An Empirical Study of Fuzzy Approach with Artificial Neural Network Models

Size: px
Start display at page:

Download "An Empirical Study of Fuzzy Approach with Artificial Neural Network Models"

Transcription

1 An Emprcal Study of Fuzzy Approach wth Artfcal Neural Networ Models Memmedaga Memmedl and Ozer Ozdemr Abstract Tme seres forecastng based on fuzzy approach by usng artfcal neural networs s a sgnfcant topc n many scentfc areas nowadays. Artfcal neural networ models are suffcent due to ther abltes to solve nonlnear problems especally fnancal researches n recent years. For these reasons, n ths paper we made a forecastng study for weely closed prces of the exchange rate of Tursh Lras (TL) to Euro between 2005 and 2009 whch has mportant effect n economcal and ndustral areas. We appled the best four networs whch are called multlayer perceptron (MLP), radal bass functon (RBF) neural networ and generalzed regresson neural networ (GRNN) to mprove forecastng fuzzy tme seres wth dfferent degrees of membershp by usng MSE performance measure. Emprcal results show that the MLP outperforms others to forecast neural networ based-fuzzy tme seres. Keywords Artfcal neural networ models, Exchange rate, Forecastng, Fuzzy approach, Tme seres. F I. INTRODUCTION UZZY set theory was ntroduced by Zadeh []. In recent years, fuzzy tme seres approach ntroduced by Song and Chssom[2],[3],[4]. Snce then, many fuzzy tme seres models have been proposed such as frst order models [5],[6],[7], hgh-order models [8],[9], hybrd models[0],[], seasonal models [2],[3], bvarate models [4],[5] and multvarate models[6],[7],[8]. These fuzzy tme seres models have been appled to varous problem domans, such as enrollment [9],[22], temperature [23,[24] and stoc ndex [25],[26]. Neural networs have become popular due to ther abltes to solve nonlnear problems n recent years. So, n ths study, a neural networ based fuzzy tme seres model has been appled to solve nonlnear problems. We made a comparson study to mprove forecastng performance of exchange rate of TL to Euro wth dfferent neural networ archtectures. Forecastng results of all neural networ models ncludng multlayer perceptron, radal bass functon and generalzed regresson neural networs are fnally compared wth each other and multlayer perceptron s the best to forecast fuzzy tme seres accordng to other artfcal neural networ models. To show Prof. Dr. M. Memmedl s wth the Statstcs Department, Faculty of Scence, Anadolu Unversty, Essehr, TURKEY. (e-mal: mmammadov@anadolu.edu.tr). Lecturer O. Ozdemr s wth the Department of Statstcs, Faculty of Scence, Anadolu Unversty, Essehr, TURKEY. (correspondng author to provde phone: ; e-mal: ozerozdemr@anadolu.edu.tr). these thngs, the remander of ths paper s organzed as follows. Secton 2 clearly revews fuzzy tme seres. Secton 3 brefly explans most commonly used artfcal neural networs. Secton 4 ncludes all emprcal analyss. Fnally, secton 5 concludes the paper. II. FUZZY TIME SERIES Song and Chssom frst proposed the defntons of fuzzy tme seres [2],[3]. The some general defntons of fuzzy tme seres are gven as follows: Let U be the unverse of dscourse, where U = { u, u2,..., un}. A fuzzy set A of U s defned by ( ) ( ) ( ) A = f u / u + f u / u f u / u, A A 2 2 A n n where f s the membershp functon of the fuzzy set A A, f : U 0,. u s an element of fuzzy set f u A [ ] s the degree of belongngness of and n. u to A and A ( ) A. f ( u ) [ 0,] Defnton : Y ( t ) ( t =...,0,,2,... ) s a subset of real numbers. Let Y ( t ) be the unverse of dscourse defned by the fuzzy set f ( t ). If F ( t ) conssts of f ( t ) ( =, 2,... ), F ( t ) s defned as a fuzzy tme seres on Y ( t ) ( t =...,0,,2,... ) [2]. Defnton 2: If there exsts a fuzzy relatonshp R( t, t), such that ( ) ( ) (, ) F t = F t R t t, where s an operator, then F ( t ) s sad to be caused by F ( t ). The relatonshp between F ( t ) and F ( t ) can be denoted by ( ) F ( t) F t. Defnton 3: Suppose F ( t ) = A and F ( t) A A =, a Issue, Volume 6, 202 4

2 fuzzy logcal relatonshp s defned as A A, where A s named as left-hand sde of the fuzzy logcal relatonshp and A the rght-hand sde. Note the repeated fuzzy logcal relatonshps are removed [3], [43]. Defnton 4: Fuzzy logcal relatonshps can be further grouped together nto fuzzy logcal relatonshp groups accordng to the same left-hand sdes of the fuzzy logcal relatonshps. For example, there are fuzzy logcal relatonshps wth the A : same left-hand sdes ( ) A A, A A 2 These fuzzy logcal relatonshps can be grouped nto a fuzzy logcal relatonshp group as follows: A A, A,... 2 Defnton 5: Suppose F ( t ) s caused by F ( t ) only, and F ( t) = F ( t ) R( t, t). For any t, f R( t, t) s ndependent of t, then F ( t ) s named a tme-nvarant fuzzy tme seres, otherwse a tme-varant fuzzy tme seres. Song and Chssom appled both tme-nvarant and tmevarant models to forecast the enrollment at the Unversty of Alabama [3],[4]. The tme-nvarant model ncludes the followng steps: () defne the unverse of dscourse and the ntervals, (2) partton the ntervals, (3) defne the fuzzy sets, (4) fuzzfy the data, (5) establsh the fuzzy relatonshps, (6) forecast, (7) defuzzfy the forecastng results. The tme-varant model ncludes the followng steps: () Defne the unverse of dscourse and the ntervals (the same as step () n the tme-nvarant model). (2) Partton the ntervals (the same as step (2) n the tmenvarant model). (3) Defne the fuzzy sets (the same as step (3) n the tmenvarant model). (4) Fuzzfy the data (the same as step (4) n the tmenvarant model). (5) Establsh the fuzzy relatonshps and forecast. (6) Defuzzfy the forecastng results. In both models, note that the establshment of fuzzy R t, t, and defuzzfcaton were the crtcal relatonshps, ( ) steps for forecastng [3],[4],[5],[8],[8], [44]. We used the followng steps n problem soluton: Step. Defnng and parttonng the unverse of dscourse. Step 2. Fuzzfcaton. Step 3. Neural Networ Tranng. Step 4. Neural Networ Forecastng. Step 5. Defuzzfcaton. Step 6. Performance Evaluaton. III. ARTIFICIAL NEURAL NETWORKS In forecastng, artfcal neural networs are mathematcal models that mtate bologcal neural networs. Artfcal neural networs consst of some elements. Determnng the elements of the artfcal neural networs ssue that affect the forecastng performance of artfcal neural networs should be consdered carefully [27],[28]. One of them s networ archtecture. However, there are not general rules for determnng the best archtecture. So, much archtecture should be tred for the correct results. There are varous types of artfcal neural networs. Let gve an overvew of the networs whch s ndcated n the best three networs for the related data of the recent study [20], [2]. Mult Layer Percepteron (MLP): MLP networs are constructed of multple layers of computatonal unts. Each neuron n one layer s drectly connected to the neurons of the subsequent hdden layer. In many applcatons, the frequently used actvaton functon s sgmod functon. Mult-layer networs use a varety of learnng technques, the most popular beng bac-propagaton. Artfcal Neural Networs usually refer to Multlayer Perceptron Neural Networs and are a popular estmator to construct nonlnear models of data. A MLP dstngushes tself by the presence of one or more hdden layers, whose computaton nodes are correspondngly called hdden neurons of hdden unts. For example, a three-layer MLP s gven n (Fg. ). The functon of hdden neurons s to ntervene between the external nput and the networ output n some useful manner [29]. MLP has been appled successfully to dffcult problems by tranng n a supervsed algorthm nown as the error bacpropagaton algorthm. Ths learnng algorthm conssts of two drectons through the dfferent layers of the networ: forward and bacward drectons. In the forward drecton, an nput data s appled to the nput nodes of the networ and ts error propagates through the networ layer by layer. Fnally, a set of outputs s produced as an actual response of the networ. Durng the forward drecton, the synaptc weghts of the networs are not changed whle, durng the bacward drecton, the synaptc weghts are altered n accordance wth an error correcton rule. The defnte response of the output layer s subtracted absolutely from an expected response to produce an error sgnal. Ths error sgnal s then propagated bacward through the networ [2]. Issue, Volume 6, 202 5

3 learn. Tranng the MLP-networ wth the bacpropagaton rule guarantees that a local mnmum of the error surface s found, though ths s not necessarly the global one. In order to speed up the tranng process, a momentum term s often ntroduced nto the update formula [42]: ( ) η δ α ( ) w t + = a + w t (5) p p Fgure. Archtecture of a three-layer MLP Durng the processng n a MLP-networ, actvatons are propagated from nput unts through hdden unts to output a w are unts. At each unt, the weghted nput actvatons summed and a bas parameter net = a w + θ θ s added. () The resultng networ nput net s then passed through a sgmod functon (the logstc functon) n order to restrct the value range of the resultng actvaton a to the nterval [0, ]. = + e (2) a net The networ learns by adaptng the weghts of the connectons between unts, untl the correct output s produced. MLP networs use a varety of learnng technques, the most popular beng bac-propagaton [2]. It performs a gradent descent search on the error surface. The weght update w,.e. the dfference between the old and the new value of weght w, s here defned as: Radal Bass Functon (RBF): That networ type s consstng of an nput layer, a hdden layer of radal unts and an output layer of lnear unts. Typcally, the radal layer has exponental actvaton functons and the output layer a lnear actvaton functons. RBF networ s an alternatve to the more wdely used MLP networ and s less computer tme consumng for networ tranng. RBF networ conssts of three layers: an nput layer, a hdden layer and an output layer. The nodes wthn each layer are fully connected to the prevous layer. The nput varables are each assgned to the nodes n the nput layer and they pass drectly to the hdden layer wthout weghts. The transfer functons of the hdden nodes are RBF [30]. RBF networs are beng used for functon approxmaton, pattern recognton and tme seres predcton problems. Such networs have the unversal approxmaton property [3], are dealt well n the theory of nterpolaton [32] and arse naturally as regularzed solutons of ll-posed problems [33]. Ther smple structure enables learnng n stages, gves a reducton n the tranng tme, and ths has led to the applcaton of such networs to many practcal problems [34]. RBF networs have tradtonally been assocated wth radal functons n a three-layer networ (see Fg. 2) consstng of an nput layer, a hdden layer of radal unts and an output layer of lnear unts [35], [36]. w = ηa δ (3) p p, where ( )( ) ( ) δ ap ap tp ap, f s an output unt δ p = ap ap pw, f s a hdden unt here, t p s the target output vector whch the networ must (4) Fgure 2. Archtecture of a RBF Networ. The radal bass functon determnes the output wth nput Issue, Volume 6, 202 6

4 varable x and dstance from center µ. As the nput varable approaches the center, the output becomes larger. As the radal bass functon, the Gaussan functon s often used. It may be wrtten as f ( x) = exp ( x µ ) 2 2 2σ (6) where x : nput, µ : center, σ : wdth of receptve feld. Output of RBF networ s expressed by a lnear combnaton of the radal bass functons. It may be wrtten as n y = w φ (7) where = functon [45]. w : connecton weght, φ : output of bass Generalzed Regresson Neural Networs (GRNN): That type of networs s a nd of Bayesan networ. GRNN has exactly four layers: nput, a layer of radal centers, a layer of regresson unts, and output. Ths layer must be traned by a clusterng algorthm. Thn of t as a normalzed RBF networ n whch there s a hdden unt centered at every tranng case. A GRNN s based on nonlnear regresson theory and often used as a popular statstcal tool for functon approxmaton. GRNN s one varant of the RBF networ, unle the standard RBF, the weghts of these networs can be calculated analytcally [37]. GRNN was devsed by Specht [38], castng a statstcal method of functon approxmaton nto a neural networ form. The GRNN, le the MLP, s able to approxmate any functonal relatonshp between nputs and outputs [39]. Structurally, the GRNN resembles the MLP. However, unle the MLP, the GRNN does not requre an estmate of the number of hdden unts to be made before tranng can tae place. Furthermore, the GRNN dffers from the classcal MLP n that every weght s replaced by a dstrbuton of weght whch mnmzes the chance of endng up n local mnma. Therefore, no test and verfcaton sets are requred [40]. GRNN has exactly four layers: nput, a layer of radal centers, a layer of regresson unts, and output. Ths layer must be traned by a clusterng algorthm. Thn of t as a normalzed RBF networ n whch there s a hdden unt centered at every tranng case. Fgure 3 shows the general structure of the GRNN [4]. Fgure 3. General structure of the GRNN. IV. EMPIRICAL ANALYSIS Ths study uses weely closed prces of the exchange rate of TL to Euro between 2005 and 2009 as the forecastng target. Emprcal analyss shows preparng a neural networ based fuzzy tme seres model to mprove forecastng performance and show forecastng performance year by year accordng to performance measure called mean square error (MSE). After obtanng all results, we compared the results for all artfcal neural networs by usng MSE from 2005 to 2009 and calculated overall results. We can explan all steps for problem soluton. Step. Defnng and parttonng the unverse of dscourse The unverse of dscourse for observatons, U = startng, endng, s defned. After the length of ntervals, [ ] l, s determned, the U can be parttoned nto equal length ntervals u, u2,..., u, b b =,... and ther correspondng mdponts m, m2,..., m respectvely. b ( ) ( ) ub = startng + b l, startng + b l, startng b l, startng b l mb = We can show dfferent length of ntervals wth startng and endng ponts accordng to exchange rate of TL to Euro for all years n Table. Table. Length of ntervals for all years Start End Interval Length For example, for the year 2005, we have U = [.58,.86] Issue, Volume 6, 202 7

5 and the length of nterval s set to Step 2. Fuzzfcaton Each lngustc observaton, A, can be fuzzfed nto a set of degrees of membershp, = b A µ / u µ / u2... µ / u b Step 3. Neural Networ Tranng A large data set s necessary for tranng a neural networ and we used weely closng prces of the exchange rate of TL to Euro for the years from 2005 to Many studes have used a convenent rato to separate n-samples from out-of samples rangng from 70%:30% to 90%:0%. Hence, we chose the data from January to October for our tranng (nsample) and November and December for forecastng (out- F t = A s taen sample). So the rato s about 83%:7%. ( ) as nput and F ( t) A = s taen as output snce fuzzy logcal relatonshp s defned as A A. Step 4. Neural Networ Forecastng By applyng the process, we use all tranng data for tranng the neural networ, so we can forecast all the degrees of membershp for out of sample data. Step 5. Defuzzfcaton We used weghted averages to defuzzfy the degrees of membershp as follows; forecast( t) = = µ m = t µ t µ t shows the forecasted degrees of membershp and m represents correspondng mdponts. Step 6. Performance Evaluaton We choose MSE for performance measure for performance evaluaton. It s calculated as follows; MSE = n ( forecast( t) observaton( t) ) 2 t= + n n s the number of observatons, ncludng n-sample and n- out-of-sample observatons. We traned and forecasted the data for all years wth MLP, RBF and GRNN. The best results of all are obtaned wth MLP and ts results by all years and overall performance can be seen n Secton 5. V. CONCLUSION Fuzzy approach and artfcal neural networs become effectve tool for researchers by forecastng fuzzy tme seres. The relaton of these has advantage to mprove forecastng performance especally n handlng nonlnear systems. Hence, n ths study we amed to handle a nonlnear problem to apply neural networ-based fuzzy tme seres model. Dfferng from prevous studes, we used varous degrees of membershp n establshng fuzzy relatonshps and we performed dfferent neural networ models to mprove forecastng performance. To demonstrate comparson between these models we used a data set of exchange rate of Tursh Lras (TL) to Euro for the years Emprcal results show that the multlayer perceptron s the best to forecast fuzzy tme seres n most commonly used artfcal neural networ models. Tme seres forecastng by usng artfcal neural networs s an mportant ssue n many scentfc researches n recent years. Artfcal neural networs are suffcent due to ther abltes to solve nonlnear problems nowadays. In ths paper we made a forecastng study for weely closed prces of the exchange rate of TL to Euro between 2005 and 2009 whch has mportant effect n economcal and ndustral areas. We appled the best four networs whch are called MLP, RBF and GRNN to mprove forecastng fuzzy tme seres wth dfferent degrees of membershp by usng MSE performance measure. Frst, by usng the exchange rate of TL to Euro for the years whch s a large data set for tranng a neural networ s used for forecastng target and separated nto nsample (estmaton) and out-of-sample (forecastng). The rato was 83%:7%. Second, we used the best four networs for tranng and forecastng. In Table 2, 3 and 4, we can see the results of MSE for varous artfcal neural networ models after forecastng for fnancal tme seres. In Table 2, all results are shown accordng to years for GRNN best models n all other GRNN models. Table 2. MSE values for neural networ models for GRNN , , , , , Overall In Table 3, all results are shown accordng to years for MLP best models n all other MLP models. Table 3. MSE values for neural networ models for MLP , , , , , Overall Issue, Volume 6, 202 8

6 Accordng to the frst two tables, n general, we can say that MLP models are better results by comparng all years and overall results. In Table 4, all results are shown accordng to years for RBF best models n all other RBF models. Table 4. MSE values for neural networ models for RBF , , , , ,00274 Overall After Table 4 and accordng to overall results n Table 5, we can fnally express that MLP models wthout fuzzy approach for a large data set of exchange rate outperform other model types by yearly comparson and overall results. Table 5. MSE overall values for all neural networ models Models MSE GRNN MLP RBF Now, by usng neural networ-based fuzzy tme seres models, we can obtan Table 6, 7 and 8. So, we can see the results of MSE for tang nto account fuzzy approach by usng varous artfcal neural networ models n proposed method to forecast ths tme seres. In Table 6, we can see the all results accordng to all years for GRNN-based fuzzy tme seres best models n all other GRNN-based fuzzy tme seres models whch are generated by STATISTICA software. Table 6. MSE values for GRNN-based fuzzy tme seres models , , , , ,00264 Overall In Table 7, all results are shown accordng to years for MLP-based fuzzy tme seres best models n all other MLPbased fuzzy tme seres models whch are generated by STATISTICA software. Accordng to the frst two tables, n general, we can say that MLP-based fuzzy tme seres models are better results by comparng all years and overall results accordng to GRNNbased fuzzy tme seres models. In Table 8, all results are shown accordng to years for RBF-based fuzzy tme seres best models n all other RBFbased fuzzy tme seres models. Table 7. MSE values for MLP-based fuzzy tme seres models , , , , ,0050 Overall Table 8. MSE values for RBF-based fuzzy tme seres models , , , , ,00852 Overall After Table 8 and accordng to overall results n Table 9, we can fnally express that MLP-based fuzzy tme seres models wth fuzzy approach for a large data set of weely closed prces of the exchange rate of TL to Euro between 2005 and 2009 outperform other model types wth usng fuzzy approach by yearly comparson and overall results. Table 9. MSE overall values for all neural networ-based fuzzy tme seres models Models MSE F-GRNN F-MLP F-RBF All emprcal results show that the MLP s the best to forecast fuzzy tme seres n most commonly used artfcal neural networ models. Also, MLP-based fuzzy tme seres models outperform artfcal neural networ models wthout fuzzy approach for Table 0. We can see the general results of all model types n Table 0. Table 0. MSE overall values for both neural networ-based fuzzy tme seres models and artfcal neural networ models Models MSE GRNN MLP RBF F-GRNN F-MLP F-RBF In terms of the yearly comparson, the MLP model Issue, Volume 6, 202 9

7 performed better than other models for demonstratng networs by usng STATISTICA 6.0 software. We can see the same results from the overall performance. Average of all years performance of neural networ based fuzzy tme seres by MLP model s better than average of all years performance of the other. We can say that MLP neural networs wthout fuzzy approach are the second and other neural networ-based fuzzy tme seres models are the last for performance evaluatons of forecastng accordng to weely closed prces of the exchange rate of TL to Euro between 2005 and 2009 by usng STATISTICA 6.0 software. Therefore, fuzzy approach wth neural networ based s better by usng ust MLP archtecture. Wthout MLP, artfcal neural networ models can have better results accordng to other neural networbased fuzzy tme seres models such as GRNN or RBF. REFERENCES [] L.A. Zadeh, Fuzzy Sets, Informaton and Control, 8, 965, pp [2] Q. Song and B. S. Chssom, Fuzzy Tme Seres and ts Models, Fuzzy Sets and Systems, 54, 993, pp [3] Q. Song and B. S. Chssom, Forecastng Enrollments wth Fuzzy Tme Seres, Part, Fuzzy Sets and Systems, 54, 993, pp. -0. [4] Q. Song and B. S. Chssom, Forecastng Enrollments wth Fuzzy Tme Seres, Part 2, Fuzzy Sets and Systems, 62, 994, pp. -8. [5] S.-M. Chen, Forecastng Enrollments Based on Fuzzy Tme Seres, Fuzzy Sets and Systems, 8(3), 996, pp [6] S.-M. Chen and J. R. Hwang, Temperature Predcton Usng Fuzzy Tme Seres, IEEE Transactons on Systems, Man, and Cybernetcs Part B, 50(2), 2000, pp [7] J. Sullvan and W. H. Woodall, A Comparson of Fuzzy Forecastng and Marov Modelng, Fuzzy Sets and Systems, 64(3), 994, pp [8] S.-M. Chen, Forecastng Enrollments Based on Hgh-Order Fuzzy Tme Seres, Cybernetcs and Systems, 33(), 2002, pp. -6. [9] K. Huarng and H.-K. Yu, An Informaton Crteron for Fuzzy Tme Seres to Forecast the Stoc Exchange Index on the TAIEX, In Nnth Internatonal Conference on Fuzzy Theory and Technology, 2003, pp [0] K. Huarng and H.-K. Yu, An N-th Order Heurstc Fuzzy Tme Seres Model for TAIEX Forecastng, Internatonal Journal of Fuzzy Systems, 5(4), 2003, pp [] F.-M. Tseng and G.-H. Tzeng, A Fuzzy Seasonal ARIMA Model for Forecastng, Fuzzy Sets and Systems, 26(3), 2002, pp [2] P.T. Chang, Fuzzy Seasonalty Forecastng, Fuzzy Sets and Systems, 90(), 997, pp. -0. [3] Q. Song, Seasonal Forecastng n Fuzzy Tme Seres, Fuzzy Sets and Systems, 07(2), 999, pp.235. [4] Y.-Y. Hsu, S.-M. Tse and B. Wu, A New Approach of Bvarate Fuzzy Tme Seres Analyss to the Forecastng of a Stoc Index, Internatonal Journal of Uncertanty, Fuzzness and Knowledge-Based Systems, (6), 2003, pp [5] H.-K. Yu and K.-H. Huarng, A Bvarate Fuzzy Tme Seres Model to Forecast the TAIEX, Expert Systems wth Applcatons, 34(4), 2008, pp [6] T. A. Jlan, and S. M. A. Burney, Multvarate Stochastc Fuzzy Forecastng Models, Expert Systems wth Applcatons, 35(3), 2008, pp [7] C.-H. Cheng, G.-W. Cheng and J.-W. Wang, Mult-Attrbute Fuzzy Tme Seres Method Based on Fuzzy Clusterng, Expert Systems wth Applcatons, 34(2), 2008, pp [8] K. Huarng, Heurstc Models of Fuzzy Tme Seres for Forecastng, Fuzzy Sets and Systems,23(3), 200, pp [9] K. Huarng, Effectve Lengths of Intervals to Improve Forecastng n Fuzzy Tme Seres, Fuzzy Sets and Systems, 23(3), 200, pp [20] C. M. Bshop, Neural Networs for Pattern Recognton, Oxford: Oxford Unversty Press, 995. [2] S. Hayn, Neural Networs: A Comprehensve Foundaton, Prentce Hall, 999. [22] K. Huarng and H.-K. Yu, Rato-Based Lengths of Intervals to Improve Fuzzy Tme Seres Forecastng, IEEE Transactons on Systems, Man, and Cybernetcs Part B, 36(2), 2006, pp [23] L.-W. Lee, L.-H. Wang and S.-M. Chen, Temperature Predcton and TAIFEX Forecastng Based on Fuzzy Logcal Relatonshps and Genetc Algorthms, Expert Systems wth Applcatons, 33, 2007, pp [24] L.-W. Lee, L.-H. Wang, S.-M. Chen and Y.-H. Leu, Handlng Forecastng Problems Based on Two Factors Hgh-Order Fuzzy Tme Seres, IEEE Transactons on Fuzzy Systems, 4(3), [25] K. Huarng and H.-K. Yu, A Dynamc Approach to Adustng Lengths of Intervals n Fuzzy Tme Seres Forecastng, Intellgent Data Analyss, 8(), 2004, pp [26] K. Huarng and H.-K. Yu, A Type 2 Fuzzy Tme Seres Model for Stoc Index Forecastng, Physca A, 353, 2005, pp [27] J. M. Zurada, Introducton of Artfcal Neural Systems, St. Paul: West Publshng, 992. [28] C. H. Aladag, M. A. Basaran, E. Egroglu, U. Yolcu, V. R. Uslu, Forecastng n Hgh Order Fuzzy Tme Seres by Usng Neural Networs to Defne Fuzzy Relatons, Expert Systems wth Applcatons, 36, 2009, pp [29] Pnar E., Paydas K., Secn G., All H., Sahn B., Cobaner M., Kocaman S., Aar M. A., 200, Artfcal neural networ approaches for predcton of bacwater through arched brdge constrctons, Advanced n Engneerng Software, (4), pp [30] Ghorbanan K., Gholamrezae M., 2009, An artfcal neural networ approach to compressor performance predcton, Appled Energy, (86), pp [3] Par J., Sandberg I. W., 99, Unversal approxmaton usng radal bass functon networs, Neural Comput., 3 (2), pp [32] Powell M. J. D., 987, Radal bass functon for multvarate nterpolaton, n: J. C. Mason, M. G. Cox (Eds.), A Revew Algorthm for the Approxmaton of Functons and Data, Clarendon, Oxford, UK. [33] Poggo T., Gros F., 990, Networs for approxmaton and learnng, Proc. IEEE 78 (9), pp [34] Celoglu H. B., 2006, Applcaton of radal bass functon and generalzed regresson neural networs n non-lnear utlty functon specfcaton for travel mode choce modelng, Mathematcal and Computer Modellng, 44, pp [35] Moody T., Daren C., 989, Fast learnng n networs of locally tuned processng unts, Neural Computaton,, pp [36] Renals S, Rohwer R., 989, Phoneme classfcaton experments usng radal bass functons, In Proceedngs of Internatonal Jont Conference on Neural Networs, Washngton, DC, pp [37] Baghereh A. H., Hower J. C., Baghereh A. R., Joran E., 2008, Studes of the relatonshp between petrography and grndablty for Kentucy coals usng artfcal neural networ, Internatonal Journal of Coal Geology, 73, pp [38] Specht D., 99, A general regresson neural networ, IEEE Trans. Neural Networs, 2 (6), pp [39] Wasserman P.D., 993, Advanced Methods n Neural Computng, New Yor, Van Nostrand Renhold, NY, USA. [40] Mostafa M. M., 20, A neuro-computatonal ntellgence analyss of the global consumer software pracy rates, Expert Systems wth Applcatons, 38, pp [4] Mostafa, M. M., 200, Forecastng stoc exchange movements usng neural networs: Emprcal evdence from Kuwat. Expert Systems wth Applcatons. [42] M. Mammadov, E. Tas, An Improved Verson of Bacpropagaton Algorthm wth Effectve Dynamc Learnng Rate and Momentum, Proceedngs of the 9 th WSEAS Internatonal Conference on Appled Mathematcs, Istanbul, Turey, May 27-29, 2006, pp [43] M. Memmedl, O. Ozdemr, A Comparson Study of Performance Measures and Length of Intervals n Fuzzy Tme Seres by Neural Networs, Proceedngs of the 8 th WSEAS Internatonal Conference on System Scence and Smulaton n Engneerng, Genova, Italy, October 7-9, 2009, pp [44] M. Memmedl, O. Ozdemr, An Applcaton of Fuzzy Tme Seres to Improve ISE Forecastng, WSEAS Transactons on Mathematcs, Issue, Volume 9, January 200, pp Issue, Volume 6,

8 [45] D. Aydın, M. Mammadov, A Comparatve Study of Hybrd, Neural Networs and Nonparametrc Regresson Models n Tme Seres Predcton, WSEAS Transactons on Mathematcs, Issue 0, Volume 8, October 2009, pp Memmedaga Memmedl was born n Azerbaan n 947. He receved a dploma at Bau State Unversty n 97. He receved a PhD dploma from Russan Academy of Scence n 977. From , he has wored as a Assocate Professor; from 2007, as a Professor Dr. at Department of Statstcs at Anadolu Unversty, Essehr, Turey. Now, he s head of Department of Statstcs, Faculty of Scence, Anadolu Unversty, Essehr, Turey. He has publshed many ournals and conference papers n hs research areas. Hs research nterests are Game Theory, Control Theory, Optmzaton, Artfcal Neural Networs, Fuzzy Modelng and Nonparametrc Regresson Analyss. Ozer Ozdemr Ozer Ozdemr was born n Turey n 982. He receved hs B.Sc., M.Sc. degrees n statstcs n the Department of Statstcs at Anadolu Unversty, Turey, respectvely n 2005 and n He s currently dong hs Ph.D. n the same place. He has wored as a Research Assstant from and as a Lecturer from 2008 n the Department of Statstcs at Anadolu Unversty, Turey. He has publshed over 20 nternatonal conference papers and ournals n hs research areas. Hs research nterests nclude Smulaton, Artfcal Neural Networs, Fuzzy Logc, Fuzzy Modelng, Tme Seres, Computer Programmng, Statstcal Software and Computer Applcatons n Statstcs. Issue, Volume 6, 202 2

For now, let us focus on a specific model of neurons. These are simplified from reality but can achieve remarkable results.

For now, let us focus on a specific model of neurons. These are simplified from reality but can achieve remarkable results. Neural Networks : Dervaton compled by Alvn Wan from Professor Jtendra Malk s lecture Ths type of computaton s called deep learnng and s the most popular method for many problems, such as computer vson

More information

Neural Networks & Learning

Neural Networks & Learning Neural Netorks & Learnng. Introducton The basc prelmnares nvolved n the Artfcal Neural Netorks (ANN) are descrbed n secton. An Artfcal Neural Netorks (ANN) s an nformaton-processng paradgm that nspred

More information

EEE 241: Linear Systems

EEE 241: Linear Systems EEE : Lnear Systems Summary #: Backpropagaton BACKPROPAGATION The perceptron rule as well as the Wdrow Hoff learnng were desgned to tran sngle layer networks. They suffer from the same dsadvantage: they

More information

Radial-Basis Function Networks

Radial-Basis Function Networks Radal-Bass uncton Networs v.0 March 00 Mchel Verleysen Radal-Bass uncton Networs - Radal-Bass uncton Networs p Orgn: Cover s theorem p Interpolaton problem p Regularzaton theory p Generalzed RBN p Unversal

More information

Multilayer Perceptrons and Backpropagation. Perceptrons. Recap: Perceptrons. Informatics 1 CG: Lecture 6. Mirella Lapata

Multilayer Perceptrons and Backpropagation. Perceptrons. Recap: Perceptrons. Informatics 1 CG: Lecture 6. Mirella Lapata Multlayer Perceptrons and Informatcs CG: Lecture 6 Mrella Lapata School of Informatcs Unversty of Ednburgh mlap@nf.ed.ac.uk Readng: Kevn Gurney s Introducton to Neural Networks, Chapters 5 6.5 January,

More information

Multigradient for Neural Networks for Equalizers 1

Multigradient for Neural Networks for Equalizers 1 Multgradent for Neural Netorks for Equalzers 1 Chulhee ee, Jnook Go and Heeyoung Km Department of Electrcal and Electronc Engneerng Yonse Unversty 134 Shnchon-Dong, Seodaemun-Ku, Seoul 1-749, Korea ABSTRACT

More information

RBF Neural Network Model Training by Unscented Kalman Filter and Its Application in Mechanical Fault Diagnosis

RBF Neural Network Model Training by Unscented Kalman Filter and Its Application in Mechanical Fault Diagnosis Appled Mechancs and Materals Submtted: 24-6-2 ISSN: 662-7482, Vols. 62-65, pp 2383-2386 Accepted: 24-6- do:.428/www.scentfc.net/amm.62-65.2383 Onlne: 24-8- 24 rans ech Publcatons, Swtzerland RBF Neural

More information

Fuzzy Boundaries of Sample Selection Model

Fuzzy Boundaries of Sample Selection Model Proceedngs of the 9th WSES Internatonal Conference on ppled Mathematcs, Istanbul, Turkey, May 7-9, 006 (pp309-34) Fuzzy Boundares of Sample Selecton Model L. MUHMD SFIIH, NTON BDULBSH KMIL, M. T. BU OSMN

More information

A Hybrid Variational Iteration Method for Blasius Equation

A Hybrid Variational Iteration Method for Blasius Equation Avalable at http://pvamu.edu/aam Appl. Appl. Math. ISSN: 1932-9466 Vol. 10, Issue 1 (June 2015), pp. 223-229 Applcatons and Appled Mathematcs: An Internatonal Journal (AAM) A Hybrd Varatonal Iteraton Method

More information

Week 5: Neural Networks

Week 5: Neural Networks Week 5: Neural Networks Instructor: Sergey Levne Neural Networks Summary In the prevous lecture, we saw how we can construct neural networks by extendng logstc regresson. Neural networks consst of multple

More information

Multilayer neural networks

Multilayer neural networks Lecture Multlayer neural networks Mlos Hauskrecht mlos@cs.ptt.edu 5329 Sennott Square Mdterm exam Mdterm Monday, March 2, 205 In-class (75 mnutes) closed book materal covered by February 25, 205 Multlayer

More information

Non-linear Canonical Correlation Analysis Using a RBF Network

Non-linear Canonical Correlation Analysis Using a RBF Network ESANN' proceedngs - European Smposum on Artfcal Neural Networks Bruges (Belgum), 4-6 Aprl, d-sde publ., ISBN -97--, pp. 57-5 Non-lnear Canoncal Correlaton Analss Usng a RBF Network Sukhbnder Kumar, Elane

More information

Multilayer Perceptron (MLP)

Multilayer Perceptron (MLP) Multlayer Perceptron (MLP) Seungjn Cho Department of Computer Scence and Engneerng Pohang Unversty of Scence and Technology 77 Cheongam-ro, Nam-gu, Pohang 37673, Korea seungjn@postech.ac.kr 1 / 20 Outlne

More information

Lecture 23: Artificial neural networks

Lecture 23: Artificial neural networks Lecture 23: Artfcal neural networks Broad feld that has developed over the past 20 to 30 years Confluence of statstcal mechancs, appled math, bology and computers Orgnal motvaton: mathematcal modelng of

More information

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur Module 3 LOSSY IMAGE COMPRESSION SYSTEMS Verson ECE IIT, Kharagpur Lesson 6 Theory of Quantzaton Verson ECE IIT, Kharagpur Instructonal Objectves At the end of ths lesson, the students should be able to:

More information

Supporting Information

Supporting Information Supportng Informaton The neural network f n Eq. 1 s gven by: f x l = ReLU W atom x l + b atom, 2 where ReLU s the element-wse rectfed lnear unt, 21.e., ReLUx = max0, x, W atom R d d s the weght matrx to

More information

Multi-layer neural networks

Multi-layer neural networks Lecture 0 Mult-layer neural networks Mlos Hauskrecht mlos@cs.ptt.edu 5329 Sennott Square Lnear regresson w Lnear unts f () Logstc regresson T T = w = p( y =, w) = g( w ) w z f () = p ( y = ) w d w d Gradent

More information

Evaluation of classifiers MLPs

Evaluation of classifiers MLPs Lecture Evaluaton of classfers MLPs Mlos Hausrecht mlos@cs.ptt.edu 539 Sennott Square Evaluaton For any data set e use to test the model e can buld a confuson matrx: Counts of examples th: class label

More information

A Robust Method for Calculating the Correlation Coefficient

A Robust Method for Calculating the Correlation Coefficient A Robust Method for Calculatng the Correlaton Coeffcent E.B. Nven and C. V. Deutsch Relatonshps between prmary and secondary data are frequently quantfed usng the correlaton coeffcent; however, the tradtonal

More information

Designing Fuzzy Time Series Model Using Generalized Wang s Method and Its application to Forecasting Interest Rate of Bank Indonesia Certificate

Designing Fuzzy Time Series Model Using Generalized Wang s Method and Its application to Forecasting Interest Rate of Bank Indonesia Certificate The Frst Internatonal Senar on Scence and Technology, Islac Unversty of Indonesa, 4-5 January 009. Desgnng Fuzzy Te Seres odel Usng Generalzed Wang s ethod and Its applcaton to Forecastng Interest Rate

More information

Multi-Step-Ahead Prediction of Stock Price Using a New Architecture of Neural Networks

Multi-Step-Ahead Prediction of Stock Price Using a New Architecture of Neural Networks Journal of Computer & Robotcs 8(), 05 47-56 47 Mult-Step-Ahead Predcton of Stoc Prce Usng a New Archtecture of Neural Networs Mohammad Taleb Motlagh *, Hamd Khaloozadeh Department of Systems and Control,

More information

Short Term Load Forecasting using an Artificial Neural Network

Short Term Load Forecasting using an Artificial Neural Network Short Term Load Forecastng usng an Artfcal Neural Network D. Kown 1, M. Km 1, C. Hong 1,, S. Cho 2 1 Department of Computer Scence, Sangmyung Unversty, Seoul, Korea 2 Department of Energy Grd, Sangmyung

More information

Generalized Linear Methods

Generalized Linear Methods Generalzed Lnear Methods 1 Introducton In the Ensemble Methods the general dea s that usng a combnaton of several weak learner one could make a better learner. More formally, assume that we have a set

More information

Lecture Notes on Linear Regression

Lecture Notes on Linear Regression Lecture Notes on Lnear Regresson Feng L fl@sdueducn Shandong Unversty, Chna Lnear Regresson Problem In regresson problem, we am at predct a contnuous target value gven an nput feature vector We assume

More information

1 Convex Optimization

1 Convex Optimization Convex Optmzaton We wll consder convex optmzaton problems. Namely, mnmzaton problems where the objectve s convex (we assume no constrants for now). Such problems often arse n machne learnng. For example,

More information

Simulated Power of the Discrete Cramér-von Mises Goodness-of-Fit Tests

Simulated Power of the Discrete Cramér-von Mises Goodness-of-Fit Tests Smulated of the Cramér-von Mses Goodness-of-Ft Tests Steele, M., Chaselng, J. and 3 Hurst, C. School of Mathematcal and Physcal Scences, James Cook Unversty, Australan School of Envronmental Studes, Grffth

More information

Econ107 Applied Econometrics Topic 3: Classical Model (Studenmund, Chapter 4)

Econ107 Applied Econometrics Topic 3: Classical Model (Studenmund, Chapter 4) I. Classcal Assumptons Econ7 Appled Econometrcs Topc 3: Classcal Model (Studenmund, Chapter 4) We have defned OLS and studed some algebrac propertes of OLS. In ths topc we wll study statstcal propertes

More information

Support Vector Machines. Vibhav Gogate The University of Texas at dallas

Support Vector Machines. Vibhav Gogate The University of Texas at dallas Support Vector Machnes Vbhav Gogate he Unversty of exas at dallas What We have Learned So Far? 1. Decson rees. Naïve Bayes 3. Lnear Regresson 4. Logstc Regresson 5. Perceptron 6. Neural networks 7. K-Nearest

More information

Neural networks. Nuno Vasconcelos ECE Department, UCSD

Neural networks. Nuno Vasconcelos ECE Department, UCSD Neural networs Nuno Vasconcelos ECE Department, UCSD Classfcaton a classfcaton problem has two types of varables e.g. X - vector of observatons (features) n the world Y - state (class) of the world x X

More information

Pop-Click Noise Detection Using Inter-Frame Correlation for Improved Portable Auditory Sensing

Pop-Click Noise Detection Using Inter-Frame Correlation for Improved Portable Auditory Sensing Advanced Scence and Technology Letters, pp.164-168 http://dx.do.org/10.14257/astl.2013 Pop-Clc Nose Detecton Usng Inter-Frame Correlaton for Improved Portable Audtory Sensng Dong Yun Lee, Kwang Myung Jeon,

More information

AN IMPROVED PARTICLE FILTER ALGORITHM BASED ON NEURAL NETWORK FOR TARGET TRACKING

AN IMPROVED PARTICLE FILTER ALGORITHM BASED ON NEURAL NETWORK FOR TARGET TRACKING AN IMPROVED PARTICLE FILTER ALGORITHM BASED ON NEURAL NETWORK FOR TARGET TRACKING Qn Wen, Peng Qcong 40 Lab, Insttuton of Communcaton and Informaton Engneerng,Unversty of Electronc Scence and Technology

More information

The Order Relation and Trace Inequalities for. Hermitian Operators

The Order Relation and Trace Inequalities for. Hermitian Operators Internatonal Mathematcal Forum, Vol 3, 08, no, 507-57 HIKARI Ltd, wwwm-hkarcom https://doorg/0988/mf088055 The Order Relaton and Trace Inequaltes for Hermtan Operators Y Huang School of Informaton Scence

More information

LINEAR REGRESSION ANALYSIS. MODULE IX Lecture Multicollinearity

LINEAR REGRESSION ANALYSIS. MODULE IX Lecture Multicollinearity LINEAR REGRESSION ANALYSIS MODULE IX Lecture - 30 Multcollnearty Dr. Shalabh Department of Mathematcs and Statstcs Indan Insttute of Technology Kanpur 2 Remedes for multcollnearty Varous technques have

More information

Structure and Drive Paul A. Jensen Copyright July 20, 2003

Structure and Drive Paul A. Jensen Copyright July 20, 2003 Structure and Drve Paul A. Jensen Copyrght July 20, 2003 A system s made up of several operatons wth flow passng between them. The structure of the system descrbes the flow paths from nputs to outputs.

More information

System identifications by SIRMs models with linear transformation of input variables

System identifications by SIRMs models with linear transformation of input variables ORIGINAL RESEARCH System dentfcatons by SIRMs models wth lnear transformaton of nput varables Hrofum Myama, Nortaka Shge, Hrom Myama Graduate School of Scence and Engneerng, Kagoshma Unversty, Japan Receved:

More information

Kernel Methods and SVMs Extension

Kernel Methods and SVMs Extension Kernel Methods and SVMs Extenson The purpose of ths document s to revew materal covered n Machne Learnng 1 Supervsed Learnng regardng support vector machnes (SVMs). Ths document also provdes a general

More information

International Journal of Mathematical Archive-3(3), 2012, Page: Available online through ISSN

International Journal of Mathematical Archive-3(3), 2012, Page: Available online through   ISSN Internatonal Journal of Mathematcal Archve-3(3), 2012, Page: 1136-1140 Avalable onlne through www.ma.nfo ISSN 2229 5046 ARITHMETIC OPERATIONS OF FOCAL ELEMENTS AND THEIR CORRESPONDING BASIC PROBABILITY

More information

Statistics for Economics & Business

Statistics for Economics & Business Statstcs for Economcs & Busness Smple Lnear Regresson Learnng Objectves In ths chapter, you learn: How to use regresson analyss to predct the value of a dependent varable based on an ndependent varable

More information

Solving Nonlinear Differential Equations by a Neural Network Method

Solving Nonlinear Differential Equations by a Neural Network Method Solvng Nonlnear Dfferental Equatons by a Neural Network Method Luce P. Aarts and Peter Van der Veer Delft Unversty of Technology, Faculty of Cvlengneerng and Geoscences, Secton of Cvlengneerng Informatcs,

More information

Comparison of the Population Variance Estimators. of 2-Parameter Exponential Distribution Based on. Multiple Criteria Decision Making Method

Comparison of the Population Variance Estimators. of 2-Parameter Exponential Distribution Based on. Multiple Criteria Decision Making Method Appled Mathematcal Scences, Vol. 7, 0, no. 47, 07-0 HIARI Ltd, www.m-hkar.com Comparson of the Populaton Varance Estmators of -Parameter Exponental Dstrbuton Based on Multple Crtera Decson Makng Method

More information

FUZZY GOAL PROGRAMMING VS ORDINARY FUZZY PROGRAMMING APPROACH FOR MULTI OBJECTIVE PROGRAMMING PROBLEM

FUZZY GOAL PROGRAMMING VS ORDINARY FUZZY PROGRAMMING APPROACH FOR MULTI OBJECTIVE PROGRAMMING PROBLEM Internatonal Conference on Ceramcs, Bkaner, Inda Internatonal Journal of Modern Physcs: Conference Seres Vol. 22 (2013) 757 761 World Scentfc Publshng Company DOI: 10.1142/S2010194513010982 FUZZY GOAL

More information

Resource Allocation with a Budget Constraint for Computing Independent Tasks in the Cloud

Resource Allocation with a Budget Constraint for Computing Independent Tasks in the Cloud Resource Allocaton wth a Budget Constrant for Computng Independent Tasks n the Cloud Wemng Sh and Bo Hong School of Electrcal and Computer Engneerng Georga Insttute of Technology, USA 2nd IEEE Internatonal

More information

Time Series Forecasting Using Artificial Neural Networks under Dempster Shafer Evidence Theory and Trimmed-winsorized Means

Time Series Forecasting Using Artificial Neural Networks under Dempster Shafer Evidence Theory and Trimmed-winsorized Means Internatonal Journal of Informaton and Computaton Technology. ISSN 0974-2239 Volume 3, Number 5 (2013), pp. 383-390 Internatonal Research Publcatons House http://www. rphouse.com /jct.htm Tme Seres Forecastng

More information

FORECASTING EXCHANGE RATE USING SUPPORT VECTOR MACHINES

FORECASTING EXCHANGE RATE USING SUPPORT VECTOR MACHINES Proceedngs of the Fourth Internatonal Conference on Machne Learnng and Cybernetcs, Guangzhou, 8- August 005 FORECASTING EXCHANGE RATE USING SUPPORT VECTOR MACHINES DING-ZHOU CAO, SU-LIN PANG, YUAN-HUAI

More information

Determining Transmission Losses Penalty Factor Using Adaptive Neuro Fuzzy Inference System (ANFIS) For Economic Dispatch Application

Determining Transmission Losses Penalty Factor Using Adaptive Neuro Fuzzy Inference System (ANFIS) For Economic Dispatch Application 7 Determnng Transmsson Losses Penalty Factor Usng Adaptve Neuro Fuzzy Inference System (ANFIS) For Economc Dspatch Applcaton Rony Seto Wbowo Maurdh Hery Purnomo Dod Prastanto Electrcal Engneerng Department,

More information

Efficient Weather Forecasting using Artificial Neural Network as Function Approximator

Efficient Weather Forecasting using Artificial Neural Network as Function Approximator Effcent Weather Forecastng usng Artfcal Neural Network as Functon Approxmator I. El-Fegh, Z. Zuba and S. Abozgaya Abstract Forecastng s the referred to as the process of estmaton n unknown stuatons. Weather

More information

Complement of Type-2 Fuzzy Shortest Path Using Possibility Measure

Complement of Type-2 Fuzzy Shortest Path Using Possibility Measure Intern. J. Fuzzy Mathematcal rchve Vol. 5, No., 04, 9-7 ISSN: 30 34 (P, 30 350 (onlne Publshed on 5 November 04 www.researchmathsc.org Internatonal Journal of Complement of Type- Fuzzy Shortest Path Usng

More information

Using T.O.M to Estimate Parameter of distributions that have not Single Exponential Family

Using T.O.M to Estimate Parameter of distributions that have not Single Exponential Family IOSR Journal of Mathematcs IOSR-JM) ISSN: 2278-5728. Volume 3, Issue 3 Sep-Oct. 202), PP 44-48 www.osrjournals.org Usng T.O.M to Estmate Parameter of dstrbutons that have not Sngle Exponental Famly Jubran

More information

PERFORMANCE COMPARISON BETWEEN BACK PROPAGATION, RPE AND MRPE ALGORITHMS FOR TRAINING MLP NETWORKS

PERFORMANCE COMPARISON BETWEEN BACK PROPAGATION, RPE AND MRPE ALGORITHMS FOR TRAINING MLP NETWORKS PERFORMANCE COMPARISON BETWEEN BACK PROPAGATION, RPE AND MRPE ALGORITHMS FOR TRAINING MLP NETWORKS Mohd Yusoff Mashor School of Electrcal and Electronc Engneerng, Unversty Scence Malaysa, Pera Branch Campus,

More information

Lecture 10 Support Vector Machines II

Lecture 10 Support Vector Machines II Lecture 10 Support Vector Machnes II 22 February 2016 Taylor B. Arnold Yale Statstcs STAT 365/665 1/28 Notes: Problem 3 s posted and due ths upcomng Frday There was an early bug n the fake-test data; fxed

More information

Department of Quantitative Methods & Information Systems. Time Series and Their Components QMIS 320. Chapter 6

Department of Quantitative Methods & Information Systems. Time Series and Their Components QMIS 320. Chapter 6 Department of Quanttatve Methods & Informaton Systems Tme Seres and Ther Components QMIS 30 Chapter 6 Fall 00 Dr. Mohammad Zanal These sldes were modfed from ther orgnal source for educatonal purpose only.

More information

CHAPTER-5 INFORMATION MEASURE OF FUZZY MATRIX AND FUZZY BINARY RELATION

CHAPTER-5 INFORMATION MEASURE OF FUZZY MATRIX AND FUZZY BINARY RELATION CAPTER- INFORMATION MEASURE OF FUZZY MATRI AN FUZZY BINARY RELATION Introducton The basc concept of the fuzz matr theor s ver smple and can be appled to socal and natural stuatons A branch of fuzz matr

More information

COMPARISON OF SOME RELIABILITY CHARACTERISTICS BETWEEN REDUNDANT SYSTEMS REQUIRING SUPPORTING UNITS FOR THEIR OPERATIONS

COMPARISON OF SOME RELIABILITY CHARACTERISTICS BETWEEN REDUNDANT SYSTEMS REQUIRING SUPPORTING UNITS FOR THEIR OPERATIONS Avalable onlne at http://sck.org J. Math. Comput. Sc. 3 (3), No., 6-3 ISSN: 97-537 COMPARISON OF SOME RELIABILITY CHARACTERISTICS BETWEEN REDUNDANT SYSTEMS REQUIRING SUPPORTING UNITS FOR THEIR OPERATIONS

More information

College of Computer & Information Science Fall 2009 Northeastern University 20 October 2009

College of Computer & Information Science Fall 2009 Northeastern University 20 October 2009 College of Computer & Informaton Scence Fall 2009 Northeastern Unversty 20 October 2009 CS7880: Algorthmc Power Tools Scrbe: Jan Wen and Laura Poplawsk Lecture Outlne: Prmal-dual schema Network Desgn:

More information

3.1 Expectation of Functions of Several Random Variables. )' be a k-dimensional discrete or continuous random vector, with joint PMF p (, E X E X1 E X

3.1 Expectation of Functions of Several Random Variables. )' be a k-dimensional discrete or continuous random vector, with joint PMF p (, E X E X1 E X Statstcs 1: Probablty Theory II 37 3 EPECTATION OF SEVERAL RANDOM VARIABLES As n Probablty Theory I, the nterest n most stuatons les not on the actual dstrbuton of a random vector, but rather on a number

More information

A PROBABILITY-DRIVEN SEARCH ALGORITHM FOR SOLVING MULTI-OBJECTIVE OPTIMIZATION PROBLEMS

A PROBABILITY-DRIVEN SEARCH ALGORITHM FOR SOLVING MULTI-OBJECTIVE OPTIMIZATION PROBLEMS HCMC Unversty of Pedagogy Thong Nguyen Huu et al. A PROBABILITY-DRIVEN SEARCH ALGORITHM FOR SOLVING MULTI-OBJECTIVE OPTIMIZATION PROBLEMS Thong Nguyen Huu and Hao Tran Van Department of mathematcs-nformaton,

More information

The Chaotic Robot Prediction by Neuro Fuzzy Algorithm (2) = θ (3) = ω. Asin. A v. Mana Tarjoman, Shaghayegh Zarei

The Chaotic Robot Prediction by Neuro Fuzzy Algorithm (2) = θ (3) = ω. Asin. A v. Mana Tarjoman, Shaghayegh Zarei The Chaotc Robot Predcton by Neuro Fuzzy Algorthm Mana Tarjoman, Shaghayegh Zare Abstract In ths paper an applcaton of the adaptve neurofuzzy nference system has been ntroduced to predct the behavor of

More information

A Neuro-Fuzzy System on System Modeling and Its. Application on Character Recognition

A Neuro-Fuzzy System on System Modeling and Its. Application on Character Recognition A Neuro-Fuzzy System on System Modelng and Its Applcaton on Character Recognton C. J. Chen 1, S. M. Yang 2, Z. C. Wang 3 1 Department of Avaton Servce Management Alethea Unversty Tawan, ROC 2,3 Department

More information

Admin NEURAL NETWORKS. Perceptron learning algorithm. Our Nervous System 10/25/16. Assignment 7. Class 11/22. Schedule for the rest of the semester

Admin NEURAL NETWORKS. Perceptron learning algorithm. Our Nervous System 10/25/16. Assignment 7. Class 11/22. Schedule for the rest of the semester 0/25/6 Admn Assgnment 7 Class /22 Schedule for the rest of the semester NEURAL NETWORKS Davd Kauchak CS58 Fall 206 Perceptron learnng algorthm Our Nervous System repeat untl convergence (or for some #

More information

Numerical Heat and Mass Transfer

Numerical Heat and Mass Transfer Master degree n Mechancal Engneerng Numercal Heat and Mass Transfer 06-Fnte-Dfference Method (One-dmensonal, steady state heat conducton) Fausto Arpno f.arpno@uncas.t Introducton Why we use models and

More information

Wavelet chaotic neural networks and their application to continuous function optimization

Wavelet chaotic neural networks and their application to continuous function optimization Vol., No.3, 04-09 (009) do:0.436/ns.009.307 Natural Scence Wavelet chaotc neural networks and ther applcaton to contnuous functon optmzaton Ja-Ha Zhang, Yao-Qun Xu College of Electrcal and Automatc Engneerng,

More information

Comparison of Regression Lines

Comparison of Regression Lines STATGRAPHICS Rev. 9/13/2013 Comparson of Regresson Lnes Summary... 1 Data Input... 3 Analyss Summary... 4 Plot of Ftted Model... 6 Condtonal Sums of Squares... 6 Analyss Optons... 7 Forecasts... 8 Confdence

More information

Comparative Studies of Law of Conservation of Energy. and Law Clusters of Conservation of Generalized Energy

Comparative Studies of Law of Conservation of Energy. and Law Clusters of Conservation of Generalized Energy Comparatve Studes of Law of Conservaton of Energy and Law Clusters of Conservaton of Generalzed Energy No.3 of Comparatve Physcs Seres Papers Fu Yuhua (CNOOC Research Insttute, E-mal:fuyh1945@sna.com)

More information

Foundations of Arithmetic

Foundations of Arithmetic Foundatons of Arthmetc Notaton We shall denote the sum and product of numbers n the usual notaton as a 2 + a 2 + a 3 + + a = a, a 1 a 2 a 3 a = a The notaton a b means a dvdes b,.e. ac = b where c s an

More information

The Study of Teaching-learning-based Optimization Algorithm

The Study of Teaching-learning-based Optimization Algorithm Advanced Scence and Technology Letters Vol. (AST 06), pp.05- http://dx.do.org/0.57/astl.06. The Study of Teachng-learnng-based Optmzaton Algorthm u Sun, Yan fu, Lele Kong, Haolang Q,, Helongang Insttute

More information

ADVANCED MACHINE LEARNING ADVANCED MACHINE LEARNING

ADVANCED MACHINE LEARNING ADVANCED MACHINE LEARNING 1 ADVANCED ACHINE LEARNING ADVANCED ACHINE LEARNING Non-lnear regresson technques 2 ADVANCED ACHINE LEARNING Regresson: Prncple N ap N-dm. nput x to a contnuous output y. Learn a functon of the type: N

More information

Neural Networks. Perceptrons and Backpropagation. Silke Bussen-Heyen. 5th of Novemeber Universität Bremen Fachbereich 3. Neural Networks 1 / 17

Neural Networks. Perceptrons and Backpropagation. Silke Bussen-Heyen. 5th of Novemeber Universität Bremen Fachbereich 3. Neural Networks 1 / 17 Neural Networks Perceptrons and Backpropagaton Slke Bussen-Heyen Unverstät Bremen Fachberech 3 5th of Novemeber 2012 Neural Networks 1 / 17 Contents 1 Introducton 2 Unts 3 Network structure 4 Snglelayer

More information

Atmospheric Environmental Quality Assessment RBF Model Based on the MATLAB

Atmospheric Environmental Quality Assessment RBF Model Based on the MATLAB Journal of Envronmental Protecton, 01, 3, 689-693 http://dxdoorg/10436/jep0137081 Publshed Onlne July 01 (http://wwwscrporg/journal/jep) 689 Atmospherc Envronmental Qualty Assessment RBF Model Based on

More information

Pattern Classification

Pattern Classification Pattern Classfcaton All materals n these sldes ere taken from Pattern Classfcaton (nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wley & Sons, 000 th the permsson of the authors and the publsher

More information

Interactive Bi-Level Multi-Objective Integer. Non-linear Programming Problem

Interactive Bi-Level Multi-Objective Integer. Non-linear Programming Problem Appled Mathematcal Scences Vol 5 0 no 65 3 33 Interactve B-Level Mult-Objectve Integer Non-lnear Programmng Problem O E Emam Department of Informaton Systems aculty of Computer Scence and nformaton Helwan

More information

Statistics II Final Exam 26/6/18

Statistics II Final Exam 26/6/18 Statstcs II Fnal Exam 26/6/18 Academc Year 2017/18 Solutons Exam duraton: 2 h 30 mn 1. (3 ponts) A town hall s conductng a study to determne the amount of leftover food produced by the restaurants n the

More information

A new Approach for Solving Linear Ordinary Differential Equations

A new Approach for Solving Linear Ordinary Differential Equations , ISSN 974-57X (Onlne), ISSN 974-5718 (Prnt), Vol. ; Issue No. 1; Year 14, Copyrght 13-14 by CESER PUBLICATIONS A new Approach for Solvng Lnear Ordnary Dfferental Equatons Fawz Abdelwahd Department of

More information

Speeding up Computation of Scalar Multiplication in Elliptic Curve Cryptosystem

Speeding up Computation of Scalar Multiplication in Elliptic Curve Cryptosystem H.K. Pathak et. al. / (IJCSE) Internatonal Journal on Computer Scence and Engneerng Speedng up Computaton of Scalar Multplcaton n Ellptc Curve Cryptosystem H. K. Pathak Manju Sangh S.o.S n Computer scence

More information

Internet Engineering. Jacek Mazurkiewicz, PhD Softcomputing. Part 3: Recurrent Artificial Neural Networks Self-Organising Artificial Neural Networks

Internet Engineering. Jacek Mazurkiewicz, PhD Softcomputing. Part 3: Recurrent Artificial Neural Networks Self-Organising Artificial Neural Networks Internet Engneerng Jacek Mazurkewcz, PhD Softcomputng Part 3: Recurrent Artfcal Neural Networks Self-Organsng Artfcal Neural Networks Recurrent Artfcal Neural Networks Feedback sgnals between neurons Dynamc

More information

Comparison of the COG Defuzzification Technique and Its Variations to the GPA Index

Comparison of the COG Defuzzification Technique and Its Variations to the GPA Index Amercan Journal of Computatonal and Appled Mathematcs 06, 6(): 87-93 DOI: 0.93/.acam.06060.03 Comparson of the COG Defuzzfcaton Technque and Its Varatons to the GPA Index Mchael Gr. Voskoglou Department

More information

DETERMINATION OF TEMPERATURE DISTRIBUTION FOR ANNULAR FINS WITH TEMPERATURE DEPENDENT THERMAL CONDUCTIVITY BY HPM

DETERMINATION OF TEMPERATURE DISTRIBUTION FOR ANNULAR FINS WITH TEMPERATURE DEPENDENT THERMAL CONDUCTIVITY BY HPM Ganj, Z. Z., et al.: Determnaton of Temperature Dstrbuton for S111 DETERMINATION OF TEMPERATURE DISTRIBUTION FOR ANNULAR FINS WITH TEMPERATURE DEPENDENT THERMAL CONDUCTIVITY BY HPM by Davood Domr GANJI

More information

Gadjah Mada University, Indonesia. Yogyakarta State University, Indonesia Karangmalang Yogyakarta 55281

Gadjah Mada University, Indonesia. Yogyakarta State University, Indonesia Karangmalang Yogyakarta 55281 Reducng Fuzzy Relatons of Fuzzy Te Seres odel Usng QR Factorzaton ethod and Its Applcaton to Forecastng Interest Rate of Bank Indonesa Certfcate Agus aan Abad Subanar Wdodo 3 Sasubar Saleh 4 Ph.D Student

More information

Appendix B: Resampling Algorithms

Appendix B: Resampling Algorithms 407 Appendx B: Resamplng Algorthms A common problem of all partcle flters s the degeneracy of weghts, whch conssts of the unbounded ncrease of the varance of the mportance weghts ω [ ] of the partcles

More information

Power law and dimension of the maximum value for belief distribution with the max Deng entropy

Power law and dimension of the maximum value for belief distribution with the max Deng entropy Power law and dmenson of the maxmum value for belef dstrbuton wth the max Deng entropy Bngy Kang a, a College of Informaton Engneerng, Northwest A&F Unversty, Yanglng, Shaanx, 712100, Chna. Abstract Deng

More information

Chapter 13: Multiple Regression

Chapter 13: Multiple Regression Chapter 13: Multple Regresson 13.1 Developng the multple-regresson Model The general model can be descrbed as: It smplfes for two ndependent varables: The sample ft parameter b 0, b 1, and b are used to

More information

Logistic Regression. CAP 5610: Machine Learning Instructor: Guo-Jun QI

Logistic Regression. CAP 5610: Machine Learning Instructor: Guo-Jun QI Logstc Regresson CAP 561: achne Learnng Instructor: Guo-Jun QI Bayes Classfer: A Generatve model odel the posteror dstrbuton P(Y X) Estmate class-condtonal dstrbuton P(X Y) for each Y Estmate pror dstrbuton

More information

On the correction of the h-index for career length

On the correction of the h-index for career length 1 On the correcton of the h-ndex for career length by L. Egghe Unverstet Hasselt (UHasselt), Campus Depenbeek, Agoralaan, B-3590 Depenbeek, Belgum 1 and Unverstet Antwerpen (UA), IBW, Stadscampus, Venusstraat

More information

Online Classification: Perceptron and Winnow

Online Classification: Perceptron and Winnow E0 370 Statstcal Learnng Theory Lecture 18 Nov 8, 011 Onlne Classfcaton: Perceptron and Wnnow Lecturer: Shvan Agarwal Scrbe: Shvan Agarwal 1 Introducton In ths lecture we wll start to study the onlne learnng

More information

MAXIMUM A POSTERIORI TRANSDUCTION

MAXIMUM A POSTERIORI TRANSDUCTION MAXIMUM A POSTERIORI TRANSDUCTION LI-WEI WANG, JU-FU FENG School of Mathematcal Scences, Peng Unversty, Bejng, 0087, Chna Center for Informaton Scences, Peng Unversty, Bejng, 0087, Chna E-MIAL: {wanglw,

More information

CHAPTER 5 NUMERICAL EVALUATION OF DYNAMIC RESPONSE

CHAPTER 5 NUMERICAL EVALUATION OF DYNAMIC RESPONSE CHAPTER 5 NUMERICAL EVALUATION OF DYNAMIC RESPONSE Analytcal soluton s usually not possble when exctaton vares arbtrarly wth tme or f the system s nonlnear. Such problems can be solved by numercal tmesteppng

More information

Feature Selection: Part 1

Feature Selection: Part 1 CSE 546: Machne Learnng Lecture 5 Feature Selecton: Part 1 Instructor: Sham Kakade 1 Regresson n the hgh dmensonal settng How do we learn when the number of features d s greater than the sample sze n?

More information

APPLICATION OF RBF NEURAL NETWORK IMPROVED BY PSO ALGORITHM IN FAULT DIAGNOSIS

APPLICATION OF RBF NEURAL NETWORK IMPROVED BY PSO ALGORITHM IN FAULT DIAGNOSIS Journal of Theoretcal and Appled Informaton Technology 005-01 JATIT & LLS. All rghts reserved. ISSN: 199-8645 www.jatt.org E-ISSN: 1817-3195 APPLICATION OF RBF NEURAL NETWORK IMPROVED BY PSO ALGORITHM

More information

2 STATISTICALLY OPTIMAL TRAINING DATA 2.1 A CRITERION OF OPTIMALITY We revew the crteron of statstcally optmal tranng data (Fukumzu et al., 1994). We

2 STATISTICALLY OPTIMAL TRAINING DATA 2.1 A CRITERION OF OPTIMALITY We revew the crteron of statstcally optmal tranng data (Fukumzu et al., 1994). We Advances n Neural Informaton Processng Systems 8 Actve Learnng n Multlayer Perceptrons Kenj Fukumzu Informaton and Communcaton R&D Center, Rcoh Co., Ltd. 3-2-3, Shn-yokohama, Yokohama, 222 Japan E-mal:

More information

2016 Wiley. Study Session 2: Ethical and Professional Standards Application

2016 Wiley. Study Session 2: Ethical and Professional Standards Application 6 Wley Study Sesson : Ethcal and Professonal Standards Applcaton LESSON : CORRECTION ANALYSIS Readng 9: Correlaton and Regresson LOS 9a: Calculate and nterpret a sample covarance and a sample correlaton

More information

The Minimum Universal Cost Flow in an Infeasible Flow Network

The Minimum Universal Cost Flow in an Infeasible Flow Network Journal of Scences, Islamc Republc of Iran 17(2): 175-180 (2006) Unversty of Tehran, ISSN 1016-1104 http://jscencesutacr The Mnmum Unversal Cost Flow n an Infeasble Flow Network H Saleh Fathabad * M Bagheran

More information

Removal of Hidden Neurons by Crosswise Propagation

Removal of Hidden Neurons by Crosswise Propagation Neural Informaton Processng - etters and Revews Vol.6 No.3 arch 25 ETTER Removal of dden Neurons by Crosswse Propagaton Xun ang Department of anagement Scence and Engneerng Stanford Unversty CA 9535 USA

More information

A New Evolutionary Computation Based Approach for Learning Bayesian Network

A New Evolutionary Computation Based Approach for Learning Bayesian Network Avalable onlne at www.scencedrect.com Proceda Engneerng 15 (2011) 4026 4030 Advanced n Control Engneerng and Informaton Scence A New Evolutonary Computaton Based Approach for Learnng Bayesan Network Yungang

More information

Variations of the Rectangular Fuzzy Assessment Model and Applications to Human Activities

Variations of the Rectangular Fuzzy Assessment Model and Applications to Human Activities Varatons of the Rectangular Fuzzy Assessment Model and Applcatons to Human Actvtes MICHAEl GR. VOSKOGLOU Department of Mathematcal Scence3s Graduate Technologcal Educatonal Insttute of Western Greece Meg.

More information

Chapter 11: Simple Linear Regression and Correlation

Chapter 11: Simple Linear Regression and Correlation Chapter 11: Smple Lnear Regresson and Correlaton 11-1 Emprcal Models 11-2 Smple Lnear Regresson 11-3 Propertes of the Least Squares Estmators 11-4 Hypothess Test n Smple Lnear Regresson 11-4.1 Use of t-tests

More information

On an Extension of Stochastic Approximation EM Algorithm for Incomplete Data Problems. Vahid Tadayon 1

On an Extension of Stochastic Approximation EM Algorithm for Incomplete Data Problems. Vahid Tadayon 1 On an Extenson of Stochastc Approxmaton EM Algorthm for Incomplete Data Problems Vahd Tadayon Abstract: The Stochastc Approxmaton EM (SAEM algorthm, a varant stochastc approxmaton of EM, s a versatle tool

More information

System in Weibull Distribution

System in Weibull Distribution Internatonal Matheatcal Foru 4 9 no. 9 94-95 Relablty Equvalence Factors of a Seres-Parallel Syste n Webull Dstrbuton M. A. El-Dacese Matheatcs Departent Faculty of Scence Tanta Unversty Tanta Egypt eldacese@yahoo.co

More information

Turbulence classification of load data by the frequency and severity of wind gusts. Oscar Moñux, DEWI GmbH Kevin Bleibler, DEWI GmbH

Turbulence classification of load data by the frequency and severity of wind gusts. Oscar Moñux, DEWI GmbH Kevin Bleibler, DEWI GmbH Turbulence classfcaton of load data by the frequency and severty of wnd gusts Introducton Oscar Moñux, DEWI GmbH Kevn Blebler, DEWI GmbH Durng the wnd turbne developng process, one of the most mportant

More information

A neural network with localized receptive fields for visual pattern classification

A neural network with localized receptive fields for visual pattern classification Unversty of Wollongong Research Onlne Faculty of Informatcs - Papers (Archve) Faculty of Engneerng and Informaton Scences 2005 A neural network wth localzed receptve felds for vsual pattern classfcaton

More information

Kernels in Support Vector Machines. Based on lectures of Martin Law, University of Michigan

Kernels in Support Vector Machines. Based on lectures of Martin Law, University of Michigan Kernels n Support Vector Machnes Based on lectures of Martn Law, Unversty of Mchgan Non Lnear separable problems AND OR NOT() The XOR problem cannot be solved wth a perceptron. XOR Per Lug Martell - Systems

More information

Research Article Relative Smooth Topological Spaces

Research Article Relative Smooth Topological Spaces Advances n Fuzzy Systems Volume 2009, Artcle ID 172917, 5 pages do:10.1155/2009/172917 Research Artcle Relatve Smooth Topologcal Spaces B. Ghazanfar Department of Mathematcs, Faculty of Scence, Lorestan

More information