BACKPROPAGATION NEURAL NETWORK APPROACH FOR MEAN TEMPERATURE PREDICTION

Similar documents
Supporting Information

Chapter 11: Simple Linear Regression and Correlation

Short Term Load Forecasting using an Artificial Neural Network

LINEAR REGRESSION ANALYSIS. MODULE IX Lecture Multicollinearity

For now, let us focus on a specific model of neurons. These are simplified from reality but can achieve remarkable results.

Chapter 13: Multiple Regression

Multigradient for Neural Networks for Equalizers 1

LINEAR REGRESSION ANALYSIS. MODULE IX Lecture Multicollinearity

Chapter 9: Statistical Inference and the Relationship between Two Variables

Comparison of Regression Lines

Negative Binomial Regression

Solving Nonlinear Differential Equations by a Neural Network Method

COMPARISON OF SOME RELIABILITY CHARACTERISTICS BETWEEN REDUNDANT SYSTEMS REQUIRING SUPPORTING UNITS FOR THEIR OPERATIONS

Kernel Methods and SVMs Extension

2016 Wiley. Study Session 2: Ethical and Professional Standards Application

Support Vector Machines. Vibhav Gogate The University of Texas at dallas

Admin NEURAL NETWORKS. Perceptron learning algorithm. Our Nervous System 10/25/16. Assignment 7. Class 11/22. Schedule for the rest of the semester

Psychology 282 Lecture #24 Outline Regression Diagnostics: Outliers

Neural Networks. Perceptrons and Backpropagation. Silke Bussen-Heyen. 5th of Novemeber Universität Bremen Fachbereich 3. Neural Networks 1 / 17

COMPARATIVE STUDY OF RAINFALL FORECASTING MODELS. Mohita Anand Sharma 1 and Jai Bhagwan Singh 2

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur

VQ widely used in coding speech, image, and video

EEE 241: Linear Systems

Uncertainty in measurements of power and energy on power networks

Multilayer Perceptrons and Backpropagation. Perceptrons. Recap: Perceptrons. Informatics 1 CG: Lecture 6. Mirella Lapata

CHAPTER 5 NUMERICAL EVALUATION OF DYNAMIC RESPONSE

Statistics for Economics & Business

CIS526: Machine Learning Lecture 3 (Sept 16, 2003) Linear Regression. Preparation help: Xiaoying Huang. x 1 θ 1 output... θ M x M

Lecture 23: Artificial neural networks

Turbulence classification of load data by the frequency and severity of wind gusts. Oscar Moñux, DEWI GmbH Kevin Bleibler, DEWI GmbH

Generalized Linear Methods

Logistic Regression. CAP 5610: Machine Learning Instructor: Guo-Jun QI

Numerical Heat and Mass Transfer

The Study of Teaching-learning-based Optimization Algorithm

Statistical Evaluation of WATFLOOD

Boostrapaggregating (Bagging)

Week 5: Neural Networks

Note 10. Modeling and Simulation of Dynamic Systems

Polynomial Regression Models

Neural Networks & Learning

NON-CENTRAL 7-POINT FORMULA IN THE METHOD OF LINES FOR PARABOLIC AND BURGERS' EQUATIONS

Econ107 Applied Econometrics Topic 3: Classical Model (Studenmund, Chapter 4)

Durban Watson for Testing the Lack-of-Fit of Polynomial Regression Models without Replications

Homework Assignment 3 Due in class, Thursday October 15

Department of Quantitative Methods & Information Systems. Time Series and Their Components QMIS 320. Chapter 6

MULTISPECTRAL IMAGE CLASSIFICATION USING BACK-PROPAGATION NEURAL NETWORK IN PCA DOMAIN

The Ordinary Least Squares (OLS) Estimator

A LINEAR PROGRAM TO COMPARE MULTIPLE GROSS CREDIT LOSS FORECASTS. Dr. Derald E. Wentzien, Wesley College, (302) ,

BOOTSTRAP METHOD FOR TESTING OF EQUALITY OF SEVERAL MEANS. M. Krishna Reddy, B. Naveen Kumar and Y. Ramu

Lossy Compression. Compromise accuracy of reconstruction for increased compression.

Statistical Energy Analysis for High Frequency Acoustic Analysis with LS-DYNA

Lecture 10 Support Vector Machines II

SIMPLE LINEAR REGRESSION

PHYS 450 Spring semester Lecture 02: Dealing with Experimental Uncertainties. Ron Reifenberger Birck Nanotechnology Center Purdue University

Lecture 6: Introduction to Linear Regression

Parametric fractional imputation for missing data analysis. Jae Kwang Kim Survey Working Group Seminar March 29, 2010

x i1 =1 for all i (the constant ).

UNR Joint Economics Working Paper Series Working Paper No Further Analysis of the Zipf Law: Does the Rank-Size Rule Really Exist?

Resource Allocation with a Budget Constraint for Computing Independent Tasks in the Cloud

NUMERICAL DIFFERENTIATION

Orientation Model of Elite Education and Mass Education

A Robust Method for Calculating the Correlation Coefficient

Copyright 2017 by Taylor Enterprises, Inc., All Rights Reserved. Adjusted Control Limits for U Charts. Dr. Wayne A. Taylor

Midterm Examination. Regression and Forecasting Models

Statistics for Business and Economics

Queueing Networks II Network Performance

Efficient Weather Forecasting using Artificial Neural Network as Function Approximator

MATH 567: Mathematical Techniques in Data Science Lab 8

Chapter 2 - The Simple Linear Regression Model S =0. e i is a random error. S β2 β. This is a minimization problem. Solution is a calculus exercise.

Lecture Notes on Linear Regression

Linear Correlation. Many research issues are pursued with nonexperimental studies that seek to establish relationships among 2 or more variables

3.1 Expectation of Functions of Several Random Variables. )' be a k-dimensional discrete or continuous random vector, with joint PMF p (, E X E X1 E X

Atmospheric Environmental Quality Assessment RBF Model Based on the MATLAB

Feature Selection: Part 1

4 Analysis of Variance (ANOVA) 5 ANOVA. 5.1 Introduction. 5.2 Fixed Effects ANOVA

Notes on Frequency Estimation in Data Streams

Copyright 2017 by Taylor Enterprises, Inc., All Rights Reserved. Adjusted Control Limits for P Charts. Dr. Wayne A. Taylor

Design and Optimization of Fuzzy Controller for Inverse Pendulum System Using Genetic Algorithm

Appendix B: Resampling Algorithms

CHALMERS, GÖTEBORGS UNIVERSITET. SOLUTIONS to RE-EXAM for ARTIFICIAL NEURAL NETWORKS. COURSE CODES: FFR 135, FIM 720 GU, PhD

Effective plots to assess bias and precision in method comparison studies

Speeding up Computation of Scalar Multiplication in Elliptic Curve Cryptosystem

Internet Engineering. Jacek Mazurkiewicz, PhD Softcomputing. Part 3: Recurrent Artificial Neural Networks Self-Organising Artificial Neural Networks

CONTRAST ENHANCEMENT FOR MIMIMUM MEAN BRIGHTNESS ERROR FROM HISTOGRAM PARTITIONING INTRODUCTION

e i is a random error

Global Sensitivity. Tuesday 20 th February, 2018

A new Approach for Solving Linear Ordinary Differential Equations

Some basic statistics and curve fitting techniques

A Bayes Algorithm for the Multitask Pattern Recognition Problem Direct Approach

Neural networks. Nuno Vasconcelos ECE Department, UCSD

Section 8.3 Polar Form of Complex Numbers

Kernels in Support Vector Machines. Based on lectures of Martin Law, University of Michigan

/ n ) are compared. The logic is: if the two

Lecture 9: Linear regression: centering, hypothesis testing, multiple covariates, and confounding

Lecture 9: Linear regression: centering, hypothesis testing, multiple covariates, and confounding

1. Inference on Regression Parameters a. Finding Mean, s.d and covariance amongst estimates. 2. Confidence Intervals and Working Hotelling Bands

A Short Term Forecasting Method for Wind Power Generation System based on BP Neural Networks

Classification as a Regression Problem

Suppose that there s a measured wndow of data fff k () ; :::; ff k g of a sze w, measured dscretely wth varable dscretzaton step. It s convenent to pl

Dr. Shalabh Department of Mathematics and Statistics Indian Institute of Technology Kanpur

Transcription:

IJRRAS 9 () October 6 www.arpapress.com/volumes/vol9issue/ijrras_9.pdf BACKPROPAGATIO EURAL ETWORK APPROACH FOR MEA TEMPERATURE PREDICTIO Manal A. Ashour,*, Soma A. ElZahaby & Mahmoud I. Abdalla 3, Al-Azher Unversty, Faculty of Scence, Math. Department 3 Zagazg Unversty, Faculty of Engneerng, Electroncs and communcaton Department ABSTRACT s one of the basc components of the weather. In ths paper, mean temperature have been forecasted usng Artfcal eural etwork (A). The desgn of the A based on four weather parameters. The A desgn has been appled for Caro cty, the captal of Egypt. The tranng and testng used meteorologcal data for twenty years (996-6). In ths study we predct the mean temperature by usng the artfcal neural network A model and the multple lnear regresson MLR model. Ths study provdes a neural network model based on backpropagaton to predct the mean temperature and to compare the obtaned results wth the results obtaned by the multple lnear regresson MLR model. The dfferent performance evaluaton crtera are ntroduced to compare the results obtaned by the neural network model and the results obtaned by the multple lnear regresson. Results show mproved performance of the neural network model over the multple lnear regresson model. Keywords: artfcal neural network model A, backpropagaton BP, multple lnear regresson model MLR, mean temperature forecastng.. ITRODUCTIO Predct the future s one of the key ssues used n many felds. Tme seres analyss s one of the common statstcal methods used to predct and whch are used wdely n many applcatons of statstcal and economc terms n whch the behavor of the dependent varable predcton based on ts behavour n the past. On the other hand, there s a modern method more accurate and effectve n forecastng whch can use logc n ther operatons rather than the dea of the fxed relatonshp between varables known as artfcal neural networks. eural networks are sutable way n the representaton of relatonshps between varables whch are dfferent from the tradtonal ways n such that they are arthmetc system conssts of a set of smple elements and assocated wth each other to run the data dynamcally n response to external nput. eural networks consder as a data processng system whch have certan performance characterstcs n a manner smulates bologcal nerve system. The artfcal neural network A s a contemporary advanced methodology; t attracted the attenton of many researchers and scentsts n varous felds and dscplnes ncludng medcal, engneerng, statstcal research operatons, nformaton technology and others. In the last years artfcal neural networks have ganed an ncreasng mportance n processng and analyss of tme seres and future forecasts calculatons, due to the advantage of ts great flexblty compared to the known conventonal methods whch approved n ths area as well as ts ablty to self-learn and adapt to any model.. ARTIFICIAL EURAL ETWORK Artfcal neural network s a computatonal technque whch s desgned to smulate the way n whch the human bran process a specfc task by a huge dstrbuted parallel processng. eural network s made up of smple processng unts. These unts are called neurons. eurons are mathematcal elements whch have a nervous property n that they store practcal knowledge and emprcal nformaton to make t avalable to the user and that by adjustng the weghts. It s a matter of surprse that the speed of the computer exceeds the speed of the nerve cells by ten bllon tmes. In spte of that, someone can recognze a famlar face n the tenth of a second usng nerve cells wth speed do not exceed / of a second. The maxmum steps followed by the cells conclude no more than (/) steps n any way. Scentsts dd not reach a convncng and logcal explanaton for how low speed cells can reach solutons wth a hgh speed. euron cells process data n parallel that gves t hgh speed. Ths was enough to entce many researchers to mmc human neural networks usng computer smulaton.

IJRRAS 9 () October 6 Ashour et al. Backpropagaton eural etwork Approach It was possble to smplfy the cell components. In prncple, t was suffcent to take some of ts functons nto account and use a small number of them and then represent t mathematcally to get the artfcal neurons. Artfcal neuron s a processng unt. Fgure shows a smplest representaton of the artfcal neural cell. Weghts x w x w Output x n w n Fgure. A model Multple network types have been proposed. Feed forward networks are the most mportant proposed network. In ths structure, the sgnal transmt forward only from nput to output. The output of any neuron n a layer affects only on the next layer and there s no any connecton between neurons n the same layer. Most of the networks that follow ths pattern consst of nputs layer and output layer. In addton to these two layers of the network, at least one hdden layer. Each of these tree layers conssts of a number of neurons. Supervsed neural network need a teacher durng the tranng phase to montor to show the desred output for each nput to eventually reach the correct results and hence there s no need to the external supervson teacher. Ths operaton s carred out usng several methods and algorthms. The most mportant method s the backpropagaton method. Ths process starts by computng the error between the desred output and the calculated output. Then return ths error back from the output layer to the hdden layers and fnally to the nput layer. Durng the backpropagaton process, weghts are changng n the drecton that pushes the error to decrease to be zero. Ths tranng method used by the feed forward networks. Feed forward namng due to the network structure and backpropagaton namng due to the tranng method used by the network. In ths study the sgmod functon has been used as the actvaton functon because t s very easy to be derved. Backpropagaton enjoy wth many benefts, the most mportant benefts are: the mnmum square error guarantee, ts ablty to deal wth confusng data and ts ablty to deal wth systems of lnear and non lnear separable functons. Tranng of the network conssts of sx basc steps; () gve random weghts to the network, () supply the network wth nputs whch are prepared for tranng, (3) apply feed forward process to compute the network outputs, (4) compare the actual output wth desred output and determne the error value, (5) return the error over the network and correct weghts n the drecton that the error value decrease, (6) decrease the total error for each used nput n tranng. Tranng process s requred to be repeated many several tmes to get a lower value for the error. It should pay attenton to some mportant ssues when dealng wth the network especally durng network desgnng and network tranng. The lack of attenton to ths ssues leads to ncomplete or neffectve neural network. Some of them are; over fttng tranng, under fttng tranng, neural network sze, normalzaton and learnng rate. To avod the problems of tranng, the data dvded nto three sets; tranng set, valdaton set and testng set. The choce of the approprate sze of the network consders the toughest challenges ever n the artfcal neural network desgn. In addton to the many avalable choces of the actvaton functon for each neuron, n each layer, there s the problem of the choce of the approprate number of neurons n the layer and the choce of the number of the hdden layer tself n the network. Wthout any doubt, all of these choces must be made before startng n tranng process. The conclator choce of the network sze leads to unacceptable results. The easest and wdely used method n choosng the sze of the network s the "tral and error" method. The desgner should to try out a number of networks and choose the best. Ths expermentaton should to be systematcally somewhat so as not to take a long tme. The desgner can start wth a smple network and then ncrease ts sze bt by bt by addng layers or neurons untl t reaches to acceptable results. It can also begn a complex network and works on smplfyng gradually untl t reaches to acceptable n terms of network complexty and performance [4,5,6]. One of the choces that the desgner must be dentfed durng the tranng process s the learnng rate. Ths varable determnes the speed of weghts adaptng. If the learnng rate has a small value then the tranng process takes a long tme. However, f the learnng rate has a large value the weghts may fluctuate and move away from the requred weghts takng the tranng process to an nstablty case. In practce, the desgner can begn wth a randomly learnng rate 3

IJRRAS 9 () October 6 Ashour et al. Backpropagaton eural etwork Approach value ( ) and change t lttle by lttle to reach a sutable choce that combnes between the speed of tranng and mantanng ts stablty. 3. ARTIFICIAL EURAL ETWORK FOR WEATHER FORECASTIG After the frst world war, Lews Rchardson n 9 [] supposed that he can use mathematcs to predct the weather because atmosphere follows the laws of physcs, but the equatons were very complex process and the computng process consumed consderable tme n such that the aerodynamc front go before the forecasters could complete ther computaton. In addton, Rchardson used weather nformaton taken every sx hours, but as noted by the french meteorology specalst Rene Shabu that to reach the mnmum value of accuracy n weather forecastng, t requres that measurements are taken every thrty mnutes at most. Weather forecastng ncludes the study of many dfferent weather phenomena, such as temperature, barometrc pressure and so many thngs and phenomena related to the atmosphere. Study weather phenomena were not someday luxury wthout nterest but are the most mportant scence that should be largely provded n all countres and regons due to the number of large applcatons that rely on ths type of scence, and the study of weather that could save the lves of people from dyng as a result of abnormaltes n the weather could lead to many dsasters on the country and the whole regon. For ths reason, the frst beneft from the study of weather s to save the lves of as many people before t s too late. Weather forecastng has many applcatons and other benefts, whch are many and vared. Ar traffc reles manly on the weather. In addton, the study of changes n the atmosphere and the weather condtons are very useful n mantanng the vegetaton n a specfc area of the rsk of ce formaton whch leads to death f the necessary precautons and measures are not takng. The Meteorology s very mportant n the study and analyss of the water stuaton n a gven area and the amount of expected ran n a gven area, and more areas affected by these rans, whch helps a good deal wth them and not to be wasted. Also, the scence of meteorology s very mportant for martme traffc; where meteorologcal workng to determne whether or not the movement of shps, and states that these applcatons have many sectons n the scence of weather. The use of neural networks to predct the varous parameters of the weather as a new systematc scence has proved as a successful technque n forecastng. Weather s a contnuous, data ntensve, mult dmensons, dynamc and chaotc process. These propertes make weather forecastng a major challenge. Due to the non lnear nature of weather's data, the focus has shfted towards the non lnear predcton of the weather data. Snce the weather data s non lnear and follows a very rregular trend, artfcal neural network has evolved out to be a better technque to brng out the structural relatonshp between the varous enttes. Artfcal neural network s non lnear flexble functon that does not requre the restrcted assumptons on the relatonshp between the ndependent and dependent varables. eural etworks treats well wth nonparametrc data or small data sze n addton to ts accuracy wth parametrc data. Many studes have shown that neural networks offer better predct levels compared wth other tradtonal statstcal methods as well as neural networks can manage analytcal and predctve tasks n a way more speed whch wll postvely reflect on the element of tme. The A model has been desgned by usng four basc numbers of procedures: () collectng data, () processng data, (3) buldng the network, (4) tranng, valdatng and testng network (fgure ). Ths study used daly data. The data was collected from Caro Arport Internatonal Staton (HECA) 3., 3.4 through the nterval 996-6 []. The mean temperature, mean dew pont, mean humdty, mean sea level pressure and the mean wnd speed are recorded n Excel data fle conssts of data sheets represents the mean temperature, mean dew pont, mean humdty, mean sea level pressure and mean wnd speed belongs to January (996-6). After the data collecton, the recorded data has been reprocessng. Data reprocessng procedures are conducted to tran the A more effcently. The mssng data are replaced by the average of neghbourng values. E 4

IJRRAS 9 () October 6 Ashour et al. Backpropagaton eural etwork Approach Collect weather data Data reprocessng Desgn of A Tran A Testng Predcted weather output Fgure. The block dagram of tranng procedure In artfcal neural network software, all nputs and outputs are normalzed between and. The use of the orgnal data as nput to neural network may cause convergence problem [3]. eural networks provde mproved performance wth the normalzed data. Data normalzng s the process of scalng data to fall wthn a smaller range. ormalzng help n speedng up the tranng phase. Ths step s very mportant when dealng wth parameters of dfferent unts and scales. Therefore, all parameters should have the same scale for a far comprse between them. Two methods are usually well known for rescalng data. ormalzaton scales all numerc varables n the range [, ] [3]. One possble formula s gven below: ' x mn x max mn On the other hand, you can use standardzaton on your data set. It wll then transform t to have zero mean and unt varance [4, 5], for example usng the equaton below: x x ' where, are the mean and the standard devaton of the data set. It s worth mentonng that, both of these methods have ther drawbacks. If you have outlers n your data set, normalzng your data wll certanly scale the "normal" data to a very small nterval. And generally, most of data sets have outlnes. When usng standardzaton, your new data are not bounded (unlke normalzaton). ormalzaton wll help the nput and output parameters to correlate wth each other. The outputs then were denormalzed nto the orgnal data format for achevng the desred results. To forecast the temperature for the next day, we need to construct an abstract model that reflects the realty and actually of the real system. In order to keep the smplcty of the modellng structure, only one hdden layer wth 5 neurons s consdered. Ths number was arrved by tral and error method. Performance of the network was evaluated by ncreasng the number of hdden neurons. Ths number was arrved after analyzng 5,, 5,, 5 and 3 neurons n the hdden layer. The archtecture wth 5 and neurons n the hdden layer was faster n computaton but the convergence rate s very slow. The archtecture wth, 5 and 3 neurons n the hdden layer was convergng equally well as that wth 5 neurons. Therefore, the archtecture 5 neurons n the hdden layer were selected (table ). After fndng hdden neurons, epochs ncrease tll we fnd the sutable epochs [9]. The number of nput neurons s four representng the chosen weather parameters. The parameters chosen n ths study for the predcton are the mean temperature, mean dew pont, mean relatve humdty, mean sea level pressure and mean wnd speed. 5

IJRRAS 9 () October 6 Ashour et al. Backpropagaton eural etwork Approach Table. Meteorologcal varables Parameter no. Meteorologcal varables unts Mean temperature C Relatve humdty % 3 Dew pont C 4 Sea level pressure (SLP) hpa 5 Wnd speed Km/h There s no reason behnd ths choce of weather parameters. The choce s made just to predct the mean temperature. The number of hdden neurons s 5 for processng and the number of outputs s representng the weather varable to be forecasted. All the weather data sets were therefore, transformed nto values between and. Fnally outputs were denormalzed nto the orgnal data format for achevng the desred result. Tranng goal for the network was set to. etwork was traned for a fxed number of epochs (table ). The network s frst ntalzed by settng up all ts weghts to be small unformly random numbers between - and +.e. weghts ~U (-, ). ext the nput pattern s appled and the output calculated (forward pass). The calculaton gves an output whch s completely dfferent to what you want (the target), snce all the weghts are random. We then calculated the error of each neuron, whch s essentally: predcted output actual output. Ths error s then used mathematcally to change the weghts n such a way that the error wll get smaller. In other words, the output of each neuron wll get closer to ts target (reverse or back pass). The process s repeated agan and agan untl the error s mnmal. Table A model structure 4. MULTIPLE LIEAR REGRESSIO MODEL (MLR) Forecastng model based on tme seres data are beng developed for predcton of the dfferent varables. Regresson s a statstcal emprcal technque and s wdely used n busness. The multple regresson lnear model s ftted to predct the weather parameters [,]. The multple regresson lnear model s ftted to predct the daly mean temperature as dependent parameter takng the other daly ndependent parameters as mean dew pont, mean relatve humdty, mean sea level pressure and mean wnd speed. The most sgnfcantly contrbuted parameters are selected usng enter regresson analyss based on 3 data set of years and the best ft multple regresson model s gven below: y.3x.68x.5x3.x 4 75.33 where, mean temperature y as the dependent parameter havng 9% Contrbuton of the sgnfcant parameters mean dew pont x, mean relatve humdty x, mean sea level pressure x 3 and mean wnd speed x 4. The multple lnear regresson MLR procedure for selectng the sgnfcant parameters for the mean temperature parameter s mentoned n Appendx. 5. PERFORMACE EVALUATIO CRITERIA Many analytcal methods have been proposed for the evaluaton and nter comparson of dfferent models, whch can be evaluated n terms of graphcal representaton and numercal computatons. The graphcal performance crteron nvolves a lnear scale plot of the actual and the predcted weather parameters for tranng and testng data for all the models. The numercal performance crtera nvolves Mean error (BAIS): umber of hdden layers o. hdden neurons 5 o. of epochs Actvaton functon used n nput layer Actvaton functon used n hdden layer Actvaton functon used n output layer Learnng Method lnear Sgmod Sgmod Supervsed 6

IJRRAS 9 () October 6 Ashour et al. Backpropagaton eural etwork Approach X Y Mean absolute error (MAE): X Y Root mean square error (RMSE): X Y Predcton error (PE): X Y Y Correlaton Coeffcent (r): ths s obtaned by performng a regresson between the predcted values and the actual values. r Where, ( Y Y )( X ( Y Y ) X ) ( X X ) mples the average over the whole test set, s the total number of forecast outputs. actual and the predcted values respectvely for,,...,, predcted values respectvely. Y and X Y and X are the are the mean values of the actual and the 5. RESULT AD DISCUSSIO For the best predcton, the BIAS, MAE, and RMSE values should be small and PE should be suffcently small.e., close to. But r should be found closer to (between -) for ndcatng better agreement between actual and predcted values. The rectal of weather forecastng models had been evaluated on the bass of SPSS verson 4. It can be seen from table 3 that the bas for testng data s.968 and.97 for and MLR models respectvely. The bas s lesser for artfcal neural network than the values that are obtaned from MLR model. The MAE for s.35 whch are lesser than the MAE for MLR model. The RMSE s also found to be.6887 for whch s less than that of the MLR model. The artfcal neural network model also shows a smaller value for PE aganst to ths of MLR model. Further, the correlaton coeffcent s observed to be hghest for artfcal neural network model as.97677 n comparson to.94955 for multple lnear regresson MLR model. Thus, the graphcal representaton (fgure 3) as well as the numercal estmates both favored. Table 3. Comparson of the performance of forecastng models for mean temperature Model A MLR Bas.968.97 MAE.5566.559 RMSE.688799.689659 PE.6845.859 CC.97677.94955 6. COCLUSIO Artfcal neural network s more accurate and effcent n predctng than the tradtonal statstcal methods. A model wth the learnng rate of.7 and wth momentum.3 s a preferred performance n comparson to MLR model, concludng that ths A can be used as an effectve mean temperature forecastng tool. 7

IJRRAS 9 () October 6 Ashour et al. Backpropagaton eural etwork Approach Predcted Temp. by usng A Actual and predcted temp. usng 35 3 5 5 5 3 4 5 6 7 8 9 3 3 Actual Temp. Calculated Temp. 5 5 5 3 4 5 6 7 8 9 3 3 Actual Temp. Predcted Temp. Actual and predcted temp. usng MLR Actual and predcted temp. usng MLR 6 4 8 6 4 3 4 5 6 7 8 9 3 3 6 4 8 6 4 3 4 5 6 7 8 9 3 3 Actual Temp. Predcted Temp. Actual Temp. Predcted Temp. Fgure 3 comparson of the actual and the predcted mean temperature usng A and MLR models REFERECES []. S. Mathur "A feature based neural network model for weather forecastng" World academy of scence, engneerng and technology 34, 7. []. https://www.wunderground.com/. [3]. M.R.Khan and C.Ondrusek, "short- term electrc demand prognoss usng artfcal neural networks", Journal of Electrcal Engneerng, Vol. 5, pp.96-3,. [4]. M.T Hagan, Demuth HB, Beale MH (996) eural etwork desgn. PWS, Boston, MA. [5]. M. Khan, Ondrusek C () Short-term electrc demand prognoss usng artfcal neural networks. J Elec Engg 5:96-3. [6]. Y. Tan, Wang J, Zurada JM () onlnear blnd source separaton usng a radal bass functon network. IEEE Transactons on eural etworks :4-34. [7]. R. Rojas, "The back propagaton algorthm of eural etworks A Systematc Introducton, "chapter 7, ISB 978-3546558. [8]. S. Chattopadhyay, "Multlayered feed forward Artfcal eural etwork model to predct the average summer-monsoon ranfall n Inda, "6. [9]. Y. Inde, A. Buzo, R. Gray, An algorthm for vector quantzer desgn. IEEE Transactons, o. 8, 98, pp. 84-95. []. P. and Sanjay Mathur, "A smple weather forecastng model usng Mathematcal regresson", Indan Research Journal ofextenson Educaton, specal ssue vol.,jan. []. O. Terz, "Monthly Ranfall estmaton usng data mnng process, appled computatonal ntellgence and soft computng, vol.. []. P. Lynch, The Emergence of umercal Weather Predcton. Quarterly journal of the royal meteorologcal socety, Q. J. R. Meteorol. Soc. 33: 43 44, 7 [3]. G. Shmuel, R. tn, C. Peter, Data Mnng In Excel: Lecture otes and Cases. Draft December 3, 5. Dstrbuted by: Resamplng Stats, Inc. 6. Jackson St. Arlngton, VA, USA, nfo@xlmner.com, www.xlmner.com [4]. G. Mllgan, M. Cooper. 988. A study of standardzaton of varables n cluster analyss. Journal of Classfcaton, 5, 8-4. [5]. O. Tongeren,. 995. Cluster Analyss. Pages 74- n: R. H. G. Jongman, C. J. F. ter Braak, and O. F. R. van Tongeren, Eds. Data Analyss n Landscape and Communty Ecology. (Cambrdge & ew York: Cambrdge Unversty Press). 8