Data and prognosis for renewable energy

Size: px
Start display at page:

Download "Data and prognosis for renewable energy"

Transcription

1 The Hong Kong Polytechnic University Department of Electrical Engineering Project code: FYP_27 Data and prognosis for renewable energy by Choi Man Hin D Final Report Bachelor of Engineering (Honours) in Electrical Engineering (41470) Of Department of Electrical Engineering The Hong Kong Polytechnic University Supervisor: Dr. Zhao Xu Date: 31/3/2018

2 Abstract It is important to forecast the wind speed in the prognosis of wind power generation. The energy loss and the operation cost could be greatly reduced with the accurate wind speed prediction. In this paper, we would study 3 different forecasting models, which are Artificial Neural Network Model (ANN), Moving Average Model (MA) and Auto Regressive Model (AR) to study the prognosis of wind speed in Hong Kong. The data of different weather parameters including wind speed, temperature, humidity and wind speed we studied is in Hong Kong We use matlab2015 to carry out the data processing, correlation and feature selection in ANN model. Levenberg-Marquardt backpropagation (LM) would be the learning algorithm for selftraining and generate the predicted outcomes. For MA and AR, Excel2016 would be involved to give out the forecasting results. This project aims to investigate the intra-comparison and inter-comparison among ANN, AR, MA for the best forecasting method of wind speed. Mean square error would be the benchmark for the comparison of the accuracy and performance. After all comparison, we would determine a suitable way to forecast the wind speed in Hong Kong among 3 methods we suggested. After testing, it is noticeable that ANN shows the greatest performance, followed by the AR, and MA shows the worse among 3 methods. 10 minutes ahead forecasting is more accurate than 30 minutes ahead forecasting. Forecasting result in autumn performs the best and performs the worse in winter. Despite the long preparation and training time, it is strongly recommended to use ANN model to forecast the weather condition. It is hoped that to be possible to develop a single model to integrate wind speed with other parameters to improve the accuracy of power forecasting, as will be investigated in future.

3 Writing Acknowledgements Dr. Zhao Xu and his students Mr. Chai Song Jian have been very supportive throughout my project, which began in October last year. This project includes a lot of mathematics and statistics concepts. Dr. Zhao and Mr. Chai has provided me a lot of suggestion and academic advising on the work. They gave me deep introduction on this project like the concept of Artificial Neural Network and Time Series Forecasting, which enabled me to have a clear direction on my final year project at the beginning. Especially Mr. Chai supported me for some complicate programming codes in the exploit of Matlab2015 computing program, and so, I could run the forecasting models smoothly on the computing learning algorithms.

4 TABLE OF CONTENT Abstract Acknowledgement Table of content Introduction Literature Review Methodology Result Discussion Conclusion Reference

5 Introduction Renewable energy (RE) exploits the natural elements to generate electricity in an environmentally friendly way, with low emission of greenhouse gases like carbon dioxide to the atmosphere. We are responsible to develop RE and instead of using coal burning for electricity generation, to give a better environment to our next generation. Mahoney [10] suggests that engineers is now trying to improve the power quality, with the integration of the renewable energy system into the central power grid recently. Luo [2] reveals that, wind and solar power development encounter a lot of limitation, while the biggest drawback is about the cost, with high initial and operating cost. At the same time, the wind speed varies from time to time, and it leads to the nonconstant power supply. As a result, wind generation system usually connects with traditional generators for securing stable power quality and supply. Once, the RE system stops to work, the generators would generate electricity to compensate it. The generators are required to have a wide range of spinning reserve, but it also leads to the great energy lost. As a result, some government decide to delay the implementation of the development of RE. In this paper, we would focus on the development of the wind power generation. The biggest limitation of wind power development is the high levelized cost of electricity. Jung [7] points out that with accurate forecasting on the wind speed, the energy loss in spinning reserve of the generators and exploitation of alternative battery system could be greatly reduced. Kim [8] suggests that it would provide the economic incentives for the government to launch the wind power generation development. The historical weather datum including wind speed, humidity, temperature, wind direction in Hong Kong 2008 and 2009 is given for the study of the wind speed forecasting models. We would use different ways to make predictions on the wind speed including the Artificial Neural Network Model (ANN), Moving Average Model (MA) and Autoregressive Model (AR). Objective We suggest three different wind forecasting models - Moving Average (MA), Autoregressive (AR) and Artificial Neuron Network. We aim to compare the forecasting results with different seasonal trend and forecasting horizon, determine the best-fit wind forecasting model among the three prediction methods in Hong Kong.

6 Artificial Neural Network Model (ANN) Artificial Neural Network Model is a non-linear approach analyzing method. Tascikaraoglu [9] demonstrates that the idea of multilayer perceptron refers to the mechanism of the self-learning algorithm. It comprises three layers, namely input node, hidden node and output node. The network model exploits the historical data that we entered in the input layer, to undergo self-training and generate the result. Park suggests that [1] throughout the training process, the network model captures all the relevant attributes, to adjust the weighting and give out probability events of different input parameters. The purpose of training is to minimize the global error E, which defined as the followed. σ E = 1 Eσ K σ=1 --- (1) Eσ = 1 (Oi 2 i=1 Ti) (2) M Where K is the number of set of training pattern, Eσ refers to the error for training pattern σ, M is equivalence to the output node quantity, Oi and Ti correspond to network output and target output at i th output node. The above diagram reveals the basic concept of ANN model, with 3 different layers. Levenberg Marquardt One of the backpropagation method Levenberg Marquardt would be studied in this paper. Wang [5] demonstrates that Levenberg-Marquardt is a learning method to approximate a function, and to approach second order training speed without having to compute Hessian Matrix. It is an iterative procedure to solve and propagate the following equation. (J t J + λi)δ = J t E (1) J refers to the Jacobian matrix for the system, λ equals to damping factor, δ is the weight update vector, while E is the error vector containing the output errors for each input vector. δ corresponds to adjustment of weights to achieve a better solution.

7 Moving Average Model (MA) Given a series of numbers and a fixed subset size, moving average of the first element is obtained by taking the average of the initial fixed subset of the number series. Borhan [6] says that it is a shifting forward method to extend the timeline; that is, excluding the first number of the series and including the next value in the subset. Autoregressive (AR) Autoregressive Moving Average Model (ARMA) is a linear approaching, to forecast from stationery models revert to mean. We use usually use ARMA to study a time series of data. Similar to ANN model, AR aims to make forecasting of future values by using the historical data by autocorrelation, but ARMA doesn t involve the self-training process. Carli suggests that [4] AR calculation gives different coefficient or weighting to the past historical data to undergo future prediction. The general formula of AR is shown as Where Xt refers to the predicted values, a1 to ap correspond to the coefficient, Xt-1 to Xt-p are the past data values and finally wt refer to the constant value. Mean Square Error (MSE) It is an estimator to measure the accuracy of the predicted and actual values. It shows the formula as below: Where n refers to number of data set, yi demonstrates the predicted value and yi desh corresponds to the actual value. It is a common estimator, especially for such large amount of data sets.

8 Literature review There are several researches on the wind speed forecasting in foreign countries. One of the example is using Artificial Neural Network Model - ANN algorithm to forecast the wind speed at Knock Airport recently. In 2014, Mukh Raj YadavKumar, Gaurav Singh and Anurag Chaturvedi [3] suggested using Levenberg-Marquardt backpropagation, Scaled Conjugate Gradient algorithm along with Bayesian Regularization to carry out the forecasting models. The results show that Levenberg-Marquardt back propagation and Bayesian Regularization performs better than Scaled Conjugate Gradient. Mukh Raj YadavKumar, Gaurav Singh and Anurag Chaturvedi s paper includes the intracomparison on the forecasting accuracy on the three ANN methods mentioned. However, the result shows a high percentage of MSE error. The input matrix includes a lot of different aspects of parameters including lagged wind speed, wind direction, temperature, dew point, wet bulb, vapor pressure, relative humidity, sea level pressure at the same time in the ANN model. It is questionable that inputting various of sectors as input parameters would decrease the performance on wind speed forecasting. The paper doesn t include the cross-correlation analysis between each sector to the wind speed. In my paper, it is suggested that the input parameters of ANN would be determined by the results of autocorrelation and cross-correlation. More dimensions of comparison like seasonal trend, would also be investigated in this project. Hence, in my own paper, we would like to emphasize the importance of the historical data of lagged wind speed to forecast the future values. The simple time series forecasting method Moving Average Model (MA) and a more Autoregressive Mode l(ar), would be exploited to forecast the wind speed in different papers. These two methods only make use of the past data of wind speed to undergo the prognosis of the future wind speed. Finally, one of the ANN method, Levenberg-Marquardt back propagation (LM), would also be included for the inter-comparison of these 3 methods. ANN, AR, MA method are the three different kinds of methods to carry out the prognosis of wind speed. This paper is inspired by the researches in foreign countries wind speed forecasting report. However, it is uncommon to discover the wind speed research or even renewable energy investigation in Hong Kong. This paper would like to try using AR, MA and Levenberg-Marquardt (LM propagation) in ANN adopting seasonal and trending analysis in this project.

9 Methodology In this project, we mainly use the Matlab2015 and Microsoft Excel computing programs for the data analysis. There are several steps before we make investigation on our forecasting models, including data processing, sample normalizing, and correlation. Finally, we could generate the forecasted outcome from three different methods, ANN, AR and MA respectively. It is noticeable that the whole working process would be slightly difference from the 3 models. The following shows the flow of developing ANN model. ANN: Classification -> Error Processing -> Averaging -> Normalizing -> Cross Correlation -> Autocorrelation -> Parameter Selection -> Machine Learning -> Prediction MA / AR: Classification -> Error Processing -> Averaging -> Normalizing -> Calculation -> Prediction For ANN Classification The historical data measured by Hong Kong Observatory in 2008 would be undergo researching in this paper. The parameters including wind speed, humidity, temperature, wind direction would be studied. The following shows the units of parameters in measurement: Parameters Units Wind Speed ms -1 (starting from 0) Humidity % (ranging from 0 to 100) Temperature o C (starting from -273) Wind direction Degree (ranging from 0 to 360) where 0 o = North, 90 o = East, 180 o = South, 270 o = West Then, we would classify the considerable amounts of datum systematically into 4 seasons, which would be transformed and undergo comparison in the last section. With the reference of the information shared by Hong Kong Observatory, we would simply divide the datum in 2008 as follow by seasons. It is noticeable that the first 2

10 months of data in 2009 would be combined with the data in December 2008 as the values in winter. Season Date Spring 01/03/2008 to 31/05/2008 Summer 01/06/2008 to 31/08/2008 Autumn 01/09/2008 to 30/11/2008 Winter 01/12/2008 to 28/02/2009 Error processing In fact, there are little portions of data with error, represented as in the files. The error datum would affect the learning algorithm of the models, and hence, it is a necessary to handle with them. In this project, we alter the data with error directly to the previous value. The following shows that in red color is changed to the previous value 0. The above diagram shows the example of error processing Averaging There are over 2 million number of data totally with respect to the wind speed, humidity, temperature, wind direction measured in every minute in To order to increase the performance effectiveness of our forecasting model, we would average the magnitude of our data in every 10 minutes.

11 Normalization Followed by, we would like to normalize the averaged data by using the equation X u Xmax Xmin, where X corresponds to the variables, u refers to the mean of all data, while Xmax and Xmin are the greatest and smallest magnitude among all data. Normalizing helps to standardize the range of different parameters from 0 to 1, and also eliminates the influence on the extreme values in statistic. Hence, it is indispensable to carry out this process before we apply the data into the learning model. Feature Selection Feature selection could be classified into two ways cross correlation and autocorrelation. This part would only be included in the ANN model. We would use Matlab2015 to finish it. Cross correlation Cross correlation refers to figure out the similarity between two different parameters, to define correlation coefficient to express the level of similarity in the analysis. The greater magnitude of correlation coefficient shows the greater extent of correlation between the selected parameters. The value ranging from [0, 1] represents the positive correlation, which reveals that one variable increases with another. While ranging from [-1, 0] corresponds to the negative correlation, which demonstrates that one variable increases as the other decreases. It is noticeable that in terms of negative correlation, such as two coefficients of -0.8 and -0.2 respectively, -0.8 demonstrates the greater extent of negative linkage with another variable. In this paper, we would study the cross-correlation around the 4 different parameters, which are wind speed, humidity, temperature and wind direction. As the main purpose of this paper in this section is to figure out the influence of humidity, temperature and wind direction on the wind speed; hence, we would carry out the following analysis: a) Wind Speed cross Humidity b) Wind Speed cross Temperature c) Wind Speed cross Wind Speed

12 Autocorrelation Unlike cross correlation, autocorrelation aims to identify the appropriate time series in a non-random way. We know that past datum of wind speed is strongly correlated to the future wind speed. Hence, in autocorrelation analysis, we are going to determine the number of previous data highly connected to the next moment wind speed, and thus, adjusting our number of input variables in our ANN models. Autocorrelation show a more significance importance than cross correlation in our forecasting model. Same as cross correlation, the value of correlation coefficient is ranging from [-1, 1], which [0, 1] represents the positive correlation, [-1, 0] corresponds to the negative correlation. It is also noticeable that we usually consider the correlation coefficient greater than 0.8 as high linkage with the future values. Hence, we would focus on the coefficient value ranging from [0.8, 1]. Parameter selection After observing the cross and auto correlation result, we would determine the suitable in parameter into our ANN model, we would input the parameters having correlation coefficient greater than 0.8 (experienced value) into the forecasting model (LM) in Matlab2015. The learning program would undergo self-training generate the forecasting result. In this project, we use 70% of the data as training, 15% of data as validation and last 15% as testing results (around 1950 to 2000), which would be undergo compared among different forecasting methods. In the training process, we would compare the performance of ANN by using different number of neurons [2, 4, 6, 8, 10, 12, 14, 16, 18, and 20]. For MA We could only use the past data of wind speed for the prediction, but not with other vectors in MA model. Traditionally, moving average is just a tool to reveal the current situation but without any forecasting. In this project, I would like to change the use of MA into a forecasting way. Taking MA (5) as an example, MA (5) refers to the value that taking equal average of the previous 5 data, in this testing, we would compare the

13 difference of the MA (5) value to the actual next moment value. The number of data being tested include would be about 1950 to 2000, which is around 15% of the total normalized wind speed data in every season, as same as in ANN. In this paper, we would carry out intra-comparison to find the greatest accuracy in terms of MA (4), MA (6), MA (8) respectively, then using the best performance to compare with other methods. For AR Followed by classification, error processing, averaging and normalizing, we input the formula of Where Xt refers to the predicted values, a1 to ap correspond to the coefficient, Xt-1 to Xt-p are the past data values and finally wt refer to the constant value. We carry out AR (4), AR (6), AR (8) and that means we use the past 4, 6 and 8 data, giving them the corresponding coefficients, to perform the future predictions. Same as MA, we could only use the past data of wind speed for the prediction, but not with other vectors in AR model. We use around 1950 to 2000 sets of results for comparison in different seasons at different look ahead time. Result comparison We use MSE as indicator to compare the performance. A smaller value of MSE represents the greater accuracy. Mean squared error is an important benchmark that is exploited on the ground of measuring the performance or accuracy of an estimator. The mean squared error, abbreviated as MSE, is essential for relaying the concepts of precision, bias and accuracy during the statistical estimation. Where n refers to number of data set, yi demonstrates the predicted value and yi desh corresponds to the actual value. It is a common estimator, especially for such large amount of data sets.

14 Results Intra-comparison of ANN We would first focus on the result of parameter selection and then follow by the forecasted outcomes. For cross correlation, we have Wind Speed cross Humidity, Wind Speed cross Temperature and Wind Speed cross Wind Speed. The following shows the results of cross correlation, we use different lengths of lagged times as x-axis. Wind Speed cross Humidity spring The scaled is demonstrated from 0 to 1200 for the x-axis revealing the lagged time The scaled is demonstrated from 0 to 168 for the x-axis revealing the lagged time For the cross correlation between wind speed and humidity in spring, we could figure out that the highest correlation is 0.12 at lagged time 64 in the aspects of positive correlation, while he highest correlation is at lagged time 580 in the aspects of negative correlation.

15 Wind Speed cross Humidity summer The scaled is demonstrated from 0 to 1200 for the x-axis revealing the lagged time The scaled is demonstrated from 0 to 168 for the x-axis revealing the lagged time For the cross correlation between wind speed and humidity in summer, we could figure out that the highest correlation is 0.15 at lagged time 90 in the aspects of positive correlation, while he highest correlation is at lagged time 1150 in the aspects of negative correlation.

16 Wind Speed cross Humidity autumn The scaled is demonstrated from 0 to 1200 for the x-axis revealing the lagged time The scaled is demonstrated from 0 to 168 for the x-axis revealing the lagged time For the cross correlation between wind speed and humidity in autumn, we could figure out that the highest correlation is 0.3 at lagged time 72 in the aspects of positive correlation, while he highest correlation is at lagged time 850 in the aspects of negative correlation.

17 Wind Speed cross Humidity winter The scaled is demonstrated from 0 to 1200 for the x-axis revealing the lagged time The scaled is demonstrated from 0 to 168 for the x-axis revealing the lagged time For the cross correlation between wind speed and humidity in winter, we could figure out that the highest correlation is 0.25 at lagged time 72 in the aspects of positive correlation, while he highest correlation is at lagged time 1020 in the aspects of negative correlation.

18 Wind Speed cross Temperature spring The scaled is demonstrated from 0 to 1200 for the x-axis revealing the lagged time The scaled is demonstrated from 0 to 168 for the x-axis revealing the lagged time For the cross correlation between wind speed and temperature in spring, we could figure out that the highest correlation is 0.3 at lagged time 60 in the aspects of positive correlation, while he highest correlation is at lagged time 800 in the aspects of negative correlation.

19 Wind Speed cross Temperature summer The scaled is demonstrated from 0 to 1200 for the x-axis revealing the lagged time The scaled is demonstrated from 0 to 168 for the x-axis revealing the lagged time For the cross correlation between wind speed and temperature in summer, we could figure out that the highest correlation is 0.2 at lagged time 1150 in the aspects of positive correlation, while he highest correlation is at lagged time 650 in the aspects of negative correlation.

20 Wind Speed cross Temperature autumn The scaled is demonstrated from 0 to 1200 for the x-axis revealing the lagged time The scaled is demonstrated from 0 to 168 for the x-axis revealing the lagged time For the cross correlation between wind speed and temperature in autumn, we could figure out that the highest correlation is 0.14 at lagged time 0 in the aspects of positive correlation, while he highest correlation is at lagged time 1200 in the aspects of negative correlation.

21 Wind Speed cross Temperature winter The scaled is demonstrated from 0 to 1200 for the x-axis revealing the lagged time The scaled is demonstrated from 0 to 168 for the x-axis revealing the lagged time For the cross correlation between wind speed and temperature in winter, we could figure out that the highest correlation is 0.18 at lagged time 300 in the aspects of positive correlation, while he highest correlation is at lagged time 650 in the aspects of negative correlation.

22 Wind Speed cross Wind Direction spring The scaled is demonstrated from 0 to 1200 for the x-axis revealing the lagged time The scaled is demonstrated from 0 to 264 for the x-axis revealing the lagged time For the cross correlation between wind speed and wind direction in spring, we could figure out that the highest correlation is 0.2 at various lagged time in the aspects of positive correlation, while he highest correlation is at lagged time 0 in the aspects of negative correlation.

23 Wind Speed cross Wind Direction summer The scaled is demonstrated from 0 to 1200 for the x-axis revealing the lagged time The scaled is demonstrated from 0 to 168 for the x-axis revealing the lagged time For the cross correlation between wind speed and wind direction in summer, we could figure out that the highest correlation is 0.12 at lagged time 1150 in the aspects of positive correlation, while he highest correlation is at lagged time 750 in the aspects of negative correlation.

24 Wind Speed cross Wind Direction autumn The scaled is demonstrated from 0 to 1200 for the x-axis revealing the lagged time The scaled is demonstrated from 0 to 264 for the x-axis revealing the lagged time For the cross correlation between wind speed and wind direction in autumn, we could figure out that the highest correlation is at lagged time 425 in the aspects of positive correlation, while he highest correlation is -0.2 at lagged time 48 in the aspects of negative correlation.

25 Wind Speed cross Wind Direction winter The scaled is demonstrated from 0 to 1200 for the x-axis revealing the lagged time The scaled is demonstrated from 0 to 168 for the x-axis revealing the lagged time For the cross correlation between wind speed and wind direction in winter, we could figure out that the highest correlation is 0.12 at lagged time 300 in the aspects of positive correlation, while he highest correlation is -0.2 at lagged time 5 in the aspects of negative correlation.

26 The following shows the result of cross correlation between wind speed and humidity: Season Highest correlation (+) Highest correlation (-) Spring Summer Autumn Winter The following shows the result of cross correlation between wind speed and temperature: Season Highest correlation (+) Highest correlation (-) Spring Summer Autumn Winter The following shows the result of cross correlation between wind speed and wind direction: Season Highest correlation (+) Highest correlation (-) Spring Summer Autumn Winter To make a simple conclusion, all the alternative parameters humidity, temperature, wind direction show very low correlation relationship with the wind speed. The correlation coefficient is ranging from to Most of the cases are with averaging around 0.2 of correlation coefficient only. Mentioned in the methodology, we would only consider correlation coefficient greater than 0.8 as releveant input parameters in ANN model. As a result, we would not involve the sectors of humidity, temperature, and wind direction into our ANN model training, given that they are not correlated.

27 Wind Speed Autocorrelation - spring The scaled is demonstrated from 0 to 500 for the x-axis revealing the lagged time The scaled is demonstrated from 0 to 100 for the x-axis revealing the lagged time From the larger scaling of autocorrelation graph, we can find that correlation coefficient demonstrating a decreasing trend when the number of lagged increases. It indicates that with the more latter values of wind speed, they show the higher relationship with the next moment wind speed. From the smaller scaling of autocorrelation graph, It is observable that there are 10 points of correlation coefficient ranging from [0.8, 1]. Hence, we are going to use the previous 10 wind speed data to forecast the next wind speed data in spring in our ANN model.

28 Wind Speed Autocorrelation - summer The scaled is demonstrated from 0 to 500 for the x-axis revealing the lagged time The scaled is demonstrated from 0 to 100 for the x-axis revealing the lagged time From the larger scaling of autocorrelation graph, we can find that correlation coefficient demonstrating a decreasing trend when the number of lagged increases. It indicates that with the more latter values of wind speed, they show the higher relationship with the next moment wind speed. From the smaller scaling of autocorrelation graph, It is observable that there are 6 points of correlation coefficient ranging from [0.8, 1]. Hence, we are going to use the previous 6 wind speed data to forecast the next wind speed data in summer in our ANN model.

29 Wind Speed Autocorrelation - autumn The scaled is demonstrated from 0 to 500 for the x-axis revealing the lagged time The scaled is demonstrated from 0 to 100 for the x-axis revealing the lagged time From the larger scaling of autocorrelation graph, we can find that correlation coefficient demonstrating a decreasing trend when the number of lagged increases. It indicates that with the more latter values of wind speed, they show the higher relationship with the next moment wind speed. From the smaller scaling of autocorrelation graph, It is observable that there are 7 points of correlation coefficient ranging from [0.8, 1]. Hence, we are going to use the previous 7 wind speed data to forecast the next wind speed data in autumn in our ANN model.

30 Wind Speed Autocorrelation winter The scaled is demonstrated from 0 to 500 for the x-axis revealing the lagged time The scaled is demonstrated from 0 to 100 for the x-axis revealing the lagged time From the larger scaling of autocorrelation graph, we can find that correlation coefficient demonstrating a decreasing trend when the number of lagged increases. It indicates that with the more latter values of wind speed, they show the higher relationship with the next moment wind speed. From the smaller scaling of autocorrelation graph, It is observable that there are 5 points of correlation coefficient ranging from [0.8, 1]. Hence, we are going to use the previous 5 wind speed data to forecast the next wind speed data in winter in our ANN model.

31 From the above results, we could simply make a table to summarize the input variables of our ANN model. Seasons Number of spots found from the range of [0.8, 1] in terms of Forecasting equation (10 minutes ahead forecasting) Forecasting equation (30 minutes ahead forecasting) correlation coefficient Spring 10 Input: m to m+9 Target: m+10 Input: m to m+9 Target: m+12 Summer 6 Input: m to m+5 Target: m+6 Input: m to m+5 Target: m+8 Autumn 7 Input: m to m+6 Target: m to m+7 Input: m to m+6 Target: m to m+9 Winter 5 Input: m to m+4 Target: m to m+5 Input: m to m+4 Target: m to m+8 10 lagged value would be entered as input parameter in spring demonstration, 6 for summer, 7 for autumn and 5 for winter. The target values would be the 10 minutes and 30 minutes later in 10 minutes look ahead and 30 minutes look ahead respectively. Hence, we would transform our normalized datum into two different tables one for 10 minutes ahead forecasting, another for 30 minutes ahead forecasting. Using the data in summer as an example, here shows the original normalized data listing from top to below. We then transform the tables into two, which are shown in the next pages.

32 This table is designed for the 10 minutes ahead forecasting, the values in first 6 column refer to the input variables, while the value in the 7 th column correspond to the target variables. We use every previous 6 data to forecast the next 10 minutes wind speed. There are sets of data, with combination of input and target, and would be processed into our ANN model in the next step. This table is designed for the 30 minutes ahead forecasting. Similar to the 10 minutes forecasting, the values in first 6 column refer to the input variables, while the value in the 7 th column correspond to the target variables. However, we use every previous 6 data to forecast wind speed after 30 minutes. Therefore, we can find the order of the data in the 7 th column is slightly different from each table. For the 30 minutes ahead forecasting, there are sets of data, with combination of input and target, and would be processed into our ANN model in the next step. The result would be generated then. We use different number of neurons to find the best performance of LM method.

33 Normalized value Intra-comparison of ANN (LM) model The following shows the performance of LM using 10 minutes ahead forecasting in spring. Number of neurons MSE We can observe that LM (14) get the best result with MSE , while MA (18) performs the worst with MSE mins ahead LM(14) in spring actual forecast Time Series The above diagram demonstrates the difference of actual and forecasted results with the best performance of 10 minutes ahead forecasting in spring.

34 Normalized value The following shows the performance of LM using 10 minutes ahead forecasting in summer. Number of neurons MSE We can observe that LM (4), LM (16), LM (18) get the best result with MSE , while LM (20) performs the worse with MSE mins ahead LM(4) in summer actual forecast Time Series The above diagram demonstrates the difference of actual and forecasted results with one of the best performance of 10 minutes ahead forecasting in summer.

35 Normalized value The following shows the performance of LM using 10 minutes ahead forecasting in autumn. Number of neurons MSE We can observe that LM (6), LM (14) get the best result with MSE , while LM (4) performs the worse with MSE mins ahead LM(6) in autumn actual forecast Time Series The above diagram demonstrates the difference of actual and forecasted results with one of the best performance of 10 minutes ahead forecasting in autumn.

36 Normalized value The following shows the performance of LM using 10 minutes ahead forecasting in winter. Number of neurons MSE We can observe that LM (4), LM (18) get the best result with MSE , while LM (8) and LM (12) perform the worse with MSE mins ahead LM(4) in winter actual forecast Time Series The above diagram demonstrates the difference of actual and forecasted results with one of the best performance of 10 minutes ahead forecasting in winter.

37 Normalized value The following shows the performance of LM using 30 minutes ahead forecasting in spring. Number of neurons MSE We can observe that LM (12) get the best result with MSE , while LM (6) and LM (20) performs the worse with MSE mins ahead LM(12) in spring actual forecast Time Series The above diagram demonstrates the difference of actual and forecasted results with one of the best performance of 30 minutes ahead forecasting in spring.

38 Normalized value The following shows the performance of LM using 30 minutes ahead forecasting in summer. Number of neurons MSE We can observe that LM (6), LM (10), LM (12) get the best result with MSE , while LM (2), LM (16) and LM (20) performs the worse with MSE mins ahead LM(6) in summer actual forecast Time Series The above diagram demonstrates the difference of actual and forecasted results with one of the best performance of 30 minutes ahead forecasting in summer.

39 Normalized value The following shows the performance of LM using 30 minutes ahead forecasting in autumn. Number of neurons MSE We can observe that LM (14) get the best result with MSE , and the others show similar result ranging from to mins ahead LM(12) in autumn actual forecast Time Series The above diagram demonstrates the difference of actual and forecasted results with one of the best performance of 30 minutes ahead forecasting in autumn

40 Normalized value The following shows the performance of LM using 30 minutes ahead forecasting in winter. Number of neurons MSE We can observe that LM (16) get the best result with MSE , while LM (6), LM (12) performs the worse with MSE mins ahead LM(16) in winter actual forecast Time Series The above diagram demonstrates the difference of actual and forecasted results with one of the best performance of 30 minutes ahead forecasting in winter.

41 In the following, we would summarize the best performance in different seasons using 10 min ahead or 30 minutes ahead ANN forecasting. Season 10 minutes ahead (MSE) 30 minutes ahead (MSE) Spring Summer Autumn Winter To give a short summary, we find that the number of neurons in ANN model doesn t directly proportional to the performance of the prediction model. Even we increase the number of neurons and thus using more training time, the accuracy doesn t seem to be greatly improved. Besides, 10 minutes look ahead forecasting shows greater accuracy compare with 30 minutes look ahead forecasting. Especially in autumn, the MSE of 30 minutes look ahead forecasting is , which is nearly 4 times as the 10 minutes with MSE The remaining seasons also showed a big difference of MSE in terms of 10 / 30 minutes look ahead forecasting.

42 Normalized value Intra-comparison of MA model The following shows the 10 minutes ahead forecasting results by MA method in spring using MA (4), MA (6) and MA (8) Spring MSE MA (4) MA (6) MA (8) We can observe that MA (4) get the best result with MSE , while MA (8) performs the worst with MSE mins ahead MA(4) in Spring Actual Forecast Time Series The above diagram demonstrates the difference of actual and forecasted results for the best performance of 10 minutes ahead forecasting in spring.

43 Normalized value The following shows the 10 minutes ahead forecasting results by MA method in summer using MA (4), MA (6) and MA (8) Spring MSE MA (4) MA (6) MA (8) We can observe that MA (4) get the best result with MSE , while MA (8) performs the worst with MSE mins ahead MA(4) in Summer Actual Forecast Time Series The above diagram demonstrates the difference of actual and forecasted results for the best performance of 10 minutes ahead forecasting in summer.

44 Normalized value The following shows the 10 minutes ahead forecasting results by MA method in autumn using MA (4), MA (6) and MA (8) Spring MSE MA (4) MA (6) MA (8) We can observe that MA (4) get the best result with MSE , while MA (8) performs the worst with MSE mins ahead MA(4) in Autumn Actual Forecast Time Series The above diagram demonstrates the difference of actual and forecasted results for the best performance of 10 minutes ahead forecasting in autumn.

45 Normalized value The following shows the 10 minutes ahead forecasting results by MA method in winter using MA (4), MA (6) and MA (8) Spring MSE MA (4) MA (6) MA (8) We can observe that MA (4) get the best result with MSE , while MA (8) performs the worst with MSE mins ahead MA(4) in Winter Actual Forecast Time Series The above diagram demonstrates the difference of actual and forecasted results for the best performance of 10 minutes ahead forecasting in winter.

46 Normalized value The following shows the 30 minute ahead forecasting results by MA method in spring using MA (4), MA (6) and MA (8) Spring MSE MA (4) MA (6) MA (8) We can observe that MA (6) get the best result with MSE , while MA (8) performs the worst with MSE mins ahead MA(4) in Spring Actual Forecast Time Series The above diagram demonstrates the difference of actual and forecasted results for the best performance of 30 minutes ahead forecasting in spring.

47 Normalized value The following shows the 30 minutes ahead forecasting results by MA method in summer using MA (4), MA (6) and MA (8) Spring MSE MA (4) MA (6) MA (8) We can observe that MA (4) get the best result with MSE , while MA (8) performs the worst with MSE mins ahead MA(4) in Summer Actual Time Series The above diagram demonstrates the difference of actual and forecasted results for the best performance of 30 minutes ahead forecasting in summer.

48 Normalized value The following shows the 30 minutes ahead forecasting results by MA method in autumn using MA (4), MA (6) and MA (8) Spring MSE MA (4) MA (6) MA (8) We can observe that MA (4) get the best result with MSE , while MA (8) performs the worst with MSE mins ahead MA(4) in Autumn Actual Forecast Time Series The above diagram demonstrates the difference of actual and forecasted results for the best performance of 30 minutes ahead forecasting in autumn.

49 Normalized value The following shows the 30 minutes ahead forecasting results by MA method in winter using MA (4), MA (6) and MA (8) Spring MSE MA (4) MA (6) MA (8) We can observe that MA (4) get the best result with MSE , while MA (8) performs the worst with MSE mins ahead MA(4) in Winter Actual Forecast Time Series The above diagram demonstrates the difference of actual and forecasted results for the best performance of 30 minutes ahead forecasting in winter. In the following, we would summarize the best performance in different seasons using 10 min ahead or 30 minutes ahead MA forecasting. Season 10 minutes ahead (MSE) 30 minutes ahead (MSE) Spring Summer Autumn Winter Moving average giving equal weighted on the past data to predict the future values. The results of MA (4) > MA (6) > MA (8) frequently. That means the wind speed is more correlated to the previous data. We found that using the simplest way MA doesn t give us a low-accurate result, in fact, the MSE levels are not really high which are better than expected.

50 Normalized value Intra-comparison of AR model The following shows the 10 minutes ahead forecasting results by AR method in spring using AR (4), AR (6) and AR (8). Spring MSE AR (4) AR (6) AR (8) We can observe that both AR (6) and AR (8) get the best result with MSE which is greater than AR (4) with MSE mins ahead AR(8) in Spring Actual forecast Time Series The above diagram demonstrates the difference of actual and forecasted results for the best performance of 10 minutes ahead forecasting in spring.

51 Normalized value The following shows the 10 minutes ahead forecasting results by AR method in summer using AR (4), AR (6) and AR (8). Spring MSE AR (4) AR (6) AR (8) We can observe that both AR (4), AR (6), AR (8) show the same result with MSE mins ahead AR(4) in Summer Actual forecast Time Series The above diagram demonstrates the difference of actual and forecasted results for one of the best performance of 10 minutes ahead forecasting in summer.

52 Normalized value The following shows the 10 minutes ahead forecasting results by AR method in autumn using AR (4), AR (6) and AR (8). Spring MSE AR (4) AR (6) AR (8) We can observe that both AR (4), AR (6), AR (8) show the same result with MSE mins ahead AR(6) in autumn Time Series actual forecast The above diagram demonstrates the difference of actual and forecasted results for one of the best performance of 10 minutes ahead forecasting in autumn.

53 Normalized value The following shows the 10 minutes ahead forecasting results by AR method in winter using AR (4), AR (6) and AR (8). Spring MSE AR (4) AR (6) AR (8) We can observe that both AR (6) and AR (8) get the best result with MSE , while which is slightly greater than AR (4) 1 10 mins ahead AR(8) in winter Actual forecast Time Series The above diagram demonstrates the difference of actual and forecasted results for the best performance of 10 minutes ahead forecasting in winter.

54 Normalized values The following shows the 30 minutes ahead forecasting results by AR method in spring using AR (4), AR (6) and AR (8). Spring MSE AR (4) AR (6) AR (8) We can observe that both AR (6) and AR (8) get the best result with MSE , while AR (4) performs the worst with MSE mins ahead AR(8) in Spring Time Series Actual forecast The above diagram demonstrates the difference of actual and forecasted results for the best performance of 30 minutes ahead forecasting in spring.

55 Normalized values The following shows the 30 minutes ahead forecasting results by AR method in summer using AR (4), AR (6) and AR (8). Spring MSE AR (4) AR (6) AR (8) We can observe that AR (4) and AR (6) get the best result with MSE , which is slightly greater than AR (8) with MSE mins ahead AR(4) in summer Times series actual forecast The above diagram demonstrates the difference of actual and forecasted results for one of the best performance of 30 minutes ahead forecasting in summer.

56 座標軸標題 The following shows the 30 minutes ahead forecasting results by AR method in autumn using AR (4), AR (6) and AR (8). Spring MSE AR (4) AR (6) AR (8) We can observe that all AR (4), AR (6) and AR (8) get the same result of MSE mins ahead AR(8) in autumn Time Series actual forecast The above diagram demonstrates the difference of actual and forecasted results for one of the best performance of 30 minutes ahead forecasting in autumn.

57 Normalized values The following shows the 30 minutes ahead forecasting results by AR method in winter using AR (4), AR (6) and AR (8). Spring MSE AR (4) AR (6) AR (8) We can observe that AR (8) get the best result with MSE , while AR (4) performs the worst with MSE mins ahead AR(8) in winter actual forecast Time Series The above diagram demonstrates the difference of actual and forecasted results for the best performance of 30 minutes ahead forecasting in autumn. In the following, the best performance in different seasons using 10 min ahead or 30 minutes ahead by AR model was summarized. Season 10 minutes ahead (MSE) 30 minutes ahead (MSE) Spring Summer Autumn Winter Same as ANN and MA, the result of using 10 minutes ahead forecasting show better result than using 30 minutes. The results are satisfying, especially the MSE are and only in summer and autumn at 10 minutes look ahead prediction.

58 Inter-comparison of best result of ANN, MA and AR model 10 minutes look ahead forecast ANN MA AR Spring Summer Autumn Winter Average MSE of best performance results minutes look ahead forecast ANN MA AR Spring Summer Autumn Winter Average MSE of best performance results In terms of 10 minutes look ahead forecasting with the best performance result comparison: For spring, ANN performs the best with MSE , followed by AR with MSE which is 12.9% greater, and MA performs the worse with MSE which is 38.7% higher than the ANN s. For summer, ANN performs the best with MSE , followed by AR with MSE which is 4.7% greater, and MA performs the worse with MSE which is 28.5% higher than the ANN s. For autumn, both ANN and AR perform well with MSE , MA performs the worse with MSE which is 20.0% higher than the ANN s and AR s. For winter, both ANN and AR perform well with MSE , MA performs the worse with MSE which is 24.1% higher than the ANN s and AR s. Averagely, the MSE of ANN, MA and AR are , and respectively. ANN shows the best performance, AR performs 4% less accurate than ANN, MA performs 28% less accurate than ANN.

Wind Power Forecasting using Artificial Neural Networks

Wind Power Forecasting using Artificial Neural Networks Wind Power Forecasting using Artificial Neural Networks This paper aims at predicting the power output of wind turbines using artificial neural networks,two different algorithms and models were trained

More information

Short-term wind forecasting using artificial neural networks (ANNs)

Short-term wind forecasting using artificial neural networks (ANNs) Energy and Sustainability II 197 Short-term wind forecasting using artificial neural networks (ANNs) M. G. De Giorgi, A. Ficarella & M. G. Russo Department of Engineering Innovation, Centro Ricerche Energia

More information

Prediction of Hourly Solar Radiation in Amman-Jordan by Using Artificial Neural Networks

Prediction of Hourly Solar Radiation in Amman-Jordan by Using Artificial Neural Networks Int. J. of Thermal & Environmental Engineering Volume 14, No. 2 (2017) 103-108 Prediction of Hourly Solar Radiation in Amman-Jordan by Using Artificial Neural Networks M. A. Hamdan a*, E. Abdelhafez b

More information

Electric Load Forecasting Using Wavelet Transform and Extreme Learning Machine

Electric Load Forecasting Using Wavelet Transform and Extreme Learning Machine Electric Load Forecasting Using Wavelet Transform and Extreme Learning Machine Song Li 1, Peng Wang 1 and Lalit Goel 1 1 School of Electrical and Electronic Engineering Nanyang Technological University

More information

ANN and Statistical Theory Based Forecasting and Analysis of Power System Variables

ANN and Statistical Theory Based Forecasting and Analysis of Power System Variables ANN and Statistical Theory Based Forecasting and Analysis of Power System Variables Sruthi V. Nair 1, Poonam Kothari 2, Kushal Lodha 3 1,2,3 Lecturer, G. H. Raisoni Institute of Engineering & Technology,

More information

A SEASONAL FUZZY TIME SERIES FORECASTING METHOD BASED ON GUSTAFSON-KESSEL FUZZY CLUSTERING *

A SEASONAL FUZZY TIME SERIES FORECASTING METHOD BASED ON GUSTAFSON-KESSEL FUZZY CLUSTERING * No.2, Vol.1, Winter 2012 2012 Published by JSES. A SEASONAL FUZZY TIME SERIES FORECASTING METHOD BASED ON GUSTAFSON-KESSEL * Faruk ALPASLAN a, Ozge CAGCAG b Abstract Fuzzy time series forecasting methods

More information

Application of Artificial Neural Network for Short Term Load Forecasting

Application of Artificial Neural Network for Short Term Load Forecasting aerd Scientific Journal of Impact Factor(SJIF): 3.134 e-issn(o): 2348-4470 p-issn(p): 2348-6406 International Journal of Advance Engineering and Research Development Volume 2,Issue 4, April -2015 Application

More information

ARTIFICIAL NEURAL NETWORK PART I HANIEH BORHANAZAD

ARTIFICIAL NEURAL NETWORK PART I HANIEH BORHANAZAD ARTIFICIAL NEURAL NETWORK PART I HANIEH BORHANAZAD WHAT IS A NEURAL NETWORK? The simplest definition of a neural network, more properly referred to as an 'artificial' neural network (ANN), is provided

More information

MODELLING ENERGY DEMAND FORECASTING USING NEURAL NETWORKS WITH UNIVARIATE TIME SERIES

MODELLING ENERGY DEMAND FORECASTING USING NEURAL NETWORKS WITH UNIVARIATE TIME SERIES MODELLING ENERGY DEMAND FORECASTING USING NEURAL NETWORKS WITH UNIVARIATE TIME SERIES S. Cankurt 1, M. Yasin 2 1&2 Ishik University Erbil, Iraq 1 s.cankurt@ishik.edu.iq, 2 m.yasin@ishik.edu.iq doi:10.23918/iec2018.26

More information

Short Term Load Forecasting Using Multi Layer Perceptron

Short Term Load Forecasting Using Multi Layer Perceptron International OPEN ACCESS Journal Of Modern Engineering Research (IJMER) Short Term Load Forecasting Using Multi Layer Perceptron S.Hema Chandra 1, B.Tejaswini 2, B.suneetha 3, N.chandi Priya 4, P.Prathima

More information

Introduction to Machine Learning Spring 2018 Note Neural Networks

Introduction to Machine Learning Spring 2018 Note Neural Networks CS 189 Introduction to Machine Learning Spring 2018 Note 14 1 Neural Networks Neural networks are a class of compositional function approximators. They come in a variety of shapes and sizes. In this class,

More information

ANN based techniques for prediction of wind speed of 67 sites of India

ANN based techniques for prediction of wind speed of 67 sites of India ANN based techniques for prediction of wind speed of 67 sites of India Paper presentation in Conference on Large Scale Grid Integration of Renewable Energy in India Authors: Parul Arora Prof. B.K Panigrahi

More information

Artificial Neural Networks

Artificial Neural Networks Introduction ANN in Action Final Observations Application: Poverty Detection Artificial Neural Networks Alvaro J. Riascos Villegas University of los Andes and Quantil July 6 2018 Artificial Neural Networks

More information

Solar Irradiance Prediction using Neural Model

Solar Irradiance Prediction using Neural Model Volume-8, Issue-3, June 2018 International Journal of Engineering and Management Research Page Number: 241-245 DOI: doi.org/10.31033/ijemr.8.3.32 Solar Irradiance Prediction using Neural Model Raj Kumar

More information

A Novel 2-D Model Approach for the Prediction of Hourly Solar Radiation

A Novel 2-D Model Approach for the Prediction of Hourly Solar Radiation A Novel 2-D Model Approach for the Prediction of Hourly Solar Radiation F Onur Hocao glu, Ö Nezih Gerek, and Mehmet Kurban Anadolu University, Dept of Electrical and Electronics Eng, Eskisehir, Turkey

More information

Short Term Load Forecasting Based Artificial Neural Network

Short Term Load Forecasting Based Artificial Neural Network Short Term Load Forecasting Based Artificial Neural Network Dr. Adel M. Dakhil Department of Electrical Engineering Misan University Iraq- Misan Dr.adelmanaa@gmail.com Abstract Present study develops short

More information

Time Series and Forecasting

Time Series and Forecasting Time Series and Forecasting Introduction to Forecasting n What is forecasting? n Primary Function is to Predict the Future using (time series related or other) data we have in hand n Why are we interested?

More information

EE-588 ADVANCED TOPICS IN NEURAL NETWORK

EE-588 ADVANCED TOPICS IN NEURAL NETWORK CUKUROVA UNIVERSITY DEPARTMENT OF ELECTRICAL&ELECTRONICS ENGINEERING EE-588 ADVANCED TOPICS IN NEURAL NETWORK THE PROJECT PROPOSAL AN APPLICATION OF NEURAL NETWORKS FOR WEATHER TEMPERATURE FORECASTING

More information

Forecasting of Rain Fall in Mirzapur District, Uttar Pradesh, India Using Feed-Forward Artificial Neural Network

Forecasting of Rain Fall in Mirzapur District, Uttar Pradesh, India Using Feed-Forward Artificial Neural Network International Journal of Engineering Science Invention ISSN (Online): 2319 6734, ISSN (Print): 2319 6726 Volume 2 Issue 8ǁ August. 2013 ǁ PP.87-93 Forecasting of Rain Fall in Mirzapur District, Uttar Pradesh,

More information

Introduction to Natural Computation. Lecture 9. Multilayer Perceptrons and Backpropagation. Peter Lewis

Introduction to Natural Computation. Lecture 9. Multilayer Perceptrons and Backpropagation. Peter Lewis Introduction to Natural Computation Lecture 9 Multilayer Perceptrons and Backpropagation Peter Lewis 1 / 25 Overview of the Lecture Why multilayer perceptrons? Some applications of multilayer perceptrons.

More information

22/04/2014. Economic Research

22/04/2014. Economic Research 22/04/2014 Economic Research Forecasting Models for Exchange Rate Tuesday, April 22, 2014 The science of prognostics has been going through a rapid and fruitful development in the past decades, with various

More information

Open Access Research on Data Processing Method of High Altitude Meteorological Parameters Based on Neural Network

Open Access Research on Data Processing Method of High Altitude Meteorological Parameters Based on Neural Network Send Orders for Reprints to reprints@benthamscience.ae The Open Automation and Control Systems Journal, 2015, 7, 1597-1604 1597 Open Access Research on Data Processing Method of High Altitude Meteorological

More information

Rainfall Prediction using Back-Propagation Feed Forward Network

Rainfall Prediction using Back-Propagation Feed Forward Network Rainfall Prediction using Back-Propagation Feed Forward Network Ankit Chaturvedi Department of CSE DITMR (Faridabad) MDU Rohtak (hry). ABSTRACT Back propagation is most widely used in neural network projects

More information

Forecasting with Expert Opinions

Forecasting with Expert Opinions CS 229 Machine Learning Forecasting with Expert Opinions Khalid El-Awady Background In 2003 the Wall Street Journal (WSJ) introduced its Monthly Economic Forecasting Survey. Each month the WSJ polls between

More information

Neural Networks. Nicholas Ruozzi University of Texas at Dallas

Neural Networks. Nicholas Ruozzi University of Texas at Dallas Neural Networks Nicholas Ruozzi University of Texas at Dallas Handwritten Digit Recognition Given a collection of handwritten digits and their corresponding labels, we d like to be able to correctly classify

More information

Artificial Intelligence

Artificial Intelligence Artificial Intelligence Jeff Clune Assistant Professor Evolving Artificial Intelligence Laboratory Announcements Be making progress on your projects! Three Types of Learning Unsupervised Supervised Reinforcement

More information

Research Article Weather Forecasting Using Sliding Window Algorithm

Research Article Weather Forecasting Using Sliding Window Algorithm ISRN Signal Processing Volume 23, Article ID 5654, 5 pages http://dx.doi.org/.55/23/5654 Research Article Weather Forecasting Using Sliding Window Algorithm Piyush Kapoor and Sarabjeet Singh Bedi 2 KvantumInc.,Gurgaon22,India

More information

Introduction Neural Networks - Architecture Network Training Small Example - ZIP Codes Summary. Neural Networks - I. Henrik I Christensen

Introduction Neural Networks - Architecture Network Training Small Example - ZIP Codes Summary. Neural Networks - I. Henrik I Christensen Neural Networks - I Henrik I Christensen Robotics & Intelligent Machines @ GT Georgia Institute of Technology, Atlanta, GA 30332-0280 hic@cc.gatech.edu Henrik I Christensen (RIM@GT) Neural Networks 1 /

More information

ESTIMATION OF HOURLY MEAN AMBIENT TEMPERATURES WITH ARTIFICIAL NEURAL NETWORKS 1. INTRODUCTION

ESTIMATION OF HOURLY MEAN AMBIENT TEMPERATURES WITH ARTIFICIAL NEURAL NETWORKS 1. INTRODUCTION Mathematical and Computational Applications, Vol. 11, No. 3, pp. 215-224, 2006. Association for Scientific Research ESTIMATION OF HOURLY MEAN AMBIENT TEMPERATURES WITH ARTIFICIAL NEURAL NETWORKS Ömer Altan

More information

UNCERTAINTY ANALYSIS OF TWO-SHAFT GAS TURBINE PARAMETER OF ARTIFICIAL NEURAL NETWORK (ANN) APPROXIMATED FUNCTION USING SEQUENTIAL PERTURBATION METHOD

UNCERTAINTY ANALYSIS OF TWO-SHAFT GAS TURBINE PARAMETER OF ARTIFICIAL NEURAL NETWORK (ANN) APPROXIMATED FUNCTION USING SEQUENTIAL PERTURBATION METHOD UNCERTAINTY ANALYSIS OF TWO-SHAFT GAS TURBINE PARAMETER OF ARTIFICIAL NEURAL NETWORK (ANN) APPROXIMATED FUNCTION USING SEQUENTIAL PERTURBATION METHOD HILMI ASYRAF BIN RAZALI Report submitted in partial

More information

Australian Journal of Basic and Applied Sciences. A Comparative Analysis of Neural Network based Short Term Load Forecast for Seasonal Prediction

Australian Journal of Basic and Applied Sciences. A Comparative Analysis of Neural Network based Short Term Load Forecast for Seasonal Prediction Australian Journal of Basic and Applied Sciences, 7() Sep 03, Pages: 49-48 AENSI Journals Australian Journal of Basic and Applied Sciences Journal home page: www.ajbasweb.com A Comparative Analysis of

More information

A new method for short-term load forecasting based on chaotic time series and neural network

A new method for short-term load forecasting based on chaotic time series and neural network A new method for short-term load forecasting based on chaotic time series and neural network Sajjad Kouhi*, Navid Taghizadegan Electrical Engineering Department, Azarbaijan Shahid Madani University, Tabriz,

More information

Predict Time Series with Multiple Artificial Neural Networks

Predict Time Series with Multiple Artificial Neural Networks , pp. 313-324 http://dx.doi.org/10.14257/ijhit.2016.9.7.28 Predict Time Series with Multiple Artificial Neural Networks Fei Li 1, Jin Liu 1 and Lei Kong 2,* 1 College of Information Engineering, Shanghai

More information

Do we need Experts for Time Series Forecasting?

Do we need Experts for Time Series Forecasting? Do we need Experts for Time Series Forecasting? Christiane Lemke and Bogdan Gabrys Bournemouth University - School of Design, Engineering and Computing Poole House, Talbot Campus, Poole, BH12 5BB - United

More information

Local Prediction of Precipitation Based on Neural Network

Local Prediction of Precipitation Based on Neural Network Environmental Engineering 10th International Conference eissn 2029-7092 / eisbn 978-609-476-044-0 Vilnius Gediminas Technical University Lithuania, 27 28 April 2017 Article ID: enviro.2017.079 http://enviro.vgtu.lt

More information

Improved the Forecasting of ANN-ARIMA Model Performance: A Case Study of Water Quality at the Offshore Kuala Terengganu, Terengganu, Malaysia

Improved the Forecasting of ANN-ARIMA Model Performance: A Case Study of Water Quality at the Offshore Kuala Terengganu, Terengganu, Malaysia Improved the Forecasting of ANN-ARIMA Model Performance: A Case Study of Water Quality at the Offshore Kuala Terengganu, Terengganu, Malaysia Muhamad Safiih Lola1 Malaysia- safiihmd@umt.edu.my Mohd Noor

More information

J1.2 Short-term wind forecasting at the Hong Kong International Airport by applying chaotic oscillatory-based neural network to LIDAR data

J1.2 Short-term wind forecasting at the Hong Kong International Airport by applying chaotic oscillatory-based neural network to LIDAR data J1.2 Short-term wind forecasting at the Hong Kong International Airport by applying chaotic oscillatory-based neural network to LIDAR data K.M. Kwong Hong Kong Polytechnic University, Hong Kong, China

More information

Artificial Neural Networks. MGS Lecture 2

Artificial Neural Networks. MGS Lecture 2 Artificial Neural Networks MGS 2018 - Lecture 2 OVERVIEW Biological Neural Networks Cell Topology: Input, Output, and Hidden Layers Functional description Cost functions Training ANNs Back-Propagation

More information

(Feed-Forward) Neural Networks Dr. Hajira Jabeen, Prof. Jens Lehmann

(Feed-Forward) Neural Networks Dr. Hajira Jabeen, Prof. Jens Lehmann (Feed-Forward) Neural Networks 2016-12-06 Dr. Hajira Jabeen, Prof. Jens Lehmann Outline In the previous lectures we have learned about tensors and factorization methods. RESCAL is a bilinear model for

More information

DAY AHEAD FORECAST OF SOLAR POWER FOR OPTIMAL GRID OPERATION

DAY AHEAD FORECAST OF SOLAR POWER FOR OPTIMAL GRID OPERATION DAY AHEAD FORECAST OF SOLAR POWER FOR OPTIMAL GRID OPERATION Jeenu Jose 1, Vijaya Margaret 2 1 PG Scholar, Department of Electrical & Electronics Engineering, Christ Uinversity, India. 2 Assistant Professor,

More information

Keywords- Source coding, Huffman encoding, Artificial neural network, Multilayer perceptron, Backpropagation algorithm

Keywords- Source coding, Huffman encoding, Artificial neural network, Multilayer perceptron, Backpropagation algorithm Volume 4, Issue 5, May 2014 ISSN: 2277 128X International Journal of Advanced Research in Computer Science and Software Engineering Research Paper Available online at: www.ijarcsse.com Huffman Encoding

More information

Machine Learning. Neural Networks. (slides from Domingos, Pardo, others)

Machine Learning. Neural Networks. (slides from Domingos, Pardo, others) Machine Learning Neural Networks (slides from Domingos, Pardo, others) Human Brain Neurons Input-Output Transformation Input Spikes Output Spike Spike (= a brief pulse) (Excitatory Post-Synaptic Potential)

More information

Deep Learning Architecture for Univariate Time Series Forecasting

Deep Learning Architecture for Univariate Time Series Forecasting CS229,Technical Report, 2014 Deep Learning Architecture for Univariate Time Series Forecasting Dmitry Vengertsev 1 Abstract This paper studies the problem of applying machine learning with deep architecture

More information

A Hybrid ARIMA and Neural Network Model to Forecast Particulate. Matter Concentration in Changsha, China

A Hybrid ARIMA and Neural Network Model to Forecast Particulate. Matter Concentration in Changsha, China A Hybrid ARIMA and Neural Network Model to Forecast Particulate Matter Concentration in Changsha, China Guangxing He 1, Qihong Deng 2* 1 School of Energy Science and Engineering, Central South University,

More information

Prediction of Monthly Rainfall of Nainital Region using Artificial Neural Network (ANN) and Support Vector Machine (SVM)

Prediction of Monthly Rainfall of Nainital Region using Artificial Neural Network (ANN) and Support Vector Machine (SVM) Vol- Issue-3 25 Prediction of ly of Nainital Region using Artificial Neural Network (ANN) and Support Vector Machine (SVM) Deepa Bisht*, Mahesh C Joshi*, Ashish Mehta** *Department of Mathematics **Department

More information

P. M. FONTE GONÇALO XUFRE SILVA J. C. QUADRADO DEEA Centro de Matemática DEEA ISEL Rua Conselheiro Emídio Navarro, LISBOA PORTUGAL

P. M. FONTE GONÇALO XUFRE SILVA J. C. QUADRADO DEEA Centro de Matemática DEEA ISEL Rua Conselheiro Emídio Navarro, LISBOA PORTUGAL Wind Speed Prediction using Artificial Neural Networks P. M. FONTE GONÇALO XUFRE SILVA J. C. QUADRADO DEEA Centro de Matemática DEEA ISEL Rua Conselheiro Emídio Navarro, 1950-072 LISBOA PORTUGAL Abstract:

More information

A Support Vector Regression Model for Forecasting Rainfall

A Support Vector Regression Model for Forecasting Rainfall A Support Vector Regression for Forecasting Nasimul Hasan 1, Nayan Chandra Nath 1, Risul Islam Rasel 2 Department of Computer Science and Engineering, International Islamic University Chittagong, Bangladesh

More information

Multilayer Perceptrons and Backpropagation

Multilayer Perceptrons and Backpropagation Multilayer Perceptrons and Backpropagation Informatics 1 CG: Lecture 7 Chris Lucas School of Informatics University of Edinburgh January 31, 2017 (Slides adapted from Mirella Lapata s.) 1 / 33 Reading:

More information

Comparing the Univariate Modeling Techniques, Box-Jenkins and Artificial Neural Network (ANN) for Measuring of Climate Index

Comparing the Univariate Modeling Techniques, Box-Jenkins and Artificial Neural Network (ANN) for Measuring of Climate Index Applied Mathematical Sciences, Vol. 8, 2014, no. 32, 1557-1568 HIKARI Ltd, www.m-hikari.com http://dx.doi.org/10.12988/ams.2014.4150 Comparing the Univariate Modeling Techniques, Box-Jenkins and Artificial

More information

Heat Load Forecasting of District Heating System Based on Numerical Weather Prediction Model

Heat Load Forecasting of District Heating System Based on Numerical Weather Prediction Model 2nd International Forum on Electrical Engineering and Automation (IFEEA 2) Heat Load Forecasting of District Heating System Based on Numerical Weather Prediction Model YANG Hongying, a, JIN Shuanglong,

More information

A comparative study of ANN and angstrom Prescott model in the context of solar radiation analysis

A comparative study of ANN and angstrom Prescott model in the context of solar radiation analysis A comparative study of ANN and angstrom Prescott model in the context of solar radiation analysis JUHI JOSHI 1, VINIT KUMAR 2 1 M.Tech, SGVU, Jaipur, India 2 Assistant. Professor, SGVU, Jaipur, India ABSTRACT

More information

High Wind and Energy Specific Models for Global. Production Forecast

High Wind and Energy Specific Models for Global. Production Forecast High Wind and Energy Specific Models for Global Production Forecast Carlos Alaíz, Álvaro Barbero, Ángela Fernández, José R. Dorronsoro Dpto. de Ingeniería Informática and Instituto de Ingeniería del Conocimiento

More information

Research Note INTELLIGENT FORECASTING OF RAINFALL AND TEMPERATURE OF SHIRAZ CITY USING NEURAL NETWORKS *

Research Note INTELLIGENT FORECASTING OF RAINFALL AND TEMPERATURE OF SHIRAZ CITY USING NEURAL NETWORKS * Iranian Journal of Science & Technology, Transaction B, Vol. 28, No. B1 Printed in Islamic Republic of Iran, 24 Shiraz University Research Note INTELLIGENT FORECASTING OF RAINFALL AND TEMPERATURE OF SHIRAZ

More information

CHAPTER 6 CONCLUSION AND FUTURE SCOPE

CHAPTER 6 CONCLUSION AND FUTURE SCOPE CHAPTER 6 CONCLUSION AND FUTURE SCOPE 146 CHAPTER 6 CONCLUSION AND FUTURE SCOPE 6.1 SUMMARY The first chapter of the thesis highlighted the need of accurate wind forecasting models in order to transform

More information

Neural Networks and the Back-propagation Algorithm

Neural Networks and the Back-propagation Algorithm Neural Networks and the Back-propagation Algorithm Francisco S. Melo In these notes, we provide a brief overview of the main concepts concerning neural networks and the back-propagation algorithm. We closely

More information

GL Garrad Hassan Short term power forecasts for large offshore wind turbine arrays

GL Garrad Hassan Short term power forecasts for large offshore wind turbine arrays GL Garrad Hassan Short term power forecasts for large offshore wind turbine arrays Require accurate wind (and hence power) forecasts for 4, 24 and 48 hours in the future for trading purposes. Receive 4

More information

Machine Learning. Neural Networks. (slides from Domingos, Pardo, others)

Machine Learning. Neural Networks. (slides from Domingos, Pardo, others) Machine Learning Neural Networks (slides from Domingos, Pardo, others) For this week, Reading Chapter 4: Neural Networks (Mitchell, 1997) See Canvas For subsequent weeks: Scaling Learning Algorithms toward

More information

Modelling and Prediction of 150KW PV Array System in Northern India using Artificial Neural Network

Modelling and Prediction of 150KW PV Array System in Northern India using Artificial Neural Network International Journal of Engineering Science Invention ISSN (Online): 2319 6734, ISSN (Print): 2319 6726 Volume 5 Issue 5 May 2016 PP.18-25 Modelling and Prediction of 150KW PV Array System in Northern

More information

Prediction for night-time ventilation in Stanford s Y2E2 building

Prediction for night-time ventilation in Stanford s Y2E2 building Prediction for night-time ventilation in Stanford s Y2E2 building Balthazar Donon Stanford University December 16, 2016 Indoor ventilation I. Introduction In the United States, around 40% of the energy

More information

CSC321 Lecture 5: Multilayer Perceptrons

CSC321 Lecture 5: Multilayer Perceptrons CSC321 Lecture 5: Multilayer Perceptrons Roger Grosse Roger Grosse CSC321 Lecture 5: Multilayer Perceptrons 1 / 21 Overview Recall the simple neuron-like unit: y output output bias i'th weight w 1 w2 w3

More information

Forecasting River Flow in the USA: A Comparison between Auto-Regression and Neural Network Non-Parametric Models

Forecasting River Flow in the USA: A Comparison between Auto-Regression and Neural Network Non-Parametric Models Journal of Computer Science 2 (10): 775-780, 2006 ISSN 1549-3644 2006 Science Publications Forecasting River Flow in the USA: A Comparison between Auto-Regression and Neural Network Non-Parametric Models

More information

A Hybrid Model of Wavelet and Neural Network for Short Term Load Forecasting

A Hybrid Model of Wavelet and Neural Network for Short Term Load Forecasting International Journal of Electronic and Electrical Engineering. ISSN 0974-2174, Volume 7, Number 4 (2014), pp. 387-394 International Research Publication House http://www.irphouse.com A Hybrid Model of

More information

CSE 190 Fall 2015 Midterm DO NOT TURN THIS PAGE UNTIL YOU ARE TOLD TO START!!!!

CSE 190 Fall 2015 Midterm DO NOT TURN THIS PAGE UNTIL YOU ARE TOLD TO START!!!! CSE 190 Fall 2015 Midterm DO NOT TURN THIS PAGE UNTIL YOU ARE TOLD TO START!!!! November 18, 2015 THE EXAM IS CLOSED BOOK. Once the exam has started, SORRY, NO TALKING!!! No, you can t even say see ya

More information

AN ARTIFICIAL NEURAL NETWORK MODEL FOR ROAD ACCIDENT PREDICTION: A CASE STUDY OF KHULNA METROPOLITAN CITY

AN ARTIFICIAL NEURAL NETWORK MODEL FOR ROAD ACCIDENT PREDICTION: A CASE STUDY OF KHULNA METROPOLITAN CITY Proceedings of the 4 th International Conference on Civil Engineering for Sustainable Development (ICCESD 2018), 9~11 February 2018, KUET, Khulna, Bangladesh (ISBN-978-984-34-3502-6) AN ARTIFICIAL NEURAL

More information

An Improved Method of Power System Short Term Load Forecasting Based on Neural Network

An Improved Method of Power System Short Term Load Forecasting Based on Neural Network An Improved Method of Power System Short Term Load Forecasting Based on Neural Network Shunzhou Wang School of Electrical and Electronic Engineering Huailin Zhao School of Electrical and Electronic Engineering

More information

AN INTRODUCTION TO NEURAL NETWORKS. Scott Kuindersma November 12, 2009

AN INTRODUCTION TO NEURAL NETWORKS. Scott Kuindersma November 12, 2009 AN INTRODUCTION TO NEURAL NETWORKS Scott Kuindersma November 12, 2009 SUPERVISED LEARNING We are given some training data: We must learn a function If y is discrete, we call it classification If it is

More information

Thunderstorm Forecasting by using Artificial Neural Network

Thunderstorm Forecasting by using Artificial Neural Network Thunderstorm Forecasting by using Artificial Neural Network N.F Nik Ismail, D. Johari, A.F Ali, Faculty of Electrical Engineering Universiti Teknologi MARA 40450 Shah Alam Malaysia nikfasdi@yahoo.com.my

More information

Neural Networks. CSE 6363 Machine Learning Vassilis Athitsos Computer Science and Engineering Department University of Texas at Arlington

Neural Networks. CSE 6363 Machine Learning Vassilis Athitsos Computer Science and Engineering Department University of Texas at Arlington Neural Networks CSE 6363 Machine Learning Vassilis Athitsos Computer Science and Engineering Department University of Texas at Arlington 1 Perceptrons x 0 = 1 x 1 x 2 z = h w T x Output: z x D A perceptron

More information

Artifical Neural Networks

Artifical Neural Networks Neural Networks Artifical Neural Networks Neural Networks Biological Neural Networks.................................. Artificial Neural Networks................................... 3 ANN Structure...........................................

More information

Science - 4th grade practice test

Science - 4th grade practice test Name: Instructions: Bubble the correct answer. Read each choice before marking your answer. Copyright 2000-2002 Measured Progress, All Rights Reserved : Use the picture below to answer question 1. 1. A

More information

WEATHER PREDICTION FOR INDIAN LOCATION USING MACHINE LEARNING

WEATHER PREDICTION FOR INDIAN LOCATION USING MACHINE LEARNING Volume 118 No. 22 2018, 1945-1949 ISSN: 1314-3395 (on-line version) url: http://acadpubl.eu/hub ijpam.eu WEATHER PREDICTION FOR INDIAN LOCATION USING MACHINE LEARNING 1 Jitcha Shivang, 2 S.S Sridhar 1

More information

Time Series and Forecasting

Time Series and Forecasting Time Series and Forecasting Introduction to Forecasting n What is forecasting? n Primary Function is to Predict the Future using (time series related or other) data we have in hand n Why are we interested?

More information

) (d o f. For the previous layer in a neural network (just the rightmost layer if a single neuron), the required update equation is: 2.

) (d o f. For the previous layer in a neural network (just the rightmost layer if a single neuron), the required update equation is: 2. 1 Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science 6.034 Artificial Intelligence, Fall 2011 Recitation 8, November 3 Corrected Version & (most) solutions

More information

3. This room is located in a building in New York State. On which side of the building is the window located? (1) north (3) east (2) south (4) west

3. This room is located in a building in New York State. On which side of the building is the window located? (1) north (3) east (2) south (4) west 1. The planetary winds in Earth s Northern Hemisphere generally curve to the right due to Earth s (1) orbit around the Sun (2) spin on its axis (3) magnetic field (4) force of gravity Base your answers

More information

WEATHER DEPENENT ELECTRICITY MARKET FORECASTING WITH NEURAL NETWORKS, WAVELET AND DATA MINING TECHNIQUES. Z.Y. Dong X. Li Z. Xu K. L.

WEATHER DEPENENT ELECTRICITY MARKET FORECASTING WITH NEURAL NETWORKS, WAVELET AND DATA MINING TECHNIQUES. Z.Y. Dong X. Li Z. Xu K. L. WEATHER DEPENENT ELECTRICITY MARKET FORECASTING WITH NEURAL NETWORKS, WAVELET AND DATA MINING TECHNIQUES Abstract Z.Y. Dong X. Li Z. Xu K. L. Teo School of Information Technology and Electrical Engineering

More information

Statistical NLP for the Web

Statistical NLP for the Web Statistical NLP for the Web Neural Networks, Deep Belief Networks Sameer Maskey Week 8, October 24, 2012 *some slides from Andrew Rosenberg Announcements Please ask HW2 related questions in courseworks

More information

Machine Learning. Neural Networks. (slides from Domingos, Pardo, others)

Machine Learning. Neural Networks. (slides from Domingos, Pardo, others) Machine Learning Neural Networks (slides from Domingos, Pardo, others) For this week, Reading Chapter 4: Neural Networks (Mitchell, 1997) See Canvas For subsequent weeks: Scaling Learning Algorithms toward

More information

That s Hot: Predicting Daily Temperature for Different Locations

That s Hot: Predicting Daily Temperature for Different Locations That s Hot: Predicting Daily Temperature for Different Locations Alborz Bejnood, Max Chang, Edward Zhu Stanford University Computer Science 229: Machine Learning December 14, 2012 1 Abstract. The problem

More information

Reading Group on Deep Learning Session 1

Reading Group on Deep Learning Session 1 Reading Group on Deep Learning Session 1 Stephane Lathuiliere & Pablo Mesejo 2 June 2016 1/31 Contents Introduction to Artificial Neural Networks to understand, and to be able to efficiently use, the popular

More information

Combined GIS, CFD and Neural Network Multi-Zone Model for Urban Planning and Building Simulation. Methods

Combined GIS, CFD and Neural Network Multi-Zone Model for Urban Planning and Building Simulation. Methods Combined GIS, CFD and Neural Network Multi-Zone Model for Urban Planning and Building Simulation Meng Kong 1, Mingshi Yu 2, Ning Liu 1, Peng Gao 2, Yanzhi Wang 1, Jianshun Zhang 1 1 School of Engineering

More information

Artificial Neural Networks. Historical description

Artificial Neural Networks. Historical description Artificial Neural Networks Historical description Victor G. Lopez 1 / 23 Artificial Neural Networks (ANN) An artificial neural network is a computational model that attempts to emulate the functions of

More information

Estimation of Pan Evaporation Using Artificial Neural Networks A Case Study

Estimation of Pan Evaporation Using Artificial Neural Networks A Case Study International Journal of Current Microbiology and Applied Sciences ISSN: 2319-7706 Volume 6 Number 9 (2017) pp. 3052-3065 Journal homepage: http://www.ijcmas.com Case Study https://doi.org/10.20546/ijcmas.2017.609.376

More information

No. 6 Determining the input dimension of a To model a nonlinear time series with the widely used feed-forward neural network means to fit the a

No. 6 Determining the input dimension of a To model a nonlinear time series with the widely used feed-forward neural network means to fit the a Vol 12 No 6, June 2003 cfl 2003 Chin. Phys. Soc. 1009-1963/2003/12(06)/0594-05 Chinese Physics and IOP Publishing Ltd Determining the input dimension of a neural network for nonlinear time series prediction

More information

Predicting Floods in North Central Province of Sri Lanka using Machine Learning and Data Mining Methods

Predicting Floods in North Central Province of Sri Lanka using Machine Learning and Data Mining Methods Thilakarathne & Premachandra Predicting Floods in North Central Province of Sri Lanka using Machine Learning and Data Mining Methods H. Thilakarathne 1, K. Premachandra 2 1 Department of Physical Science,

More information

Internet Engineering Jacek Mazurkiewicz, PhD

Internet Engineering Jacek Mazurkiewicz, PhD Internet Engineering Jacek Mazurkiewicz, PhD Softcomputing Part 11: SoftComputing Used for Big Data Problems Agenda Climate Changes Prediction System Based on Weather Big Data Visualisation Natural Language

More information

CSC 578 Neural Networks and Deep Learning

CSC 578 Neural Networks and Deep Learning CSC 578 Neural Networks and Deep Learning Fall 2018/19 3. Improving Neural Networks (Some figures adapted from NNDL book) 1 Various Approaches to Improve Neural Networks 1. Cost functions Quadratic Cross

More information

Culway Weigh-In-Motion (WIM) Compensating for Calibration Drift Preliminary Report

Culway Weigh-In-Motion (WIM) Compensating for Calibration Drift Preliminary Report Department of Transport, Energy and Infrastructure Transport Information Management Section Culway Weigh-In-Motion (WIM) Compensating for Calibration Drift Preliminary Report CARL CARUANA Traffic Registrar

More information

Artificial Neural Network : Training

Artificial Neural Network : Training Artificial Neural Networ : Training Debasis Samanta IIT Kharagpur debasis.samanta.iitgp@gmail.com 06.04.2018 Debasis Samanta (IIT Kharagpur) Soft Computing Applications 06.04.2018 1 / 49 Learning of neural

More information

Short-Term Power Production Forecasting in Smart Grid Based on Solar Power Plants

Short-Term Power Production Forecasting in Smart Grid Based on Solar Power Plants International Journal of Engineering and Applied Sciences (IJEAS) Short-Term Power Production Forecasting in Smart Grid Based on Solar Power Plants Qudsia Memon, Nurettin Çetinkaya Abstract Since the world

More information

Unit 8: Introduction to neural networks. Perceptrons

Unit 8: Introduction to neural networks. Perceptrons Unit 8: Introduction to neural networks. Perceptrons D. Balbontín Noval F. J. Martín Mateos J. L. Ruiz Reina A. Riscos Núñez Departamento de Ciencias de la Computación e Inteligencia Artificial Universidad

More information

FORECASTING YIELD PER HECTARE OF RICE IN ANDHRA PRADESH

FORECASTING YIELD PER HECTARE OF RICE IN ANDHRA PRADESH International Journal of Mathematics and Computer Applications Research (IJMCAR) ISSN 49-6955 Vol. 3, Issue 1, Mar 013, 9-14 TJPRC Pvt. Ltd. FORECASTING YIELD PER HECTARE OF RICE IN ANDHRA PRADESH R. RAMAKRISHNA

More information

Using an Artificial Neural Network to Predict Parameters for Frost Deposition on Iowa Bridgeways

Using an Artificial Neural Network to Predict Parameters for Frost Deposition on Iowa Bridgeways Using an Artificial Neural Network to Predict Parameters for Frost Deposition on Iowa Bridgeways Bradley R. Temeyer and William A. Gallus Jr. Graduate Student of Atmospheric Science 31 Agronomy Hall Ames,

More information

1. Introduction. 2. Artificial Neural Networks and Fuzzy Time Series

1. Introduction. 2. Artificial Neural Networks and Fuzzy Time Series 382 IJCSNS International Journal of Computer Science and Network Security, VOL.8 No.9, September 2008 A Comparative Study of Neural-Network & Fuzzy Time Series Forecasting Techniques Case Study: Wheat

More information

FORECASTING ACTIVITY OF THE KILAUA VOLCANE USING INTELLIGENT METHODS OF DATA ANALYSIS. Stanislav Zabielin

FORECASTING ACTIVITY OF THE KILAUA VOLCANE USING INTELLIGENT METHODS OF DATA ANALYSIS. Stanislav Zabielin 94 International Journal Information Theories and Applications, Vol. 25, Number 1, 2018 FORECASTING ACTIVITY OF THE KILAUA VOLCANE USING INTELLIGENT METHODS OF DATA ANALYSIS Stanislav Zabielin Abstract:

More information

Wind Energy Predictions of Small-Scale Turbine Output Using Exponential Smoothing and Feed- Forward Neural Network

Wind Energy Predictions of Small-Scale Turbine Output Using Exponential Smoothing and Feed- Forward Neural Network Wind Energy Predictions of Small-Scale Turbine Output Using Exponential Smoothing and Feed- Forward Neural Network Zaccheus O. Olaofe 1, 2 1 ZakkWealth Energy 2 Faculty of Engineering and Built Environment,

More information

University of Athens School of Physics Atmospheric Modeling and Weather Forecasting Group

University of Athens School of Physics Atmospheric Modeling and Weather Forecasting Group University of Athens School of Physics Atmospheric Modeling and Weather Forecasting Group http://forecast.uoa.gr Data Assimilation in WAM System operations and validation G. Kallos, G. Galanis and G. Emmanouil

More information

Research Article Hybrid Power Forecasting Model for Photovoltaic Plants Based on Neural Network with Air Quality Index

Research Article Hybrid Power Forecasting Model for Photovoltaic Plants Based on Neural Network with Air Quality Index Hindawi International Photoenergy Volume 2017, Article ID 6938713, 9 pages https://doi.org/10.1155/2017/6938713 Research Article Hybrid Power Forecasting Model for Photovoltaic Plants Based on Neural Network

More information

Input layer. Weight matrix [ ] Output layer

Input layer. Weight matrix [ ] Output layer MASSACHUSETTS INSTITUTE OF TECHNOLOGY Department of Electrical Engineering and Computer Science 6.034 Artificial Intelligence, Fall 2003 Recitation 10, November 4 th & 5 th 2003 Learning by perceptrons

More information

Artificial Neural Network Method of Rock Mass Blastability Classification

Artificial Neural Network Method of Rock Mass Blastability Classification Artificial Neural Network Method of Rock Mass Blastability Classification Jiang Han, Xu Weiya, Xie Shouyi Research Institute of Geotechnical Engineering, Hohai University, Nanjing, Jiangshu, P.R.China

More information

Neural Networks. Yan Shao Department of Linguistics and Philology, Uppsala University 7 December 2016

Neural Networks. Yan Shao Department of Linguistics and Philology, Uppsala University 7 December 2016 Neural Networks Yan Shao Department of Linguistics and Philology, Uppsala University 7 December 2016 Outline Part 1 Introduction Feedforward Neural Networks Stochastic Gradient Descent Computational Graph

More information