Data and prognosis for renewable energy

Similar documents
Wind Power Forecasting using Artificial Neural Networks

Short-term wind forecasting using artificial neural networks (ANNs)

Prediction of Hourly Solar Radiation in Amman-Jordan by Using Artificial Neural Networks

Electric Load Forecasting Using Wavelet Transform and Extreme Learning Machine

ANN and Statistical Theory Based Forecasting and Analysis of Power System Variables

A SEASONAL FUZZY TIME SERIES FORECASTING METHOD BASED ON GUSTAFSON-KESSEL FUZZY CLUSTERING *

Application of Artificial Neural Network for Short Term Load Forecasting

ARTIFICIAL NEURAL NETWORK PART I HANIEH BORHANAZAD

MODELLING ENERGY DEMAND FORECASTING USING NEURAL NETWORKS WITH UNIVARIATE TIME SERIES

Short Term Load Forecasting Using Multi Layer Perceptron

Introduction to Machine Learning Spring 2018 Note Neural Networks

ANN based techniques for prediction of wind speed of 67 sites of India

Artificial Neural Networks

Solar Irradiance Prediction using Neural Model

A Novel 2-D Model Approach for the Prediction of Hourly Solar Radiation

Short Term Load Forecasting Based Artificial Neural Network

Time Series and Forecasting

EE-588 ADVANCED TOPICS IN NEURAL NETWORK

Forecasting of Rain Fall in Mirzapur District, Uttar Pradesh, India Using Feed-Forward Artificial Neural Network

Introduction to Natural Computation. Lecture 9. Multilayer Perceptrons and Backpropagation. Peter Lewis

22/04/2014. Economic Research

Open Access Research on Data Processing Method of High Altitude Meteorological Parameters Based on Neural Network

Rainfall Prediction using Back-Propagation Feed Forward Network

Forecasting with Expert Opinions

Neural Networks. Nicholas Ruozzi University of Texas at Dallas

Artificial Intelligence

Research Article Weather Forecasting Using Sliding Window Algorithm

Introduction Neural Networks - Architecture Network Training Small Example - ZIP Codes Summary. Neural Networks - I. Henrik I Christensen

ESTIMATION OF HOURLY MEAN AMBIENT TEMPERATURES WITH ARTIFICIAL NEURAL NETWORKS 1. INTRODUCTION

UNCERTAINTY ANALYSIS OF TWO-SHAFT GAS TURBINE PARAMETER OF ARTIFICIAL NEURAL NETWORK (ANN) APPROXIMATED FUNCTION USING SEQUENTIAL PERTURBATION METHOD

Australian Journal of Basic and Applied Sciences. A Comparative Analysis of Neural Network based Short Term Load Forecast for Seasonal Prediction

A new method for short-term load forecasting based on chaotic time series and neural network

Predict Time Series with Multiple Artificial Neural Networks

Do we need Experts for Time Series Forecasting?

Local Prediction of Precipitation Based on Neural Network

Improved the Forecasting of ANN-ARIMA Model Performance: A Case Study of Water Quality at the Offshore Kuala Terengganu, Terengganu, Malaysia

J1.2 Short-term wind forecasting at the Hong Kong International Airport by applying chaotic oscillatory-based neural network to LIDAR data

Artificial Neural Networks. MGS Lecture 2

(Feed-Forward) Neural Networks Dr. Hajira Jabeen, Prof. Jens Lehmann

DAY AHEAD FORECAST OF SOLAR POWER FOR OPTIMAL GRID OPERATION

Keywords- Source coding, Huffman encoding, Artificial neural network, Multilayer perceptron, Backpropagation algorithm

Machine Learning. Neural Networks. (slides from Domingos, Pardo, others)

Deep Learning Architecture for Univariate Time Series Forecasting

A Hybrid ARIMA and Neural Network Model to Forecast Particulate. Matter Concentration in Changsha, China

Prediction of Monthly Rainfall of Nainital Region using Artificial Neural Network (ANN) and Support Vector Machine (SVM)

P. M. FONTE GONÇALO XUFRE SILVA J. C. QUADRADO DEEA Centro de Matemática DEEA ISEL Rua Conselheiro Emídio Navarro, LISBOA PORTUGAL

A Support Vector Regression Model for Forecasting Rainfall

Multilayer Perceptrons and Backpropagation

Comparing the Univariate Modeling Techniques, Box-Jenkins and Artificial Neural Network (ANN) for Measuring of Climate Index

Heat Load Forecasting of District Heating System Based on Numerical Weather Prediction Model

A comparative study of ANN and angstrom Prescott model in the context of solar radiation analysis

High Wind and Energy Specific Models for Global. Production Forecast

Research Note INTELLIGENT FORECASTING OF RAINFALL AND TEMPERATURE OF SHIRAZ CITY USING NEURAL NETWORKS *

CHAPTER 6 CONCLUSION AND FUTURE SCOPE

Neural Networks and the Back-propagation Algorithm

GL Garrad Hassan Short term power forecasts for large offshore wind turbine arrays

Machine Learning. Neural Networks. (slides from Domingos, Pardo, others)

Modelling and Prediction of 150KW PV Array System in Northern India using Artificial Neural Network

Prediction for night-time ventilation in Stanford s Y2E2 building

CSC321 Lecture 5: Multilayer Perceptrons

Forecasting River Flow in the USA: A Comparison between Auto-Regression and Neural Network Non-Parametric Models

A Hybrid Model of Wavelet and Neural Network for Short Term Load Forecasting

CSE 190 Fall 2015 Midterm DO NOT TURN THIS PAGE UNTIL YOU ARE TOLD TO START!!!!

AN ARTIFICIAL NEURAL NETWORK MODEL FOR ROAD ACCIDENT PREDICTION: A CASE STUDY OF KHULNA METROPOLITAN CITY

An Improved Method of Power System Short Term Load Forecasting Based on Neural Network

AN INTRODUCTION TO NEURAL NETWORKS. Scott Kuindersma November 12, 2009

Thunderstorm Forecasting by using Artificial Neural Network

Neural Networks. CSE 6363 Machine Learning Vassilis Athitsos Computer Science and Engineering Department University of Texas at Arlington

Artifical Neural Networks

Science - 4th grade practice test

WEATHER PREDICTION FOR INDIAN LOCATION USING MACHINE LEARNING

Time Series and Forecasting

) (d o f. For the previous layer in a neural network (just the rightmost layer if a single neuron), the required update equation is: 2.

3. This room is located in a building in New York State. On which side of the building is the window located? (1) north (3) east (2) south (4) west

WEATHER DEPENENT ELECTRICITY MARKET FORECASTING WITH NEURAL NETWORKS, WAVELET AND DATA MINING TECHNIQUES. Z.Y. Dong X. Li Z. Xu K. L.

Statistical NLP for the Web

Machine Learning. Neural Networks. (slides from Domingos, Pardo, others)

That s Hot: Predicting Daily Temperature for Different Locations

Reading Group on Deep Learning Session 1

Combined GIS, CFD and Neural Network Multi-Zone Model for Urban Planning and Building Simulation. Methods

Artificial Neural Networks. Historical description

Estimation of Pan Evaporation Using Artificial Neural Networks A Case Study

No. 6 Determining the input dimension of a To model a nonlinear time series with the widely used feed-forward neural network means to fit the a

Predicting Floods in North Central Province of Sri Lanka using Machine Learning and Data Mining Methods

Internet Engineering Jacek Mazurkiewicz, PhD

CSC 578 Neural Networks and Deep Learning

Culway Weigh-In-Motion (WIM) Compensating for Calibration Drift Preliminary Report

Artificial Neural Network : Training

Short-Term Power Production Forecasting in Smart Grid Based on Solar Power Plants

Unit 8: Introduction to neural networks. Perceptrons

FORECASTING YIELD PER HECTARE OF RICE IN ANDHRA PRADESH

Using an Artificial Neural Network to Predict Parameters for Frost Deposition on Iowa Bridgeways

1. Introduction. 2. Artificial Neural Networks and Fuzzy Time Series

FORECASTING ACTIVITY OF THE KILAUA VOLCANE USING INTELLIGENT METHODS OF DATA ANALYSIS. Stanislav Zabielin

Wind Energy Predictions of Small-Scale Turbine Output Using Exponential Smoothing and Feed- Forward Neural Network

University of Athens School of Physics Atmospheric Modeling and Weather Forecasting Group

Research Article Hybrid Power Forecasting Model for Photovoltaic Plants Based on Neural Network with Air Quality Index

Input layer. Weight matrix [ ] Output layer

Artificial Neural Network Method of Rock Mass Blastability Classification

Neural Networks. Yan Shao Department of Linguistics and Philology, Uppsala University 7 December 2016

Transcription:

The Hong Kong Polytechnic University Department of Electrical Engineering Project code: FYP_27 Data and prognosis for renewable energy by Choi Man Hin 14072258D Final Report Bachelor of Engineering (Honours) in Electrical Engineering (41470) Of Department of Electrical Engineering The Hong Kong Polytechnic University Supervisor: Dr. Zhao Xu Date: 31/3/2018

Abstract It is important to forecast the wind speed in the prognosis of wind power generation. The energy loss and the operation cost could be greatly reduced with the accurate wind speed prediction. In this paper, we would study 3 different forecasting models, which are Artificial Neural Network Model (ANN), Moving Average Model (MA) and Auto Regressive Model (AR) to study the prognosis of wind speed in Hong Kong. The data of different weather parameters including wind speed, temperature, humidity and wind speed we studied is in Hong Kong 2008. We use matlab2015 to carry out the data processing, correlation and feature selection in ANN model. Levenberg-Marquardt backpropagation (LM) would be the learning algorithm for selftraining and generate the predicted outcomes. For MA and AR, Excel2016 would be involved to give out the forecasting results. This project aims to investigate the intra-comparison and inter-comparison among ANN, AR, MA for the best forecasting method of wind speed. Mean square error would be the benchmark for the comparison of the accuracy and performance. After all comparison, we would determine a suitable way to forecast the wind speed in Hong Kong among 3 methods we suggested. After testing, it is noticeable that ANN shows the greatest performance, followed by the AR, and MA shows the worse among 3 methods. 10 minutes ahead forecasting is more accurate than 30 minutes ahead forecasting. Forecasting result in autumn performs the best and performs the worse in winter. Despite the long preparation and training time, it is strongly recommended to use ANN model to forecast the weather condition. It is hoped that to be possible to develop a single model to integrate wind speed with other parameters to improve the accuracy of power forecasting, as will be investigated in future.

Writing Acknowledgements Dr. Zhao Xu and his students Mr. Chai Song Jian have been very supportive throughout my project, which began in October last year. This project includes a lot of mathematics and statistics concepts. Dr. Zhao and Mr. Chai has provided me a lot of suggestion and academic advising on the work. They gave me deep introduction on this project like the concept of Artificial Neural Network and Time Series Forecasting, which enabled me to have a clear direction on my final year project at the beginning. Especially Mr. Chai supported me for some complicate programming codes in the exploit of Matlab2015 computing program, and so, I could run the forecasting models smoothly on the computing learning algorithms.

TABLE OF CONTENT Abstract ------------------------------------------------------------------------------------------------- Acknowledgement ------------------------------------------------------------------------------------ Table of content --------------------------------------------------------------------------------------- Introduction ---------------------------------------------------------------------------------------- 1-3 Literature Review ------------------------------------------------------------------------------------ 4 Methodology --------------------------------------------------------------------------------------- 5-9 Result --------------------------------------------------------------------------------------------- 10-55 Discussion ---------------------------------------------------------------------------------------- 56-57 Conclusion ----------------------------------------------------------------------------------------- 58 Reference ------------------------------------------------------------------------------------------- 59

Introduction Renewable energy (RE) exploits the natural elements to generate electricity in an environmentally friendly way, with low emission of greenhouse gases like carbon dioxide to the atmosphere. We are responsible to develop RE and instead of using coal burning for electricity generation, to give a better environment to our next generation. Mahoney [10] suggests that engineers is now trying to improve the power quality, with the integration of the renewable energy system into the central power grid recently. Luo [2] reveals that, wind and solar power development encounter a lot of limitation, while the biggest drawback is about the cost, with high initial and operating cost. At the same time, the wind speed varies from time to time, and it leads to the nonconstant power supply. As a result, wind generation system usually connects with traditional generators for securing stable power quality and supply. Once, the RE system stops to work, the generators would generate electricity to compensate it. The generators are required to have a wide range of spinning reserve, but it also leads to the great energy lost. As a result, some government decide to delay the implementation of the development of RE. In this paper, we would focus on the development of the wind power generation. The biggest limitation of wind power development is the high levelized cost of electricity. Jung [7] points out that with accurate forecasting on the wind speed, the energy loss in spinning reserve of the generators and exploitation of alternative battery system could be greatly reduced. Kim [8] suggests that it would provide the economic incentives for the government to launch the wind power generation development. The historical weather datum including wind speed, humidity, temperature, wind direction in Hong Kong 2008 and 2009 is given for the study of the wind speed forecasting models. We would use different ways to make predictions on the wind speed including the Artificial Neural Network Model (ANN), Moving Average Model (MA) and Autoregressive Model (AR). Objective We suggest three different wind forecasting models - Moving Average (MA), Autoregressive (AR) and Artificial Neuron Network. We aim to compare the forecasting results with different seasonal trend and forecasting horizon, determine the best-fit wind forecasting model among the three prediction methods in Hong Kong.

Artificial Neural Network Model (ANN) Artificial Neural Network Model is a non-linear approach analyzing method. Tascikaraoglu [9] demonstrates that the idea of multilayer perceptron refers to the mechanism of the self-learning algorithm. It comprises three layers, namely input node, hidden node and output node. The network model exploits the historical data that we entered in the input layer, to undergo self-training and generate the result. Park suggests that [1] throughout the training process, the network model captures all the relevant attributes, to adjust the weighting and give out probability events of different input parameters. The purpose of training is to minimize the global error E, which defined as the followed. σ E = 1 Eσ K σ=1 --- (1) Eσ = 1 (Oi 2 i=1 Ti) 2 --- (2) M Where K is the number of set of training pattern, Eσ refers to the error for training pattern σ, M is equivalence to the output node quantity, Oi and Ti correspond to network output and target output at i th output node. The above diagram reveals the basic concept of ANN model, with 3 different layers. Levenberg Marquardt One of the backpropagation method Levenberg Marquardt would be studied in this paper. Wang [5] demonstrates that Levenberg-Marquardt is a learning method to approximate a function, and to approach second order training speed without having to compute Hessian Matrix. It is an iterative procedure to solve and propagate the following equation. (J t J + λi)δ = J t E -------- (1) J refers to the Jacobian matrix for the system, λ equals to damping factor, δ is the weight update vector, while E is the error vector containing the output errors for each input vector. δ corresponds to adjustment of weights to achieve a better solution.

Moving Average Model (MA) Given a series of numbers and a fixed subset size, moving average of the first element is obtained by taking the average of the initial fixed subset of the number series. Borhan [6] says that it is a shifting forward method to extend the timeline; that is, excluding the first number of the series and including the next value in the subset. Autoregressive (AR) Autoregressive Moving Average Model (ARMA) is a linear approaching, to forecast from stationery models revert to mean. We use usually use ARMA to study a time series of data. Similar to ANN model, AR aims to make forecasting of future values by using the historical data by autocorrelation, but ARMA doesn t involve the self-training process. Carli suggests that [4] AR calculation gives different coefficient or weighting to the past historical data to undergo future prediction. The general formula of AR is shown as Where Xt refers to the predicted values, a1 to ap correspond to the coefficient, Xt-1 to Xt-p are the past data values and finally wt refer to the constant value. Mean Square Error (MSE) It is an estimator to measure the accuracy of the predicted and actual values. It shows the formula as below: Where n refers to number of data set, yi demonstrates the predicted value and yi desh corresponds to the actual value. It is a common estimator, especially for such large amount of data sets.

Literature review There are several researches on the wind speed forecasting in foreign countries. One of the example is using Artificial Neural Network Model - ANN algorithm to forecast the wind speed at Knock Airport recently. In 2014, Mukh Raj YadavKumar, Gaurav Singh and Anurag Chaturvedi [3] suggested using Levenberg-Marquardt backpropagation, Scaled Conjugate Gradient algorithm along with Bayesian Regularization to carry out the forecasting models. The results show that Levenberg-Marquardt back propagation and Bayesian Regularization performs better than Scaled Conjugate Gradient. Mukh Raj YadavKumar, Gaurav Singh and Anurag Chaturvedi s paper includes the intracomparison on the forecasting accuracy on the three ANN methods mentioned. However, the result shows a high percentage of MSE error. The input matrix includes a lot of different aspects of parameters including lagged wind speed, wind direction, temperature, dew point, wet bulb, vapor pressure, relative humidity, sea level pressure at the same time in the ANN model. It is questionable that inputting various of sectors as input parameters would decrease the performance on wind speed forecasting. The paper doesn t include the cross-correlation analysis between each sector to the wind speed. In my paper, it is suggested that the input parameters of ANN would be determined by the results of autocorrelation and cross-correlation. More dimensions of comparison like seasonal trend, would also be investigated in this project. Hence, in my own paper, we would like to emphasize the importance of the historical data of lagged wind speed to forecast the future values. The simple time series forecasting method Moving Average Model (MA) and a more Autoregressive Mode l(ar), would be exploited to forecast the wind speed in different papers. These two methods only make use of the past data of wind speed to undergo the prognosis of the future wind speed. Finally, one of the ANN method, Levenberg-Marquardt back propagation (LM), would also be included for the inter-comparison of these 3 methods. ANN, AR, MA method are the three different kinds of methods to carry out the prognosis of wind speed. This paper is inspired by the researches in foreign countries wind speed forecasting report. However, it is uncommon to discover the wind speed research or even renewable energy investigation in Hong Kong. This paper would like to try using AR, MA and Levenberg-Marquardt (LM propagation) in ANN adopting seasonal and trending analysis in this project.

Methodology In this project, we mainly use the Matlab2015 and Microsoft Excel computing programs for the data analysis. There are several steps before we make investigation on our forecasting models, including data processing, sample normalizing, and correlation. Finally, we could generate the forecasted outcome from three different methods, ANN, AR and MA respectively. It is noticeable that the whole working process would be slightly difference from the 3 models. The following shows the flow of developing ANN model. ANN: Classification -> Error Processing -> Averaging -> Normalizing -> Cross Correlation -> Autocorrelation -> Parameter Selection -> Machine Learning -> Prediction MA / AR: Classification -> Error Processing -> Averaging -> Normalizing -> Calculation -> Prediction For ANN Classification The historical data measured by Hong Kong Observatory in 2008 would be undergo researching in this paper. The parameters including wind speed, humidity, temperature, wind direction would be studied. The following shows the units of parameters in measurement: Parameters Units Wind Speed ms -1 (starting from 0) Humidity % (ranging from 0 to 100) Temperature o C (starting from -273) Wind direction Degree (ranging from 0 to 360) where 0 o = North, 90 o = East, 180 o = South, 270 o = West Then, we would classify the considerable amounts of datum systematically into 4 seasons, which would be transformed and undergo comparison in the last section. With the reference of the information shared by Hong Kong Observatory, we would simply divide the datum in 2008 as follow by seasons. It is noticeable that the first 2

months of data in 2009 would be combined with the data in December 2008 as the values in winter. Season Date Spring 01/03/2008 to 31/05/2008 Summer 01/06/2008 to 31/08/2008 Autumn 01/09/2008 to 30/11/2008 Winter 01/12/2008 to 28/02/2009 Error processing In fact, there are little portions of data with error, represented as 32767 in the files. The error datum would affect the learning algorithm of the models, and hence, it is a necessary to handle with them. In this project, we alter the data with error directly to the previous value. The following shows that 32767 in red color is changed to the previous value 0. The above diagram shows the example of error processing Averaging There are over 2 million number of data totally with respect to the wind speed, humidity, temperature, wind direction measured in every minute in 2008. To order to increase the performance effectiveness of our forecasting model, we would average the magnitude of our data in every 10 minutes.

Normalization Followed by, we would like to normalize the averaged data by using the equation X u Xmax Xmin, where X corresponds to the variables, u refers to the mean of all data, while Xmax and Xmin are the greatest and smallest magnitude among all data. Normalizing helps to standardize the range of different parameters from 0 to 1, and also eliminates the influence on the extreme values in statistic. Hence, it is indispensable to carry out this process before we apply the data into the learning model. Feature Selection Feature selection could be classified into two ways cross correlation and autocorrelation. This part would only be included in the ANN model. We would use Matlab2015 to finish it. Cross correlation Cross correlation refers to figure out the similarity between two different parameters, to define correlation coefficient to express the level of similarity in the analysis. The greater magnitude of correlation coefficient shows the greater extent of correlation between the selected parameters. The value ranging from [0, 1] represents the positive correlation, which reveals that one variable increases with another. While ranging from [-1, 0] corresponds to the negative correlation, which demonstrates that one variable increases as the other decreases. It is noticeable that in terms of negative correlation, such as two coefficients of -0.8 and -0.2 respectively, -0.8 demonstrates the greater extent of negative linkage with another variable. In this paper, we would study the cross-correlation around the 4 different parameters, which are wind speed, humidity, temperature and wind direction. As the main purpose of this paper in this section is to figure out the influence of humidity, temperature and wind direction on the wind speed; hence, we would carry out the following analysis: a) Wind Speed cross Humidity b) Wind Speed cross Temperature c) Wind Speed cross Wind Speed

Autocorrelation Unlike cross correlation, autocorrelation aims to identify the appropriate time series in a non-random way. We know that past datum of wind speed is strongly correlated to the future wind speed. Hence, in autocorrelation analysis, we are going to determine the number of previous data highly connected to the next moment wind speed, and thus, adjusting our number of input variables in our ANN models. Autocorrelation show a more significance importance than cross correlation in our forecasting model. Same as cross correlation, the value of correlation coefficient is ranging from [-1, 1], which [0, 1] represents the positive correlation, [-1, 0] corresponds to the negative correlation. It is also noticeable that we usually consider the correlation coefficient greater than 0.8 as high linkage with the future values. Hence, we would focus on the coefficient value ranging from [0.8, 1]. Parameter selection After observing the cross and auto correlation result, we would determine the suitable in parameter into our ANN model, we would input the parameters having correlation coefficient greater than 0.8 (experienced value) into the forecasting model (LM) in Matlab2015. The learning program would undergo self-training generate the forecasting result. In this project, we use 70% of the data as training, 15% of data as validation and last 15% as testing results (around 1950 to 2000), which would be undergo compared among different forecasting methods. In the training process, we would compare the performance of ANN by using different number of neurons [2, 4, 6, 8, 10, 12, 14, 16, 18, and 20]. For MA We could only use the past data of wind speed for the prediction, but not with other vectors in MA model. Traditionally, moving average is just a tool to reveal the current situation but without any forecasting. In this project, I would like to change the use of MA into a forecasting way. Taking MA (5) as an example, MA (5) refers to the value that taking equal average of the previous 5 data, in this testing, we would compare the

difference of the MA (5) value to the actual next moment value. The number of data being tested include would be about 1950 to 2000, which is around 15% of the total normalized wind speed data in every season, as same as in ANN. In this paper, we would carry out intra-comparison to find the greatest accuracy in terms of MA (4), MA (6), MA (8) respectively, then using the best performance to compare with other methods. For AR Followed by classification, error processing, averaging and normalizing, we input the formula of Where Xt refers to the predicted values, a1 to ap correspond to the coefficient, Xt-1 to Xt-p are the past data values and finally wt refer to the constant value. We carry out AR (4), AR (6), AR (8) and that means we use the past 4, 6 and 8 data, giving them the corresponding coefficients, to perform the future predictions. Same as MA, we could only use the past data of wind speed for the prediction, but not with other vectors in AR model. We use around 1950 to 2000 sets of results for comparison in different seasons at different look ahead time. Result comparison We use MSE as indicator to compare the performance. A smaller value of MSE represents the greater accuracy. Mean squared error is an important benchmark that is exploited on the ground of measuring the performance or accuracy of an estimator. The mean squared error, abbreviated as MSE, is essential for relaying the concepts of precision, bias and accuracy during the statistical estimation. Where n refers to number of data set, yi demonstrates the predicted value and yi desh corresponds to the actual value. It is a common estimator, especially for such large amount of data sets.

Results Intra-comparison of ANN We would first focus on the result of parameter selection and then follow by the forecasted outcomes. For cross correlation, we have Wind Speed cross Humidity, Wind Speed cross Temperature and Wind Speed cross Wind Speed. The following shows the results of cross correlation, we use different lengths of lagged times as x-axis. Wind Speed cross Humidity spring The scaled is demonstrated from 0 to 1200 for the x-axis revealing the lagged time The scaled is demonstrated from 0 to 168 for the x-axis revealing the lagged time For the cross correlation between wind speed and humidity in spring, we could figure out that the highest correlation is 0.12 at lagged time 64 in the aspects of positive correlation, while he highest correlation is -0.19 at lagged time 580 in the aspects of negative correlation.

Wind Speed cross Humidity summer The scaled is demonstrated from 0 to 1200 for the x-axis revealing the lagged time The scaled is demonstrated from 0 to 168 for the x-axis revealing the lagged time For the cross correlation between wind speed and humidity in summer, we could figure out that the highest correlation is 0.15 at lagged time 90 in the aspects of positive correlation, while he highest correlation is -0.25 at lagged time 1150 in the aspects of negative correlation.

Wind Speed cross Humidity autumn The scaled is demonstrated from 0 to 1200 for the x-axis revealing the lagged time The scaled is demonstrated from 0 to 168 for the x-axis revealing the lagged time For the cross correlation between wind speed and humidity in autumn, we could figure out that the highest correlation is 0.3 at lagged time 72 in the aspects of positive correlation, while he highest correlation is -0.075 at lagged time 850 in the aspects of negative correlation.

Wind Speed cross Humidity winter The scaled is demonstrated from 0 to 1200 for the x-axis revealing the lagged time The scaled is demonstrated from 0 to 168 for the x-axis revealing the lagged time For the cross correlation between wind speed and humidity in winter, we could figure out that the highest correlation is 0.25 at lagged time 72 in the aspects of positive correlation, while he highest correlation is -0.075 at lagged time 1020 in the aspects of negative correlation.

Wind Speed cross Temperature spring The scaled is demonstrated from 0 to 1200 for the x-axis revealing the lagged time The scaled is demonstrated from 0 to 168 for the x-axis revealing the lagged time For the cross correlation between wind speed and temperature in spring, we could figure out that the highest correlation is 0.3 at lagged time 60 in the aspects of positive correlation, while he highest correlation is -0.09 at lagged time 800 in the aspects of negative correlation.

Wind Speed cross Temperature summer The scaled is demonstrated from 0 to 1200 for the x-axis revealing the lagged time The scaled is demonstrated from 0 to 168 for the x-axis revealing the lagged time For the cross correlation between wind speed and temperature in summer, we could figure out that the highest correlation is 0.2 at lagged time 1150 in the aspects of positive correlation, while he highest correlation is -0.175 at lagged time 650 in the aspects of negative correlation.

Wind Speed cross Temperature autumn The scaled is demonstrated from 0 to 1200 for the x-axis revealing the lagged time The scaled is demonstrated from 0 to 168 for the x-axis revealing the lagged time For the cross correlation between wind speed and temperature in autumn, we could figure out that the highest correlation is 0.14 at lagged time 0 in the aspects of positive correlation, while he highest correlation is -0.05 at lagged time 1200 in the aspects of negative correlation.

Wind Speed cross Temperature winter The scaled is demonstrated from 0 to 1200 for the x-axis revealing the lagged time The scaled is demonstrated from 0 to 168 for the x-axis revealing the lagged time For the cross correlation between wind speed and temperature in winter, we could figure out that the highest correlation is 0.18 at lagged time 300 in the aspects of positive correlation, while he highest correlation is -0.09 at lagged time 650 in the aspects of negative correlation.

Wind Speed cross Wind Direction spring The scaled is demonstrated from 0 to 1200 for the x-axis revealing the lagged time The scaled is demonstrated from 0 to 264 for the x-axis revealing the lagged time For the cross correlation between wind speed and wind direction in spring, we could figure out that the highest correlation is 0.2 at various lagged time in the aspects of positive correlation, while he highest correlation is -0.35 at lagged time 0 in the aspects of negative correlation.

Wind Speed cross Wind Direction summer The scaled is demonstrated from 0 to 1200 for the x-axis revealing the lagged time The scaled is demonstrated from 0 to 168 for the x-axis revealing the lagged time For the cross correlation between wind speed and wind direction in summer, we could figure out that the highest correlation is 0.12 at lagged time 1150 in the aspects of positive correlation, while he highest correlation is -0.15 at lagged time 750 in the aspects of negative correlation.

Wind Speed cross Wind Direction autumn The scaled is demonstrated from 0 to 1200 for the x-axis revealing the lagged time The scaled is demonstrated from 0 to 264 for the x-axis revealing the lagged time For the cross correlation between wind speed and wind direction in autumn, we could figure out that the highest correlation is 0.005 at lagged time 425 in the aspects of positive correlation, while he highest correlation is -0.2 at lagged time 48 in the aspects of negative correlation.

Wind Speed cross Wind Direction winter The scaled is demonstrated from 0 to 1200 for the x-axis revealing the lagged time The scaled is demonstrated from 0 to 168 for the x-axis revealing the lagged time For the cross correlation between wind speed and wind direction in winter, we could figure out that the highest correlation is 0.12 at lagged time 300 in the aspects of positive correlation, while he highest correlation is -0.2 at lagged time 5 in the aspects of negative correlation.

The following shows the result of cross correlation between wind speed and humidity: Season Highest correlation (+) Highest correlation (-) Spring 0.12-0.19 Summer 0.15-0.25 Autumn 0.3-0.075 Winter 0.25-0.075 The following shows the result of cross correlation between wind speed and temperature: Season Highest correlation (+) Highest correlation (-) Spring 0.3-0.09 Summer 0.2-0.175 Autumn 0.14-0.05 Winter 0.18-0.09 The following shows the result of cross correlation between wind speed and wind direction: Season Highest correlation (+) Highest correlation (-) Spring 0.2-0.35 Summer 0.12-0.15 Autumn 0.005-0.2 Winter 0.12-0.2 To make a simple conclusion, all the alternative parameters humidity, temperature, wind direction show very low correlation relationship with the wind speed. The correlation coefficient is ranging from 0.005 to 0.35. Most of the cases are with averaging around 0.2 of correlation coefficient only. Mentioned in the methodology, we would only consider correlation coefficient greater than 0.8 as releveant input parameters in ANN model. As a result, we would not involve the sectors of humidity, temperature, and wind direction into our ANN model training, given that they are not correlated.

Wind Speed Autocorrelation - spring The scaled is demonstrated from 0 to 500 for the x-axis revealing the lagged time The scaled is demonstrated from 0 to 100 for the x-axis revealing the lagged time From the larger scaling of autocorrelation graph, we can find that correlation coefficient demonstrating a decreasing trend when the number of lagged increases. It indicates that with the more latter values of wind speed, they show the higher relationship with the next moment wind speed. From the smaller scaling of autocorrelation graph, It is observable that there are 10 points of correlation coefficient ranging from [0.8, 1]. Hence, we are going to use the previous 10 wind speed data to forecast the next wind speed data in spring in our ANN model.

Wind Speed Autocorrelation - summer The scaled is demonstrated from 0 to 500 for the x-axis revealing the lagged time The scaled is demonstrated from 0 to 100 for the x-axis revealing the lagged time From the larger scaling of autocorrelation graph, we can find that correlation coefficient demonstrating a decreasing trend when the number of lagged increases. It indicates that with the more latter values of wind speed, they show the higher relationship with the next moment wind speed. From the smaller scaling of autocorrelation graph, It is observable that there are 6 points of correlation coefficient ranging from [0.8, 1]. Hence, we are going to use the previous 6 wind speed data to forecast the next wind speed data in summer in our ANN model.

Wind Speed Autocorrelation - autumn The scaled is demonstrated from 0 to 500 for the x-axis revealing the lagged time The scaled is demonstrated from 0 to 100 for the x-axis revealing the lagged time From the larger scaling of autocorrelation graph, we can find that correlation coefficient demonstrating a decreasing trend when the number of lagged increases. It indicates that with the more latter values of wind speed, they show the higher relationship with the next moment wind speed. From the smaller scaling of autocorrelation graph, It is observable that there are 7 points of correlation coefficient ranging from [0.8, 1]. Hence, we are going to use the previous 7 wind speed data to forecast the next wind speed data in autumn in our ANN model.

Wind Speed Autocorrelation winter The scaled is demonstrated from 0 to 500 for the x-axis revealing the lagged time The scaled is demonstrated from 0 to 100 for the x-axis revealing the lagged time From the larger scaling of autocorrelation graph, we can find that correlation coefficient demonstrating a decreasing trend when the number of lagged increases. It indicates that with the more latter values of wind speed, they show the higher relationship with the next moment wind speed. From the smaller scaling of autocorrelation graph, It is observable that there are 5 points of correlation coefficient ranging from [0.8, 1]. Hence, we are going to use the previous 5 wind speed data to forecast the next wind speed data in winter in our ANN model.

From the above results, we could simply make a table to summarize the input variables of our ANN model. Seasons Number of spots found from the range of [0.8, 1] in terms of Forecasting equation (10 minutes ahead forecasting) Forecasting equation (30 minutes ahead forecasting) correlation coefficient Spring 10 Input: m to m+9 Target: m+10 Input: m to m+9 Target: m+12 Summer 6 Input: m to m+5 Target: m+6 Input: m to m+5 Target: m+8 Autumn 7 Input: m to m+6 Target: m to m+7 Input: m to m+6 Target: m to m+9 Winter 5 Input: m to m+4 Target: m to m+5 Input: m to m+4 Target: m to m+8 10 lagged value would be entered as input parameter in spring demonstration, 6 for summer, 7 for autumn and 5 for winter. The target values would be the 10 minutes and 30 minutes later in 10 minutes look ahead and 30 minutes look ahead respectively. Hence, we would transform our normalized datum into two different tables one for 10 minutes ahead forecasting, another for 30 minutes ahead forecasting. Using the data in summer as an example, here shows the original 13248 normalized data listing from top to below. We then transform the tables into two, which are shown in the next pages.

This table is designed for the 10 minutes ahead forecasting, the values in first 6 column refer to the input variables, while the value in the 7 th column correspond to the target variables. We use every previous 6 data to forecast the next 10 minutes wind speed. There are 13242 sets of data, with combination of input and target, and would be processed into our ANN model in the next step. This table is designed for the 30 minutes ahead forecasting. Similar to the 10 minutes forecasting, the values in first 6 column refer to the input variables, while the value in the 7 th column correspond to the target variables. However, we use every previous 6 data to forecast wind speed after 30 minutes. Therefore, we can find the order of the data in the 7 th column is slightly different from each table. For the 30 minutes ahead forecasting, there are 13240 sets of data, with combination of input and target, and would be processed into our ANN model in the next step. The result would be generated then. We use different number of neurons to find the best performance of LM method.

Normalized value Intra-comparison of ANN (LM) model The following shows the performance of LM using 10 minutes ahead forecasting in spring. Number of neurons MSE 2 0.0035 4 0.0034 6 0.0035 8 0.0035 10 0.0035 12 0.0034 14 0.0031 16 0.0033 18 0.0036 20 0.0034 We can observe that LM (14) get the best result with MSE 0.0031, while MA (18) performs the worst with MSE 0.0036. 0.8 0.7 0.6 10 mins ahead LM(14) in spring 0.5 0.4 0.3 0.2 actual forecast 0.1 0-0.1 0 500 1000 1500 2000 2500 Time Series The above diagram demonstrates the difference of actual and forecasted results with the best performance of 10 minutes ahead forecasting in spring.

Normalized value The following shows the performance of LM using 10 minutes ahead forecasting in summer. Number of neurons MSE 2 0.0022 4 0.0021 6 0.0022 8 0.0022 10 0.0022 12 0.0022 14 0.0022 16 0.0021 18 0.0021 20 0.0024 We can observe that LM (4), LM (16), LM (18) get the best result with MSE 0.0021, while LM (20) performs the worse with MSE 0.0024. 0.8 0.7 0.6 10 mins ahead LM(4) in summer 0.5 0.4 0.3 0.2 actual forecast 0.1 0-0.1 0 500 1000 1500 2000 2500 Time Series The above diagram demonstrates the difference of actual and forecasted results with one of the best performance of 10 minutes ahead forecasting in summer.

Normalized value The following shows the performance of LM using 10 minutes ahead forecasting in autumn. Number of neurons MSE 2 0.0016 4 0.0017 6 0.0015 8 0.0016 10 0.0016 12 0.0016 14 0.0015 16 0.0016 18 0.0016 20 0.0016 We can observe that LM (6), LM (14) get the best result with MSE 0.0015, while LM (4) performs the worse with MSE 0.0017. 0.45 0.4 0.35 10 mins ahead LM(6) in autumn 0.3 0.25 0.2 0.15 actual forecast 0.1 0.05 0-0.05 0 500 1000 1500 2000 2500 Time Series The above diagram demonstrates the difference of actual and forecasted results with one of the best performance of 10 minutes ahead forecasting in autumn.

Normalized value The following shows the performance of LM using 10 minutes ahead forecasting in winter. Number of neurons MSE 2 0.0064 4 0.0058 6 0.0062 8 0.0067 10 0.0062 12 0.0067 14 0.0063 16 0.0059 18 0.0058 20 0.0064 We can observe that LM (4), LM (18) get the best result with MSE 0.0058, while LM (8) and LM (12) perform the worse with MSE 0.0067. 1 10 mins ahead LM(4) in winter 0.8 0.6 0.4 0.2 actual forecast 0-0.2 0 500 1000 1500 2000 2500 Time Series The above diagram demonstrates the difference of actual and forecasted results with one of the best performance of 10 minutes ahead forecasting in winter.

Normalized value The following shows the performance of LM using 30 minutes ahead forecasting in spring. Number of neurons MSE 2 0.0057 4 0.0058 6 0.006 8 0.0057 10 0.0058 12 0.0053 14 0.0059 16 0.0057 18 0.0058 20 0.006 We can observe that LM (12) get the best result with MSE 0.0053, while LM (6) and LM (20) performs the worse with MSE 0.006. 0.8 0.7 0.6 30 mins ahead LM(12) in spring 0.5 0.4 0.3 0.2 actual forecast 0.1 0-0.1 0 500 1000 1500 2000 2500 Time Series The above diagram demonstrates the difference of actual and forecasted results with one of the best performance of 30 minutes ahead forecasting in spring.

Normalized value The following shows the performance of LM using 30 minutes ahead forecasting in summer. Number of neurons MSE 2 0.004 4 0.0039 6 0.0035 8 0.0039 10 0.0035 12 0.0035 14 0.0038 16 0.004 18 0.0038 20 0.004 We can observe that LM (6), LM (10), LM (12) get the best result with MSE 0.0035, while LM (2), LM (16) and LM (20) performs the worse with MSE 0.004. 0.8 10 mins ahead LM(6) in summer 0.7 0.6 0.5 0.4 0.3 0.2 actual forecast 0.1 0-0.1 0 500 1000 1500 2000 2500 Time Series The above diagram demonstrates the difference of actual and forecasted results with one of the best performance of 30 minutes ahead forecasting in summer.

Normalized value The following shows the performance of LM using 30 minutes ahead forecasting in autumn. Number of neurons MSE 2 0.0026 4 0.0026 6 0.0026 8 0.0026 10 0.0026 12 0.0027 14 0.0025 16 0.0027 18 0.0027 20 0.0027 We can observe that LM (14) get the best result with MSE 0.0025, and the others show similar result ranging from 0.0026 to 0.0027 0.45 0.4 0.35 30 mins ahead LM(12) in autumn 0.3 0.25 0.2 0.15 actual forecast 0.1 0.05 0-0.05 0 500 1000 1500 2000 2500 Time Series The above diagram demonstrates the difference of actual and forecasted results with one of the best performance of 30 minutes ahead forecasting in autumn

Normalized value The following shows the performance of LM using 30 minutes ahead forecasting in winter. Number of neurons MSE 2 0.0108 4 0.0111 6 0.0109 8 0.0118 10 0.0113 12 0.0118 14 0.0109 16 0.0107 18 0.0115 20 0.0109 We can observe that LM (16) get the best result with MSE 0.0107, while LM (6), LM (12) performs the worse with MSE 0.0118. 1 10 mins ahead LM(16) in winter 0.8 0.6 0.4 0.2 actual forecast 0-0.2 0 500 1000 1500 2000 2500 Time Series The above diagram demonstrates the difference of actual and forecasted results with one of the best performance of 30 minutes ahead forecasting in winter.

In the following, we would summarize the best performance in different seasons using 10 min ahead or 30 minutes ahead ANN forecasting. Season 10 minutes ahead (MSE) 30 minutes ahead (MSE) Spring 0.0031 0.0053 Summer 0.0021 0.0035 Autumn 0.0015 0.0058 Winter 0.0058 0.0107 To give a short summary, we find that the number of neurons in ANN model doesn t directly proportional to the performance of the prediction model. Even we increase the number of neurons and thus using more training time, the accuracy doesn t seem to be greatly improved. Besides, 10 minutes look ahead forecasting shows greater accuracy compare with 30 minutes look ahead forecasting. Especially in autumn, the MSE of 30 minutes look ahead forecasting is 0.0058, which is nearly 4 times as the 10 minutes with MSE 0.0015. The remaining seasons also showed a big difference of MSE in terms of 10 / 30 minutes look ahead forecasting.

Normalized value Intra-comparison of MA model The following shows the 10 minutes ahead forecasting results by MA method in spring using MA (4), MA (6) and MA (8) Spring MSE MA (4) 0.0043 MA (6) 0.0045 MA (8) 0.0048 We can observe that MA (4) get the best result with MSE 0.0043, while MA (8) performs the worst with MSE 0.0048. 0.8 10 mins ahead MA(4) in Spring 0.7 0.6 0.5 0.4 0.3 0.2 Actual Forecast 0.1 0-0.1 0 500 1000 1500 2000 2500 Time Series The above diagram demonstrates the difference of actual and forecasted results for the best performance of 10 minutes ahead forecasting in spring.

Normalized value The following shows the 10 minutes ahead forecasting results by MA method in summer using MA (4), MA (6) and MA (8) Spring MSE MA (4) 0.0027 MA (6) 0.0030 MA (8) 0.0034 We can observe that MA (4) get the best result with MSE 0.0027, while MA (8) performs the worst with MSE 0.0034. 0.8 10 mins ahead MA(4) in Summer 0.7 0.6 0.5 0.4 0.3 0.2 Actual Forecast 0.1 0-0.1 0 500 1000 1500 2000 2500 Time Series The above diagram demonstrates the difference of actual and forecasted results for the best performance of 10 minutes ahead forecasting in summer.

Normalized value The following shows the 10 minutes ahead forecasting results by MA method in autumn using MA (4), MA (6) and MA (8) Spring MSE MA (4) 0.0018 MA (6) 0.0019 MA (8) 0.0021 We can observe that MA (4) get the best result with MSE 0.0018, while MA (8) performs the worst with MSE 0.0021. 0.45 0.4 0.35 0.3 0.25 0.2 0.15 0.1 0.05 0-0.05 10 mins ahead MA(4) in Autumn Actual Forecast 0 500 1000 1500 2000 2500 Time Series The above diagram demonstrates the difference of actual and forecasted results for the best performance of 10 minutes ahead forecasting in autumn.

Normalized value The following shows the 10 minutes ahead forecasting results by MA method in winter using MA (4), MA (6) and MA (8) Spring MSE MA (4) 0.0072 MA (6) 0.0079 MA (8) 0.0085 We can observe that MA (4) get the best result with MSE 0.0072, while MA (8) performs the worst with MSE 0.0085. 1 10 mins ahead MA(4) in Winter 0.8 0.6 0.4 0.2 Actual Forecast 0-0.2 0 500 1000 1500 2000 2500 Time Series The above diagram demonstrates the difference of actual and forecasted results for the best performance of 10 minutes ahead forecasting in winter.

Normalized value The following shows the 30 minute ahead forecasting results by MA method in spring using MA (4), MA (6) and MA (8) Spring MSE MA (4) 0.0062 MA (6) 0.0061 MA (8) 0.0063 We can observe that MA (6) get the best result with MSE 0.0061, while MA (8) performs the worst with MSE 0.0063. 0.8 10 mins ahead MA(4) in Spring 0.7 0.6 0.5 0.4 0.3 0.2 Actual Forecast 0.1 0-0.1 0 500 1000 1500 2000 2500 Time Series The above diagram demonstrates the difference of actual and forecasted results for the best performance of 30 minutes ahead forecasting in spring.

Normalized value The following shows the 30 minutes ahead forecasting results by MA method in summer using MA (4), MA (6) and MA (8) Spring MSE MA (4) 0.0042 MA (6) 0.0045 MA (8) 0.0049 We can observe that MA (4) get the best result with MSE 0.0042, while MA (8) performs the worst with MSE 0.0049. 0.8 30 mins ahead MA(4) in Summer Actual 0.7 0.6 0.5 0.4 0.3 0.2 0.1 0-0.1 0 500 1000 1500 2000 2500 Time Series The above diagram demonstrates the difference of actual and forecasted results for the best performance of 30 minutes ahead forecasting in summer.

Normalized value The following shows the 30 minutes ahead forecasting results by MA method in autumn using MA (4), MA (6) and MA (8) Spring MSE MA (4) 0.0026 MA (6) 0.0027 MA (8) 0.0029 We can observe that MA (4) get the best result with MSE 0.0026, while MA (8) performs the worst with MSE 0.0029. 0.45 0.4 0.35 0.3 0.25 0.2 0.15 0.1 0.05 0-0.05 30 mins ahead MA(4) in Autumn Actual Forecast 0 500 1000 1500 2000 2500 Time Series The above diagram demonstrates the difference of actual and forecasted results for the best performance of 30 minutes ahead forecasting in autumn.

Normalized value The following shows the 30 minutes ahead forecasting results by MA method in winter using MA (4), MA (6) and MA (8) Spring MSE MA (4) 0.0108 MA (6) 0.0109 MA (8) 0.0111 We can observe that MA (4) get the best result with MSE 0.0108, while MA (8) performs the worst with MSE 0.0111. 1 30 mins ahead MA(4) in Winter 0.8 0.6 0.4 0.2 Actual Forecast 0-0.2 0 500 1000 1500 2000 2500 Time Series The above diagram demonstrates the difference of actual and forecasted results for the best performance of 30 minutes ahead forecasting in winter. In the following, we would summarize the best performance in different seasons using 10 min ahead or 30 minutes ahead MA forecasting. Season 10 minutes ahead (MSE) 30 minutes ahead (MSE) Spring 0.0043 0.0061 Summer 0.0027 0.0042 Autumn 0.0018 0.0026 Winter 0.0072 0.0108 Moving average giving equal weighted on the past data to predict the future values. The results of MA (4) > MA (6) > MA (8) frequently. That means the wind speed is more correlated to the previous data. We found that using the simplest way MA doesn t give us a low-accurate result, in fact, the MSE levels are not really high which are better than expected.

Normalized value Intra-comparison of AR model The following shows the 10 minutes ahead forecasting results by AR method in spring using AR (4), AR (6) and AR (8). Spring MSE AR (4) 0.0039 AR (6) 0.0035 AR (8) 0.0035 We can observe that both AR (6) and AR (8) get the best result with MSE 0.0035 which is greater than AR (4) with MSE 0.0039. 0.8 0.7 0.6 10 mins ahead AR(8) in Spring 0.5 0.4 0.3 0.2 Actual forecast 0.1 0-0.1 0 500 1000 1500 2000 2500 Time Series The above diagram demonstrates the difference of actual and forecasted results for the best performance of 10 minutes ahead forecasting in spring.

Normalized value The following shows the 10 minutes ahead forecasting results by AR method in summer using AR (4), AR (6) and AR (8). Spring MSE AR (4) 0.0022 AR (6) 0.0022 AR (8) 0.0022 We can observe that both AR (4), AR (6), AR (8) show the same result with MSE 0.0022 0.8 0.7 0.6 10 mins ahead AR(4) in Summer 0.5 0.4 0.3 0.2 Actual forecast 0.1 0-0.1 0 500 1000 1500 2000 2500 Time Series The above diagram demonstrates the difference of actual and forecasted results for one of the best performance of 10 minutes ahead forecasting in summer.

Normalized value The following shows the 10 minutes ahead forecasting results by AR method in autumn using AR (4), AR (6) and AR (8). Spring MSE AR (4) 0.0015 AR (6) 0.0015 AR (8) 0.0015 We can observe that both AR (4), AR (6), AR (8) show the same result with MSE 0.0015. 10 mins ahead AR(6) in autumn 0.45 0.4 0.35 0.3 0.25 0.2 0.15 0.1 0.05 0-0.05 0 500 1000 1500 2000 2500 Time Series actual forecast The above diagram demonstrates the difference of actual and forecasted results for one of the best performance of 10 minutes ahead forecasting in autumn.

Normalized value The following shows the 10 minutes ahead forecasting results by AR method in winter using AR (4), AR (6) and AR (8). Spring MSE AR (4) 0.0059 AR (6) 0.0058 AR (8) 0.0058 We can observe that both AR (6) and AR (8) get the best result with MSE 0.0058, while which is slightly greater than AR (4) 1 10 mins ahead AR(8) in winter 0.8 0.6 0.4 0.2 Actual forecast 0-0.2 0 500 1000 1500 2000 2500 Time Series The above diagram demonstrates the difference of actual and forecasted results for the best performance of 10 minutes ahead forecasting in winter.

Normalized values The following shows the 30 minutes ahead forecasting results by AR method in spring using AR (4), AR (6) and AR (8). Spring MSE AR (4) 0.0063 AR (6) 0.0060 AR (8) 0.0060 We can observe that both AR (6) and AR (8) get the best result with MSE 0.0060, while AR (4) performs the worst with MSE 0.0063. 0.8 0.7 30 mins ahead AR(8) in Spring 0.6 0.5 0.4 0.3 0.2 0.1 0-0.1 0 500 1000 1500 2000 2500 Time Series Actual forecast The above diagram demonstrates the difference of actual and forecasted results for the best performance of 30 minutes ahead forecasting in spring.

Normalized values The following shows the 30 minutes ahead forecasting results by AR method in summer using AR (4), AR (6) and AR (8). Spring MSE AR (4) 0.0039 AR (6) 0.0039 AR (8) 0.004 We can observe that AR (4) and AR (6) get the best result with MSE 0.0039, which is slightly greater than AR (8) with MSE 0.004 0.8 0.7 30 mins ahead AR(4) in summer 0.6 0.5 0.4 0.3 0.2 0.1 0-0.1 0 500 1000 1500 2000 2500 Times series actual forecast The above diagram demonstrates the difference of actual and forecasted results for one of the best performance of 30 minutes ahead forecasting in summer.

座標軸標題 The following shows the 30 minutes ahead forecasting results by AR method in autumn using AR (4), AR (6) and AR (8). Spring MSE AR (4) 0.0025 AR (6) 0.0025 AR (8) 0.0025 We can observe that all AR (4), AR (6) and AR (8) get the same result of MSE 0.0025. 30 mins ahead AR(8) in autumn 0.45 0.4 0.35 0.3 0.25 0.2 0.15 0.1 0.05 0-0.05 0 500 1000 1500 2000 2500 Time Series actual forecast The above diagram demonstrates the difference of actual and forecasted results for one of the best performance of 30 minutes ahead forecasting in autumn.

Normalized values The following shows the 30 minutes ahead forecasting results by AR method in winter using AR (4), AR (6) and AR (8). Spring MSE AR (4) 0.0102 AR (6) 0.0101 AR (8) 0.0100 We can observe that AR (8) get the best result with MSE 0.0100, while AR (4) performs the worst with MSE 0.0102. 1 30 mins ahead AR(8) in winter 0.8 0.6 0.4 0.2 actual forecast 0-0.2 0 500 1000 1500 2000 2500 Time Series The above diagram demonstrates the difference of actual and forecasted results for the best performance of 30 minutes ahead forecasting in autumn. In the following, the best performance in different seasons using 10 min ahead or 30 minutes ahead by AR model was summarized. Season 10 minutes ahead (MSE) 30 minutes ahead (MSE) Spring 0.0035 0.006 Summer 0.0022 0.0039 Autumn 0.0015 0.0025 Winter 0.0058 0.01 Same as ANN and MA, the result of using 10 minutes ahead forecasting show better result than using 30 minutes. The results are satisfying, especially the MSE are 0.0015 and 0.0022 only in summer and autumn at 10 minutes look ahead prediction.

Inter-comparison of best result of ANN, MA and AR model 10 minutes look ahead forecast ANN MA AR Spring 0.0031 0.0043 0.0035 Summer 0.0021 0.0027 0.0022 Autumn 0.0015 0.0018 0.0015 Winter 0.0058 0.0072 0.0058 Average MSE of best performance results 0.003125 0.004 0.00325 30 minutes look ahead forecast ANN MA AR Spring 0.0053 0.0061 0.006 Summer 0.0035 0.0042 0.0039 Autumn 0.0025 0.0026 0.0025 Winter 0.0107 0.0108 0.0100 Average MSE of best performance results 0.0055 0.005925 0.0056 In terms of 10 minutes look ahead forecasting with the best performance result comparison: For spring, ANN performs the best with MSE 0.0031, followed by AR with MSE 0.0035 which is 12.9% greater, and MA performs the worse with MSE 0.0043 which is 38.7% higher than the ANN s. For summer, ANN performs the best with MSE 0.0021, followed by AR with MSE 0.0022 which is 4.7% greater, and MA performs the worse with MSE 0.0027 which is 28.5% higher than the ANN s. For autumn, both ANN and AR perform well with MSE 0.0015, MA performs the worse with MSE 0.0018 which is 20.0% higher than the ANN s and AR s. For winter, both ANN and AR perform well with MSE 0.0058, MA performs the worse with MSE 0.0072 which is 24.1% higher than the ANN s and AR s. Averagely, the MSE of ANN, MA and AR are 0.003125, 0.004 and 0.00325 respectively. ANN shows the best performance, AR performs 4% less accurate than ANN, MA performs 28% less accurate than ANN.