Artificial Neural Network-Based Short-Term Demand Forecaster
|
|
- Justin Grant
- 5 years ago
- Views:
Transcription
1 Artificial Neural Network-Based Short-Term Demand Forecaster A.P. Alves da Silva 1 alex@coep.ufrj.br U.P. Rodrigues 2 ubiratan@triline.com.br A.J. Rocha Reis 2 agnreis@iee.efei.br L.S. Moulin 3 moulin@cepel.br P.C. Nascimento 2 paulocn@iee.efei.br 1 PEE-COPPE Federal University of Rio de Janeiro C.P , Rio de Janeiro, RJ, Brazil 2 Systems Engineering Group (GESis), Institute of Electrical Engineering Federal University of Itajubá Av. BPS, Itajubá, MG, Brazil 3 Electric Power Research Center (CEPEL) Av. Hum, s/n Cidade Universitária - Ilha do Fundão Rio de Janeiro, RJ, Brazil Abstract. The importance of Short-Term Load Forecasting (STLF) has been increasing lately. With deregulation and competition, energy price forecasting has become a big business. Bus load forecasting is essential to feed analytical methods utilized for determining energy prices. The variability and non-stationarity of loads are becoming worse due to the dynamics of energy tariffs. Besides, the number of nodal loads to be predicted does not allow frequent interventions from load forecasting experts. More autonomous load predictors are needed in the new competitive scenario. The application of neural network-based STLF has developed sophisticated practical systems over the years. However, the question of how to maximize the generalization ability of such machines, together with the choice of architecture, activation functions, training set data and size, etc. makes up a huge number of possible combinations for the final Neural Network (NN) design, whose optimal solution has not been figured yet. This paper describes a STLF system which uses a non-parametric model based on a linear model coupled with a polynomial network, identified by pruning/growing mechanisms. The load forecaster has special features of data preprocessing and confidence intervals calculations, which are also described. Results of load forecasts are presented for one year with forecasting horizons from 15 min. to 168 hours ahead.
2 1. Introduction With power systems growth and the increase in their complexity, many factors have become influential to the electric power generation and consumption (e.g., load management, energy exchange, spot pricing). Therefore, the forecasting process has become even more complex, and more accurate forecasts are needed. The relationship between the load and its exogenous factors is complex and nonlinear, making it quite difficult to model through conventional techniques only (e.g., time series linear models and linear regression analysis). The Neural Networks (NN s) ability in mapping complex nonlinear relationships is responsible for the growing number of their applications to load forecasting. Despite their success in that application (Khotanzad et al., 1998), there are still a number of unsolved technical issues, particularly with regard to parameterization. The main issue in the application of multilayer feedforward NN s for time series prediction is the question of how to maximize their generalization ability. This kind of model is sensitive to the choice of the network s architecture, preprocessing of data, choice of activation functions, number of training cycles, size of training sets, learning algorithm and the validation process. This is especially true when a nonstationary system has to be tracked, i.e., adaptation is necessary, as it is the case in load forecasting. This paper describes a STLF system which includes two types of models: a linear and a NN one. The final forecast is calculated by combining both predictions. Figure (1) shows the block diagram for the load forecaster. A nonparametric NN model has been used to automate the training and adaptation processes. With this approach, the underlying model is not known a priori, and it is estimated using a large number of candidate models to describe available data. Application of this type of model to STLF has been neglected in the literature. The paper is divided as follows. Section 2 describes the forecasting models. In Section 3, preprocessing of load data is presented. Some techniques such as normalization, standardization, differencing, and digital filters have been employed. Confidence intervals calculation for STLF is the subject of Section 4. The proposed system is evaluated through forecasting simulations in Section 5. Finally, Section 6 presents the main conclusions of this paper. Figure 1. Block diagram for NeuroDem. 2. Non-Parametric NN Model The main motivation for developing non-parametric NN s is the creation of fully data driven models, i.e., automatic selection of the candidate model of the right complexity to describe the training data. The idea is to leave for the designer only the data-gathering task. Nevertheless, the state of the art in this area has not reach that far. Every so-called non-parametric model still has some dependence on a few pre-set training parameters. A very useful byproduct of the automatic estimation of the model structure is the selection of the most significant input variables for synthesizing a desired mapping. Input variable selection for NN-based load forecasters has been performed using the same techniques applied for linear models. However, it has been shown that the best input variables for linear models are not among good input variables for nonlinear ones (Drezga and Rahman, 1998). Linear methods interpret all regular structure in a data set, such as a dominant frequency, as linear correlations. Therefore, linear models are useful if and only if the power spectrum is a useful characterization of the relevant features of a time series. Linear models can only represent an exponentially growing or a periodically oscillating behavior. Therefore, all irregular behavior of a system has to be attributed to a random external input to the system. Chaos theory has shown that random input is not the only possible source of irregularity in a system s output. The goal in creating an ARMA model is to have the residual as white noise. This is equivalent to produce a flat power spectrum for the residual. However, in practice, this goal cannot be perfectly achieved. Suspicious anomalies in the power spectrum are very common, i.e., the residual s power spectrum is not really flat. Consequently, it is difficult to say if the residual corresponds to white noise or if there is still some useful information to be extracted from the time series. Neural networks can find predictable patterns that cannot be detected by classical statistical tests such as auto(cross)correlation coefficients and power spectrum.
3 Besides, many observed load series exhibit periods during which they are less predictable, depending on the past history of the series. This dependence on the past of the series cannot be represented by a linear model (Alves da Silva and Moulin, 2000). Linear models fail to consider the fact that certain past histories may permit more accurate forecasting than others. Therefore, differently from nonlinear models, they cannot identify the circumstances under which more accurate forecasts can be expected. The greatest challenges in NN training are related to the issues raised in the previous section. The huge number of possible combinations of all NN training parameters makes its application not very reliable. Non-parametric NN models have been proposed in the literature (Kwok and Yeung, 1997). Although the very first attempt to apply this idea to STLF dates back to 1975 (Dillon et al., 1975), it is still one of the few investigations on this subject, despite the tremendous increase in computational resources Neural Network Training Non-parametric NN training uses two basic mechanisms for finding the most appropriate architecture: pruning and growing. Pruning methods assume that the initial architecture contains the optimal structure. It is common practice to start the search using an oversized network. The excessive connections and/or neurons have to be removed during training, while adjusting the remaining parts. The pruning methods have the following drawbacks: - there is no mathematically sound initialization for the NN architecture, therefore initial guesses usually use very large structures; and - due to the previous argument, a lot of computational effort is wasted. As the growing methods operate in the opposite direction of the pruning methods, the shortcomings mentioned before are overcome. However, the incorporation of one element has to be evaluated independently of other elements that could be added later. Therefore, pruning should be applied as a complementary procedure to growing methods, in order to remove parts of the model that become unnecessary during the constructive process Types of Approximation Functions The greatest concern when applying nonlinear NN s is to avoid unnecessary complex models to not overfit the training patterns. The ideal model is the one that matches the complexity of the available data. However, it is desirable to work with a general model that could provide any required degree of nonlinearity. Among the models that can be classified as universal approximators, i.e., the ones which can approximate any continuous function with arbitrary precision, the following types are the most important: - multilayer networks; - local basis function networks; - trigonometric polynomials; - algebraic polynomials. The universal approximators above can be linear, although nonlinear in the inputs, or nonlinear in the parameters. Regularization criteria, analytic (e.g., Akaike s Information Criterion, Minimum Description Length, etc.) or based on resampling (e.g., cross-validation), have been proposed. In practice, model regularization considering nonlinearity in the parameters is very difficult. An advantage of using universal approximators that are linear in the parameters is the possibility of decoupling the exploration of architecture space from the weight space search. Methods for selecting models nonlinear in the parameters attempt to explore both spaces simultaneously, which is an extremely hard nonconvex optimization problem. An algebraic polynomial universal function approximator is presented next. It belongs to the class of models which are linear in the parameters. Our load forecaster combines a linear predictor based on regression with such a polynomial network (Section 2.3), in order to improve the training method adopted in reference (Dillon et al., 1975) The Polynomial Network The polynomial network can be interpreted as a feedforward NN with supervised learning that synthesizes a polynomial mapping, P, between input data and the desired output (P: R m R ). Each neuron can be expressed by a 2 2 simple elementary polynomial of order n (e.g., for n=2, y = A + Bxi + Cx j + Dxi + Ex j + Fxi x j, where x i and x j are external inputs or outputs from previous layers, and A, B, C, D, E and F are the polynomial coefficients, which are equivalent to network weights, and y is the neuron output). As the polynomial network is a non-parametric model, it is not necessary to define a priori its initial structure (e.g., number of neurons and layers). The neural network layers are constructed one by one, and each new generated neuron, formed by combining external inputs or outputs from previous layers, pursues the desired output. For the estimation of the polynomial coefficients related to each new generated neuron only a linear regression problem is solved, since the network weights from the previous layers are kept frozen. The training process continues until the creation of a new layer begins to deteriorate the NN generalization ability, according to the adopted regularization criterion. In this case, only one neuron is saved in the last layer, i.e., the one that provides best generalization, and only those neurons that are necessary to generate the output neuron are preserved in
4 the previous layers. This non-parametric method, called Group Method of Data Handling (GMDH) (Farlow, 1984), selects only the relevant input variables (i.e., significant past load values and past/predicted temperatures) in the remaining network. Figure (2) shows a generic example where seven input variables were initially presented, but only five of them were found to be relevant for synthesizing the desired mapping. y Figure 2. Polynomial network. X X X X X X X The GMDH has two major drawbacks. The first is related to the constraints in the possible architectures, since the layers are formed using for inputs the outputs of the previous layer only. A typical example of this shortcoming is when a good part of the mapping could be represented by a linear function. After the first layer, there is no way to apply regression directly to the input variables. Therefore, the final model is sub-optimal in expressivity. The second drawback is related to the stopping criterion. There is no guarantee that the generalization ability of the best neuron in each layer has a convex behavior. Reference (Kargupta and Smith, 1991) applies genetic algorithms to overcome the above mentioned drawbacks. Another shortcoming of the GMDH training technique is the setting of the activation function order (i.e., the order of the elementary polynomial). It has been noticed that the order of the elementary polynomial affects the NN performance, since polynomial networks can suffer from excessive oscillatory behavior. The application of a dimensionality expansion of the input space can overcome the above mentioned problem. The new input variables include the original ones plus nonlinear transformations of these variables. This idea is borrowed from the functional expansion model presented in reference (Pao, 1989). The combination of the GMDH technique with the functional expansion model produces a very powerful NN model due to the automatic input selection capability of the GMDH. The advantages of the proposed training method are: - the NN architecture is automatically defined by the training process; - there is no learning parameter to be set; - no local minima problem; - very fast convergence. 3. Data Preprocessing Sections 3 and 4 deals with important features of our load forecaster. The first relevant feature is related to enhancing STLF via preprocessing (Fig. (3)). The second one is concerned with confidence intervals estimation for those predictions Normalization & Differencing Depending on the type of activation function used in the NN output neuron(s), it is necessary to normalize the output variable(s) in order to consider the activation function output range. This procedure usually helps to improve training efficiency. The basic motivation for normalizing input and output variables is to make them equally important to the training process. In some cases, normalization helps to improve the NN mapping interpretability as well. The normalization procedure adopted in this paper is the one in which the variables are linear transformed according to prespecified minimum and maximum values. The process of differencing computes the differences of adjacent values of a load series, i.e. the new series represents the variations of the original one. Differencing helps to improve stationarity. For instance, a linear trend can be easily removed applying differencing. Another reason for differencing is that, depending on the variable, the variations can be as important as the original values (e.g., temperature). Differencing can be interpreted as a kind of high-pass filter.
5 3.2. Filtering Electric load series are formed by the aggregation of individual consumers of different natures. A good piece of the information provided by a load series is useful for forecasting purposes. The rest is related to a random component. Therefore, there are two main reasons for filtering an electric load time series. First, important features of the load series can be emphasized. Second, a partition in different components of the load series can be produced, decreasing the learning effort (Rocha Reis and Alves da Silva, 2000). Digital filters have been used in this work. It is necessary to avoid losing important information contained in the original time series when applying filters to forecasting. Linear filters have been suggested for avoiding this problem (Rorabaugh, 1997). The idea can be illustrated by the application of one single filter. In order to not lose any relevant information, the filtered series is subtracted from the original one. Therefore, by adding the output of the filter with the result of the subtraction, the original series is perfectly reconstructed. Filters can be characterized by their cutoff frequencies and widths. The width parameter needs a careful specification. The smaller it is, more load values are used for filtering each value. It is not appropriate to use too many load values before and after a certain time slot in order to filter its value. A few adjacent neighbors are supposed to contain the most useful information for this purpose, without excessively enlarging the filter width. Digital filters in the frequency domain are employed in this work. An important point to be taken into account is the problem known as circular convolution. The discrete Fourier transformer wraps the time series around in a circle. This is equivalent to appending the beginning of the series at its end and vice-versa. Therefore, for forecasting purposes, where the last known load values are usually among the most relevant data, circular convolution is a major concern. As it is not possible to avoid it, padding is adopted. Padding means attaching convenient data at the end and/or at the beginning of the load series. The objective is to avoid the influence of circular convolution on both sides of the load series used for training and on the data required for prediction. The padding scheme adopted in this work has been proposed in (Rocha Reis and Alves da Silva, 2000). It consists in appending the previous load values at the beginning of the series and forecasted values at the end of it. The following procedure for filtering a load series has been used (Rorabaugh, 1997). Initially, pad as previously described. Reference (Masters, 1995) suggests that the minimum padding on each side of the series can be estimated dividing 0.8 by the filter width. Then, compute the discrete Fourier transformer Eq. (1): w 2πjk 2πjk n 1 j = [ Pk cos + Pk sin i ] k = 0 n n (1) Following that, perform a low-pass filtering in the frequency domain applying an energy decay factor Eq. (2) to w j, after the filter cutoff frequency j c. The parameter l determines the filter width. H( j ) 2 j jc l = e for j > j c (2) Next, apply the inverse transform to return to the time domain Eq. (3). Finally, disregard the filtered values corresponding to padding. P 1 2πjk 2πjk n 1 f f f k = [ w j cos w j sin i ] n j= 0 n n (3) Preprocessing From the Historical Measurements Normalization Low-Pass Filter Differencing To the Artificial Neural Network Figure 3. Preprocessing scheme.
6 4. Confidence Intervals Despite the success of the application of NN s to STLF, there has not been a procedure to compute confidence intervals (CI) and to estimate the uncertainties which are inherent to the forecasts. Ideally, forecasts shouldn t be made without a confidence measure calculation. Point forecasts may have no meaning if the time series is noisy. There are many difficulties to calculate these indices in nonlinear models (Gershenfeld and Weigend, 1995 and Husmeier and Taylor, 1997). A methodology has been recently proposed to compute the CI for the NN short-term load forecasting (Alves da Silva and Moulin, 2000). Three techniques are presented, (i) Error Output (EO), (ii) Resampling (RE) and (iii) Multilinear Regression adapted to NN s (MR) Error Output (EO) In this first technique, NN s are trained with two outputs which correspond to the forecast hourly load and to the forecast hourly load error (error output). Therefore, the CI are inherent to the NN inference process. This idea assumes it is possible to capture the regularities of the forecast errors Resampling (RE) In this technique a resampling procedure is performed with the forecasting errors, considering that a resampling series is representative of load values to happen in the future. Suppose a resampling series is represented like in Fig. (4), where a recursive forecasting process is shown (forecast values are inputs to the NN as time goes on), using three lagged inputs to forecast one to four steps ahead. The known load values of instants 1, 2 and 3 are used to forecast the load of instant 4. As the actual load value of instant 4 is known, this one step ahead forecast error can be computed. After that, using the know load values of instants 2 and 3, and the previously forecast load value of instant 4, a two steps ahead forecast can be made, and the corresponding error can be computed. The known load value of instant 3 and the forecast values of instants 4 and 5 are used to forecast the load value of instant 6, and so on. One error measure can be collected for each forecasting horizon when the maximum forecasting distance is reached, at instant 7. The same procedure is repeated to collect one more error sample for each forecasting horizon, using the known load values of instants 2 to 8 (upper dotted line). This procedure is repeated until, for some time window, the maximum distance is reached in the known resampling series. Afterwards, for each forecasting horizon, by sorting the n errors in ascending order (considering the signs) and representing them by z (1), z (2),..., z (n), the cumulative sample distribution function of the forecasting errors can be estimated as the following Eq. (4): 0, z < z( 1 ) Sn ( z ) = r / n, z( r ) z < z( r + 1 ) (4) 1, z( n ) z S n (z) is the fraction of the collection of errors less than or equal to z. When n is large enough, S n (z) is a good approximation of F(z), the true cumulative probability distribution. Therefore, confidence intervals can be estimated by keeping the intermediate z (r) s and discarding the extreme ones, according to the desired confidence degree. The first and the last error values of such truncated series are taken as the respective upper and lower confidence limits. Figure 4. Example of the resampling process Multilinear Regression (MR) In this technique, a multilinear regression model is implemented in the output layer of a multilayer perceptron NN. The model s inputs are taken as the hidden layer neuron s outputs, and the regression coefficients are taken as the output neuron s connection weights. The CI are computed according to the traditional MR theory of confidence intervals (Anderson, 1958).
7 In (Alves da Silva and Moulin, 2000), it has been shown that the performance of the estimation methods strongly relies in the similarities between current and past data. This is true even for the EO and the MR, where the direct influence of current data is guaranteed by the NN inputs. The CIs obtained by the RE are more trustworthy. The resampling technique provides correct intervals when the CIs are estimated from samples which are representative of true populations, even when the forecasts are not good. 5. Results The STLF system estimates short-term load forecasts in an automatic way, without the intervention of an human operator. Regular intervals of 15 minutes are used for training the model. Usually, for typical days, the 6 more recent weeks of the historical of load is used for training. Holidays and other atypical days are modeled in a different way. These models can demand a few years of historical load (e.g., Labor Day, Christmas, New Year's Day, Thanksgiving Day, etc.). Figure (5) shows a typical result of load forecasts from 15 minutes to 7 days ahead. The annual mean absolute percentage error (MAPE) is 2.26%. Special attention is given to the very short term from 15 minutes to 24 hours ahead. The confidence intervals for this horizon is shown in Fig. (6). In Fig. (7), three load curves are presented. Curve 1 represents an average load for the previous weeks. Curve 2 shows the current load evolution acquired by SCADA or by state estimation. The load forecaster allows the user to smooth the load curve when he/she finds it appropriate. This filtering process is conceived by the program, and can be activated as many as the user wishes. Curve 3 is the result of the filtering action. Figure (7) shows the application of two successive smoothing requests. Figure 5. Short-term load forecast for 1 week ahead. Figure 6. Confidence intervals.
8 3 1 2 Figure 7. Filtering process of bad data. 6. Conclusions With power systems growth and the increase in their complexity, many factors have become influential to the electric power generation and consumption. Therefore, the forecasting process has become more complex, and more accurate and autonomous solutions are needed. In this paper, a Non-parametric NN-based short-term load forecaster has been presented. Possibly, it is the most autonomous load forecaster ever developed. The Brazilian reality, in terms of available data, has been taken into account for its development. One year of load data has been utilized in order to test it. The obtained forecast errors are compatible with the values considered as state of the art in performance terms (annual MAPE around 2%). Future work will focus on the development of a new integrated forecasting tool, which will extend the forecast horizon for the medium term (i.e., up to one year ahead). 7. Acknowledgments The authors are supported by the Brazilian Research Council (CNPq) and by the Brazilian Ministry of Education agency CAPES. 8. References Alves da Silva, A.P. and Moulin, L.S., November 2000, Confidence Intervals for Neural Network Based Short-Term Load Forecasting. IEEE Trans. Power Systems, Vol. 15, Issue 4, pp Anderson, T.W., 1958, The Statistical Analysis of Time Series. John Wiley & Sons. Dillon, T.S., Morsztyn, K. and Phua, K., September 1975, Short term load forecasting using adaptive pattern recognition and self-organizing techniques, 5th Power System Computation Conference, Vol. 1, Paper 2.4/3. Drezga, I. and Rahman, S., November 1998, Input variable selection for ANN-based short-term load forecasting. IEEE Trans. Power Systems, vol. 13, no. 4, pp Farlow, S.J., 1984, Self-Organizing Methods in Modeling, Marcel Dekker. Gershenfeld, N.A. and Weigend, A.S., 1995, The future of time series: learning and understanding. In Time Series Prediction: Forecasting the Future and Understanding the Past, Addison-Wesley, pp Husmeier, D. and Taylor, J.G., 1997, Predicting conditional probability densities of stationary stochastic time series, Neural Networks, vol. 10, no. 3, pp Kargupta, H. and Smith, R.E., July 1991, System Identification with Evolving Polynomial Networks, 4th International Conference on Genetic Algorithms, San Diego, pp Khotanzad, A., Afkhami-Rohani, R. and Maratukulam D., November 1998, ANNSTLF artificial neural network short-term load forecaster generation three. IEEE Trans. Power Systems., vol. 13, no. 4, pp Kwok, T.-Y. and Yeung, D.-Y., May 1997, Constructive algorithms for structure learning in feedforward neural networks for regression problems, IEEE Trans. Neural Networks, Vol. 8, No. 3, pp Masters, T., 1995, Neural, Novel and Hybrid Algorithms for Time Series Prediction, John Wiley & Sons. Pao, Y.-H., 1989, Adaptive Pattern Recognition and Neural Networks, Addison-Wesley. Rocha Reis, A.J. and Alves da Silva, A.P., September 2000, Enhancing neural network based load forecasting via preprocessing, in Portuguese, Proceedings of the XIII Brazilian Conference on Automatica, Florianópolis, Brazil, pp Rorabaugh, C.B., 1997, Digital Filter Designer s Handbook: with C++ algorithms. McGraw-Hill, Second Edition. 9. Copyright Notice The authors are the only responsible for the printed material included in this paper.
Short Term Load Forecasting via a Hierarchical Neural Model
Proceedings of the V Brazilian Conference on Neural Networks - V Congresso Brasileiro de Redes Neurais pp. 055 059, April 2 5, 200 - Rio de Janeiro - RJ - Brazil Short Term Load Forecasting via a Hierarchical
More informationAnalysis of Fast Input Selection: Application in Time Series Prediction
Analysis of Fast Input Selection: Application in Time Series Prediction Jarkko Tikka, Amaury Lendasse, and Jaakko Hollmén Helsinki University of Technology, Laboratory of Computer and Information Science,
More informationWEATHER DEPENENT ELECTRICITY MARKET FORECASTING WITH NEURAL NETWORKS, WAVELET AND DATA MINING TECHNIQUES. Z.Y. Dong X. Li Z. Xu K. L.
WEATHER DEPENENT ELECTRICITY MARKET FORECASTING WITH NEURAL NETWORKS, WAVELET AND DATA MINING TECHNIQUES Abstract Z.Y. Dong X. Li Z. Xu K. L. Teo School of Information Technology and Electrical Engineering
More informationModeling and Forecasting Electric Daily Peak Loads Using Abductive Networks
Modeling and Forecasting Electric Daily Peak Loads Using Abductive Networks R. E. Abdel-Aal Department of Computer Engineering, King Fahd University of Petroleum and Minerals, Dhahran, Saudi Arabia Address
More informationA Hybrid Model of Wavelet and Neural Network for Short Term Load Forecasting
International Journal of Electronic and Electrical Engineering. ISSN 0974-2174, Volume 7, Number 4 (2014), pp. 387-394 International Research Publication House http://www.irphouse.com A Hybrid Model of
More informationApplication of Artificial Neural Networks in Evaluation and Identification of Electrical Loss in Transformers According to the Energy Consumption
Application of Artificial Neural Networks in Evaluation and Identification of Electrical Loss in Transformers According to the Energy Consumption ANDRÉ NUNES DE SOUZA, JOSÉ ALFREDO C. ULSON, IVAN NUNES
More informationACCURATE load forecasting is a key requirement for the
164 IEEE TRANSACTIONS ON POWER SYSTEMS, VOL. 19, NO. 1, FEBRUARY 2004 Short-Term Hourly Load Forecasting Using Abductive Networks Radwan E. Abdel-Aal, Member, IEEE Abstract Short-term load modeling and
More informationElectric Load Forecasting Using Wavelet Transform and Extreme Learning Machine
Electric Load Forecasting Using Wavelet Transform and Extreme Learning Machine Song Li 1, Peng Wang 1 and Lalit Goel 1 1 School of Electrical and Electronic Engineering Nanyang Technological University
More informationFunctional Preprocessing for Multilayer Perceptrons
Functional Preprocessing for Multilayer Perceptrons Fabrice Rossi and Brieuc Conan-Guez Projet AxIS, INRIA, Domaine de Voluceau, Rocquencourt, B.P. 105 78153 Le Chesnay Cedex, France CEREMADE, UMR CNRS
More informationLoad Forecasting Using Artificial Neural Networks and Support Vector Regression
Proceedings of the 7th WSEAS International Conference on Power Systems, Beijing, China, September -7, 2007 3 Load Forecasting Using Artificial Neural Networks and Support Vector Regression SILVIO MICHEL
More informationNeural-wavelet Methodology for Load Forecasting
Journal of Intelligent and Robotic Systems 31: 149 157, 2001. 2001 Kluwer Academic Publishers. Printed in the Netherlands. 149 Neural-wavelet Methodology for Load Forecasting RONG GAO and LEFTERI H. TSOUKALAS
More informationARTIFICIAL NEURAL NETWORK WITH HYBRID TAGUCHI-GENETIC ALGORITHM FOR NONLINEAR MIMO MODEL OF MACHINING PROCESSES
International Journal of Innovative Computing, Information and Control ICIC International c 2013 ISSN 1349-4198 Volume 9, Number 4, April 2013 pp. 1455 1475 ARTIFICIAL NEURAL NETWORK WITH HYBRID TAGUCHI-GENETIC
More informationShort-Term Load Forecasting Using ARIMA Model For Karnataka State Electrical Load
International Journal of Engineering Research and Development e-issn: 2278-67X, p-issn: 2278-8X, www.ijerd.com Volume 13, Issue 7 (July 217), PP.75-79 Short-Term Load Forecasting Using ARIMA Model For
More informationFEEDBACK GMDH-TYPE NEURAL NETWORK AND ITS APPLICATION TO MEDICAL IMAGE ANALYSIS OF LIVER CANCER. Tadashi Kondo and Junji Ueno
International Journal of Innovative Computing, Information and Control ICIC International c 2012 ISSN 1349-4198 Volume 8, Number 3(B), March 2012 pp. 2285 2300 FEEDBACK GMDH-TYPE NEURAL NETWORK AND ITS
More informationA new method for short-term load forecasting based on chaotic time series and neural network
A new method for short-term load forecasting based on chaotic time series and neural network Sajjad Kouhi*, Navid Taghizadegan Electrical Engineering Department, Azarbaijan Shahid Madani University, Tabriz,
More informationA SEASONAL FUZZY TIME SERIES FORECASTING METHOD BASED ON GUSTAFSON-KESSEL FUZZY CLUSTERING *
No.2, Vol.1, Winter 2012 2012 Published by JSES. A SEASONAL FUZZY TIME SERIES FORECASTING METHOD BASED ON GUSTAFSON-KESSEL * Faruk ALPASLAN a, Ozge CAGCAG b Abstract Fuzzy time series forecasting methods
More informationDo we need Experts for Time Series Forecasting?
Do we need Experts for Time Series Forecasting? Christiane Lemke and Bogdan Gabrys Bournemouth University - School of Design, Engineering and Computing Poole House, Talbot Campus, Poole, BH12 5BB - United
More informationy(n) Time Series Data
Recurrent SOM with Local Linear Models in Time Series Prediction Timo Koskela, Markus Varsta, Jukka Heikkonen, and Kimmo Kaski Helsinki University of Technology Laboratory of Computational Engineering
More informationShort Term Load Forecasting Using Multi Layer Perceptron
International OPEN ACCESS Journal Of Modern Engineering Research (IJMER) Short Term Load Forecasting Using Multi Layer Perceptron S.Hema Chandra 1, B.Tejaswini 2, B.suneetha 3, N.chandi Priya 4, P.Prathima
More informationReal-Time Travel Time Prediction Using Multi-level k-nearest Neighbor Algorithm and Data Fusion Method
1861 Real-Time Travel Time Prediction Using Multi-level k-nearest Neighbor Algorithm and Data Fusion Method Sehyun Tak 1, Sunghoon Kim 2, Kiate Jang 3 and Hwasoo Yeo 4 1 Smart Transportation System Laboratory,
More informationMODELLING ENERGY DEMAND FORECASTING USING NEURAL NETWORKS WITH UNIVARIATE TIME SERIES
MODELLING ENERGY DEMAND FORECASTING USING NEURAL NETWORKS WITH UNIVARIATE TIME SERIES S. Cankurt 1, M. Yasin 2 1&2 Ishik University Erbil, Iraq 1 s.cankurt@ishik.edu.iq, 2 m.yasin@ishik.edu.iq doi:10.23918/iec2018.26
More informationShort Term Load Forecasting Based Artificial Neural Network
Short Term Load Forecasting Based Artificial Neural Network Dr. Adel M. Dakhil Department of Electrical Engineering Misan University Iraq- Misan Dr.adelmanaa@gmail.com Abstract Present study develops short
More informationForecasting of Electric Consumption in a Semiconductor Plant using Time Series Methods
Forecasting of Electric Consumption in a Semiconductor Plant using Time Series Methods Prayad B. 1* Somsak S. 2 Spansion Thailand Limited 229 Moo 4, Changwattana Road, Pakkred, Nonthaburi 11120 Nonthaburi,
More information4. Multilayer Perceptrons
4. Multilayer Perceptrons This is a supervised error-correction learning algorithm. 1 4.1 Introduction A multilayer feedforward network consists of an input layer, one or more hidden layers, and an output
More informationSTRUCTURED NEURAL NETWORK FOR NONLINEAR DYNAMIC SYSTEMS MODELING
STRUCTURED NEURAL NETWORK FOR NONLINEAR DYNAIC SYSTES ODELING J. CODINA, R. VILLÀ and J.. FUERTES UPC-Facultat d Informàtica de Barcelona, Department of Automatic Control and Computer Engineeering, Pau
More informationP. M. FONTE GONÇALO XUFRE SILVA J. C. QUADRADO DEEA Centro de Matemática DEEA ISEL Rua Conselheiro Emídio Navarro, LISBOA PORTUGAL
Wind Speed Prediction using Artificial Neural Networks P. M. FONTE GONÇALO XUFRE SILVA J. C. QUADRADO DEEA Centro de Matemática DEEA ISEL Rua Conselheiro Emídio Navarro, 1950-072 LISBOA PORTUGAL Abstract:
More informationCSE 417T: Introduction to Machine Learning. Final Review. Henry Chai 12/4/18
CSE 417T: Introduction to Machine Learning Final Review Henry Chai 12/4/18 Overfitting Overfitting is fitting the training data more than is warranted Fitting noise rather than signal 2 Estimating! "#$
More informationGMDH-type Neural Networks with a Feedback Loop and their Application to the Identification of Large-spatial Air Pollution Patterns.
GMDH-type Neural Networks with a Feedback Loop and their Application to the Identification of Large-spatial Air Pollution Patterns. Tadashi Kondo 1 and Abhijit S.Pandya 2 1 School of Medical Sci.,The Univ.of
More informationMODELLING TRAFFIC FLOW ON MOTORWAYS: A HYBRID MACROSCOPIC APPROACH
Proceedings ITRN2013 5-6th September, FITZGERALD, MOUTARI, MARSHALL: Hybrid Aidan Fitzgerald MODELLING TRAFFIC FLOW ON MOTORWAYS: A HYBRID MACROSCOPIC APPROACH Centre for Statistical Science and Operational
More informationA Data-Driven Model for Software Reliability Prediction
A Data-Driven Model for Software Reliability Prediction Author: Jung-Hua Lo IEEE International Conference on Granular Computing (2012) Young Taek Kim KAIST SE Lab. 9/4/2013 Contents Introduction Background
More informationARTIFICIAL NEURAL NETWORKS گروه مطالعاتي 17 بهار 92
ARTIFICIAL NEURAL NETWORKS گروه مطالعاتي 17 بهار 92 BIOLOGICAL INSPIRATIONS Some numbers The human brain contains about 10 billion nerve cells (neurons) Each neuron is connected to the others through 10000
More informationDeep Learning Architecture for Univariate Time Series Forecasting
CS229,Technical Report, 2014 Deep Learning Architecture for Univariate Time Series Forecasting Dmitry Vengertsev 1 Abstract This paper studies the problem of applying machine learning with deep architecture
More informationNeutron inverse kinetics via Gaussian Processes
Neutron inverse kinetics via Gaussian Processes P. Picca Politecnico di Torino, Torino, Italy R. Furfaro University of Arizona, Tucson, Arizona Outline Introduction Review of inverse kinetics techniques
More informationGlossary. The ISI glossary of statistical terms provides definitions in a number of different languages:
Glossary The ISI glossary of statistical terms provides definitions in a number of different languages: http://isi.cbs.nl/glossary/index.htm Adjusted r 2 Adjusted R squared measures the proportion of the
More informationShort Term Load Forecasting for Bakhtar Region Electric Co. Using Multi Layer Perceptron and Fuzzy Inference systems
Short Term Load Forecasting for Bakhtar Region Electric Co. Using Multi Layer Perceptron and Fuzzy Inference systems R. Barzamini, M.B. Menhaj, A. Khosravi, SH. Kamalvand Department of Electrical Engineering,
More informationAutomated Statistical Recognition of Partial Discharges in Insulation Systems.
Automated Statistical Recognition of Partial Discharges in Insulation Systems. Massih-Reza AMINI, Patrick GALLINARI, Florence d ALCHE-BUC LIP6, Université Paris 6, 4 Place Jussieu, F-75252 Paris cedex
More informationWavelet Neural Networks for Nonlinear Time Series Analysis
Applied Mathematical Sciences, Vol. 4, 2010, no. 50, 2485-2495 Wavelet Neural Networks for Nonlinear Time Series Analysis K. K. Minu, M. C. Lineesh and C. Jessy John Department of Mathematics National
More informationCSC2515 Winter 2015 Introduction to Machine Learning. Lecture 2: Linear regression
CSC2515 Winter 2015 Introduction to Machine Learning Lecture 2: Linear regression All lecture slides will be available as.pdf on the course website: http://www.cs.toronto.edu/~urtasun/courses/csc2515/csc2515_winter15.html
More informationData Mining Part 5. Prediction
Data Mining Part 5. Prediction 5.5. Spring 2010 Instructor: Dr. Masoud Yaghini Outline How the Brain Works Artificial Neural Networks Simple Computing Elements Feed-Forward Networks Perceptrons (Single-layer,
More informationRevisiting linear and non-linear methodologies for time series prediction - application to ESTSP 08 competition data
Revisiting linear and non-linear methodologies for time series - application to ESTSP 08 competition data Madalina Olteanu Universite Paris 1 - SAMOS CES 90 Rue de Tolbiac, 75013 Paris - France Abstract.
More informationProbabilistic Energy Forecasting
Probabilistic Energy Forecasting Moritz Schmid Seminar Energieinformatik WS 2015/16 ^ KIT The Research University in the Helmholtz Association www.kit.edu Agenda Forecasting challenges Renewable energy
More informationAn Analysis of the INFFC Cotton Futures Time Series: Lower Bounds and Testbed Design Recommendations
An Analysis of the INFFC Cotton Futures Time Series: Lower Bounds and Testbed Design Recommendations Radu Drossu & Zoran Obradovic rdrossu@eecs.wsu.edu & zoran@eecs.wsu.edu School of Electrical Engineering
More informationModeling Economic Time Series Using a Focused Time Lagged FeedForward Neural Network
Proceedings of Student Research Day, CSIS, Pace University, May 9th, 23 Modeling Economic Time Series Using a Focused Time Lagged FeedForward Neural Network N. Moseley ABSTRACT, - Artificial neural networks
More informationAn Inverse Vibration Problem Solved by an Artificial Neural Network
TEMA Tend. Mat. Apl. Comput., 6, No. 1 (05), 163-175. c Uma Publicação da Sociedade Brasileira de Matemática Aplicada e Computacional. An Inverse Vibration Problem Solved by an Artificial Neural Network
More informationTime series prediction
Chapter 12 Time series prediction Amaury Lendasse, Yongnan Ji, Nima Reyhani, Jin Hao, Antti Sorjamaa 183 184 Time series prediction 12.1 Introduction Amaury Lendasse What is Time series prediction? Time
More informationNeural Networks biological neuron artificial neuron 1
Neural Networks biological neuron artificial neuron 1 A two-layer neural network Output layer (activation represents classification) Weighted connections Hidden layer ( internal representation ) Input
More informationTime Series and Forecasting
Time Series and Forecasting Introduction to Forecasting n What is forecasting? n Primary Function is to Predict the Future using (time series related or other) data we have in hand n Why are we interested?
More informationA Hybrid Method of CART and Artificial Neural Network for Short-term term Load Forecasting in Power Systems
A Hybrid Method of CART and Artificial Neural Network for Short-term term Load Forecasting in Power Systems Hiroyuki Mori Dept. of Electrical & Electronics Engineering Meiji University Tama-ku, Kawasaki
More informationECE521 Lecture 7/8. Logistic Regression
ECE521 Lecture 7/8 Logistic Regression Outline Logistic regression (Continue) A single neuron Learning neural networks Multi-class classification 2 Logistic regression The output of a logistic regression
More informationPATTERN CLASSIFICATION
PATTERN CLASSIFICATION Second Edition Richard O. Duda Peter E. Hart David G. Stork A Wiley-lnterscience Publication JOHN WILEY & SONS, INC. New York Chichester Weinheim Brisbane Singapore Toronto CONTENTS
More informationA FUZZY NEURAL NETWORK MODEL FOR FORECASTING STOCK PRICE
A FUZZY NEURAL NETWORK MODEL FOR FORECASTING STOCK PRICE Li Sheng Institute of intelligent information engineering Zheiang University Hangzhou, 3007, P. R. China ABSTRACT In this paper, a neural network-driven
More informationat least 50 and preferably 100 observations should be available to build a proper model
III Box-Jenkins Methods 1. Pros and Cons of ARIMA Forecasting a) need for data at least 50 and preferably 100 observations should be available to build a proper model used most frequently for hourly or
More informationIncrease of coal burning efficiency via automatic mathematical modeling. Patrick Bangert algorithmica technologies GmbH 1 Germany
Increase of coal burning efficiency via automatic mathematical modeling Patrick Bangert algorithmica technologies GmbH 1 Germany Abstract The entire process of a coal power plant from coal delivery to
More informationA Feature Based Neural Network Model for Weather Forecasting
World Academy of Science, Engineering and Technology 4 2 A Feature Based Neural Network Model for Weather Forecasting Paras, Sanjay Mathur, Avinash Kumar, and Mahesh Chandra Abstract Weather forecasting
More informationSmall sample size generalization
9th Scandinavian Conference on Image Analysis, June 6-9, 1995, Uppsala, Sweden, Preprint Small sample size generalization Robert P.W. Duin Pattern Recognition Group, Faculty of Applied Physics Delft University
More informationReservoir Computing and Echo State Networks
An Introduction to: Reservoir Computing and Echo State Networks Claudio Gallicchio gallicch@di.unipi.it Outline Focus: Supervised learning in domain of sequences Recurrent Neural networks for supervised
More informationANN and Statistical Theory Based Forecasting and Analysis of Power System Variables
ANN and Statistical Theory Based Forecasting and Analysis of Power System Variables Sruthi V. Nair 1, Poonam Kothari 2, Kushal Lodha 3 1,2,3 Lecturer, G. H. Raisoni Institute of Engineering & Technology,
More informationLecture 4: Perceptrons and Multilayer Perceptrons
Lecture 4: Perceptrons and Multilayer Perceptrons Cognitive Systems II - Machine Learning SS 2005 Part I: Basic Approaches of Concept Learning Perceptrons, Artificial Neuronal Networks Lecture 4: Perceptrons
More informationDevelopment of Stochastic Artificial Neural Networks for Hydrological Prediction
Development of Stochastic Artificial Neural Networks for Hydrological Prediction G. B. Kingston, M. F. Lambert and H. R. Maier Centre for Applied Modelling in Water Engineering, School of Civil and Environmental
More informationShort-term wind forecasting using artificial neural networks (ANNs)
Energy and Sustainability II 197 Short-term wind forecasting using artificial neural networks (ANNs) M. G. De Giorgi, A. Ficarella & M. G. Russo Department of Engineering Innovation, Centro Ricerche Energia
More informationNeural Networks and the Back-propagation Algorithm
Neural Networks and the Back-propagation Algorithm Francisco S. Melo In these notes, we provide a brief overview of the main concepts concerning neural networks and the back-propagation algorithm. We closely
More informationEM-algorithm for Training of State-space Models with Application to Time Series Prediction
EM-algorithm for Training of State-space Models with Application to Time Series Prediction Elia Liitiäinen, Nima Reyhani and Amaury Lendasse Helsinki University of Technology - Neural Networks Research
More informationUnivariate versus Multivariate Models for Short-term Electricity Load Forecasting
Univariate versus Multivariate Models for Short-term Electricity Load Forecasting Guilherme Guilhermino Neto 1, Samuel Belini Defilippo 2, Henrique S. Hippert 3 1 IFES Campus Linhares. guilherme.neto@ifes.edu.br
More informationRelevance Vector Machines for Earthquake Response Spectra
2012 2011 American American Transactions Transactions on on Engineering Engineering & Applied Applied Sciences Sciences. American Transactions on Engineering & Applied Sciences http://tuengr.com/ateas
More information22/04/2014. Economic Research
22/04/2014 Economic Research Forecasting Models for Exchange Rate Tuesday, April 22, 2014 The science of prognostics has been going through a rapid and fruitful development in the past decades, with various
More informationDirect Method for Training Feed-forward Neural Networks using Batch Extended Kalman Filter for Multi- Step-Ahead Predictions
Direct Method for Training Feed-forward Neural Networks using Batch Extended Kalman Filter for Multi- Step-Ahead Predictions Artem Chernodub, Institute of Mathematical Machines and Systems NASU, Neurotechnologies
More informationClassification of Ordinal Data Using Neural Networks
Classification of Ordinal Data Using Neural Networks Joaquim Pinto da Costa and Jaime S. Cardoso 2 Faculdade Ciências Universidade Porto, Porto, Portugal jpcosta@fc.up.pt 2 Faculdade Engenharia Universidade
More informationCOMS 4771 Introduction to Machine Learning. Nakul Verma
COMS 4771 Introduction to Machine Learning Nakul Verma Announcements HW1 due next lecture Project details are available decide on the group and topic by Thursday Last time Generative vs. Discriminative
More informationDAMPING MODELLING AND IDENTIFICATION USING GENERALIZED PROPORTIONAL DAMPING
DAMPING MODELLING AND IDENTIFICATION USING GENERALIZED PROPORTIONAL DAMPING S. Adhikari Department of Aerospace Engineering, University of Bristol, Queens Building, University Walk, Bristol BS8 1TR (U.K.)
More informationFEATURE REDUCTION FOR NEURAL NETWORK BASED SMALL-SIGNAL STABILITY ASSESSMENT
FEATURE REDUCTION FOR NEURAL NETWORK BASED SMALL-SIGNAL STABILITY ASSESSMENT S.P. Teeuwsen teeuwsen@uni-duisburg.de University of Duisburg, Germany I. Erlich erlich@uni-duisburg.de M.A. El-Sharkawi elsharkawi@ee.washington.edu
More informationRobust Multi-Objective Optimization in High Dimensional Spaces
Robust Multi-Objective Optimization in High Dimensional Spaces André Sülflow, Nicole Drechsler, and Rolf Drechsler Institute of Computer Science University of Bremen 28359 Bremen, Germany {suelflow,nd,drechsle}@informatik.uni-bremen.de
More informationChoosing Variables with a Genetic Algorithm for Econometric models based on Neural Networks learning and adaptation.
Choosing Variables with a Genetic Algorithm for Econometric models based on Neural Networks learning and adaptation. Daniel Ramírez A., Israel Truijillo E. LINDA LAB, Computer Department, UNAM Facultad
More informationUSE OF FUZZY LOGIC TO INVESTIGATE WEATHER PARAMETER IMPACT ON ELECTRICAL LOAD BASED ON SHORT TERM FORECASTING
Nigerian Journal of Technology (NIJOTECH) Vol. 35, No. 3, July 2016, pp. 562 567 Copyright Faculty of Engineering, University of Nigeria, Nsukka, Print ISSN: 0331-8443, Electronic ISSN: 2467-8821 www.nijotech.com
More informationEEG- Signal Processing
Fatemeh Hadaeghi EEG- Signal Processing Lecture Notes for BSP, Chapter 5 Master Program Data Engineering 1 5 Introduction The complex patterns of neural activity, both in presence and absence of external
More informationACCURATE APPROXIMATION TO THE EXTREME ORDER STATISTICS OF GAUSSIAN SAMPLES
COMMUN. STATIST.-SIMULA., 28(1), 177-188 (1999) ACCURATE APPROXIMATION TO THE EXTREME ORDER STATISTICS OF GAUSSIAN SAMPLES Chien-Chung Chen & Christopher W. Tyler Smith-Kettlewell Eye Research Institute
More informationForecasting Using Time Series Models
Forecasting Using Time Series Models Dr. J Katyayani 1, M Jahnavi 2 Pothugunta Krishna Prasad 3 1 Professor, Department of MBA, SPMVV, Tirupati, India 2 Assistant Professor, Koshys Institute of Management
More informationOn the convergence of the iterative solution of the likelihood equations
On the convergence of the iterative solution of the likelihood equations R. Moddemeijer University of Groningen, Department of Computing Science, P.O. Box 800, NL-9700 AV Groningen, The Netherlands, e-mail:
More informationExploiting Machine Learning Techniques for the Enhancement of Acceptance Sampling
Exploiting Machine Learning Techniques for the Enhancement of Acceptance Sampling Aikaterini Fountoulaki, Nikos Karacapilidis, and Manolis Manatakis International Science Index, Industrial and Manufacturing
More informationFrequency Forecasting using Time Series ARIMA model
Frequency Forecasting using Time Series ARIMA model Manish Kumar Tikariha DGM(O) NSPCL Bhilai Abstract In view of stringent regulatory stance and recent tariff guidelines, Deviation Settlement mechanism
More informationMultilayer Perceptron = FeedForward Neural Network
Multilayer Perceptron = FeedForward Neural Networ History Definition Classification = feedforward operation Learning = bacpropagation = local optimization in the space of weights Pattern Classification
More informationEstimation of the Pre-Consolidation Pressure in Soils Using ANN method
Current World Environment Vol. 11(Special Issue 1), 83-88 (2016) Estimation of the Pre-Consolidation Pressure in Soils Using ANN method M. R. Motahari Department of Civil Engineering, Faculty of Engineering,
More informationA New Look at Nonlinear Time Series Prediction with NARX Recurrent Neural Network. José Maria P. Menezes Jr. and Guilherme A.
A New Look at Nonlinear Time Series Prediction with NARX Recurrent Neural Network José Maria P. Menezes Jr. and Guilherme A. Barreto Department of Teleinformatics Engineering Federal University of Ceará,
More informationDeep Learning for NLP
Deep Learning for NLP CS224N Christopher Manning (Many slides borrowed from ACL 2012/NAACL 2013 Tutorials by me, Richard Socher and Yoshua Bengio) Machine Learning and NLP NER WordNet Usually machine learning
More informationJournal of Chemical and Pharmaceutical Research, 2014, 6(5): Research Article
Available online www.jocpr.com Journal of Chemical and Pharmaceutical Research, 2014, 6(5):266-270 Research Article ISSN : 0975-7384 CODEN(USA) : JCPRC5 Anomaly detection of cigarette sales using ARIMA
More informationPredict Time Series with Multiple Artificial Neural Networks
, pp. 313-324 http://dx.doi.org/10.14257/ijhit.2016.9.7.28 Predict Time Series with Multiple Artificial Neural Networks Fei Li 1, Jin Liu 1 and Lei Kong 2,* 1 College of Information Engineering, Shanghai
More informationThis paper presents the
ISESCO JOURNAL of Science and Technology Volume 8 - Number 14 - November 2012 (2-8) A Novel Ensemble Neural Network based Short-term Wind Power Generation Forecasting in a Microgrid Aymen Chaouachi and
More informationAN INTERACTIVE WAVELET ARTIFICIAL NEURAL NETWORK IN TIME SERIES PREDICTION
AN INTERACTIVE WAVELET ARTIFICIAL NEURAL NETWORK IN TIME SERIES PREDICTION 1 JAIRO MARLON CORRÊA, 2 ANSELMO CHAVES NETO, 3 LUIZ ALBINO TEIXEIRA JÚNIOR, 4 SAMUEL BELLIDO RODRIGUES, 5 EDGAR MANUEL CARREÑO
More informationHow Accurate is My Forecast?
How Accurate is My Forecast? Tao Hong, PhD Utilities Business Unit, SAS 15 May 2012 PLEASE STAND BY Today s event will begin at 11:00am EDT The audio portion of the presentation will be heard through your
More information11/3/15. Deep Learning for NLP. Deep Learning and its Architectures. What is Deep Learning? Advantages of Deep Learning (Part 1)
11/3/15 Machine Learning and NLP Deep Learning for NLP Usually machine learning works well because of human-designed representations and input features CS224N WordNet SRL Parser Machine learning becomes
More informationForecasting Wind Ramps
Forecasting Wind Ramps Erin Summers and Anand Subramanian Jan 5, 20 Introduction The recent increase in the number of wind power producers has necessitated changes in the methods power system operators
More informationOperations Management
3-1 Forecasting Operations Management William J. Stevenson 8 th edition 3-2 Forecasting CHAPTER 3 Forecasting McGraw-Hill/Irwin Operations Management, Eighth Edition, by William J. Stevenson Copyright
More informationImproved the Forecasting of ANN-ARIMA Model Performance: A Case Study of Water Quality at the Offshore Kuala Terengganu, Terengganu, Malaysia
Improved the Forecasting of ANN-ARIMA Model Performance: A Case Study of Water Quality at the Offshore Kuala Terengganu, Terengganu, Malaysia Muhamad Safiih Lola1 Malaysia- safiihmd@umt.edu.my Mohd Noor
More informationComparing the Univariate Modeling Techniques, Box-Jenkins and Artificial Neural Network (ANN) for Measuring of Climate Index
Applied Mathematical Sciences, Vol. 8, 2014, no. 32, 1557-1568 HIKARI Ltd, www.m-hikari.com http://dx.doi.org/10.12988/ams.2014.4150 Comparing the Univariate Modeling Techniques, Box-Jenkins and Artificial
More informationLecture 5: Logistic Regression. Neural Networks
Lecture 5: Logistic Regression. Neural Networks Logistic regression Comparison with generative models Feed-forward neural networks Backpropagation Tricks for training neural networks COMP-652, Lecture
More informationAutomatic Differentiation and Neural Networks
Statistical Machine Learning Notes 7 Automatic Differentiation and Neural Networks Instructor: Justin Domke 1 Introduction The name neural network is sometimes used to refer to many things (e.g. Hopfield
More informationA Sparse Linear Model and Significance Test. for Individual Consumption Prediction
A Sparse Linear Model and Significance Test 1 for Individual Consumption Prediction Pan Li, Baosen Zhang, Yang Weng, and Ram Rajagopal arxiv:1511.01853v3 [stat.ml] 21 Feb 2017 Abstract Accurate prediction
More informationData and prognosis for renewable energy
The Hong Kong Polytechnic University Department of Electrical Engineering Project code: FYP_27 Data and prognosis for renewable energy by Choi Man Hin 14072258D Final Report Bachelor of Engineering (Honours)
More informationDemand Forecasting in Deregulated Electricity Markets
International Journal of Computer Applications (975 8887) Demand Forecasting in Deregulated Electricity Marets Anamia Electrical & Electronics Engineering Department National Institute of Technology Jamshedpur
More informationAn Adaptively Constructing Multilayer Feedforward Neural Networks Using Hermite Polynomials
An Adaptively Constructing Multilayer Feedforward Neural Networks Using Hermite Polynomials L. Ma 1 and K. Khorasani 2 1 Department of Applied Computer Science, Tokyo Polytechnic University, 1583 Iiyama,
More informationDiscussion About Nonlinear Time Series Prediction Using Least Squares Support Vector Machine
Commun. Theor. Phys. (Beijing, China) 43 (2005) pp. 1056 1060 c International Academic Publishers Vol. 43, No. 6, June 15, 2005 Discussion About Nonlinear Time Series Prediction Using Least Squares Support
More information