Forecasting strong seasonal time series with artificial neural networks

Size: px
Start display at page:

Download "Forecasting strong seasonal time series with artificial neural networks"

Transcription

1 Journal of Scientific ADHIKARI & Industrial & Research AGRAWAL : FORECASTING SEASONAL TIME SERIES NEURAL NETWORKS Vol. 7, October 202, pp Forecasting strong seasonal time series with artificial neural networks Ratnadip Adhikari * and R. K. Agrawal School of Computer and Systems Sciences, Jawaharlal Nehru University, New Delhi-0067, INDIA Received 5 May 202 ; revised 6 August 202 ;accepted 6 September 202 Many practical time series often exhibit trends and seasonal patterns. The traditional statistical models eliminate the effect of seasonality from a time series before making future forecasts. As a result, the computational complexities are increased together with substantial reductions in overall forecasting accuracies. This paper comprehensively explores the outstanding ability of Artificial Neural Networks (ANNs) in recognizing and forecasting strong seasonal patterns without removing them from the raw data. Six real-world time series having dominant seasonal fluctuations are used in our work. The performances of the fitted ANN for each of these time series are compared with those of three traditional models both manually as well as through a non-parametric statistical test. The empirical results show that the properly designed ANNs are remarkably efficient in directly forecasting strong seasonal variations as well as outperform each of the three statistical models for all six time series. A robust algorithm together with important practical guidelines is also suggested for ANN forecasting of strong seasonal data. Keywords: Time series forecasting; Seasonal time series; Artificial neural networks; Holt-Winters, Box-Jenkins model, Support vector machine. Introduction Time series modeling and forecasting has received indispensable importance in various areas of science and engineering. Forecasting is performed by carefully analyzing the past observations to develop a proper model which in turn is used to generate future values. Seasonality is a special property which is frequently observed in many economic and financial time series. These are the fluctuations within a year which repeat in each season. Often the seasonal patterns occur with a quarterly, bimonthly or monthly period. A number of models have been derived to routinely analyze and forecast seasonal data. Some of them are the Seasonal autoregressive Integrated Moving Average (SARIMA) models 2, the Holt-Winters (HW) models 3, the Periodic Autoregressive Moving Average (PARMA) models, etc. Recently, Support Vector Machine (SVM) 4-6 has also found notable applications in forecasting seasonal data. These traditional models use either deseasonalization or differencing in order to remove or adjust the seasonal factors. However, several researchers have recently criticized the approach of seasonal adjustment and removal. Miller and Williams 7 observed that the traditional deseasonalization approaches are biased in presence of different types of trends and may even lead to overestimation *Author for correspondence adhikari.ratan@gmail.com of the seasonal components. Ghysels et al. 8 pointed that univariate time series data often show adverse nonlinear properties due to seasonal adjustments. Another critical issue encountered with such techniques is the increased model complexity. These problems are even more apparent when the seasonal pattern of the associated time series is of dominant nature. Thus, it is evident that a model which can directly track the seasonal variations will be more appealing as well as effective. Unfortunately, till now no such adequate statistical model exists in literature. During the last two decades Artificial Neural Networks (ANNs) have been evolved as an attractive and efficient alternative tool for time series forecasting. Several distinguishing properties of ANNs made them extremely popular in the forecasting domain. Their most remarkable feature is the nonlinear, nonparametric, data-driven and selfadaptive nature 9-. Although ANNs have found numerous applications in seasonal time series forecasting, but opinions widely vary regarding their efficiency. Some researchers have claimed that ANNs cannot adequately track the seasonal patterns without removing them 2,3. On the other hand, many studies have favored the just opposite conclusion, i.e. ANNs with seasonally readjusted or differenced data can achieve reasonably better forecasting accuracies than those with raw data 4,5. Moreover, the appropriate network structures and training algorithms must be selected with utmost care for having successful forecasting performances

2 658 J SCI IND RES VOL 7 OCTOBER 202 with ANNs 9,0. At present no rigorous theoretical procedure exists to resolve these ANN model designing issues. In this paper, we comprehensively analyze the effectiveness of ANNs in tracking dominant seasonal patterns. Six practical time series (three monthly and three quarterly) with strong seasonal fluctuations are used to empirically compare the forecasting performances of ANNs with those of SARIMA, HW and SVM methods. The sensitiveness of ANNs to the choice of architectures and training algorithms are studied using different number of hidden nodes and six different versions of the backpropagation training method. We further present an effective ANN forecasting algorithm for univariate seasonal time series. Materials and methods Statistical Models for Seasonal Time Series Forecasting In general the seasonality in a time series can be of two types, viz. additive and multiplicative. In the additive case, the series shows steady seasonal fluctuations whereas in the multiplicative case, the size of seasonal fluctuations varies with the means 2,6. An important tool for studying seasonal components is provided by the sample autocorrelation coefficients,6, defined as: r k = N k t= ( yt y )( yt+ k y) N t= ( yt y) ( N ) k = 0,, K,. 2 where, {y, y 2,...,y N } is the time series with mean y and k denotes the time lag. A graph which shows the successive autocorrelation values against the time lags is known as a correlogram and is indeed very helpful for understanding the intrinsic properties of a time series. The correlogram of a seasonal data exhibits the same kind of oscillations at the seasonal lags 6. Seasonal Autoregressive Integrated Moving Average (SARIMA) Model It is the most widely used statistical technique, developed by Box and Jenkins 2 for seasonal time series forecasting and is in fact a generalization of the wellknown Autoregressive Integrated Moving Average (ARIMA) model. The SARIMA model attempts to capture the seasonal and nonseasonal relationships () among the successive observations in a time series through sequences of ordinary as well as seasonal differencing of the series. This model is mathematically represented as: φ s s ( ) ( ) θ ( ) ( ) B Φ B W = B Θ B Z (2) p P t q Q t Here, B is the lag or backshift operator, defined as By t =y t- ; φ,,, p ΦP θ q Θ Q are the lagged polynomials in B of orders p, P, q and Q, respectively; Z t is a series of purely random errors and W t is the stationary nonseasonal series which is obtained after the differencing processes, i.e. t d s ( ) ( ) D W = B B y (3) t This model is commonly known as the SARIMA(p,d,q) (P,D,Q) s model. The parameters (p, P), (q, Q) stand for the autoregressive and moving average processes respectively while (d, D) specify the degrees of ordinary and seasonal differencing. The appropriate SARIMA model for a time series is usually selected through the three step iterative model-building procedure, suggested by Box and Jenkins 2. Holt-Winters (HW) Model It is a generalization of the well-known exponential smoothing technique to deal with trend and seasonal properties of a time series 3,6. In this method, the local mean level (L t ), trend (T t ) and seasonal index (I t ) of a time series are iteratively updated on the basis of the successive observations as follows: ( ) ( α)( ) ( ) ( γ) ( ) ( δ) Lt = α yt It s + Lt + It Tt = γ Lt Lt + Tt It = δ yt Lt + It s t = 0,,2, K, N where, α,?, and d are the smoothing parameters and s is the seasonal period. After specifying the model structure, the k-step ahead forecast of y t made at time t is given by: ( ) (4) yˆ = L + kt I,( k =,2, K, s) (5) t t t t s+ k There are available analogous formulae for additive case also. The HW method is quite popular for seasonal time series forecasting due to its simplicity, reduced computational

3 ADHIKARI & AGRAWAL : FORECASTING SEASONAL TIME SERIES NEURAL NETWORKS 659 Fig. Architectures of: a) a typical MLP ANN model, b) an SANN model time and reasonably good forecasting accuracy 6. A detailed study about the application of the HW method was carried out by Chatfield and Yar 3 and in this paper we implement it by precisely following their guidelines. Support Vector Machine (SVM) Model SVM is based on the Structural Risk Minimization (SRM) principle and its objective is to find a decision rule with good generalization ability through selecting some special training data points, known as the support vectors 4. Time series forecasting problem is a branch of Support Vector Regression (SVR) in which a maximum margin hyperplane is constructed for correctly classifying real-valued outputsma: 5,7. Given a training data set of N points { i, yi} i= N x with n x i R, yi R, SVM attempts to approximate the unknown data generation function in a linear form. Using Vapnik s e-insensitive loss function, the SVM regression is converted to a Quadratic Programming Problem (QPP) to minimize the empirical risk. Solving the QPP, the optimal decision hyperplane is given by N s x = xx + opt (6) i= * ( ) ( αi αi ) (, i) y K b Here, N s is the number of support vectors, α i and * α (i=, 2,..., N i s ) are the Lagrange multipliers, b opt is the optimal bias and K(x, x i ) is the kernel function. There are different choices for the SVM kernel function and a widely popular one is the Radial Basis Function (RBF) kernel, defined as K(x, y) =exp( x y 2 D 2s 2 ) where s is a tuning parameter. This kernel is used in this paper and the SVM parameters are estimated by cross-validations 5,6,7. Artificial Neural Networks for Seasonal Time Series Forecasting ANNs are a class of flexible computing frameworks which try to mimic the intelligent working paradigm of the human brain for solving a broad range of nonlinear problems. The most widely used ANN model for time series forecasting is the Multilayer Perceptron (MLP) 0, which is characterized by a feedforward architecture of an input layer, one or more hidden layers, and an output layer. Each layer contains a number of nodes which are connected to those in the immediate next layer by acyclic links. Usually, a single hidden layer is sufficient for most applications. An example of an MLP with one hidden layer is depicted in Fig. a. In a typical MLP with p input, h hidden, and a single output node, the relationship between the inputs y t-i (i=, 2,..., p) and the output y t is given by the formula: h p yt = G α0 + α jf β0 j + βijyt i j= i= (7) where a j, ß ij (i=, 2,..., p; j =, 2,..., h) are the connection weights, a 0, ß 0j are the bias terms and F, G are hidden and output layer activation functions respectively. The MLP, given by (7) is commonly referred as a (p, h, ) ANN model which performs a one-step ahead forecasting. Similarly, a (p, h, q) ANN structure can be used for q-step ahead forecasting where q is the number of output nodes.

4 660 J SCI IND RES VOL 7 OCTOBER 202 Over the years, ANNs have found numerous applications for seasonal time series forecasting. However, there are notable variations among the opinions of researchers regarding their efficiency 2-4. Recently, Hamzaçebi 8 has proposed the Seasonal ANN (SANN) model which is shown to be quite effective in forecasting seasonal data. In this model, the number of input and output neurons is represented by the seasonal period s of the time series, as shown in Fig. b. Thus, the SANN model is similar to a typical (s, h, s) ANN where h is the number of hidden nodes. Due to the implementation simplicity and efficiency of the SANN model, we use it as our base network structure in this paper and explore its forecasting strengths for time series with dominant seasonal fluctuations. ANN Model Selection Issues To achieve satisfactory accuracies with ANNs, the adequate model must have to be carefully designed 0,9. The benefit of using the (s, h, s) SANN model is that both the number of input and output nodes are already specified in terms of the seasonal period s and only an appropriate number of hidden nodes are to be determined. For this purpose, one of the following four well-known network selection criteria are often used 9,20 : Akaike Information Criterion (AIC),Bayesian Information Criterion (BIC), Schwarz s Bayesian Criterion (SBC) and Bias-corrected AIC (AIC c ). For a seasonal time series with period s, we select the maximum number of hidden nodes as: s for quarterly data max_hid_nodes = (8) s/2 for monthly data For each seasonal time series, the (s, h, s) SANN model is successively trained on the training set by varying the number of hidden nodes h from to max_hid_nodes and the network selection criteria are calculated. The optimal number of hidden nodes is then selected as that h which minimizes the value of the Bias-corrected AIC (AIC c ). The activation functions of a neural network determine the relationships between the input and output nodes and also introduce some important degree of nonlinearity to the network 0. Among various choices, we use the widely popular logistic function (F) and identity function (G) as the hidden and output layer activation functions, respectively. These are defined as: F( x) = x + e G( x) = x After selecting the proper architecture and activation functions, the next major task in ANN model designing is to select the appropriate training algorithm. So far, the most popular training method is the classic backpropagation (BP) algorithm 2. Despite its simplicity and popularity, the major drawbacks of the BP algorithm are its slow convergence rate and the risk of getting stuck at local minima. Due to these limitations, various modified BP methods have been developed in literature. Some important among them include the Levenberg- Marquardt (LM) 22, Resilient Propagation (RP) 23, Scaled Conjugate Gradient (SCG) 24, One Step Secant (OSS) 25 and Broyden-Fletcher-Goldfarb-Shanno (BFGS) 26 algorithms. Recently, the Particle Swarm Optimization (PSO) 27 has also been successfully applied as an alternative to the BP training. Adhikari and Agrawal 28 have applied PSO-Trelea and PSO-Trelea2 29 for training MLP networks to forecast three seasonal time series and have empirically demonstrated their superior performances over a BP (the RPROP) algorithm. In spite of various modified training algorithms, some crucial limitations of the standard BP technique are still remained unresolved 0. For example, none of the available training algorithms can ensure a unique global optimal solution in general. Thus, it is reasonable to train a neural network with different algorithms and then suitably combine their outputs. This practice aggregates the strengths of individual training algorithms and increases the ANN forecasting accuracy. Here, we train the SANN model for each data with six training algorithms: LM, RPROP, SCG, OSS, PSO-Trelea and PSO-Trelea2 and combine their forecasting results through arithmetic mean and median. At this point, we suggest the following robust algorithm in order to achieve efficient forecasting performances through ANNs for time series with strong seasonal fluctuations. (9) Algorithm: ANN forecasting of time series with strong seasonality Inputs: The dataset: Y=[y, y 2,..., y N ] T of a seasonal time series and the forecast horizon H. Output: The desired forecasted dataset: [ y y K y ] T Y ˆ =,,,. N+ N+ 2 N+ H

5 ADHIKARI & AGRAWAL : FORECASTING SEASONAL TIME SERIES NEURAL NETWORKS 66 Table Descriptions of the six time series datasets Time series Description Seasonal pattern Size Airline Passengers (APTS) USA Accidental Deaths (USADTS) Monthly number of international airline passengers (in thousands) (January 949 December 960). Monthly number of accidental deaths in USA ( ) Red Wine (RWTS) Sales of Australian red wine (January 980 July 995). Quarterly Sales (QSTS) Quarterly Beer Production (QBPTS) Quarterly export values of a French firm for 6 years. Quarterly beer production in USA (millions of barrels) (First quarter of 975 to fourth quarter of 982). USA expenditure (UETS) Quarterly USA new plant/equipment expenditures ( ). Monthly, multiplicative. Total: 44 Testing: 2 Monthly, additive. Total: 72 Monthly, multiplicative. Quarterly, multiplicative. Quarterly, multiplicative. Quarterly, multiplicative. Testing: 2 Total: 87 Testing: 9 Total: 24 Testing: 4 Total: 32 Testing: 8 Total: 52 Testing: 8. Divide the time series Y into appropriate training and validation subsets Y train and Y val respectively. 2. Calculate the autocorrelation coefficients of Y and plot the correlogram. 3. From the correlogram of Y, determine (or ensure) the period of seasonality s. 4. Select the k network training algorithms: TA, TA 2,, TA k. 5. Set i:=. 6. While (id k ) 7. Determine the number of hidden nodes h of the (s, h, s) SANN model for Y. 8. Train and validate the SANN model on Y train and Y val respectively with the training algorithm TA i and use it to forecast the H future observations. 9. Obtain the forecast vector: ( ) ( ) ( ) ( ) Y ˆ i = y i, i,, i. N+ yn+ 2 y K N+ H 0. i:= i +.. End while. 2. Choose an appropriate function f to combine the k forecast vectors. 3. Obtain the final forecast vector as: ( ) ( 2) ( k) ( K ) Yˆ = f Yˆ, Yˆ,, Yˆ. T Results and discussion Six real-world time series with dominant seasonal fluctuations are collected from the open-source Time Series Data Library (TSDL) repository 30. The descriptions of these time series are presented in Table and their time plots are shown in Fig. 2. In this work, the SARIMA, ANN and SVM models are fitted using MATLAB and the HW models are fitted using the R environment 3. The neural network toolbox of MATLAB 32 is used for implementing the four versions of the BP training algorithms, whereas the PSO toolbox by Birge 33 is used for implementing PSO-Trelea and PSO-Trelea2. Each ANN model is trained for 2000 epochs with each training algorithm. Every dataset is preprocessed with appropriate data transformation before fitting the forecasting models and the obtained outputs are again post-processed afterwards. The forecasting accuracies of the fitted models are evaluated in terms of the Mean Squared Error (MSE) and the Mean Absolute Percentage Error (MAPE) which are defined as: MSE= N N test test t= N ( y yˆ ) test y ˆ t yt MAPE= 00 Ntest t= y t t t 2 (0) where y t and y ˆt are respectively the actual and forecasted outputs and N test is the size of the out-of-sample

6 662 J SCI IND RES VOL 7 OCTOBER 202 Fig. 2 Time plots of: (a) APTS, (b) USADTS, (c) RWTS, (d) QSTS, (e) QBPTS, (f) UETS Fig. 3 Boxplots of MSE: a) APTS, b) USADTS, c) RWTS, d) QSTS, e) QBPTS, f) UETS testing dataset. In forecasting applications, both these error measures are desired to be as small as possible. We fit the SARIMA(0,, ) (0,, ) s models to the six time series, s being the corresponding seasonal period. The proper ANN models for APTS, USADTS, RWTS, QSTS, QBPTS and UETS datasets are respectively determined to be the (2,, 2), (2, 2, 2), (2, 2, 2), (4, 2, 4), (4, 2, 4) and (4,, 4) SANNs. We use boxplots in order to depict the effect of changing the number of hidden nodes on ANN forecasting accuracies. The boxplots of forecast MSE and MAPE values, obtained with all six training algorithms across all the datasets are depicted in Fig. 3 and Fig. 4 respectively. Figs 3 and 4 clearly depict that with a non-optimal selection of the number of hidden nodes, both MSE and MAPE values are drastically increased for each time series which indicates reasonable reductions in the ANN forecasting accuracies. This finding emphasizes the paramount importance of appropriate ANN model designing.

7 ADHIKARI & AGRAWAL : FORECASTING SEASONAL TIME SERIES NEURAL NETWORKS 663 Fig. 4 Boxplots of MAPE: a) APTS, b) USADTS, c) RWTS, d) QSTS, e) QBPTS, f) UETS Table 2 Forecasting results of the ANNs and the three traditional statistical models Forecasting Method APTS USAD RWTS QSTS QBP UETS MSE MAPE MSE MAPE MSE MAPE MSE MAPE MSE MAPE MSE MAPE ANN (LM) ANN (RPROP) ANN (SCG) ANN (OSS) ANN (PSO Trelea) ANN (PSO Trelea2) ANN (Mean) ANN (Median) SARIMA SVM HW The summary of the obtained results are presented in Table 2. The MSE values for APTS, USADTS, RWTS and QSTS are given in transformed scales (i.e. original MSE=MSE 0 4 ). The optimal HW and SVM parameters are shown in Table 3. The following important observations are evident from Table 2: The forecast errors obtained by the ANN models with all six training algorithms are significantly less than those obtained by SARIMA, SVM and HW methods for each seasonal time series. There are consistently good forecasting performances of the ANN models with each training algorithm for all the six time series. None among the six training algorithms could perform best for all time series. Considerable improvements in the forecasting accuracies are achieved by combining the different ANN outputs through arithmetic mean and median. These empirical findings suggest that with proper network structure and training algorithm, an ANN model is remarkably efficient in recognizing as well as forecasting strong seasonal fluctuation of a time series without removing it. Also, we observe that ANNs notably outperform the traditional SARIMA, SVM and HW methods in terms of the obtained forecasting accuracies for all the six seasonal time series. Our experimental results also favor the strategy of combining the ANN outputs from different training algorithms, instead of selecting solely an individual one. We use the term forecast diagram in order to refer the graph which shows the actual and predicted observations

8 664 J SCI IND RES VOL 7 OCTOBER 202 Table 3 The optimal HW and SVM model parameters for the six time series Model Parameters APTS USADTS RWTS QSTS QBPTS UETS HW α β NA γ SVM a σ C a Note: C is the positive regularization constant which assigns a penalty to misfit Fig. 5 Forecast diagrams: a) APTS, b) USADTS, c) RWTS, d) QSTS, e) QBPTS, f) UETS Fig.6 Friedman test result for: a) MSE, b) MAPE of a time series. The forecast diagrams obtained by the ANN models, combined through arithmetic mean and median for all six seasonal time series are shown in Fig. 5. We also perform the non-parametric Friedman test 34,35 in order to check whether the applied forecasting methods are equally effective or differ significantly. The results of the Friedman tests for MSE and MAPE are depicted in Figs 6a and 6b. In these figures, the mean rank of a forecasting method is pointed by a circle and the horizontal bar across each circle indicates the critical difference. The performances of two methods differ significantly if their mean ranks differ by at least the critical difference, i.e. if their horizontal bars are non-overlapping.

9 ADHIKARI & AGRAWAL : FORECASTING SEASONAL TIME SERIES NEURAL NETWORKS 665 Figs 6a and 6b reveal that the performances of the ANN models with the six training algorithms do not differ significantly among themselves but are reasonably better than the performances of the three traditional statistical methods. Also, it is clearly visible that the best forecasting accuracies in terms of both MSE and MAPE are achieved by arithmetic mean and median which are used to combine the ANN outputs for different training algorithms. Moreover, Fig. 6a shows that the two ANN combination techniques differ significantly from all the three statistical methods in terms of the obtained MSE values. Similarly, Fig. 6b shows that the ANNs with PSO-Trelea and the two combination techniques differ significantly from the SARIMA and SVM methods in terms of the obtained MAPE values. Conclusions Seasonal fluctuation is a distinctive characteristic of many time series, especially those from the economic and financial domains. The main goal of this paper was to meticulously explore the outstanding ability of artificial neural networks to efficiently recognize as well as forecast strong seasonal patterns. It is observed in our study that contrary to the claims of some researchers, ANNs can successfully learn the inherent seasonal structures in time series without removing them from the raw data. The ANN model designing issues, e.g. the selection of proper network structure, activation functions, training algorithm, etc. are discussed and then a robust algorithm is suggested to achieve effective forecasting performances with ANNs for seasonal data. Empirical analysis is conducted on six real-world time series, each containing dominant seasonal fluctuations. For an unbiased study, the ANN model for each time series is trained with six different training algorithms. The results clearly demonstrate that for each seasonal data, ANNs not only achieved consistently good results but also significantly outperformed three other well-known traditional statistical methods, viz. SARIMA, HW and SVM in terms of obtained forecasting accuracies. It is also observed that the ANN forecasting precisions are remarkably increased through combining the outputs, obtained with different training algorithms. Our findings are further statistically justified through the non-parametric Friedman test. Acknowledgements The first author expresses his gratitude to the Council of Scientific and Industrial Research (CSIR), India for the obtained financial support to perform this research work. References Hipel K W & McLeod A I, Time series modeling of water resources and environmental systems (Elsevier Science Publishing Company, Amsterdam) Box G E P & Jenkins G M, Time series analysis: Forecasting and control, 3rd edn (Holden-Day, California) Chatfield C & Yar M, Holt-winters forecasting: some practical issues, J The Statistician 37 (988) Vapnik V, The Nature of Statistical Learning Theory (Springer- Verlag, New York) Fan Y, Li P & Song Z, Dynamic least square support vector machine, Proc IEEE 6th World Congress on Intelligent Control and Automation (Institute of Electrical and Electronics, Dalian) 2006, Cortez P, Sensitivity analysis for time lag selection to forecast seasonal time series using neural networks and support vector machines, in Int Joint Conf Neural Netw (Barcelona, Spain) 8 23 July 200, 8. 7 Miller D M & Williams D, Damping seasonal factors: Shrinkage estimators for the X-2-ARIMA program, J Forecasting 20 (2004) Ghysels E, Granger C W & Siklos P L, Is seasonal adjustment a linear or nonlinear data filtering process, J Bus Econ Statist4 (996) Zhang, G P, Time series forecasting using a hybrid ARIMA and neural network model, J Neurocomputing 50 (2003) Zhang G, Patuwo B E & Hu M Y, Forecasting with artificial neural networks: The state of the art, J Forecasting 4 (998) Kamruzzaman J, Begg R & Sarker R, Artificial Neural Networks in Finance and Manufacturing (Idea Group Publishing) Hill T, O Connor M & Remus W, Neural networks models for time series forecasts, J Manage Sci 42 (996) Zhang G & Qi M, Neural network forecasting for seasonal and trend time series, Eur J Oper Res 60 (2005) Alon I, Qi M & Sadowski R J, Forecasting aggregate retail sales: a comparison of artificial neural networks and traditional methods, J Retailing and Consumer Services8 (200) Tseng F M, Yu H C & Tzeng G H, Combining neural network model with seasonal time series ARIMA model, J Technol Forecast Soc Change 69 (2002) Chatfield C, The Analysis of Time Series: An Introduction (Chapman Hall, Washington D C) Cao L & Tay E H, Support vector machine with adaptive parameters in financial time series forecasting, J IEEE Trans Neural Netw4 (2003) Hamzaçebi C, Improving artificial neural networks performance in seasonal time series forecasting, J Inform Sci 78 (2006) Kihoro J M, Otieno R O & Wafula C, Seasonal time series forecasting: A comparative study of ARIMA and ANN models, African J Sci and Technol 5 (2004) Faraway J & Chatfield C, Time series forecasting with neural networks: A comparative study using the airline data, J Appl Statist 47 (998) Rumelhart D E, Hinton G E & Williams R J, Learning representations by back-propagating errors, J Nature 323 (986)

10 666 J SCI IND RES VOL 7 OCTOBER Hagan M & Menhaj M, Training feedforward networks with the marquardt algorithm, J IEEE Trans Neural Netw 5 (994) Reidmiller M & Braun H, A direct adaptive method for faster backpropagation learning: The rprop algorithm, in IEEE Int Conf Neural Netw (San Francisco, USA) 993, Moller M F, A scaled conjugate gradient algorithm for fast supervised learning, J Neural Networks 6 (993) Battiti R, One step secant conjugate gradient, J Neural Comput 4 (992) Dennis J E & Schnabel R B, Numerical Methods for Unconstrained Optimization and Nonlinear Equations (Prentice-Hall Publishing, Englewood Cliffs, NJ) Kennedy J, Eberhart R C & Shi Y, Swarm Intelligence (Morgan Kaufmann, San Francisco) Adhikari R & Agrawal R K, Effectiveness of PSO Based Neural Network for Seasonal Time Series Forecasting, in Indian Int Conf on Artificial Intel (IICAI) (Siddaganga Institute of Technology Tumkur) 4-6 December 2006, Trelea I, The particle swarm optimization algorithm: convergence analysis and parameter selection, J Inf Process Lett 85 (2003) Hyndman R J, Time Series Data Library (TSDL), robjhyndman.com/tsdl/, R Development Core Team, R: A Language and Environment for Statistical Computing, (R Foundation for Statistical Computing, Vienna, Austria) Demuth H, Beale M & Hagan M, Neural Network Toolbox User s Guide (The MathWorks, Natic, MA) Birge B, PSOt-A Particle Swarm Optimization Toolbox for use with Matlab, in IEEE Swarm Intel Symp (Indianapolis, Indiana, USA) April Friedman M, The use of ranks to avoid the assumption of normality implicit in the analysis of variance. J Amer Statist Assoc 32 (937) Hollander M & Wolfe D A, Nonparametric Statistical Methods (John Wiley & Sons, Inc, Hoboken, NJ) 999.

Do we need Experts for Time Series Forecasting?

Do we need Experts for Time Series Forecasting? Do we need Experts for Time Series Forecasting? Christiane Lemke and Bogdan Gabrys Bournemouth University - School of Design, Engineering and Computing Poole House, Talbot Campus, Poole, BH12 5BB - United

More information

TIME SERIES ANALYSIS AND FORECASTING USING THE STATISTICAL MODEL ARIMA

TIME SERIES ANALYSIS AND FORECASTING USING THE STATISTICAL MODEL ARIMA CHAPTER 6 TIME SERIES ANALYSIS AND FORECASTING USING THE STATISTICAL MODEL ARIMA 6.1. Introduction A time series is a sequence of observations ordered in time. A basic assumption in the time series analysis

More information

SARIMA-ELM Hybrid Model for Forecasting Tourist in Nepal

SARIMA-ELM Hybrid Model for Forecasting Tourist in Nepal Volume-03 Issue-07 July-2018 ISSN: 2455-3085 (Online) www.rrjournals.com [UGC Listed Journal] SARIMA-ELM Hybrid Model for Forecasting Tourist in Nepal *1 Kadek Jemmy Waciko & 2 Ismail B *1 Research Scholar,

More information

Iterative ARIMA-Multiple Support Vector Regression models for long term time series prediction

Iterative ARIMA-Multiple Support Vector Regression models for long term time series prediction and Machine Learning Bruges (Belgium), 23-25 April 24, i6doccom publ, ISBN 978-2874995-7 Available from http://wwwi6doccom/fr/livre/?gcoi=2843244 Iterative ARIMA-Multiple Support Vector Regression models

More information

Agricultural Price Forecasting Using Neural Network Model: An Innovative Information Delivery System

Agricultural Price Forecasting Using Neural Network Model: An Innovative Information Delivery System Agricultural Economics Research Review Vol. 26 (No.2) July-December 2013 pp 229-239 Agricultural Price Forecasting Using Neural Network Model: An Innovative Information Delivery System Girish K. Jha *a

More information

Comparing the Univariate Modeling Techniques, Box-Jenkins and Artificial Neural Network (ANN) for Measuring of Climate Index

Comparing the Univariate Modeling Techniques, Box-Jenkins and Artificial Neural Network (ANN) for Measuring of Climate Index Applied Mathematical Sciences, Vol. 8, 2014, no. 32, 1557-1568 HIKARI Ltd, www.m-hikari.com http://dx.doi.org/10.12988/ams.2014.4150 Comparing the Univariate Modeling Techniques, Box-Jenkins and Artificial

More information

TIME SERIES DATA PREDICTION OF NATURAL GAS CONSUMPTION USING ARIMA MODEL

TIME SERIES DATA PREDICTION OF NATURAL GAS CONSUMPTION USING ARIMA MODEL International Journal of Information Technology & Management Information System (IJITMIS) Volume 7, Issue 3, Sep-Dec-2016, pp. 01 07, Article ID: IJITMIS_07_03_001 Available online at http://www.iaeme.com/ijitmis/issues.asp?jtype=ijitmis&vtype=7&itype=3

More information

Sensitivity Analysis for Time Lag Selection to Forecast Seasonal Time Series using Neural Networks and Support Vector Machines

Sensitivity Analysis for Time Lag Selection to Forecast Seasonal Time Series using Neural Networks and Support Vector Machines WCCI 2010 IEEE World Congress on Computational Intelligence July, 18-23, 2010 - CCIB, Barcelona, Spain IJCNN Sensitivity Analysis for Time Lag Selection to Forecast Seasonal Time Series using Neural Networks

More information

Journal of of Computer Applications Research Research and Development and Development (JCARD), ISSN (Print), ISSN

Journal of of Computer Applications Research Research and Development and Development (JCARD), ISSN (Print), ISSN JCARD Journal of of Computer Applications Research Research and Development and Development (JCARD), ISSN 2248-9304(Print), ISSN 2248-9312 (JCARD),(Online) ISSN 2248-9304(Print), Volume 1, Number ISSN

More information

Modified Holt s Linear Trend Method

Modified Holt s Linear Trend Method Modified Holt s Linear Trend Method Guckan Yapar, Sedat Capar, Hanife Taylan Selamlar and Idil Yavuz Abstract Exponential smoothing models are simple, accurate and robust forecasting models and because

More information

(SARIMA) SARIMA 1390 **

(SARIMA)  SARIMA 1390 ** * SARIMA sharzeie@utacir ** amirhoseinghafarinejad@gmailcom (SARIMA 6 88 9 85 9 Q4 C45 D G JEL * 90 ** ( 85 000 600 55 9 000,0 9 85 (ATM ( 89 00, (ANN Automated Teller Machine Artificial Neural Networks

More information

Electric Load Forecasting Using Wavelet Transform and Extreme Learning Machine

Electric Load Forecasting Using Wavelet Transform and Extreme Learning Machine Electric Load Forecasting Using Wavelet Transform and Extreme Learning Machine Song Li 1, Peng Wang 1 and Lalit Goel 1 1 School of Electrical and Electronic Engineering Nanyang Technological University

More information

ARIMA modeling to forecast area and production of rice in West Bengal

ARIMA modeling to forecast area and production of rice in West Bengal Journal of Crop and Weed, 9(2):26-31(2013) ARIMA modeling to forecast area and production of rice in West Bengal R. BISWAS AND B. BHATTACHARYYA Department of Agricultural Statistics Bidhan Chandra Krishi

More information

MODELLING ENERGY DEMAND FORECASTING USING NEURAL NETWORKS WITH UNIVARIATE TIME SERIES

MODELLING ENERGY DEMAND FORECASTING USING NEURAL NETWORKS WITH UNIVARIATE TIME SERIES MODELLING ENERGY DEMAND FORECASTING USING NEURAL NETWORKS WITH UNIVARIATE TIME SERIES S. Cankurt 1, M. Yasin 2 1&2 Ishik University Erbil, Iraq 1 s.cankurt@ishik.edu.iq, 2 m.yasin@ishik.edu.iq doi:10.23918/iec2018.26

More information

A Support Vector Regression Model for Forecasting Rainfall

A Support Vector Regression Model for Forecasting Rainfall A Support Vector Regression for Forecasting Nasimul Hasan 1, Nayan Chandra Nath 1, Risul Islam Rasel 2 Department of Computer Science and Engineering, International Islamic University Chittagong, Bangladesh

More information

FORECASTING OF ECONOMIC QUANTITIES USING FUZZY AUTOREGRESSIVE MODEL AND FUZZY NEURAL NETWORK

FORECASTING OF ECONOMIC QUANTITIES USING FUZZY AUTOREGRESSIVE MODEL AND FUZZY NEURAL NETWORK FORECASTING OF ECONOMIC QUANTITIES USING FUZZY AUTOREGRESSIVE MODEL AND FUZZY NEURAL NETWORK Dusan Marcek Silesian University, Institute of Computer Science Opava Research Institute of the IT4Innovations

More information

A SEASONAL FUZZY TIME SERIES FORECASTING METHOD BASED ON GUSTAFSON-KESSEL FUZZY CLUSTERING *

A SEASONAL FUZZY TIME SERIES FORECASTING METHOD BASED ON GUSTAFSON-KESSEL FUZZY CLUSTERING * No.2, Vol.1, Winter 2012 2012 Published by JSES. A SEASONAL FUZZY TIME SERIES FORECASTING METHOD BASED ON GUSTAFSON-KESSEL * Faruk ALPASLAN a, Ozge CAGCAG b Abstract Fuzzy time series forecasting methods

More information

FORECASTING SUGARCANE PRODUCTION IN INDIA WITH ARIMA MODEL

FORECASTING SUGARCANE PRODUCTION IN INDIA WITH ARIMA MODEL FORECASTING SUGARCANE PRODUCTION IN INDIA WITH ARIMA MODEL B. N. MANDAL Abstract: Yearly sugarcane production data for the period of - to - of India were analyzed by time-series methods. Autocorrelation

More information

Data and prognosis for renewable energy

Data and prognosis for renewable energy The Hong Kong Polytechnic University Department of Electrical Engineering Project code: FYP_27 Data and prognosis for renewable energy by Choi Man Hin 14072258D Final Report Bachelor of Engineering (Honours)

More information

Automatic Forecasting

Automatic Forecasting Automatic Forecasting Summary The Automatic Forecasting procedure is designed to forecast future values of time series data. A time series consists of a set of sequential numeric data taken at equally

More information

A Particle Swarm Optimization (PSO) Primer

A Particle Swarm Optimization (PSO) Primer A Particle Swarm Optimization (PSO) Primer With Applications Brian Birge Overview Introduction Theory Applications Computational Intelligence Summary Introduction Subset of Evolutionary Computation Genetic

More information

Revisiting linear and non-linear methodologies for time series prediction - application to ESTSP 08 competition data

Revisiting linear and non-linear methodologies for time series prediction - application to ESTSP 08 competition data Revisiting linear and non-linear methodologies for time series - application to ESTSP 08 competition data Madalina Olteanu Universite Paris 1 - SAMOS CES 90 Rue de Tolbiac, 75013 Paris - France Abstract.

More information

Selection of the Appropriate Lag Structure of Foreign Exchange Rates Forecasting Based on Autocorrelation Coefficient

Selection of the Appropriate Lag Structure of Foreign Exchange Rates Forecasting Based on Autocorrelation Coefficient Selection of the Appropriate Lag Structure of Foreign Exchange Rates Forecasting Based on Autocorrelation Coefficient Wei Huang 1,2, Shouyang Wang 2, Hui Zhang 3,4, and Renbin Xiao 1 1 School of Management,

More information

Data Mining Part 5. Prediction

Data Mining Part 5. Prediction Data Mining Part 5. Prediction 5.5. Spring 2010 Instructor: Dr. Masoud Yaghini Outline How the Brain Works Artificial Neural Networks Simple Computing Elements Feed-Forward Networks Perceptrons (Single-layer,

More information

Deep Learning Architecture for Univariate Time Series Forecasting

Deep Learning Architecture for Univariate Time Series Forecasting CS229,Technical Report, 2014 Deep Learning Architecture for Univariate Time Series Forecasting Dmitry Vengertsev 1 Abstract This paper studies the problem of applying machine learning with deep architecture

More information

Research Article Stacked Heterogeneous Neural Networks for Time Series Forecasting

Research Article Stacked Heterogeneous Neural Networks for Time Series Forecasting Hindawi Publishing Corporation Mathematical Problems in Engineering Volume 21, Article ID 373648, 2 pages doi:1.1155/21/373648 Research Article Stacked Heterogeneous Neural Networks for Time Series Forecasting

More information

FORECASTING YIELD PER HECTARE OF RICE IN ANDHRA PRADESH

FORECASTING YIELD PER HECTARE OF RICE IN ANDHRA PRADESH International Journal of Mathematics and Computer Applications Research (IJMCAR) ISSN 49-6955 Vol. 3, Issue 1, Mar 013, 9-14 TJPRC Pvt. Ltd. FORECASTING YIELD PER HECTARE OF RICE IN ANDHRA PRADESH R. RAMAKRISHNA

More information

NONLINEAR CLASSIFICATION AND REGRESSION. J. Elder CSE 4404/5327 Introduction to Machine Learning and Pattern Recognition

NONLINEAR CLASSIFICATION AND REGRESSION. J. Elder CSE 4404/5327 Introduction to Machine Learning and Pattern Recognition NONLINEAR CLASSIFICATION AND REGRESSION Nonlinear Classification and Regression: Outline 2 Multi-Layer Perceptrons The Back-Propagation Learning Algorithm Generalized Linear Models Radial Basis Function

More information

Second-order Learning Algorithm with Squared Penalty Term

Second-order Learning Algorithm with Squared Penalty Term Second-order Learning Algorithm with Squared Penalty Term Kazumi Saito Ryohei Nakano NTT Communication Science Laboratories 2 Hikaridai, Seika-cho, Soraku-gun, Kyoto 69-2 Japan {saito,nakano}@cslab.kecl.ntt.jp

More information

Automatic modelling of neural networks for time series prediction in search of a uniform methodology across varying time frequencies

Automatic modelling of neural networks for time series prediction in search of a uniform methodology across varying time frequencies Automatic modelling of neural networks for time series prediction in search of a uniform methodology across varying time frequencies Nikolaos Kourentzes and Sven F. Crone Lancaster University Management

More information

The particle swarm optimization algorithm: convergence analysis and parameter selection

The particle swarm optimization algorithm: convergence analysis and parameter selection Information Processing Letters 85 (2003) 317 325 www.elsevier.com/locate/ipl The particle swarm optimization algorithm: convergence analysis and parameter selection Ioan Cristian Trelea INA P-G, UMR Génie

More information

Application of Artificial Neural Networks in Evaluation and Identification of Electrical Loss in Transformers According to the Energy Consumption

Application of Artificial Neural Networks in Evaluation and Identification of Electrical Loss in Transformers According to the Energy Consumption Application of Artificial Neural Networks in Evaluation and Identification of Electrical Loss in Transformers According to the Energy Consumption ANDRÉ NUNES DE SOUZA, JOSÉ ALFREDO C. ULSON, IVAN NUNES

More information

Neural Network to Control Output of Hidden Node According to Input Patterns

Neural Network to Control Output of Hidden Node According to Input Patterns American Journal of Intelligent Systems 24, 4(5): 96-23 DOI:.5923/j.ajis.2445.2 Neural Network to Control Output of Hidden Node According to Input Patterns Takafumi Sasakawa, Jun Sawamoto 2,*, Hidekazu

More information

FORECASTING SAVING DEPOSIT IN MALAYSIAN ISLAMIC BANKING: COMPARISON BETWEEN ARTIFICIAL NEURAL NETWORK AND ARIMA

FORECASTING SAVING DEPOSIT IN MALAYSIAN ISLAMIC BANKING: COMPARISON BETWEEN ARTIFICIAL NEURAL NETWORK AND ARIMA Jurnal Ekonomi dan Studi Pembangunan Volume 8, Nomor 2, Oktober 2007: 154-161 FORECASTING SAVING DEPOSIT IN MALAYSIAN ISLAMIC BANKING: COMPARISON BETWEEN ARTIFICIAL NEURAL NETWORK AND ARIMA Raditya Sukmana

More information

A Machine Learning Approach to Define Weights for Linear Combination of Forecasts

A Machine Learning Approach to Define Weights for Linear Combination of Forecasts A Machine Learning Approach to Define Weights for Linear Combination of Forecasts Ricardo Prudêncio 1 and Teresa Ludermir 2 1 Departament of Information Science, Federal University of Pernambuco, Av. dos

More information

A New Weight Initialization using Statistically Resilient Method and Moore-Penrose Inverse Method for SFANN

A New Weight Initialization using Statistically Resilient Method and Moore-Penrose Inverse Method for SFANN A New Weight Initialization using Statistically Resilient Method and Moore-Penrose Inverse Method for SFANN Apeksha Mittal, Amit Prakash Singh and Pravin Chandra University School of Information and Communication

More information

Keywords- Source coding, Huffman encoding, Artificial neural network, Multilayer perceptron, Backpropagation algorithm

Keywords- Source coding, Huffman encoding, Artificial neural network, Multilayer perceptron, Backpropagation algorithm Volume 4, Issue 5, May 2014 ISSN: 2277 128X International Journal of Advanced Research in Computer Science and Software Engineering Research Paper Available online at: www.ijarcsse.com Huffman Encoding

More information

Improved the Forecasting of ANN-ARIMA Model Performance: A Case Study of Water Quality at the Offshore Kuala Terengganu, Terengganu, Malaysia

Improved the Forecasting of ANN-ARIMA Model Performance: A Case Study of Water Quality at the Offshore Kuala Terengganu, Terengganu, Malaysia Improved the Forecasting of ANN-ARIMA Model Performance: A Case Study of Water Quality at the Offshore Kuala Terengganu, Terengganu, Malaysia Muhamad Safiih Lola1 Malaysia- safiihmd@umt.edu.my Mohd Noor

More information

A new method for short-term load forecasting based on chaotic time series and neural network

A new method for short-term load forecasting based on chaotic time series and neural network A new method for short-term load forecasting based on chaotic time series and neural network Sajjad Kouhi*, Navid Taghizadegan Electrical Engineering Department, Azarbaijan Shahid Madani University, Tabriz,

More information

Analysis of Fast Input Selection: Application in Time Series Prediction

Analysis of Fast Input Selection: Application in Time Series Prediction Analysis of Fast Input Selection: Application in Time Series Prediction Jarkko Tikka, Amaury Lendasse, and Jaakko Hollmén Helsinki University of Technology, Laboratory of Computer and Information Science,

More information

On the benefit of using time series features for choosing a forecasting method

On the benefit of using time series features for choosing a forecasting method On the benefit of using time series features for choosing a forecasting method Christiane Lemke and Bogdan Gabrys Bournemouth University - School of Design, Engineering and Computing Poole House, Talbot

More information

NATCOR. Forecast Evaluation. Forecasting with ARIMA models. Nikolaos Kourentzes

NATCOR. Forecast Evaluation. Forecasting with ARIMA models. Nikolaos Kourentzes NATCOR Forecast Evaluation Forecasting with ARIMA models Nikolaos Kourentzes n.kourentzes@lancaster.ac.uk O u t l i n e 1. Bias measures 2. Accuracy measures 3. Evaluation schemes 4. Prediction intervals

More information

A Comparison of Time Series Models for Forecasting Outbound Air Travel Demand *

A Comparison of Time Series Models for Forecasting Outbound Air Travel Demand * Journal of Aeronautics, Astronautics and Aviation, Series A, Vol.42, No.2 pp.073-078 (200) 73 A Comparison of Time Series Models for Forecasting Outbound Air Travel Demand * Yu-Wei Chang ** and Meng-Yuan

More information

The Neural Support Vector Machine

The Neural Support Vector Machine The Neural Support Vector Machine M.A. Wiering a M.H. van der Ree a M.J. Embrechts b M.F. Stollenga c A. Meijster a A. Nolte d L.R.B. Schomaker a a Institute of Artificial Intelligence and Cognitive Engineering,

More information

Neural Networks and the Back-propagation Algorithm

Neural Networks and the Back-propagation Algorithm Neural Networks and the Back-propagation Algorithm Francisco S. Melo In these notes, we provide a brief overview of the main concepts concerning neural networks and the back-propagation algorithm. We closely

More information

Intelligent Modular Neural Network for Dynamic System Parameter Estimation

Intelligent Modular Neural Network for Dynamic System Parameter Estimation Intelligent Modular Neural Network for Dynamic System Parameter Estimation Andrzej Materka Technical University of Lodz, Institute of Electronics Stefanowskiego 18, 9-537 Lodz, Poland Abstract: A technique

More information

DEPARTMENT OF QUANTITATIVE METHODS & INFORMATION SYSTEMS

DEPARTMENT OF QUANTITATIVE METHODS & INFORMATION SYSTEMS DEPARTMENT OF QUANTITATIVE METHODS & INFORMATION SYSTEMS Moving Averages and Smoothing Methods ECON 504 Chapter 7 Fall 2013 Dr. Mohammad Zainal 2 This chapter will describe three simple approaches to forecasting

More information

Address for Correspondence

Address for Correspondence Research Article APPLICATION OF ARTIFICIAL NEURAL NETWORK FOR INTERFERENCE STUDIES OF LOW-RISE BUILDINGS 1 Narayan K*, 2 Gairola A Address for Correspondence 1 Associate Professor, Department of Civil

More information

Artificial Neural Network

Artificial Neural Network Artificial Neural Network Contents 2 What is ANN? Biological Neuron Structure of Neuron Types of Neuron Models of Neuron Analogy with human NN Perceptron OCR Multilayer Neural Network Back propagation

More information

Artificial Neural Networks (ANN) Xiaogang Su, Ph.D. Department of Mathematical Science University of Texas at El Paso

Artificial Neural Networks (ANN) Xiaogang Su, Ph.D. Department of Mathematical Science University of Texas at El Paso Artificial Neural Networks (ANN) Xiaogang Su, Ph.D. Department of Mathematical Science University of Texas at El Paso xsu@utep.edu Fall, 2018 Outline Introduction A Brief History ANN Architecture Terminology

More information

Empirical Approach to Modelling and Forecasting Inflation in Ghana

Empirical Approach to Modelling and Forecasting Inflation in Ghana Current Research Journal of Economic Theory 4(3): 83-87, 2012 ISSN: 2042-485X Maxwell Scientific Organization, 2012 Submitted: April 13, 2012 Accepted: May 06, 2012 Published: June 30, 2012 Empirical Approach

More information

An artificial neural networks (ANNs) model is a functional abstraction of the

An artificial neural networks (ANNs) model is a functional abstraction of the CHAPER 3 3. Introduction An artificial neural networs (ANNs) model is a functional abstraction of the biological neural structures of the central nervous system. hey are composed of many simple and highly

More information

22/04/2014. Economic Research

22/04/2014. Economic Research 22/04/2014 Economic Research Forecasting Models for Exchange Rate Tuesday, April 22, 2014 The science of prognostics has been going through a rapid and fruitful development in the past decades, with various

More information

Neural network modelling of reinforced concrete beam shear capacity

Neural network modelling of reinforced concrete beam shear capacity icccbe 2010 Nottingham University Press Proceedings of the International Conference on Computing in Civil and Building Engineering W Tizani (Editor) Neural network modelling of reinforced concrete beam

More information

Univariate ARIMA Models

Univariate ARIMA Models Univariate ARIMA Models ARIMA Model Building Steps: Identification: Using graphs, statistics, ACFs and PACFs, transformations, etc. to achieve stationary and tentatively identify patterns and model components.

More information

NON-FIXED AND ASYMMETRICAL MARGIN APPROACH TO STOCK MARKET PREDICTION USING SUPPORT VECTOR REGRESSION. Haiqin Yang, Irwin King and Laiwan Chan

NON-FIXED AND ASYMMETRICAL MARGIN APPROACH TO STOCK MARKET PREDICTION USING SUPPORT VECTOR REGRESSION. Haiqin Yang, Irwin King and Laiwan Chan In The Proceedings of ICONIP 2002, Singapore, 2002. NON-FIXED AND ASYMMETRICAL MARGIN APPROACH TO STOCK MARKET PREDICTION USING SUPPORT VECTOR REGRESSION Haiqin Yang, Irwin King and Laiwan Chan Department

More information

A Comparison of the Forecast Performance of. Double Seasonal ARIMA and Double Seasonal. ARFIMA Models of Electricity Load Demand

A Comparison of the Forecast Performance of. Double Seasonal ARIMA and Double Seasonal. ARFIMA Models of Electricity Load Demand Applied Mathematical Sciences, Vol. 6, 0, no. 35, 6705-67 A Comparison of the Forecast Performance of Double Seasonal ARIMA and Double Seasonal ARFIMA Models of Electricity Load Demand Siti Normah Hassan

More information

5 Autoregressive-Moving-Average Modeling

5 Autoregressive-Moving-Average Modeling 5 Autoregressive-Moving-Average Modeling 5. Purpose. Autoregressive-moving-average (ARMA models are mathematical models of the persistence, or autocorrelation, in a time series. ARMA models are widely

More information

Forecasting Area, Production and Yield of Cotton in India using ARIMA Model

Forecasting Area, Production and Yield of Cotton in India using ARIMA Model Forecasting Area, Production and Yield of Cotton in India using ARIMA Model M. K. Debnath 1, Kartic Bera 2 *, P. Mishra 1 1 Department of Agricultural Statistics, Bidhan Chanda Krishi Vishwavidyalaya,

More information

MODELLING TRAFFIC FLOW ON MOTORWAYS: A HYBRID MACROSCOPIC APPROACH

MODELLING TRAFFIC FLOW ON MOTORWAYS: A HYBRID MACROSCOPIC APPROACH Proceedings ITRN2013 5-6th September, FITZGERALD, MOUTARI, MARSHALL: Hybrid Aidan Fitzgerald MODELLING TRAFFIC FLOW ON MOTORWAYS: A HYBRID MACROSCOPIC APPROACH Centre for Statistical Science and Operational

More information

4. Multilayer Perceptrons

4. Multilayer Perceptrons 4. Multilayer Perceptrons This is a supervised error-correction learning algorithm. 1 4.1 Introduction A multilayer feedforward network consists of an input layer, one or more hidden layers, and an output

More information

Econometric Forecasting

Econometric Forecasting Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 1, 2014 Outline Introduction Model-free extrapolation Univariate time-series models Trend

More information

Seasonal Autoregressive Integrated Moving Average Model for Precipitation Time Series

Seasonal Autoregressive Integrated Moving Average Model for Precipitation Time Series Journal of Mathematics and Statistics 8 (4): 500-505, 2012 ISSN 1549-3644 2012 doi:10.3844/jmssp.2012.500.505 Published Online 8 (4) 2012 (http://www.thescipub.com/jmss.toc) Seasonal Autoregressive Integrated

More information

Advanced analysis and modelling tools for spatial environmental data. Case study: indoor radon data in Switzerland

Advanced analysis and modelling tools for spatial environmental data. Case study: indoor radon data in Switzerland EnviroInfo 2004 (Geneva) Sh@ring EnviroInfo 2004 Advanced analysis and modelling tools for spatial environmental data. Case study: indoor radon data in Switzerland Mikhail Kanevski 1, Michel Maignan 1

More information

Predicting Wheat Production in Iran Using an Artificial Neural Networks Approach

Predicting Wheat Production in Iran Using an Artificial Neural Networks Approach Predicting Wheat Production in Iran Using an Artificial Neural Networks Approach Reza Ghodsi, Ruzbeh Mirabdollah Yani, Rana Jalali, Mahsa Ruzbahman Industrial Engineering Department, University of Tehran,

More information

Deep Feedforward Networks

Deep Feedforward Networks Deep Feedforward Networks Liu Yang March 30, 2017 Liu Yang Short title March 30, 2017 1 / 24 Overview 1 Background A general introduction Example 2 Gradient based learning Cost functions Output Units 3

More information

ECE662: Pattern Recognition and Decision Making Processes: HW TWO

ECE662: Pattern Recognition and Decision Making Processes: HW TWO ECE662: Pattern Recognition and Decision Making Processes: HW TWO Purdue University Department of Electrical and Computer Engineering West Lafayette, INDIANA, USA Abstract. In this report experiments are

More information

Design of Time Series Model for Road Accident Fatal Death in Tamilnadu

Design of Time Series Model for Road Accident Fatal Death in Tamilnadu Volume 109 No. 8 2016, 225-232 ISSN: 1311-8080 (printed version); ISSN: 1314-3395 (on-line version) url: http://www.ijpam.eu ijpam.eu Design of Time Series Model for Road Accident Fatal Death in Tamilnadu

More information

FORECASTING COARSE RICE PRICES IN BANGLADESH

FORECASTING COARSE RICE PRICES IN BANGLADESH Progress. Agric. 22(1 & 2): 193 201, 2011 ISSN 1017-8139 FORECASTING COARSE RICE PRICES IN BANGLADESH M. F. Hassan*, M. A. Islam 1, M. F. Imam 2 and S. M. Sayem 3 Department of Agricultural Statistics,

More information

Kalman Filter and SVR Combinations in Forecasting US Unemployment

Kalman Filter and SVR Combinations in Forecasting US Unemployment Kalman Filter and SVR Combinations in Forecasting US Unemployment Georgios Sermpinis 1, Charalampos Stasinakis 1, and Andreas Karathanasopoulos 2 1 University of Glasgow Business School georgios.sermpinis@glasgow.ac.uk,

More information

Prediction of Monthly Rainfall of Nainital Region using Artificial Neural Network (ANN) and Support Vector Machine (SVM)

Prediction of Monthly Rainfall of Nainital Region using Artificial Neural Network (ANN) and Support Vector Machine (SVM) Vol- Issue-3 25 Prediction of ly of Nainital Region using Artificial Neural Network (ANN) and Support Vector Machine (SVM) Deepa Bisht*, Mahesh C Joshi*, Ashish Mehta** *Department of Mathematics **Department

More information

6.867 Machine learning

6.867 Machine learning 6.867 Machine learning Mid-term eam October 8, 6 ( points) Your name and MIT ID: .5.5 y.5 y.5 a).5.5 b).5.5.5.5 y.5 y.5 c).5.5 d).5.5 Figure : Plots of linear regression results with different types of

More information

Support Vector Ordinal Regression using Privileged Information

Support Vector Ordinal Regression using Privileged Information Support Vector Ordinal Regression using Privileged Information Fengzhen Tang 1, Peter Tiňo 2, Pedro Antonio Gutiérrez 3 and Huanhuan Chen 4 1,2,4- The University of Birmingham, School of Computer Science,

More information

Machine Learning Practice Page 2 of 2 10/28/13

Machine Learning Practice Page 2 of 2 10/28/13 Machine Learning 10-701 Practice Page 2 of 2 10/28/13 1. True or False Please give an explanation for your answer, this is worth 1 pt/question. (a) (2 points) No classifier can do better than a naive Bayes

More information

Statistical Machine Learning from Data

Statistical Machine Learning from Data January 17, 2006 Samy Bengio Statistical Machine Learning from Data 1 Statistical Machine Learning from Data Multi-Layer Perceptrons Samy Bengio IDIAP Research Institute, Martigny, Switzerland, and Ecole

More information

Frequency Forecasting using Time Series ARIMA model

Frequency Forecasting using Time Series ARIMA model Frequency Forecasting using Time Series ARIMA model Manish Kumar Tikariha DGM(O) NSPCL Bhilai Abstract In view of stringent regulatory stance and recent tariff guidelines, Deviation Settlement mechanism

More information

A Novel Activity Detection Method

A Novel Activity Detection Method A Novel Activity Detection Method Gismy George P.G. Student, Department of ECE, Ilahia College of,muvattupuzha, Kerala, India ABSTRACT: This paper presents an approach for activity state recognition of

More information

COMPUTATIONAL INTELLIGENCE (INTRODUCTION TO MACHINE LEARNING) SS16

COMPUTATIONAL INTELLIGENCE (INTRODUCTION TO MACHINE LEARNING) SS16 COMPUTATIONAL INTELLIGENCE (INTRODUCTION TO MACHINE LEARNING) SS6 Lecture 3: Classification with Logistic Regression Advanced optimization techniques Underfitting & Overfitting Model selection (Training-

More information

3 Time Series Regression

3 Time Series Regression 3 Time Series Regression 3.1 Modelling Trend Using Regression Random Walk 2 0 2 4 6 8 Random Walk 0 2 4 6 8 0 10 20 30 40 50 60 (a) Time 0 10 20 30 40 50 60 (b) Time Random Walk 8 6 4 2 0 Random Walk 0

More information

Combination of M-Estimators and Neural Network Model to Analyze Inside/Outside Bark Tree Diameters

Combination of M-Estimators and Neural Network Model to Analyze Inside/Outside Bark Tree Diameters Combination of M-Estimators and Neural Network Model to Analyze Inside/Outside Bark Tree Diameters Kyriaki Kitikidou, Elias Milios, Lazaros Iliadis, and Minas Kaymakis Democritus University of Thrace,

More information

A mutual association based nonlinear ensemble mechanism for time series forecasting

A mutual association based nonlinear ensemble mechanism for time series forecasting Appl Intell DOI 0.007/s0489-04-064-y A mutual association based nonlinear ensemble mechanism for time series forecasting Ratnadip Adhiari Springer Science+Business Media New Yor 05 Abstract Forecasting

More information

FORECASTING THE INVENTORY LEVEL OF MAGNETIC CARDS IN TOLLING SYSTEM

FORECASTING THE INVENTORY LEVEL OF MAGNETIC CARDS IN TOLLING SYSTEM FORECASTING THE INVENTORY LEVEL OF MAGNETIC CARDS IN TOLLING SYSTEM Bratislav Lazić a, Nebojša Bojović b, Gordana Radivojević b*, Gorana Šormaz a a University of Belgrade, Mihajlo Pupin Institute, Serbia

More information

ARTIFICIAL NEURAL NETWORK PART I HANIEH BORHANAZAD

ARTIFICIAL NEURAL NETWORK PART I HANIEH BORHANAZAD ARTIFICIAL NEURAL NETWORK PART I HANIEH BORHANAZAD WHAT IS A NEURAL NETWORK? The simplest definition of a neural network, more properly referred to as an 'artificial' neural network (ANN), is provided

More information

Estimation of Inelastic Response Spectra Using Artificial Neural Networks

Estimation of Inelastic Response Spectra Using Artificial Neural Networks Estimation of Inelastic Response Spectra Using Artificial Neural Networks J. Bojórquez & S.E. Ruiz Universidad Nacional Autónoma de México, México E. Bojórquez Universidad Autónoma de Sinaloa, México SUMMARY:

More information

ESTIMATION OF HOURLY MEAN AMBIENT TEMPERATURES WITH ARTIFICIAL NEURAL NETWORKS 1. INTRODUCTION

ESTIMATION OF HOURLY MEAN AMBIENT TEMPERATURES WITH ARTIFICIAL NEURAL NETWORKS 1. INTRODUCTION Mathematical and Computational Applications, Vol. 11, No. 3, pp. 215-224, 2006. Association for Scientific Research ESTIMATION OF HOURLY MEAN AMBIENT TEMPERATURES WITH ARTIFICIAL NEURAL NETWORKS Ömer Altan

More information

Linear Least-Squares Based Methods for Neural Networks Learning

Linear Least-Squares Based Methods for Neural Networks Learning Linear Least-Squares Based Methods for Neural Networks Learning Oscar Fontenla-Romero 1, Deniz Erdogmus 2, JC Principe 2, Amparo Alonso-Betanzos 1, and Enrique Castillo 3 1 Laboratory for Research and

More information

Evaluation of Some Techniques for Forecasting of Electricity Demand in Sri Lanka

Evaluation of Some Techniques for Forecasting of Electricity Demand in Sri Lanka Appeared in Sri Lankan Journal of Applied Statistics (Volume 3) 00 Evaluation of Some echniques for Forecasting of Electricity Demand in Sri Lanka.M.J. A. Cooray and M.Indralingam Department of Mathematics

More information

SCIENCE & TECHNOLOGY

SCIENCE & TECHNOLOGY Pertanika J. Sci. & Technol. 5 (3): 787-796 (017) SCIENCE & TECHNOLOGY Journal homepage: http://www.pertanika.upm.edu.my/ Combination of Forecasts with an Application to Unemployment Rate Muniroh, M. F.

More information

Final Examination CS540-2: Introduction to Artificial Intelligence

Final Examination CS540-2: Introduction to Artificial Intelligence Final Examination CS540-2: Introduction to Artificial Intelligence May 9, 2018 LAST NAME: SOLUTIONS FIRST NAME: Directions 1. This exam contains 33 questions worth a total of 100 points 2. Fill in your

More information

Computational statistics

Computational statistics Computational statistics Lecture 3: Neural networks Thierry Denœux 5 March, 2016 Neural networks A class of learning methods that was developed separately in different fields statistics and artificial

More information

EM-algorithm for Training of State-space Models with Application to Time Series Prediction

EM-algorithm for Training of State-space Models with Application to Time Series Prediction EM-algorithm for Training of State-space Models with Application to Time Series Prediction Elia Liitiäinen, Nima Reyhani and Amaury Lendasse Helsinki University of Technology - Neural Networks Research

More information

Support Vector Regression (SVR) Descriptions of SVR in this discussion follow that in Refs. (2, 6, 7, 8, 9). The literature

Support Vector Regression (SVR) Descriptions of SVR in this discussion follow that in Refs. (2, 6, 7, 8, 9). The literature Support Vector Regression (SVR) Descriptions of SVR in this discussion follow that in Refs. (2, 6, 7, 8, 9). The literature suggests the design variables should be normalized to a range of [-1,1] or [0,1].

More information

Improving L-BFGS Initialization for Trust-Region Methods in Deep Learning

Improving L-BFGS Initialization for Trust-Region Methods in Deep Learning Improving L-BFGS Initialization for Trust-Region Methods in Deep Learning Jacob Rafati http://rafati.net jrafatiheravi@ucmerced.edu Ph.D. Candidate, Electrical Engineering and Computer Science University

More information

Time Series Analysis -- An Introduction -- AMS 586

Time Series Analysis -- An Introduction -- AMS 586 Time Series Analysis -- An Introduction -- AMS 586 1 Objectives of time series analysis Data description Data interpretation Modeling Control Prediction & Forecasting 2 Time-Series Data Numerical data

More information

IN neural-network training, the most well-known online

IN neural-network training, the most well-known online IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL. 10, NO. 1, JANUARY 1999 161 On the Kalman Filtering Method in Neural-Network Training and Pruning John Sum, Chi-sing Leung, Gilbert H. Young, and Wing-kay Kan

More information

Stable Adaptive Momentum for Rapid Online Learning in Nonlinear Systems

Stable Adaptive Momentum for Rapid Online Learning in Nonlinear Systems Stable Adaptive Momentum for Rapid Online Learning in Nonlinear Systems Thore Graepel and Nicol N. Schraudolph Institute of Computational Science ETH Zürich, Switzerland {graepel,schraudo}@inf.ethz.ch

More information

18.6 Regression and Classification with Linear Models

18.6 Regression and Classification with Linear Models 18.6 Regression and Classification with Linear Models 352 The hypothesis space of linear functions of continuous-valued inputs has been used for hundreds of years A univariate linear function (a straight

More information

Stability of backpropagation learning rule

Stability of backpropagation learning rule Stability of backpropagation learning rule Petr Krupanský, Petr Pivoňka, Jiří Dohnal Department of Control and Instrumentation Brno University of Technology Božetěchova 2, 612 66 Brno, Czech republic krupan,

More information

Comparison Forecasting with Double Exponential Smoothing and Artificial Neural Network to Predict the Price of Sugar

Comparison Forecasting with Double Exponential Smoothing and Artificial Neural Network to Predict the Price of Sugar Comparison Forecasting with Double Exponential Smoothing and Artificial Neural Network to Predict the Price of Sugar Fauziah Nasir Fauziah *, Aris Gunaryati Universitas Nasional Sawo Manila, South Jakarta.

More information

ON THE USE OF RANDOM VARIABLES IN PARTICLE SWARM OPTIMIZATIONS: A COMPARATIVE STUDY OF GAUSSIAN AND UNIFORM DISTRIBUTIONS

ON THE USE OF RANDOM VARIABLES IN PARTICLE SWARM OPTIMIZATIONS: A COMPARATIVE STUDY OF GAUSSIAN AND UNIFORM DISTRIBUTIONS J. of Electromagn. Waves and Appl., Vol. 23, 711 721, 2009 ON THE USE OF RANDOM VARIABLES IN PARTICLE SWARM OPTIMIZATIONS: A COMPARATIVE STUDY OF GAUSSIAN AND UNIFORM DISTRIBUTIONS L. Zhang, F. Yang, and

More information