PM 10 Forecasting Using Kernel Adaptive Filtering: An Italian Case Study
|
|
- Christopher Shelton
- 5 years ago
- Views:
Transcription
1 PM 10 Forecasting Using Kernel Adaptive Filtering: An Italian Case Study Simone Scardapane, Danilo Comminiello, Michele Scarpiniti, Raffaele Parisi, and Aurelio Uncini Department of Information Engineering, Electronics and Telecommunications (DIET), Sapienza University of Rome, Via Eudossiana 18, 00184, Rome {danilo.comminiello,michele.scarpiniti, Abstract. Short term prediction of air pollution is gaining increasing attention in the research community, due to its social and economical impact. In this paper we study the application of a Kernel Adaptive Filtering (KAF) algorithm to the problem of predicting PM 10 data in the Italian province of Ancona, and we show how this predictor is able to achieve a significant low error with the inclusion of chemical data correlated with the PM 10 such as NO 2. Keywords: Air pollution, Nonlinear adaptive filtering, Kernel Adaptive Filters, Least Mean Square. Introduction Due to its adverse effects on human health, airborne particulate matter has gained over the last years increasing attention, both inside the research community and in the parliaments worldwide. Numerous studies have found a direct link between inhalation, long term exposure to particulate matter, and increase in mortality rates, particularly for lung cancers [1]. Moreover, a continuous presence of particulate matter lead to a constant decrease in visibility inside the cities and to the deposition of trace elements [2]. To deal with these aspects, the European Commission issued on 22 April 1999 the Council Directive 1999/30/EC, that set a roof for daily concentration of particulate matter with an aerodynamic diameter of up to 10μm(PM 10 ), and obliged member states to issue a warning every time this roof is reached, entering an attention state. Nevertheless, last data coming from the European Environmental Agency 1 shows that, although global pollution has decreased, some pollutants such as PM 10 have stayed more or less stable and constantly exceeds the required threshold. For this reason, the development of a solid system that is able to accurately predict the daily levels of PM 10 has been, and still is, a constant priority for local administrations. 1 air-quality-in-europe-2011 B. Apolloni et al. (Eds.): Neural Nets and Surroundings, SIST 19, pp DOI: / _10 c Springer-Verlag Berlin Heidelberg 2013
2 94 S. Scardapane et al. Since environmental data presents a great complexity and an ample deal of hidden factors, numerous researchers have investigated the possibility of automatically forecasting the daily concentration of PM 10 using learning systems. In particular, neural networks [3] have been found to achieve a moderate error with small amount of data, and thus represent a good solution for implementing a robust prediction system [4]. Recently, a moderate deal of effort has been put into applying other methodologies such as Support Vector Machines [5] to the problem. In this paper, we study the application of Kernel Adaptive Filtering (KAF) techniques [6] to the task of predicting PM 10 in the Italian area of Ancona. KAF algorithms are a recent development in the adaptive filtering field, and have been rarely put to use for real world problems. For this reason, it is interesting to see how they provide a fast and efficient solution to our problem. In addition, comparisons with other standard techniques provide similar results, but the proposed approach is characterized by a low computational cost. We also investigate the inclusion in the learning process of chemical data such as NO 2, showing how this lead to a strong decrease in the prediction error, probably compensating for hidden factors that occasionally falsify the instruments readings. Several studies, for example [7,8,9,10], have addressed the problem of PM 10 forecasting using cross-prediction with different chemical agents. Cross-prediction, in fact, can provide a robust estimate in all those cases, in which the nature of the time-series to be predicted can have several contributes. For example, because the area of Ancona is on the seaside, many external factors, such as the sea salt, can give a misunderstood contribution on PM 10 values, falsifying the results and alarming people even if the concentration is less than that measured. In order to obtain a more robust estimation of the PM 10 values we choose to use cross-prediction in this work. The paper is organized as follows: Section 1 introduces the Kernel Adaptive Filtering approach. Section 2 describes the experimental setup, while Section 3 shows the experimental results. Finally Section 4 concludes the work. 1 Kernel Adaptive Filtering Kernel Adaptive Filtering (KAF) [6] is a recent family of learning algorithms that combines the simplicity of Linear Adaptive Filtering with the nonlinear modeling capabilities of other learning techniques such as Neural Networks and Support Vector Machines. KAF are kernel methods, meaning that the original input x R s to the problem is transformed into a highly dimensional feature vector ψ(x) through the use of a positive definite kernel function: κ(x,x ) : R s R s R. Choosing an appropriate kernel function, the newly created input vector is linearly separable, and is used to train a linear adaptive filter. Practically, as long as the original linear algorithms can be formulated in terms of inner products, the use of the kernel trick [11] allows to rewrite them in terms only of kernel evaluations, thus avoiding the explicit computation of the transformed vector.
3 Algorithm 1. Summary of the KLMS algorithm Initialize: Training Set T, η, κ 1: a[1]=ηd[1] 2: while (x n,d[n]) T } available do 3: y[n]= n 1 j=1 a[n j]κ(x n,x n j ) 4: e[n]=d[n] y[n 1] 5: a[n]=ηe[n] 6: end while PM 10 Forecasting Using Kernel Adaptive Filtering 95 Starting from linear filtering theory, a number of kernel filters have been devised, such as the Kernel Least-Mean Square [12] that is used in this work. They all provide nonlinear approximation to any function and moderate complexity, with respect to other methods such as recurrent neural networks. The main problems arising in the use of a KAF are the choice of a proper kernel, the need for regularization and the fact that the network grows linearly with the number of processed inputs. This last problem is not addressed in the current work. For a review of methods to efficiently curtail the growth of the network, we recommend to the interested reader [12]. 1.1 Kernel Least Mean Square Kernel Least Mean Square (KLMS) [12] is the simplest training algorithm in the KAF family, and is derived by Least Mean Square (LMS) [3] by the use of the kernel trick. Despite its simplicity, it has been shown to own a self regularizing property. The pseudo-code of KLMS is briefly summarized in Algorithm 1. It takes as input a training set T composed of pairs {(x n,d[n])}, wherex n R s is the input vector to the system, and d[n] R the desired scalar output. There is a need to select a proper step size η and the kernel function κ : R n R n R, then, for every processed input, the algorithm stores the current input vector u n and the corresponding weight a[n]=ηe[n]. The step size η balances between speed and convergence, and it can be chosen using a reasoning like the classical LMS algorithm. The output of the filter at time n is given by n 1 y[n]= j=1 a[n j]κ(x n,x n j ). (1) In equation (1) κ(u n,u n j ) refers to the value of the kernel. In this work we used the Gaussian kernel, that, for two generic input vectors u and u becomes: κ(u,u )=exp( γ u u 2 ), (2) where γ is the kernel bandwidth or kernel parameter. In addition κ(u,u ) has the interesting property of being shift-invariant. As a measure of the error we used the mean-square error, which is defined over the training set T as: MSE T (a n )= 1 N (d[n] y[n]) 2. (3) (x n,d[n]) T
4 96 S. Scardapane et al PM Observations Fig. 1. Hourly observations of PM 10 for the year 2010 (in total 8764 observations) in Marina di Ancona 2 Experimental Setup In order to test the proposed predictor, we used hourly observations of PM 10 and NO 2 for the year 2010 in Marina di Ancona (Italy) 2. These data are interesting since, as for every maritime region, accuracy of the sensors can be falsified by the presence of external factors such as sea salt. We compensate this aspect with the inclusion of chemical data given by NO 2. Data is normalized between 0 and 1, and the resulting PM 10 series can be seen in Figure 1. We observe that the concentration of particulate matter has stayed more or less stable around the year, while some observations are clearly outliers. In the first experiment, the input vector x n is an embedding of the last 168 elements of the PM 10 {x[n 1],...,x[n τ]}, corresponding to the previous 7 days of observations, while the desired output is the concentration of PM 10 fortwelvep.m.ofnextday.inthe second experiment, the corresponding observations of the NO 2 are added to each input vector, leading to a 336 elements. From the original series we extracted randomly 700 input vectors and corresponding outputs for training, and 150 elements for testing, thus generating a statistically inde- 2 Data can be freely downloaded from
5 PM 10 Forecasting Using Kernel Adaptive Filtering 97 pendent testing set. Experiments are averaged on 50 different runs, and are conducted using an Intel i GHz processor at 64 bit, with 4 GB of RAM available. 2.1 Correlation between PM 10 and NO 2 NO 2 was chosen as a supporting time series since it has been found to be highly correlated with the particulate matter evolution. In Figure 2 we can visually see the correlation between the two time series for a brief period. To confirm this dependence, we computed the correlation coefficient ρ of the covariance matrix C = cov(x), wherex is a matrix containing all the hourly observations. The correlation coefficient ρ was found to be greater than 0.4, thus confirming our initial hypothesis. Fig. 2. Correlation between PM 10 and NO 2 for a brief period of the year 2010 in Marina di Ancona Figure 2 clearly shows that, around sample 210 (emphasized with a black cross), the values of measured PM 10 are greater than the values of NO 2, differently from other samples. This fact could be due to external factors, such as the sea salt. This latter assumption is argued by means of the hourly weather condition in that day. In fact there was a strong wind from the sea towards the city.
6 98 S. Scardapane et al. 3 Results Figure 3 shows the evolution of the MSE for the two cases, with and without the inclusion of NO2 chemical correlated data. The learning rate η was set to 0.2 forthefirst case and to 0.05 for the second, while the kernel parameter has remained constant at γ = 1. The error is also compared with a standard LMS filter with η = LMS KLMS KLMS MSE iteration Fig. 3. Evolution of the mean square error (MSE) for KLMS in PM 10 prediction with and without the inclusion of NO 2 correlated data We see that KLMS achieves a low error over the testing set in less than 700 iterations, a result that further improve in the second case, with the inclusion of the NO 2 observations. The predictions of the trained KLMS network are shown in Figure 4. In addition, Figure 4, which has a slightly different scale on the y-axis due to the normalization of the data itself, confirms the hypothesized outliers in data around sample 210. In fact, now the predicted value for PM 10 is less than the measured one, according to values of NO 2 and justifying the external nature of that sample. We can conclude that this and similar outliers are due to the sea salt. Note that the resulting network is constituted of 700 nodes, corresponding to the 700 training examples, each of which is composed of 360 elements. In a real world application, for computational requirements, there would be a need of a method for actively curtailing the growth of the resulting system, such as a pruning algorithm.
7 PM 10 Forecasting Using Kernel Adaptive Filtering Predicted Measured Predicted Measured output Observations Fig. 4. Predicted and measured data of trained KLMS network We have also compared the proposed approach with a Multilayer Perceptron (MLP) [3], a Support Vector Machine [3,13] and a Nonlinear Autoregressive (NAR) model [14], with and without the inclusion of NO 2 correlated data. The MLP has one hidden layer with 50 neurons, while the learning rate is set to 0.2 for each neuron. Results obtained with these standard architectures are numerical equivalent to those of Figures 3 and 4 in terms of MSE and predicted data (for this reason are not reported in the paper). This fact suggests that the proposed approach is able to well perform the prediction of correlated environmental data and obtains similar results found in literature, but it presents the advantage of a lower computational cost, with respect MLP and SVM approaches. 4 Conclusions We demonstrated how KAF algorithms provides a fast and efficient way of training a simple network to predict environmental data such as PM 10. Our system achieves a significant low error over a testing set with the inclusion of correlated data and shows a robust behavior with respect to outliers due to external factors. By iterated application, this system can become an handy tool for local legislators to accurately predict the evolution of particulate matter in their region, in the effort to strongly reduce the global concentration of PM 10 over the year. Although comparisons with other standard
8 100 S. Scardapane et al. techniques provide similar results, the proposed approach is characterized by a low computational cost. References 1. Pope, C.A., Burnett, R., Thun, M.J., Calle, E.E., Krewskik, D., Ito, K., Thurston, G.: Lung cancer, cardiopulmonary mortality, and long term exposure to fine particulate air pollution. Journal of the American Medical Association 287, (2002) 2. Hooyberghs, F., Mensink, C., Dumont, G., Fierens, F., Brasseur, O.: A neural network forecast for daily average PM10 concentrations in Belgium. Atmospheric Environment 39(8), (2005) 3. Haykin, S.: Neural Networks and Learning Machines, 2nd edn. Pearson Publishing (2009) 4. Gardner, M.W., Dorling, S.R.: Artificial neural networks (the multilayer perceptron): a review of applications in the atmospheric sciences. Atmospheric Environment 32, (1998) 5. Arampongsanuwat, S., Meesad, P.: Prediction of PM10 using Support Vector Regression. In: 2011 International Conference on Information and Electronics Engineering IPCSIT, vol. 6, pp (2011) 6. Liu, W., Principe, J.C., Haykin, S.: Kernel Adaptive Filtering: A Comprehensive Introduction. Wiley (2010) 7. Aurangojeb, M.: Relationship between PM 10,NO 2 and particle number concentration: validity of air quality controls. Procedia Environmental Sciences 6, (2011) 8. Ito, K., Thurston, J.D., Nadas, A., Lippmann, M.: Monitor-to-monitor temporal correlation of air pollution and weather variables in the North-Central U.S. Journal of Exposure Analysis and Environmental Epidemiology 11(1), (2001) 9. Lippmann, M., Ito, K., Nadas, A., Burnett, R.T.: Association of particulate matter components with daily mortality and morbidity in urban populations. Journal of Exposure Analysis and Environmental Epidemiology 9(5), 5 72 (2000) 10. Rakesh, K., Abba, E.J.: Air pollution concentrations of PM 2.5,PM 10 and NO 2 at ambient and kerbsite and their correlation in metro city - Mumbai. Environmental Monitoring and Assestment 119(1-3), (2006) 11. Scholkopf, B.: The kernel trick for distances. In: XIII Proceedings of the 2000 Conference on Advances in Neural Information Processing Systems, vol. 13, pp (2000) 12. Liu, W., Pokharel, P.P., Principe, J.C.: The Kernel Least-Mean-Square algorithm. IEEE Transactions on Signal Processing 56(2), (2008) 13. Vapnik, V.N.: The Nature of Statistical Learning Theory. Springer (1995) 14. Conover, W.J.: Practical Nonparametric Statistics. Wiley (1971)
Short term PM 10 forecasting: a survey of possible input variables
Short term PM 10 forecasting: a survey of possible input variables J. Hooyberghs 1, C. Mensink 1, G. Dumont 2, F. Fierens 2 & O. Brasseur 3 1 Flemish Institute for Technological Research (VITO), Belgium
More informationBenchmarking Functional Link Expansions for Audio Classification Tasks
Benchmarking Functional Link Expansions for Audio Classification Tasks Simone Scardapane, Danilo Comminiello, Michele Scarpiniti, Raffaele Parisi and Aurelio Uncini Abstract Functional Link Artificial
More informationTime series forecasting of hourly PM10 values: model intercomparison and the development of localized linear approaches
Air Pollution XIV 85 Time series forecasting of hourly PM10 values: model intercomparison and the development of localized linear approaches D. Vlachogiannis & A. Sfetsos Environmental Research Laboratory,
More information3.4 Linear Least-Squares Filter
X(n) = [x(1), x(2),..., x(n)] T 1 3.4 Linear Least-Squares Filter Two characteristics of linear least-squares filter: 1. The filter is built around a single linear neuron. 2. The cost function is the sum
More informationNeural Networks and the Back-propagation Algorithm
Neural Networks and the Back-propagation Algorithm Francisco S. Melo In these notes, we provide a brief overview of the main concepts concerning neural networks and the back-propagation algorithm. We closely
More informationChapter 6 Effective Blind Source Separation Based on the Adam Algorithm
Chapter 6 Effective Blind Source Separation Based on the Adam Algorithm Michele Scarpiniti, Simone Scardapane, Danilo Comminiello, Raffaele Parisi and Aurelio Uncini Abstract In this paper, we derive a
More informationCombination of M-Estimators and Neural Network Model to Analyze Inside/Outside Bark Tree Diameters
Combination of M-Estimators and Neural Network Model to Analyze Inside/Outside Bark Tree Diameters Kyriaki Kitikidou, Elias Milios, Lazaros Iliadis, and Minas Kaymakis Democritus University of Thrace,
More informationMackey-Glass Time-Series Prediction: Comparing the Performance of LMS, KLMS and NLMS-FL
Mackey-Glass Time-Series Prediction: Comparing the Performance of LMS, KLMS and NLMS-FL Student: Giovanni Murru Dept. of Informatica e Sistemistica Professor: Aurelio Uncini Dept. of Scienza e Tecnica
More informationPrediction of Particulate Matter Concentrations Using Artificial Neural Network
Resources and Environment 2012, 2(2): 30-36 DOI: 10.5923/j.re.20120202.05 Prediction of Particulate Matter Concentrations Using Artificial Neural Network Surendra Roy National Institute of Rock Mechanics,
More informationA Hybrid ARIMA and Neural Network Model to Forecast Particulate. Matter Concentration in Changsha, China
A Hybrid ARIMA and Neural Network Model to Forecast Particulate Matter Concentration in Changsha, China Guangxing He 1, Qihong Deng 2* 1 School of Energy Science and Engineering, Central South University,
More informationAir Quality Forecasting using Neural Networks
Air Quality Forecasting using Neural Networks Chen Zhao, Mark van Heeswijk, and Juha Karhunen Aalto University School of Science, FI-00076, Finland {chen.zhao, mark.van.heeswijk, juha.karhunen}@aalto.fi
More informationEM-algorithm for Training of State-space Models with Application to Time Series Prediction
EM-algorithm for Training of State-space Models with Application to Time Series Prediction Elia Liitiäinen, Nima Reyhani and Amaury Lendasse Helsinki University of Technology - Neural Networks Research
More informationUsing Kernel PCA for Initialisation of Variational Bayesian Nonlinear Blind Source Separation Method
Using Kernel PCA for Initialisation of Variational Bayesian Nonlinear Blind Source Separation Method Antti Honkela 1, Stefan Harmeling 2, Leo Lundqvist 1, and Harri Valpola 1 1 Helsinki University of Technology,
More informationDETECTING PROCESS STATE CHANGES BY NONLINEAR BLIND SOURCE SEPARATION. Alexandre Iline, Harri Valpola and Erkki Oja
DETECTING PROCESS STATE CHANGES BY NONLINEAR BLIND SOURCE SEPARATION Alexandre Iline, Harri Valpola and Erkki Oja Laboratory of Computer and Information Science Helsinki University of Technology P.O.Box
More informationKeywords- Source coding, Huffman encoding, Artificial neural network, Multilayer perceptron, Backpropagation algorithm
Volume 4, Issue 5, May 2014 ISSN: 2277 128X International Journal of Advanced Research in Computer Science and Software Engineering Research Paper Available online at: www.ijarcsse.com Huffman Encoding
More informationArtificial Neural Network
Artificial Neural Network Eung Je Woo Department of Biomedical Engineering Impedance Imaging Research Center (IIRC) Kyung Hee University Korea ejwoo@khu.ac.kr Neuron and Neuron Model McCulloch and Pitts
More informationBenchmarking Functional Link Expansions for Audio Classification Tasks
25th Italian Workshop on Neural Networks (Vietri sul Mare) Benchmarking Functional Link Expansions for Audio Classification Tasks Scardapane S., Comminiello D., Scarpiniti M., Parisi R. and Uncini A. Overview
More informationSTEADY-STATE MEAN SQUARE PERFORMANCE OF A SPARSIFIED KERNEL LEAST MEAN SQUARE ALGORITHM.
STEADY-STATE MEAN SQUARE PERFORMANCE OF A SPARSIFIED KERNEL LEAST MEAN SQUARE ALGORITHM Badong Chen 1, Zhengda Qin 1, Lei Sun 2 1 Institute of Artificial Intelligence and Robotics, Xi an Jiaotong University,
More informationMultitask Learning of Environmental Spatial Data
9th International Congress on Environmental Modelling and Software Brigham Young University BYU ScholarsArchive 6th International Congress on Environmental Modelling and Software - Leipzig, Germany - July
More informationMixture Kernel Least Mean Square
Proceedings of International Joint Conference on Neural Networks, Dallas, Texas, USA, August 4-9, 2013 Mixture Kernel Least Mean Square Rosha Pokharel, José. C. Príncipe Department of Electrical and Computer
More informationArtificial neural networks prediction of PM10 in the Milan area
Artificial neural networks prediction of PM10 in the Milan area M. Cecchetti, G. Corani a, G. Guariso a a Dipartimento di Elettronica ed Informazione Politecnico di Milano e-mail:corani(guariso)@elet.polimi.it
More informationDr. Haritini Tsangari Associate Professor of Statistics University of Nicosia, Cyprus
Dr. Haritini Tsangari Associate Professor of Statistics University of Nicosia, Cyprus H. Tsangari (presenting) 1, Z. Konsoula 1, S. Christou 1, K. E. Georgiou 2, K. Ioannou 3, T. Mesimeris 3, S. Kleanthous
More informationUsing Neural Networks for Identification and Control of Systems
Using Neural Networks for Identification and Control of Systems Jhonatam Cordeiro Department of Industrial and Systems Engineering North Carolina A&T State University, Greensboro, NC 27411 jcrodrig@aggies.ncat.edu
More informationApplication of Artificial Neural Networks in Evaluation and Identification of Electrical Loss in Transformers According to the Energy Consumption
Application of Artificial Neural Networks in Evaluation and Identification of Electrical Loss in Transformers According to the Energy Consumption ANDRÉ NUNES DE SOUZA, JOSÉ ALFREDO C. ULSON, IVAN NUNES
More informationMANY real-word applications require complex nonlinear
Proceedings of International Joint Conference on Neural Networks, San Jose, California, USA, July 31 August 5, 2011 Kernel Adaptive Filtering with Maximum Correntropy Criterion Songlin Zhao, Badong Chen,
More informationShort Term Load Forecasting Based Artificial Neural Network
Short Term Load Forecasting Based Artificial Neural Network Dr. Adel M. Dakhil Department of Electrical Engineering Misan University Iraq- Misan Dr.adelmanaa@gmail.com Abstract Present study develops short
More informationPredict Time Series with Multiple Artificial Neural Networks
, pp. 313-324 http://dx.doi.org/10.14257/ijhit.2016.9.7.28 Predict Time Series with Multiple Artificial Neural Networks Fei Li 1, Jin Liu 1 and Lei Kong 2,* 1 College of Information Engineering, Shanghai
More informationMulti-Plant Photovoltaic Energy Forecasting Challenge: Second place solution
Multi-Plant Photovoltaic Energy Forecasting Challenge: Second place solution Clément Gautrais 1, Yann Dauxais 1, and Maël Guilleme 2 1 University of Rennes 1/Inria Rennes clement.gautrais@irisa.fr 2 Energiency/University
More informationReading, UK 1 2 Abstract
, pp.45-54 http://dx.doi.org/10.14257/ijseia.2013.7.5.05 A Case Study on the Application of Computational Intelligence to Identifying Relationships between Land use Characteristics and Damages caused by
More informationCOMP-4360 Machine Learning Neural Networks
COMP-4360 Machine Learning Neural Networks Jacky Baltes Autonomous Agents Lab University of Manitoba Winnipeg, Canada R3T 2N2 Email: jacky@cs.umanitoba.ca WWW: http://www.cs.umanitoba.ca/~jacky http://aalab.cs.umanitoba.ca
More informationy(n) Time Series Data
Recurrent SOM with Local Linear Models in Time Series Prediction Timo Koskela, Markus Varsta, Jukka Heikkonen, and Kimmo Kaski Helsinki University of Technology Laboratory of Computational Engineering
More informationIntroduction to Neural Networks
Introduction to Neural Networks What are (Artificial) Neural Networks? Models of the brain and nervous system Highly parallel Process information much more like the brain than a serial computer Learning
More informationLocal Prediction of Precipitation Based on Neural Network
Environmental Engineering 10th International Conference eissn 2029-7092 / eisbn 978-609-476-044-0 Vilnius Gediminas Technical University Lithuania, 27 28 April 2017 Article ID: enviro.2017.079 http://enviro.vgtu.lt
More information10-701/ Machine Learning, Fall
0-70/5-78 Machine Learning, Fall 2003 Homework 2 Solution If you have questions, please contact Jiayong Zhang .. (Error Function) The sum-of-squares error is the most common training
More informationSupport Vector Machine Regression for Volatile Stock Market Prediction
Support Vector Machine Regression for Volatile Stock Market Prediction Haiqin Yang, Laiwan Chan, and Irwin King Department of Computer Science and Engineering The Chinese University of Hong Kong Shatin,
More informationApplication of Fully Recurrent (FRNN) and Radial Basis Function (RBFNN) Neural Networks for Simulating Solar Radiation
Bulletin of Environment, Pharmacology and Life Sciences Bull. Env. Pharmacol. Life Sci., Vol 3 () January 04: 3-39 04 Academy for Environment and Life Sciences, India Online ISSN 77-808 Journal s URL:http://www.bepls.com
More informationAnalysis of Interest Rate Curves Clustering Using Self-Organising Maps
Analysis of Interest Rate Curves Clustering Using Self-Organising Maps M. Kanevski (1), V. Timonin (1), A. Pozdnoukhov(1), M. Maignan (1,2) (1) Institute of Geomatics and Analysis of Risk (IGAR), University
More informationLoad Forecasting Using Artificial Neural Networks and Support Vector Regression
Proceedings of the 7th WSEAS International Conference on Power Systems, Beijing, China, September -7, 2007 3 Load Forecasting Using Artificial Neural Networks and Support Vector Regression SILVIO MICHEL
More informationReservoir Computing and Echo State Networks
An Introduction to: Reservoir Computing and Echo State Networks Claudio Gallicchio gallicch@di.unipi.it Outline Focus: Supervised learning in domain of sequences Recurrent Neural networks for supervised
More informationARTIFICIAL NEURAL NETWORK PART I HANIEH BORHANAZAD
ARTIFICIAL NEURAL NETWORK PART I HANIEH BORHANAZAD WHAT IS A NEURAL NETWORK? The simplest definition of a neural network, more properly referred to as an 'artificial' neural network (ANN), is provided
More informationRecurrent Neural Networks with Flexible Gates using Kernel Activation Functions
2018 IEEE International Workshop on Machine Learning for Signal Processing (MLSP 18) Recurrent Neural Networks with Flexible Gates using Kernel Activation Functions Authors: S. Scardapane, S. Van Vaerenbergh,
More informationExtended Follow-Up and Spatial Analysis of the American Cancer Society Study Linking Particulate Air Pollution and Mortality
Extended Follow-Up and Spatial Analysis of the American Cancer Society Study Linking Particulate Air Pollution and Mortality Daniel Krewski, Michael Jerrett, Richard T Burnett, Renjun Ma, Edward Hughes,
More informationThis paper presents the
ISESCO JOURNAL of Science and Technology Volume 8 - Number 14 - November 2012 (2-8) A Novel Ensemble Neural Network based Short-term Wind Power Generation Forecasting in a Microgrid Aymen Chaouachi and
More informationRadial-Basis Function Networks
Radial-Basis Function etworks A function is radial basis () if its output depends on (is a non-increasing function of) the distance of the input from a given stored vector. s represent local receptors,
More informationChap.11 Nonlinear principal component analysis [Book, Chap. 10]
Chap.11 Nonlinear principal component analysis [Book, Chap. 1] We have seen machine learning methods nonlinearly generalizing the linear regression method. Now we will examine ways to nonlinearly generalize
More informationDirect Method for Training Feed-forward Neural Networks using Batch Extended Kalman Filter for Multi- Step-Ahead Predictions
Direct Method for Training Feed-forward Neural Networks using Batch Extended Kalman Filter for Multi- Step-Ahead Predictions Artem Chernodub, Institute of Mathematical Machines and Systems NASU, Neurotechnologies
More informationMODELLING ENERGY DEMAND FORECASTING USING NEURAL NETWORKS WITH UNIVARIATE TIME SERIES
MODELLING ENERGY DEMAND FORECASTING USING NEURAL NETWORKS WITH UNIVARIATE TIME SERIES S. Cankurt 1, M. Yasin 2 1&2 Ishik University Erbil, Iraq 1 s.cankurt@ishik.edu.iq, 2 m.yasin@ishik.edu.iq doi:10.23918/iec2018.26
More informationIn the Name of God. Lecture 11: Single Layer Perceptrons
1 In the Name of God Lecture 11: Single Layer Perceptrons Perceptron: architecture We consider the architecture: feed-forward NN with one layer It is sufficient to study single layer perceptrons with just
More informationBayesian Based Neural Network Model for Solar Photovoltaic Power Forecasting
Bayesian Based Neural Network Model for Solar Photovoltaic Power Forecasting Angelo Ciaramella 1, Antonino Staiano 1, Guido Cervone 2, and Stefano Alessandrini 3 1 Dept. of Science and Technology, University
More informationChapter 9. Support Vector Machine. Yongdai Kim Seoul National University
Chapter 9. Support Vector Machine Yongdai Kim Seoul National University 1. Introduction Support Vector Machine (SVM) is a classification method developed by Vapnik (1996). It is thought that SVM improved
More informationVariable Selection in Regression using Multilayer Feedforward Network
Journal of Modern Applied Statistical Methods Volume 15 Issue 1 Article 33 5-1-016 Variable Selection in Regression using Multilayer Feedforward Network Tejaswi S. Kamble Shivaji University, Kolhapur,
More informationAPPLICATION OF RADIAL BASIS FUNCTION NEURAL NETWORK, TO ESTIMATE THE STATE OF HEALTH FOR LFP BATTERY
International Journal of Electrical and Electronics Engineering (IJEEE) ISSN(P): 2278-9944; ISSN(E): 2278-9952 Vol. 7, Issue 1, Dec - Jan 2018, 1-6 IASET APPLICATION OF RADIAL BASIS FUNCTION NEURAL NETWORK,
More informationSupport Vector Machines
Support Vector Machines Hypothesis Space variable size deterministic continuous parameters Learning Algorithm linear and quadratic programming eager batch SVMs combine three important ideas Apply optimization
More informationA New Look at Nonlinear Time Series Prediction with NARX Recurrent Neural Network. José Maria P. Menezes Jr. and Guilherme A.
A New Look at Nonlinear Time Series Prediction with NARX Recurrent Neural Network José Maria P. Menezes Jr. and Guilherme A. Barreto Department of Teleinformatics Engineering Federal University of Ceará,
More informationHoldout and Cross-Validation Methods Overfitting Avoidance
Holdout and Cross-Validation Methods Overfitting Avoidance Decision Trees Reduce error pruning Cost-complexity pruning Neural Networks Early stopping Adjusting Regularizers via Cross-Validation Nearest
More informationMultilayer Perceptron
Outline Hong Chang Institute of Computing Technology, Chinese Academy of Sciences Machine Learning Methods (Fall 2012) Outline Outline I 1 Introduction 2 Single Perceptron 3 Boolean Function Learning 4
More informationRecurrence Enhances the Spatial Encoding of Static Inputs in Reservoir Networks
Recurrence Enhances the Spatial Encoding of Static Inputs in Reservoir Networks Christian Emmerich, R. Felix Reinhart, and Jochen J. Steil Research Institute for Cognition and Robotics (CoR-Lab), Bielefeld
More informationAdvanced analysis and modelling tools for spatial environmental data. Case study: indoor radon data in Switzerland
EnviroInfo 2004 (Geneva) Sh@ring EnviroInfo 2004 Advanced analysis and modelling tools for spatial environmental data. Case study: indoor radon data in Switzerland Mikhail Kanevski 1, Michel Maignan 1
More informationLeast Absolute Shrinkage is Equivalent to Quadratic Penalization
Least Absolute Shrinkage is Equivalent to Quadratic Penalization Yves Grandvalet Heudiasyc, UMR CNRS 6599, Université de Technologie de Compiègne, BP 20.529, 60205 Compiègne Cedex, France Yves.Grandvalet@hds.utc.fr
More informationImmediate Reward Reinforcement Learning for Projective Kernel Methods
ESANN'27 proceedings - European Symposium on Artificial Neural Networks Bruges (Belgium), 25-27 April 27, d-side publi., ISBN 2-9337-7-2. Immediate Reward Reinforcement Learning for Projective Kernel Methods
More informationEVALUATION OF NEURAL NETWORK, STATISTICAL AND DETERMINISTIC MODELS AGAINST THE MEASURED CONCENTRATIONS OF NO 2, PM 10 AND PM 2.
EVALUATION OF NEURAL NETWORK, STATISTICAL AND DETERMINISTIC MODELS AGAINST THE MEASURED CONCENTRATIONS OF NO 2, PM 10 AND PM 2.5 IN AN URBAN AREA Jaakko Kukkonen 1, Ari Karppinen 1, Leena Wallenius 1,
More informationAutomatic modelling of neural networks for time series prediction in search of a uniform methodology across varying time frequencies
Automatic modelling of neural networks for time series prediction in search of a uniform methodology across varying time frequencies Nikolaos Kourentzes and Sven F. Crone Lancaster University Management
More informationDeep Learning Architecture for Univariate Time Series Forecasting
CS229,Technical Report, 2014 Deep Learning Architecture for Univariate Time Series Forecasting Dmitry Vengertsev 1 Abstract This paper studies the problem of applying machine learning with deep architecture
More informationA new method for short-term load forecasting based on chaotic time series and neural network
A new method for short-term load forecasting based on chaotic time series and neural network Sajjad Kouhi*, Navid Taghizadegan Electrical Engineering Department, Azarbaijan Shahid Madani University, Tabriz,
More informationMining Classification Knowledge
Mining Classification Knowledge Remarks on NonSymbolic Methods JERZY STEFANOWSKI Institute of Computing Sciences, Poznań University of Technology COST Doctoral School, Troina 2008 Outline 1. Bayesian classification
More informationPortugaliae Electrochimica Acta 26/4 (2008)
Portugaliae Electrochimica Acta 6/4 (008) 6-68 PORTUGALIAE ELECTROCHIMICA ACTA Comparison of Regression Model and Artificial Neural Network Model for the Prediction of Volume Percent of Diamond Deposition
More informationComparative analysis of neural network and regression based condition monitoring approaches for wind turbine fault detection
Downloaded from orbit.dtu.dk on: Oct, 218 Comparative analysis of neural network and regression based condition monitoring approaches for wind turbine fault detection Schlechtingen, Meik; Santos, Ilmar
More informationRadial-Basis Function Networks
Radial-Basis Function etworks A function is radial () if its output depends on (is a nonincreasing function of) the distance of the input from a given stored vector. s represent local receptors, as illustrated
More informationIntroduction to Natural Computation. Lecture 9. Multilayer Perceptrons and Backpropagation. Peter Lewis
Introduction to Natural Computation Lecture 9 Multilayer Perceptrons and Backpropagation Peter Lewis 1 / 25 Overview of the Lecture Why multilayer perceptrons? Some applications of multilayer perceptrons.
More informationFunctional Preprocessing for Multilayer Perceptrons
Functional Preprocessing for Multilayer Perceptrons Fabrice Rossi and Brieuc Conan-Guez Projet AxIS, INRIA, Domaine de Voluceau, Rocquencourt, B.P. 105 78153 Le Chesnay Cedex, France CEREMADE, UMR CNRS
More informationSupervised Learning Part I
Supervised Learning Part I http://www.lps.ens.fr/~nadal/cours/mva Jean-Pierre Nadal CNRS & EHESS Laboratoire de Physique Statistique (LPS, UMR 8550 CNRS - ENS UPMC Univ. Paris Diderot) Ecole Normale Supérieure
More informationA. Pelliccioni (*), R. Cotroneo (*), F. Pungì (*) (*)ISPESL-DIPIA, Via Fontana Candida 1, 00040, Monteporzio Catone (RM), Italy.
Application of Neural Net Models to classify and to forecast the observed precipitation type at the ground using the Artificial Intelligence Competition data set. A. Pelliccioni (*), R. Cotroneo (*), F.
More informationArtificial neural networks
Artificial neural networks Chapter 8, Section 7 Artificial Intelligence, spring 203, Peter Ljunglöf; based on AIMA Slides c Stuart Russel and Peter Norvig, 2004 Chapter 8, Section 7 Outline Brains Neural
More informationMACHINE LEARNING. Methods for feature extraction and reduction of dimensionality: Probabilistic PCA and kernel PCA
1 MACHINE LEARNING Methods for feature extraction and reduction of dimensionality: Probabilistic PCA and kernel PCA 2 Practicals Next Week Next Week, Practical Session on Computer Takes Place in Room GR
More informationARTIFICIAL NEURAL NETWORKS گروه مطالعاتي 17 بهار 92
ARTIFICIAL NEURAL NETWORKS گروه مطالعاتي 17 بهار 92 BIOLOGICAL INSPIRATIONS Some numbers The human brain contains about 10 billion nerve cells (neurons) Each neuron is connected to the others through 10000
More informationUnderstanding Travel Time to Airports in New York City Sierra Gentry Dominik Schunack
Understanding Travel Time to Airports in New York City Sierra Gentry Dominik Schunack 1 Introduction Even with the rising competition of rideshare services, many in New York City still utilize taxis for
More informationScale-Invariance of Support Vector Machines based on the Triangular Kernel. Abstract
Scale-Invariance of Support Vector Machines based on the Triangular Kernel François Fleuret Hichem Sahbi IMEDIA Research Group INRIA Domaine de Voluceau 78150 Le Chesnay, France Abstract This paper focuses
More informationApproximation of Functions by Multivariable Hermite Basis: A Hybrid Method
Approximation of Functions by Multivariable Hermite Basis: A Hybrid Method Bartlomiej Beliczynski Warsaw University of Technology, Institute of Control and Industrial Electronics, ul. Koszykowa 75, -66
More informationFeature Selection Optimization Solar Insolation Prediction Using Artificial Neural Network: Perspective Bangladesh
American Journal of Engineering Research (AJER) 2016 American Journal of Engineering Research (AJER) e-issn: 2320-0847 p-issn : 2320-0936 Volume-5, Issue-8, pp-261-265 www.ajer.org Research Paper Open
More informationLecture 4: Perceptrons and Multilayer Perceptrons
Lecture 4: Perceptrons and Multilayer Perceptrons Cognitive Systems II - Machine Learning SS 2005 Part I: Basic Approaches of Concept Learning Perceptrons, Artificial Neuronal Networks Lecture 4: Perceptrons
More informationDiscussion About Nonlinear Time Series Prediction Using Least Squares Support Vector Machine
Commun. Theor. Phys. (Beijing, China) 43 (2005) pp. 1056 1060 c International Academic Publishers Vol. 43, No. 6, June 15, 2005 Discussion About Nonlinear Time Series Prediction Using Least Squares Support
More informationWEATHER DEPENENT ELECTRICITY MARKET FORECASTING WITH NEURAL NETWORKS, WAVELET AND DATA MINING TECHNIQUES. Z.Y. Dong X. Li Z. Xu K. L.
WEATHER DEPENENT ELECTRICITY MARKET FORECASTING WITH NEURAL NETWORKS, WAVELET AND DATA MINING TECHNIQUES Abstract Z.Y. Dong X. Li Z. Xu K. L. Teo School of Information Technology and Electrical Engineering
More informationArtificial Neural Networks
Artificial Neural Networks 鮑興國 Ph.D. National Taiwan University of Science and Technology Outline Perceptrons Gradient descent Multi-layer networks Backpropagation Hidden layer representations Examples
More informationMulti-layer Neural Networks
Multi-layer Neural Networks Steve Renals Informatics 2B Learning and Data Lecture 13 8 March 2011 Informatics 2B: Learning and Data Lecture 13 Multi-layer Neural Networks 1 Overview Multi-layer neural
More information(Feed-Forward) Neural Networks Dr. Hajira Jabeen, Prof. Jens Lehmann
(Feed-Forward) Neural Networks 2016-12-06 Dr. Hajira Jabeen, Prof. Jens Lehmann Outline In the previous lectures we have learned about tensors and factorization methods. RESCAL is a bilinear model for
More informationRECURRENT NEURAL NETWORKS WITH FLEXIBLE GATES USING KERNEL ACTIVATION FUNCTIONS
2018 IEEE INTERNATIONAL WORKSHOP ON MACHINE LEARNING FOR SIGNAL PROCESSING, SEPT. 17 20, 2018, AALBORG, DENMARK RECURRENT NEURAL NETWORKS WITH FLEXIBLE GATES USING KERNEL ACTIVATION FUNCTIONS Simone Scardapane,
More informationShort-term wind forecasting using artificial neural networks (ANNs)
Energy and Sustainability II 197 Short-term wind forecasting using artificial neural networks (ANNs) M. G. De Giorgi, A. Ficarella & M. G. Russo Department of Engineering Innovation, Centro Ricerche Energia
More informationAn assessment of time changes of the health risk of PM10 based on GRIMM analyzer data and respiratory deposition model
J. Keder / Landbauforschung Völkenrode Special Issue 38 57 An assessment of time changes of the health risk of PM based on GRIMM analyzer data and respiratory deposition model J. Keder Abstract PM particles
More informationDESIGN AND DEVELOPMENT OF ARTIFICIAL INTELLIGENCE SYSTEM FOR WEATHER FORECASTING USING SOFT COMPUTING TECHNIQUES
DESIGN AND DEVELOPMENT OF ARTIFICIAL INTELLIGENCE SYSTEM FOR WEATHER FORECASTING USING SOFT COMPUTING TECHNIQUES Polaiah Bojja and Nagendram Sanam Department of Electronics and Communication Engineering,
More informationA Novel Activity Detection Method
A Novel Activity Detection Method Gismy George P.G. Student, Department of ECE, Ilahia College of,muvattupuzha, Kerala, India ABSTRACT: This paper presents an approach for activity state recognition of
More informationAn Improved Conjugate Gradient Scheme to the Solution of Least Squares SVM
An Improved Conjugate Gradient Scheme to the Solution of Least Squares SVM Wei Chu Chong Jin Ong chuwei@gatsby.ucl.ac.uk mpeongcj@nus.edu.sg S. Sathiya Keerthi mpessk@nus.edu.sg Control Division, Department
More informationIn the Name of God. Lectures 15&16: Radial Basis Function Networks
1 In the Name of God Lectures 15&16: Radial Basis Function Networks Some Historical Notes Learning is equivalent to finding a surface in a multidimensional space that provides a best fit to the training
More informationPrediction of Hourly Solar Radiation in Amman-Jordan by Using Artificial Neural Networks
Int. J. of Thermal & Environmental Engineering Volume 14, No. 2 (2017) 103-108 Prediction of Hourly Solar Radiation in Amman-Jordan by Using Artificial Neural Networks M. A. Hamdan a*, E. Abdelhafez b
More informationEE-588 ADVANCED TOPICS IN NEURAL NETWORK
CUKUROVA UNIVERSITY DEPARTMENT OF ELECTRICAL&ELECTRONICS ENGINEERING EE-588 ADVANCED TOPICS IN NEURAL NETWORK THE PROJECT PROPOSAL AN APPLICATION OF NEURAL NETWORKS FOR WEATHER TEMPERATURE FORECASTING
More informationNON-FIXED AND ASYMMETRICAL MARGIN APPROACH TO STOCK MARKET PREDICTION USING SUPPORT VECTOR REGRESSION. Haiqin Yang, Irwin King and Laiwan Chan
In The Proceedings of ICONIP 2002, Singapore, 2002. NON-FIXED AND ASYMMETRICAL MARGIN APPROACH TO STOCK MARKET PREDICTION USING SUPPORT VECTOR REGRESSION Haiqin Yang, Irwin King and Laiwan Chan Department
More informationFrom CDF to PDF A Density Estimation Method for High Dimensional Data
From CDF to PDF A Density Estimation Method for High Dimensional Data Shengdong Zhang Simon Fraser University sza75@sfu.ca arxiv:1804.05316v1 [stat.ml] 15 Apr 2018 April 17, 2018 1 Introduction Probability
More informationLast updated: Oct 22, 2012 LINEAR CLASSIFIERS. J. Elder CSE 4404/5327 Introduction to Machine Learning and Pattern Recognition
Last updated: Oct 22, 2012 LINEAR CLASSIFIERS Problems 2 Please do Problem 8.3 in the textbook. We will discuss this in class. Classification: Problem Statement 3 In regression, we are modeling the relationship
More informationHigh Wind and Energy Specific Models for Global. Production Forecast
High Wind and Energy Specific Models for Global Production Forecast Carlos Alaíz, Álvaro Barbero, Ángela Fernández, José R. Dorronsoro Dpto. de Ingeniería Informática and Instituto de Ingeniería del Conocimiento
More informationHopfield Neural Network
Lecture 4 Hopfield Neural Network Hopfield Neural Network A Hopfield net is a form of recurrent artificial neural network invented by John Hopfield. Hopfield nets serve as content-addressable memory systems
More informationCS 540: Machine Learning Lecture 1: Introduction
CS 540: Machine Learning Lecture 1: Introduction AD January 2008 AD () January 2008 1 / 41 Acknowledgments Thanks to Nando de Freitas Kevin Murphy AD () January 2008 2 / 41 Administrivia & Announcement
More information