A New Look at Nonlinear Time Series Prediction with NARX Recurrent Neural Network. José Maria P. Menezes Jr. and Guilherme A.
|
|
- Elfrieda Charles
- 6 years ago
- Views:
Transcription
1 A New Look at Nonlinear Time Series Prediction with NARX Recurrent Neural Network José Maria P. Menezes Jr. and Guilherme A. Barreto Department of Teleinformatics Engineering Federal University of Ceará, Centro de Tecnologia Fortaleza-CE, Brazil October 23-27, 2006
2 Contents
3 Motivation Objectives Theoretical Foundations Time Series Prediction (TSP) Tasks Recurrent Neural Networks (RNN) NARX Architecture Simulations VBR Video Traffic Laser Time Series Conclusion
4 Motivation 1. Long Term Dependence occurs very often in real-world time series (e.g. traffic series). 2. Theory of Dynamical Systems provides the theoretical bases to analyzing nonlinear systems with chaotic behavior. 3. Recurrent Neural Networks are capable of representing arbitrary nonlinear dynamical mappings, such as those commonly found in nonlinear time series prediction. 4. NARX Model is a recurrent neural network capable of modeling efficiently time series with long-term dependences.
5 Objectives of this Work 1. To evaluate the performance of standard dynamic neural networks in difficult time series prediction tasks. 2. To propose a new field of application for NARX networks: prediction of univariate time series with long range dependencies.
6 Theoretical Foundations
7 Time Series Prediction (TSP) Tasks TSP One-step-ahead prediction: Neural network models are commonly used to estimating only the next value of a time series. Multi-step-ahead prediction: If the user is interested in a wider prediction horizon. The model s output should be fed back to the input regressor for a fixed but finite number of time steps. Dynamic modeling: If the prediction horizon tends to infinity, the neural network will act as an autonomous system, modeling the long-term dynamics of the system that generated the oberved time series.
8 Recurrent Neural Networks (RNN) RNN Feedforward MLP-like networks can be easily adapted to process time series through an input tapped delay line (e.g. FTDNN model). Recurrent neural networks (RNN) have local and/or global feedback loops in their structure (e.g. Elman, Jordan and NARX models) [1]. RNN are capable to represent arbitrary nonlinear dynamical mappings, such as those commonly found in nonlinear time series prediction tasks.
9 Recurrent Neural Networks (RNN) Takens Embedding Theorem Takens [3] has shown that the state of a deterministic dynamic system can be accurately reconstructed by a time window of finite length sliding over the observed time series as follows: where x 1 (n) [x(n) x(n τ) x(n (d E 1)τ)] T, x(n) is the value of the time series at time n, d E is the embedding dimension and τ is the embedding delay.
10 Recurrent Neural Networks (RNN) FTDNN Focused Time Delay Neural Network
11 Recurrent Neural Networks (RNN) Elman Network
12 NARX Architecture NARX Model in System Identification Nonlinear Autoregressive with exogenous input (NARX) [2]: y(n + 1) = f [y(n),..., y(n d y + 1); u(n), u(n 1),..., u(n d u + 1)]. = f [y(n); u(n)], where u(n) and y(n) denote, respectively, the input and the output of the model at discrete time n. The parameters d u 1 and d y 1, d u d y, are memory delays.
13 NARX Architecture NARX Neural Network Architecture
14 NARX Architecture NARX Network in Nonlinear Time Series Prediction Using Takens Theorem to build the input regressor: u(n) = [x(n) x(n τ) x(n (d E 1)τ)] T, where we set d u = d E. The output regressor y(n) can be written in two different modes, depending on the training modes of the NARX network: y p (n) = [ x(n),..., x(n d y + 1)], y sp (n) = [x(n),..., x(n d y + 1)], where the P-mode contains d y past values of the estimated time series, while the SP-mode contains d y past values of the actual time series.
15 NARX Architecture Paralell Mode (NARX-P)
16 NARX Architecture Series-Parallel Mode (NARX-SP)
17 Simulations
18 Evaluated Networks NARX-P, NARX-SP, FTDNN and Elman networks. All networks have two-hidden layers and one output neuron. All neurons use the hyperbolic tangent activation function. The standard backpropagation algorithm is used to train the networks.
19 Summary Table - Training Parameters number of neurons number of neurons 1st hidden layer 2st hidden layer learning rate epochs (N h,1 ) p (N h,2 ) TASK 1 2d E + 1 p Nh, TASK 2 2d E + 1 Nh,
20 Performance Evaluation Metric The networks are evaluated in multi-step-ahead prediction tasks. Quantitatively, we compute the Normalized Mean Squared Error (NMSE): NMSE(N) = 1 N N σx 2 e 2 (n) where n=1 N is the horizon prediction, σ x 2 is the sample variance of the actual time series and e(n) = y(n) ŷ(n) is the prediction error at time n.
21 Task 1: Long-term prediction of VBR video traffic Variable bit rate (VBR) video traffic (Jurassic Park) [4]. This video traffic trace was encoded with MPEG-I. VBR video traffic typically exhibits burstiness over multiple time scales [5],[6] sample points, rescaled to the range [ 1, 1] samples for training and 500 samples for testing.
22 VBR Video Traffic Empirical Sensitivity Analysis - 1 Embedding dimension 1 FTDNN Elman NARX P NARX SP 0.8 NMSE Order
23 VBR Video Traffic Empirical Sensitivity Analysis - 2 Number of training epochs FTDNN Elman NARX P NARX SP NMSE Epochs
24 VBR Video Traffic Multi-Step-Ahead Predictions - 1 FTDNN Predicted Original 0 Bits Frame number
25 VBR Video Traffic Multi-Step-Ahead Predictions - 2 Elman Predicted Original 0 Bits Frame number
26 VBR Video Traffic Multi-Step-Ahead Predictions - 3 NARX-SP 0.5 Predicted Original 0 Bits Frame number
27 VBR Video Traffic Task 2: Long-term prediction of chaotic laser intensities Chaotic laser time series: comprises measurements of the intensity pulsations of a single-mode Far-Infrared-Laser NH3 in a chaotic state [7]. Available worldwide since a TSP competition organized by the Santa Fe Institute [8] sample points which have been rescaled to the range [ 1, 1] samples for training and 500 samples for testing.
28 Laser Time Series Dynamic Modeling - 1 FTDNN Elman Network Predicted Original Predicted Original P 0 P Time Time
29 Laser Time Series Dynamic Modeling - 2 NARX-SP Network Predicted Original P Time
30 Laser Time Series Sensitivity Analysis Length of the prediction horizon FTDNN Elman NARX P NARX SP 2 Arv Prediction Horizon (N)
31 Contents Motivation Objectives Theoretical Foundations Simulations Conclusion Laser Time Series Laser Time Series j j Recurrence Plot: original series, NARX-SP, FTDNN, Elman i j j i i i
32 Conclusion
33 Conclusion The results has shown that NARX network can be succesfully applied to complex univariate time series modelling and prediction tasks. The proposed approach consistently outperforms standard neural network based predictors, such as the FTDNN and Elman architectures.
34 References [1] J. F. Kolen and S. C. Kremer, A Field Guide to Dynamical Recurrent Networks, Wiley-IEEE Press, [2] T. Lin, B. G. Horne, P. Tino, and C. L. Giles, Learning long-term dependencies in NARX recurrent neural networks IEEE Transactions on Neural Networks, vol. 7, no. 6, pp , [3] F. Takens, Detecting strange attractors in turbulence, in Dynamical Systems and Turbulence, D. A. Rand and L.-S. Young, Eds. 1981, vol. 898 of Lecture Notes in Mathematics, pp , Springer. [4] O. Rose, Statistical properties of MPEG video traffic and their impact on traffic modeling in ATM systems, in Proceedings of the 20th Annual IEEE Conference on Local. [5] J. Beran, R. Sherman, M. S. Taqqu, and W. Willinger, Long-range dependence in variable-bit-rate video traffic, IEEE Transactions on Communications, vol. 43, no. 234, pp , [6] D. Heyman and T. Lakshman, What are the implications of long-range dependence for VBR video traffic engineering, IEEE/ACM Transactions on Networking, vol. 4, no. 3, pp , [7] U. Huebner, N. B. Abraham, and C. O. Weiss. Dimensions and entropies of chaotic intensity pulsations in a single-mode far-infrared NH3 laser. Physical Review A, 40(11): , [8] A. Weigend and N. Gershefeld. Time Series Prediction: Forecasting the Future and Understanding the Past. Addison-Wesley, Reading, 1994.
35 A New Look at Nonlinear Time Series Prediction with NARX Recurrent Neural Network
Direct Method for Training Feed-forward Neural Networks using Batch Extended Kalman Filter for Multi- Step-Ahead Predictions
Direct Method for Training Feed-forward Neural Networks using Batch Extended Kalman Filter for Multi- Step-Ahead Predictions Artem Chernodub, Institute of Mathematical Machines and Systems NASU, Neurotechnologies
More informationMODELING NONLINEAR DYNAMICS WITH NEURAL. Eric A. Wan. Stanford University, Department of Electrical Engineering, Stanford, CA
MODELING NONLINEAR DYNAMICS WITH NEURAL NETWORKS: EXAMPLES IN TIME SERIES PREDICTION Eric A Wan Stanford University, Department of Electrical Engineering, Stanford, CA 9435-455 Abstract A neural networ
More informationReservoir Computing and Echo State Networks
An Introduction to: Reservoir Computing and Echo State Networks Claudio Gallicchio gallicch@di.unipi.it Outline Focus: Supervised learning in domain of sequences Recurrent Neural networks for supervised
More informationLecture 5: Recurrent Neural Networks
1/25 Lecture 5: Recurrent Neural Networks Nima Mohajerin University of Waterloo WAVE Lab nima.mohajerin@uwaterloo.ca July 4, 2017 2/25 Overview 1 Recap 2 RNN Architectures for Learning Long Term Dependencies
More informationy(n) Time Series Data
Recurrent SOM with Local Linear Models in Time Series Prediction Timo Koskela, Markus Varsta, Jukka Heikkonen, and Kimmo Kaski Helsinki University of Technology Laboratory of Computational Engineering
More informationEnsembles of Nearest Neighbor Forecasts
Ensembles of Nearest Neighbor Forecasts Dragomir Yankov 1, Dennis DeCoste 2, and Eamonn Keogh 1 1 University of California, Riverside CA 92507, USA, {dyankov,eamonn}@cs.ucr.edu, 2 Yahoo! Research, 3333
More informationT Machine Learning and Neural Networks
T-61.5130 Machine Learning and Neural Networks (5 cr) Lecture 11: Processing of Temporal Information Prof. Juha Karhunen https://mycourses.aalto.fi/ Aalto University School of Science, Espoo, Finland 1
More informationChapter 15. Dynamically Driven Recurrent Networks
Chapter 15. Dynamically Driven Recurrent Networks Neural Networks and Learning Machines (Haykin) Lecture Notes on Self-learning Neural Algorithms Byoung-Tak Zhang School of Computer Science and Engineering
More informationInformation Dynamics Foundations and Applications
Gustavo Deco Bernd Schürmann Information Dynamics Foundations and Applications With 89 Illustrations Springer PREFACE vii CHAPTER 1 Introduction 1 CHAPTER 2 Dynamical Systems: An Overview 7 2.1 Deterministic
More informationLong-Term Prediction, Chaos and Artificial Neural Networks. Where is the Meeting Point?
Engineering Letters, 5:, EL_5 Long-Term Prediction, Chaos and Artificial Neural Networks. Where is the Meeting Point? Pilar Gómez-Gil Abstract This paper presents the advances of a research using a combination
More informationNARX neural networks for sequence processing tasks
Master in Artificial Intelligence (UPC-URV-UB) Master of Science Thesis NARX neural networks for sequence processing tasks eng. Eugen Hristev Advisor: prof. dr. René Alquézar Mancho June 2012 Table of
More informationA new method for short-term load forecasting based on chaotic time series and neural network
A new method for short-term load forecasting based on chaotic time series and neural network Sajjad Kouhi*, Navid Taghizadegan Electrical Engineering Department, Azarbaijan Shahid Madani University, Tabriz,
More informationTIME SERIES FORECASTING FOR OUTDOOR TEMPERATURE USING NONLINEAR AUTOREGRESSIVE NEURAL NETWORK MODELS
TIME SERIES FORECASTING FOR OUTDOOR TEMPERATURE USING NONLINEAR AUTOREGRESSIVE NEURAL NETWORK MODELS SANAM NAREJO,EROS PASERO Department of Electronics and Telecommunication, Politecnico Di Torino, Italy
More informationTECHNICAL RESEARCH REPORT
TECHNICAL RESEARCH REPORT Reconstruction of Nonlinear Systems Using Delay Lines and Feedforward Networks by D.L. Elliott T.R. 95-17 ISR INSTITUTE FOR SYSTEMS RESEARCH Sponsored by the National Science
More informationModeling and Predicting Chaotic Time Series
Chapter 14 Modeling and Predicting Chaotic Time Series To understand the behavior of a dynamical system in terms of some meaningful parameters we seek the appropriate mathematical model that captures the
More informationChristian Mohr
Christian Mohr 20.12.2011 Recurrent Networks Networks in which units may have connections to units in the same or preceding layers Also connections to the unit itself possible Already covered: Hopfield
More informationChapter 4 Neural Networks in System Identification
Chapter 4 Neural Networks in System Identification Gábor HORVÁTH Department of Measurement and Information Systems Budapest University of Technology and Economics Magyar tudósok körútja 2, 52 Budapest,
More informationSample Exam COMP 9444 NEURAL NETWORKS Solutions
FAMILY NAME OTHER NAMES STUDENT ID SIGNATURE Sample Exam COMP 9444 NEURAL NETWORKS Solutions (1) TIME ALLOWED 3 HOURS (2) TOTAL NUMBER OF QUESTIONS 12 (3) STUDENTS SHOULD ANSWER ALL QUESTIONS (4) QUESTIONS
More informationEstimating the Number of Hidden Neurons of the MLP Using Singular Value Decomposition and Principal Components Analysis: A Novel Approach
Estimating the Number of Hidden Neurons of the MLP Using Singular Value Decomposition and Principal Components Analysis: A Novel Approach José Daniel A Santos IFCE - Industry Department Av Contorno Norte,
More informationNegatively Correlated Echo State Networks
Negatively Correlated Echo State Networks Ali Rodan and Peter Tiňo School of Computer Science, The University of Birmingham Birmingham B15 2TT, United Kingdom E-mail: {a.a.rodan, P.Tino}@cs.bham.ac.uk
More informationAutonomous learning algorithm for fully connected recurrent networks
Autonomous learning algorithm for fully connected recurrent networks Edouard Leclercq, Fabrice Druaux, Dimitri Lefebvre Groupe de Recherche en Electrotechnique et Automatique du Havre Université du Havre,
More informationInternational University Bremen Guided Research Proposal Improve on chaotic time series prediction using MLPs for output training
International University Bremen Guided Research Proposal Improve on chaotic time series prediction using MLPs for output training Aakash Jain a.jain@iu-bremen.de Spring Semester 2004 1 Executive Summary
More informationFORECASTING OF INFLATION IN BANGLADESH USING ANN MODEL
FORECASTING OF INFLATION IN BANGLADESH USING ANN MODEL Rumana Hossain Department of Physical Science School of Engineering and Computer Science Independent University, Bangladesh Shaukat Ahmed Department
More informationMultivariable and Multiaxial Fatigue Life Assessment of Composite Materials using Neural Networks
Multivariable and Multiaxial Fatigue Life Assessment of Composite Materials using Neural Networks Mas Irfan P. Hidayat Abstract In the present paper, multivariable and multiaxial fatigue life assessment
More informationMulti-Model Integration for Long-Term Time Series Prediction
Multi-Model Integration for Long-Term Time Series Prediction Zifang Huang, Mei-Ling Shyu, James M. Tien Department of Electrical and Computer Engineering University of Miami, Coral Gables, FL, USA z.huang3@umiami.edu,
More informationInput Selection for Long-Term Prediction of Time Series
Input Selection for Long-Term Prediction of Time Series Jarkko Tikka, Jaakko Hollmén, and Amaury Lendasse Helsinki University of Technology, Laboratory of Computer and Information Science, P.O. Box 54,
More informationExperiments with a Hybrid-Complex Neural Networks for Long Term Prediction of Electrocardiograms
IEEE. ransactions of the 6 International World Congress of Computational Intelligence, IJCNN 6 Experiments with a Hybrid-Complex Neural Networks for Long erm Prediction of Electrocardiograms Pilar Gómez-Gil,
More informationDeep Learning Architecture for Univariate Time Series Forecasting
CS229,Technical Report, 2014 Deep Learning Architecture for Univariate Time Series Forecasting Dmitry Vengertsev 1 Abstract This paper studies the problem of applying machine learning with deep architecture
More informationElman Recurrent Neural Network in Thermal Modeling of Power Transformers
Elman Recurrent Neural Network in Thermal Modeling of Power Transformers M. HELL, F. GOMIDE Department of Computer Engineering and Industrial Automation - DCA School of Electrical and Computer Engineering
More informationRecurrent Neural Networks
Recurrent Neural Networks Datamining Seminar Kaspar Märtens Karl-Oskar Masing Today's Topics Modeling sequences: a brief overview Training RNNs with back propagation A toy example of training an RNN Why
More informationForecasting Chaotic time series by a Neural Network
Forecasting Chaotic time series by a Neural Network Dr. ATSALAKIS George Technical University of Crete, Greece atsalak@otenet.gr Dr. SKIADAS Christos Technical University of Crete, Greece atsalak@otenet.gr
More informationA STATE-SPACE NEURAL NETWORK FOR MODELING DYNAMICAL NONLINEAR SYSTEMS
A STATE-SPACE NEURAL NETWORK FOR MODELING DYNAMICAL NONLINEAR SYSTEMS Karima Amoura Patrice Wira and Said Djennoune Laboratoire CCSP Université Mouloud Mammeri Tizi Ouzou Algeria Laboratoire MIPS Université
More informationOn the use of Long-Short Term Memory neural networks for time series prediction
On the use of Long-Short Term Memory neural networks for time series prediction Pilar Gómez-Gil National Institute of Astrophysics, Optics and Electronics ccc.inaoep.mx/~pgomez In collaboration with: J.
More informationMODULAR ECHO STATE NEURAL NETWORKS IN TIME SERIES PREDICTION
Computing and Informatics, Vol. 30, 2011, 321 334 MODULAR ECHO STATE NEURAL NETWORKS IN TIME SERIES PREDICTION Štefan Babinec, Jiří Pospíchal Department of Mathematics Faculty of Chemical and Food Technology
More informationDeep Recurrent Neural Networks
Deep Recurrent Neural Networks Artem Chernodub e-mail: a.chernodub@gmail.com web: http://zzphoto.me ZZ Photo IMMSP NASU 2 / 28 Neuroscience Biological-inspired models Machine Learning p x y = p y x p(x)/p(y)
More informationEE-559 Deep learning Recurrent Neural Networks
EE-559 Deep learning 11.1. Recurrent Neural Networks François Fleuret https://fleuret.org/ee559/ Sun Feb 24 20:33:31 UTC 2019 Inference from sequences François Fleuret EE-559 Deep learning / 11.1. Recurrent
More informationRecurrent Neural Networks Deep Learning Lecture 5. Efstratios Gavves
Recurrent Neural Networks Deep Learning Lecture 5 Efstratios Gavves Sequential Data So far, all tasks assumed stationary data Neither all data, nor all tasks are stationary though Sequential Data: Text
More informationPredicting the Future with the Appropriate Embedding Dimension and Time Lag JAMES SLUSS
Predicting the Future with the Appropriate Embedding Dimension and Time Lag Georgios Lezos, Monte Tull, Joseph Havlicek, and Jim Sluss GEORGIOS LEZOS Graduate Student School of Electtical & Computer Engineering
More informationEcho State Networks with Filter Neurons and a Delay&Sum Readout
Echo State Networks with Filter Neurons and a Delay&Sum Readout Georg Holzmann 2,1 (Corresponding Author) http://grh.mur.at grh@mur.at Helmut Hauser 1 helmut.hauser@igi.tugraz.at 1 Institute for Theoretical
More informationLearning Chaotic Attractors by Neural Networks
LETTER Communicated by José Principe Learning Chaotic Attractors by Neural Networks Rembrandt Bakker DelftChemTech, Delft University of Technology, 2628 BL Delft, The Netherlands Jaap C. Schouten Chemical
More informationSOLAR ENERGY FORECASTING A PATHWAY FOR SUCCESSFUL RENEWABLE ENERGY INTEGRATION. (An ANN Based Model using NARX Model for forecasting of GHI)
IPS 2018 SOLAR ENERGY FORECASTING A PATHWAY FOR SUCCESSFUL RENEWABLE ENERGY INTEGRATION ABSTRACT (An ANN Based Model using NARX Model for forecasting of GHI) Manish Kumar Tikariha, DGM(Operations), NTPC
More informationEvaluating nonlinearity and validity of nonlinear modeling for complex time series
Evaluating nonlinearity and validity of nonlinear modeling for complex time series Tomoya Suzuki, 1 Tohru Ikeguchi, 2 and Masuo Suzuki 3 1 Department of Information Systems Design, Doshisha University,
More informationHALF HOURLY ELECTRICITY LOAD PREDICTION USING ECHO STATE NETWORK
HALF HOURLY ELECTRICITY LOAD PREDICTION USING ECHO STATE NETWORK Shivani Varshney, Toran Verma Department of Computer Science & Engineering, RCET, Bhilai, India. ABSTRACT Prediction of time series is a
More informationFinancial Risk and Returns Prediction with Modular Networked Learning
arxiv:1806.05876v1 [cs.lg] 15 Jun 2018 Financial Risk and Returns Prediction with Modular Networked Learning Carlos Pedro Gonçalves June 18, 2018 University of Lisbon, Instituto Superior de Ciências Sociais
More informationAutomatic modelling of neural networks for time series prediction in search of a uniform methodology across varying time frequencies
Automatic modelling of neural networks for time series prediction in search of a uniform methodology across varying time frequencies Nikolaos Kourentzes and Sven F. Crone Lancaster University Management
More informationHybrid HMM/MLP models for time series prediction
Bruges (Belgium), 2-23 April 999, D-Facto public., ISBN 2-649-9-X, pp. 455-462 Hybrid HMM/MLP models for time series prediction Joseph Rynkiewicz SAMOS, Université Paris I - Panthéon Sorbonne Paris, France
More informationPredicting Chaotic Time Series by Reinforcement Learning
Predicting Chaotic Time Series by Reinforcement Learning T. Kuremoto 1, M. Obayashi 1, A. Yamamoto 1, and K. Kobayashi 1 1 Dep. of Computer Science and Systems Engineering, Engineering Faculty,Yamaguchi
More informationShort Term Memory Quantifications in Input-Driven Linear Dynamical Systems
Short Term Memory Quantifications in Input-Driven Linear Dynamical Systems Peter Tiňo and Ali Rodan School of Computer Science, The University of Birmingham Birmingham B15 2TT, United Kingdom E-mail: {P.Tino,
More informationNONLINEAR PLANT IDENTIFICATION BY WAVELETS
NONLINEAR PLANT IDENTIFICATION BY WAVELETS Edison Righeto UNESP Ilha Solteira, Department of Mathematics, Av. Brasil 56, 5385000, Ilha Solteira, SP, Brazil righeto@fqm.feis.unesp.br Luiz Henrique M. Grassi
More informationExperiments with neural network for modeling of nonlinear dynamical systems: Design problems
Experiments with neural network for modeling of nonlinear dynamical systems: Design problems Ewa Skubalska-Rafaj lowicz Wroc law University of Technology, Wroc law, Wroc law, Poland Summary Introduction
More information4. Multilayer Perceptrons
4. Multilayer Perceptrons This is a supervised error-correction learning algorithm. 1 4.1 Introduction A multilayer feedforward network consists of an input layer, one or more hidden layers, and an output
More informationSystem Identification for the Hodgkin-Huxley Model using Artificial Neural Networks
Proceedings of International Joint Conference on Neural Networks, Orlando, Florida, USA, August 12-17, 2007 System Identification for the Hodgkin-Huxley Model using Artificial Neural Networks Manish Saggar,
More informationA Black-Box Approach in Modeling Valve Stiction
Vol:4, No:8, A Black-Box Approach in Modeling Valve Stiction H. Zabiri, N. Mazuki International Science Index, Mechanical and Mechatronics Engineering Vol:4, No:8, waset.org/publication/46 Abstract Several
More informationAnalysis of Fast Input Selection: Application in Time Series Prediction
Analysis of Fast Input Selection: Application in Time Series Prediction Jarkko Tikka, Amaury Lendasse, and Jaakko Hollmén Helsinki University of Technology, Laboratory of Computer and Information Science,
More informationNeural Network Identification of Non Linear Systems Using State Space Techniques.
Neural Network Identification of Non Linear Systems Using State Space Techniques. Joan Codina, J. Carlos Aguado, Josep M. Fuertes. Automatic Control and Computer Engineering Department Universitat Politècnica
More informationRecurrent Neural Networks (Part - 2) Sumit Chopra Facebook
Recurrent Neural Networks (Part - 2) Sumit Chopra Facebook Recap Standard RNNs Training: Backpropagation Through Time (BPTT) Application to sequence modeling Language modeling Applications: Automatic speech
More informationAnalysis of Multilayer Neural Network Modeling and Long Short-Term Memory
Analysis of Multilayer Neural Network Modeling and Long Short-Term Memory Danilo López, Nelson Vera, Luis Pedraza International Science Index, Mathematical and Computational Sciences waset.org/publication/10006216
More informationSensitivity of ABR Congestion Control Algorithms to Hurst Parameter Estimates
Sensitivity of ABR Congestion Control Algorithms to Hurst Parameter Estimates Sven A. M. Östring 1, Harsha Sirisena 1, and Irene Hudson 2 1 Department of Electrical & Electronic Engineering 2 Department
More informationPHONEME CLASSIFICATION OVER THE RECONSTRUCTED PHASE SPACE USING PRINCIPAL COMPONENT ANALYSIS
PHONEME CLASSIFICATION OVER THE RECONSTRUCTED PHASE SPACE USING PRINCIPAL COMPONENT ANALYSIS Jinjin Ye jinjin.ye@mu.edu Michael T. Johnson mike.johnson@mu.edu Richard J. Povinelli richard.povinelli@mu.edu
More informationModelling Time Series with Neural Networks. Volker Tresp Summer 2017
Modelling Time Series with Neural Networks Volker Tresp Summer 2017 1 Modelling of Time Series The next figure shows a time series (DAX) Other interesting time-series: energy prize, energy consumption,
More informationTransformer Top-Oil Temperature Modeling and Simulation
Transformer Top-Oil Temperature Modeling and Simulation T. C. B. N. Assunção, J. L. Silvino, and P. Resende Abstract The winding hot-spot temperature is one of the most critical parameters that affect
More informationIntroduction to Neural Networks: Structure and Training
Introduction to Neural Networks: Structure and Training Professor Q.J. Zhang Department of Electronics Carleton University, Ottawa, Canada www.doe.carleton.ca/~qjz, qjz@doe.carleton.ca A Quick Illustration
More informationLoad Forecasting Using Artificial Neural Networks and Support Vector Regression
Proceedings of the 7th WSEAS International Conference on Power Systems, Beijing, China, September -7, 2007 3 Load Forecasting Using Artificial Neural Networks and Support Vector Regression SILVIO MICHEL
More informationReconstruction Deconstruction:
Reconstruction Deconstruction: A Brief History of Building Models of Nonlinear Dynamical Systems Jim Crutchfield Center for Computational Science & Engineering Physics Department University of California,
More informationDynamical Systems and Deep Learning: Overview. Abbas Edalat
Dynamical Systems and Deep Learning: Overview Abbas Edalat Dynamical Systems The notion of a dynamical system includes the following: A phase or state space, which may be continuous, e.g. the real line,
More informationMemory Capacity of Input-Driven Echo State NetworksattheEdgeofChaos
Memory Capacity of Input-Driven Echo State NetworksattheEdgeofChaos Peter Barančok and Igor Farkaš Faculty of Mathematics, Physics and Informatics Comenius University in Bratislava, Slovakia farkas@fmph.uniba.sk
More informationNeural Networks with Applications to Vision and Language. Feedforward Networks. Marco Kuhlmann
Neural Networks with Applications to Vision and Language Feedforward Networks Marco Kuhlmann Feedforward networks Linear separability x 2 x 2 0 1 0 1 0 0 x 1 1 0 x 1 linearly separable not linearly separable
More informationMODELLING OF METALLURGICAL PROCESSES USING CHAOS THEORY AND HYBRID COMPUTATIONAL INTELLIGENCE
MODELLING OF METALLURGICAL PROCESSES USING CHAOS THEORY AND HYBRID COMPUTATIONAL INTELLIGENCE J. Krishanaiah, C. S. Kumar, M. A. Faruqi, A. K. Roy Department of Mechanical Engineering, Indian Institute
More informationControl-oriented model learning with a recurrent neural network
Control-oriented model learning with a recurrent neural network M. A. Bucci O. Semeraro A. Allauzen L. Cordier G. Wisniewski L. Mathelin 20 November 2018, APS Atlanta Kuramoto-Sivashinsky (KS) u t = 4
More informationUSING WAVELET NEURAL NETWORK FOR THE IDENTIFICATION OF A BUILDING STRUCTURE FROM EXPERIMENTAL DATA
13 th World Conference on Earthquake Engineering Vancouver, B.C., Canada August 1-6, 24 Paper No. 241 USING WAVELET NEURAL NETWORK FOR THE IDENTIFICATION OF A BUILDING STRUCTURE FROM EXPERIMENTAL DATA
More informationArtificial Neural Network
Artificial Neural Network Eung Je Woo Department of Biomedical Engineering Impedance Imaging Research Center (IIRC) Kyung Hee University Korea ejwoo@khu.ac.kr Neuron and Neuron Model McCulloch and Pitts
More informationAnalysis of the Learning Process of a Recurrent Neural Network on the Last k-bit Parity Function
Analysis of the Learning Process of a Recurrent Neural Network on the Last k-bit Parity Function Austin Wang Adviser: Xiuyuan Cheng May 4, 2017 1 Abstract This study analyzes how simple recurrent neural
More informationWavelet Neural Networks for Nonlinear Time Series Analysis
Applied Mathematical Sciences, Vol. 4, 2010, no. 50, 2485-2495 Wavelet Neural Networks for Nonlinear Time Series Analysis K. K. Minu, M. C. Lineesh and C. Jessy John Department of Mathematics National
More information(Feed-Forward) Neural Networks Dr. Hajira Jabeen, Prof. Jens Lehmann
(Feed-Forward) Neural Networks 2016-12-06 Dr. Hajira Jabeen, Prof. Jens Lehmann Outline In the previous lectures we have learned about tensors and factorization methods. RESCAL is a bilinear model for
More informationEEE 241: Linear Systems
EEE 4: Linear Systems Summary # 3: Introduction to artificial neural networks DISTRIBUTED REPRESENTATION An ANN consists of simple processing units communicating with each other. The basic elements of
More informationThis paper presents the
ISESCO JOURNAL of Science and Technology Volume 8 - Number 14 - November 2012 (2-8) A Novel Ensemble Neural Network based Short-term Wind Power Generation Forecasting in a Microgrid Aymen Chaouachi and
More informationRecurrent Neural Network Based Gating for Natural Gas Load Prediction System
Recurrent Neural Network Based Gating for Natural Gas Load Prediction System Petr Musilek, Member, IEEE, Emil Pelikán, Tomáš Brabec and Milan Šimůnek Abstract Prediction of natural gas consumption is an
More informationConvolutional LSTM Network: A Machine Learning Approach for Precipitation Nowcasting 卷积 LSTM 网络 : 利用机器学习预测短期降雨 施行健 香港科技大学 VALSE 2016/03/23
Convolutional LSTM Network: A Machine Learning Approach for Precipitation Nowcasting 卷积 LSTM 网络 : 利用机器学习预测短期降雨 施行健 香港科技大学 VALSE 2016/03/23 Content Quick Review of Recurrent Neural Network Introduction
More informationFractional Integrated Recurrent Neural Network. (FIRNN) for Forecasting of Time Series Data in. Electricity Load in Java-Bali
Contemporary Engineering Sciences, Vol. 8, 2015, no. 32, 1535-1550 HIKARI Ltd, www.m-hikari.com http://dx.doi.org/10.12988/ces.2015.510283 Fractional Integrated Recurrent Neural Network (FIRNN) for Forecasting
More informationModeling Economic Time Series Using a Focused Time Lagged FeedForward Neural Network
Proceedings of Student Research Day, CSIS, Pace University, May 9th, 23 Modeling Economic Time Series Using a Focused Time Lagged FeedForward Neural Network N. Moseley ABSTRACT, - Artificial neural networks
More informationLecture 15: Exploding and Vanishing Gradients
Lecture 15: Exploding and Vanishing Gradients Roger Grosse 1 Introduction Last lecture, we introduced RNNs and saw how to derive the gradients using backprop through time. In principle, this lets us train
More informationIntroduction to Neural Networks
Introduction to Neural Networks What are (Artificial) Neural Networks? Models of the brain and nervous system Highly parallel Process information much more like the brain than a serial computer Learning
More informationNonlinear Characterization of Activity Dynamics in Online Collaboration Websites
Nonlinear Characterization of Activity Dynamics in Online Collaboration Websites Tiago Santos 1 Simon Walk 2 Denis Helic 3 1 Know-Center, Graz, Austria 2 Stanford University 3 Graz University of Technology
More informationA recursive algorithm based on the extended Kalman filter for the training of feedforward neural models. Isabelle Rivals and Léon Personnaz
In Neurocomputing 2(-3): 279-294 (998). A recursive algorithm based on the extended Kalman filter for the training of feedforward neural models Isabelle Rivals and Léon Personnaz Laboratoire d'électronique,
More informationThe Research of Railway Coal Dispatched Volume Prediction Based on Chaos Theory
The Research of Railway Coal Dispatched Volume Prediction Based on Chaos Theory Hua-Wen Wu Fu-Zhang Wang Institute of Computing Technology, China Academy of Railway Sciences Beijing 00044, China, P.R.
More informationMultilayer Perceptrons (MLPs)
CSE 5526: Introduction to Neural Networks Multilayer Perceptrons (MLPs) 1 Motivation Multilayer networks are more powerful than singlelayer nets Example: XOR problem x 2 1 AND x o x 1 x 2 +1-1 o x x 1-1
More informationA Logarithmic Neural Network Architecture for Unbounded Non-Linear Function Approximation
1 Introduction A Logarithmic Neural Network Architecture for Unbounded Non-Linear Function Approximation J Wesley Hines Nuclear Engineering Department The University of Tennessee Knoxville, Tennessee,
More informationLearning and Memory in Neural Networks
Learning and Memory in Neural Networks Guy Billings, Neuroinformatics Doctoral Training Centre, The School of Informatics, The University of Edinburgh, UK. Neural networks consist of computational units
More informationPattern Matching and Neural Networks based Hybrid Forecasting System
Pattern Matching and Neural Networks based Hybrid Forecasting System Sameer Singh and Jonathan Fieldsend PA Research, Department of Computer Science, University of Exeter, Exeter, UK Abstract In this paper
More informationPower and Limits of Recurrent Neural Networks for Symbolic Sequences Processing
Power and Limits of Recurrent Neural Networks for Symbolic Sequences Processing Matej Makula Institute of Applied Informatics Faculty of Informatics and Information Technologies Slovak University of Technology
More informationApplication of an Artificial Neural Network Based Tool for Prediction of Pavement Performance
0 0 0 0 Application of an Artificial Neural Network Based Tool for Prediction of Pavement Performance Adelino Ferreira, Rodrigo Cavalcante Pavement Mechanics Laboratory, Research Center for Territory,
More informationRecurrent neural networks
12-1: Recurrent neural networks Prof. J.C. Kao, UCLA Recurrent neural networks Motivation Network unrollwing Backpropagation through time Vanishing and exploding gradients LSTMs GRUs 12-2: Recurrent neural
More informationRobust Learning of Chaotic Attractors
published in: Advances in Neural Information Processing Systems 12, S.A. Solla, T.K. Leen, K.-R. Müller (eds.), MIT Press, 2000, pp. 879--885. Robust Learning of Chaotic Attractors Rembrandt Bakker* Jaap
More informationNeural Network Ensemble-based Solar Power Generation Short-Term Forecasting
World Academy of Science, Engineering and Technology 5 9 eural etwork Ensemble-based Solar Power Generation Short-Term Forecasting A. Chaouachi, R. M. Kamel, R. Ichikawa, H. Hayashi, and K. agasaka Abstract
More informationApplication of NARX based FFNN, SVR and ANN Fitting models for long term industrial load forecasting and their comparison
Application of NARX based FFNN, SVR and ANN Fitting models for long term industrial load forecasting and their comparison *Shahid M. Awan 1, 3, Member, IEEE, Zubair. A. Khan 1, 2, M. Aslam 3, Waqar Mahmood
More informationAutomatic Structure and Parameter Training Methods for Modeling of Mechanical System by Recurrent Neural Networks
Automatic Structure and Parameter Training Methods for Modeling of Mechanical System by Recurrent Neural Networks C. James Li and Tung-Yung Huang Department of Mechanical Engineering, Aeronautical Engineering
More informationError Entropy Criterion in Echo State Network Training
Error Entropy Criterion in Echo State Network Training Levy Boccato 1, Daniel G. Silva 1, Denis Fantinato 1, Kenji Nose Filho 1, Rafael Ferrari 1, Romis Attux 1, Aline Neves 2, Jugurta Montalvão 3 and
More informationNARX Time Series Model for Remaining Useful Life Estimation of Gas Turbine Engines
NARX Series Model for Remaining Useful Life Estimation of Gas Turbine Engines Oguz Bektas, Jeffrey A. Jones 2,2 Warwick Manufacturing Group, University of Warwick, Coventry, UK O.Bektas@warwick.ac.uk J.A.Jones@warwick.ac.uk
More informationDo we need Experts for Time Series Forecasting?
Do we need Experts for Time Series Forecasting? Christiane Lemke and Bogdan Gabrys Bournemouth University - School of Design, Engineering and Computing Poole House, Talbot Campus, Poole, BH12 5BB - United
More informationLong-Short Term Memory and Other Gated RNNs
Long-Short Term Memory and Other Gated RNNs Sargur Srihari srihari@buffalo.edu This is part of lecture slides on Deep Learning: http://www.cedar.buffalo.edu/~srihari/cse676 1 Topics in Sequence Modeling
More information