Neural Network Identification of Non Linear Systems Using State Space Techniques.

Size: px
Start display at page:

Download "Neural Network Identification of Non Linear Systems Using State Space Techniques."

Transcription

1 Neural Network Identification of Non Linear Systems Using State Space Techniques. Joan Codina, J. Carlos Aguado, Josep M. Fuertes. Automatic Control and Computer Engineering Department Universitat Politècnica de Catalunya C/ Pau Gargallo, Barcelona - Spain Abstract This paper will focus on the identification of non linear systems by means of discrete time Artificial Neural Networks (ANN). A new architecture of ANN, with the structure of state space description of systems, is compared to the system input-output ANN structure. Through it, the results obtained by our neural network are compared with the results presented by Narendra in [1]. Our purpose with such a comparison is to demonstrate the viability of the proposed model and to evaluate the characteristics and possibilities of each method. I. INTRODUCTION Systems identification is a fundamental question in systems theory. ANN can be used to identify dynamic systems in order to design an ANN controller based on the system model. The problems of using ANN in this field arise from the general properties of neural networks: the distribution of information inside the network, which difficulties the study of the model, and the lack of methodologies for the application of neural networks (number of layers or neurons, learning rate). Due to that, the identification of systems using ANN is not a self contained subject, but a tool needed by other applications of the ANN such as non linear controller's design. The use of ANN in the identification of dynamic systems has been carried out mainly in two ways, either using classical dynamic neural networks or using feed-forward neural networks with external delays. The use of standard dynamic neural networks such as the ones of Jordan [] or Elman [3] presents some problems: an example is given when some of the system states are not connected directly to the output signals, for instance in systems with physical delays. In Jordan nets we also need the number of outputs to be at least the same as the system order. This condition can be sometimes unattainable. The investigation in the field of systems identification has developed models based on feed-forward neural networks where, by many external delays of the inputs and the previous outputs, the dynamic behavior is accomplished (Fig. 1): Multilayer feed-forward Neural network Z -1 Z -1 Z -1 Z -1 Z -1 Z -1 y(k+1) u(k) y(k+1) Fig 1. Neural network based on input output description of systems. The dynamic behavior is obtained with external delays (z -1 ). y(k+1)= F [ u(k), u(k-1), u(k-)... y(k), y(k-1), y(k-)...]. (1) One of the most extensive studies with this methodology in which different configurations are studied, was done by Narendra. To show the viability of the model, five different systems were used as examples, indicating the signals that were used to train and test the system together with the number of learning iterations. Sigmoids were used as activation functions of the neurons. This network is compared to a neural network with a structure inspired on the state space representation of a system. Such structure could allow the application of modern control theory for single-input single output and to multivariable systems. In this network we use sines and cosines as activation functions in order to allow further studies on the ANN application capabilities and methodologies. NEURAL NETWORK STRUCTURE From our point of view, an ANN who learns efficiently the system dynamics should also allow a further study of the

2 learned system. Taking this approach, the model can be validated and used in combination together with other techniques different from ANN. For this reason, we began working with this orientation by modeling linear discrete systems. We first obtained a neural network, Codina[4], able to learn the matrices of the system state representation from pairs of input-output signals. The neural network structure has three layers: input, state and output. The state and the output layers are connected to the input layer, composed by the actual inputs and state of the system (Fig ). It has been used the back-propagation through time learning algorithm [5], where the error is back-propagated from the actual state to the previous state. With this methodology, and using linear neurons we can obtain, from the system, the matrices A, B, C and D: x(k+1) = A x(k) + B u(k) () y(k) = C x(k) + D u(k) The main interest of the application of neural networks in dynamic systems arises from their ability to learn and map non linear functions. We have expanded our previously stated model to deal with non linear systems although expanding the linear model using sigmoids impedes further study of the resulting model. To solve this drawback we have used a structured neural network based on the Fourier theory in order to approximate any non linear function within a bounded interval, by means of a weighted sum of sines and cosines. A non linear discrete system can be expressed by the following difference equations: x(k+1) = F( x(k), u(k)) (3) y(k) = G( x(k), u(k)) where F and G can be approximated in a bounded interval by a Fourier series, if we assume that both functions accomplish the Dirichlet conditions (which is a usual restriction). For one input and one state: z -1 Next State State Output Input Fig. Neural network structure based on the system state representation. x( k + 1) = A cos( nw x( k ) + mw u( k) ) + Fn, m n= 0 m= n + B sin( nw x( k ) + mw u( k )) Fn, m n m y( k ) = AG n, m cos( nwnx( k ) + mwmu( k )) + (5) n = 0 m= + B sin( nw x( k ) + mw u( k )) Gn, m n m The coefficients can thus be calculated by classical methods and the results can be compared with those by the ANN, work carried out in [6]. A linear coefficient is added in order to improve convergence and to reduce the number of Fourier coefficients needed. IDENTIFICATION PROCESS To demonstrate the viability of our ANN architecture we have tested it with the five system models that Narendra used in [1]. Each one of the five models has been simulated and a neural network has been trained to learn the system behavior. After the training, the ANN is fed with a test signal, to evaluate the learning capabilities of the proposed structure. Learning procedure The learning procedure used in this case is different from the one presented by Narendra, who used a series-parallel model. In this model, the previous outputs of the real system are used instead of those from the simulated system. This approach is no feasible using a state space structure, because it is known neither the state of the real system nor the internal state representation taken by the ANN. Using the parallel model increases the number of training examples, and they are highly sensitive to initial conditions. m (4) The methodology used has the following steps.

3 1) A first approximation is obtained by a linear network, with a fixed training signal. ) The model is then expanded using the Fourier terms, the network is trained to learn the coefficients, for the same training signal. 3) The NN is trained with different signals in order to make the system evolve through the whole working region of the state space. 4) If the number of Fourier coefficients is too small then the third step results in an important increasing of the MSE, returning to step. The system models In his paper, Narendra presented six exp eriments in model identification, but the sixth one was only a little variation from the first one. Thus we have replicated exactly his: y( k + 1) = f y( k ), y( k 1), y( k ), u( k ), u( k 1) x1xx3x5( x3 1) + x4 f x1, x, x3, x4, x5 = 1+ x + x 3 (1) and the test input provided: u( k ) = sin( πk / 15) if k 500; (13) u( k ) = 0. 8sin( π k / 15) + 0. sin( π k / 1. 5) if k > 500; Example 5. And the last nonlinear plant tried was: L y1 ( k + 1) NM O y ( k + 1) Q P = L NM y1( k ) 1+ y ( k) u k 1 ( ) y ( k) y ( k ) 1 L u ( k ) 1+ y ( k) P + N M O Q P (14) O P Q where the inputs used to test the trained network were: Example 1. The nonlinear plant to be identified was: 3 y( k + 1) = 0. 3y( k ) y( k 1) + u u 0. 4u (6) u1 ( k), u ( k ) = sin( πk / 5), cos( π k / 5) (15) Simulation results T T and the test input chosen: u( k ) = sin( πk / 50) + sin( π k / 5) (7) Example. Next nonlinear plant was second order: y( k ) y( k 1) y( k ) +. 5 y( k + 1) = + u( k) (8) 1+ y ( k ) + y ( k 1) and the test input tried was: In all five cases, the normalized MSE, between the desired output and the output from the network, decreases, while learning, to values between 10-4 and If when tested, the error increases in an important way then more coefficients where added and also new training signals where used. The test signals are the same used by Narendra. In Fig 3 to Fig 8 the outputs from the real system (continuous trace) and the outputs from the trained ANN ( dashed trace) are compared. u( k ) = sin( π k / 5) (9) Example 3. The third nonlinear plant to be identified had the form: y( k ) 3 y( k + 1) = + u ( k ) + 1 y( k ) (10) and was tested with input function: u( k ) = sin( πk / 5) + sin( π k / 10) (11) Fig 3. Example 1. Output of the plant and identification model. Example 4. The nonlinear plant to now was assumed:

4 Fig 4. Example. Output of the plant and identification model. Fig 8. Example 5. Second output of the plant and second output of the identification model CONCLUSIONS Fig 5. Example 3. Output of the plant and identification model. Fig 6. Example 4. Output of the plant and identification model. Using structured ANN with sine, cosine, and linear activation functions allows us to extract information from the network about the system. Input-output models with sigmoids as activation functions allow the use of neural networks for identification purposes, but no information can be obtained from the resulting structure. In addition, the use of a network structure related with state space description of systems allows us to obtain the equations. With the equations we can apply many of the classical methodologies dealing with non linear systems, usually related with the state space description of systems. The use of state space representation has as a drawback the difficulty to know the initial state, and that the neural network learns his own state representation, that perhaps has no match with the real systems states. On the other hand, if we use input-output models for systems with multiple inputs and multiple outputs we will obtain different equations for every output. The use of structured neural networks has shown to be as useful for the identification of non linear dynamic systems as some of the other approaches. As classical methodologies, our structure allows the extraction of a mathematical model of the system, but it adds the advantages of neural networks to learn and generalize. Our model can also be used to train neural network controllers. REFERENCES Fig 7. Example 5. First output of the plant and of the identification model. [1] Narendra K.S., Parthasarathy K. Identification and Control of Dynamical Systems Using Neural Networks. IEEE Transactions on Neural Networks. Vol 1. N 1. March [] Jordan, M.I. Serial Order: A Parallel Distributed Processing approach. Institute for Cognitive Science Report University of California, Sand Diego. 1986

5 [3] Elman, J.L. Finding Structure in Time. Center For Research in Language. Report University of California. San Diego [4] Codina J., Morcego B., Fuertes J.M., Català A. A Novel Neural Network Structure for Control. IEEE Int. Conf. on Systems, Man and Cybernetics. pp Chicago.199 [5] Werbos P.J. Backpropagation Through Time: What it Does and How to Do it. Proceedings of the IEEE. Vol. 78 N. 10 pp October [6] Codina J., Aguado J. C., Fuertes J. M. Capabilities of a Structured Neural Network. Learning and Comparison with Classical Techniques. Unpublished.

STRUCTURED NEURAL NETWORK FOR NONLINEAR DYNAMIC SYSTEMS MODELING

STRUCTURED NEURAL NETWORK FOR NONLINEAR DYNAMIC SYSTEMS MODELING STRUCTURED NEURAL NETWORK FOR NONLINEAR DYNAIC SYSTES ODELING J. CODINA, R. VILLÀ and J.. FUERTES UPC-Facultat d Informàtica de Barcelona, Department of Automatic Control and Computer Engineeering, Pau

More information

A STATE-SPACE NEURAL NETWORK FOR MODELING DYNAMICAL NONLINEAR SYSTEMS

A STATE-SPACE NEURAL NETWORK FOR MODELING DYNAMICAL NONLINEAR SYSTEMS A STATE-SPACE NEURAL NETWORK FOR MODELING DYNAMICAL NONLINEAR SYSTEMS Karima Amoura Patrice Wira and Said Djennoune Laboratoire CCSP Université Mouloud Mammeri Tizi Ouzou Algeria Laboratoire MIPS Université

More information

International Journal of Advanced Research in Computer Science and Software Engineering

International Journal of Advanced Research in Computer Science and Software Engineering Volume 3, Issue 4, April 2013 ISSN: 2277 128X International Journal of Advanced Research in Computer Science and Software Engineering Research Paper Available online at: www.ijarcsse.com Application of

More information

( t) Identification and Control of a Nonlinear Bioreactor Plant Using Classical and Dynamical Neural Networks

( t) Identification and Control of a Nonlinear Bioreactor Plant Using Classical and Dynamical Neural Networks Identification and Control of a Nonlinear Bioreactor Plant Using Classical and Dynamical Neural Networks Mehmet Önder Efe Electrical and Electronics Engineering Boðaziçi University, Bebek 80815, Istanbul,

More information

Address for Correspondence

Address for Correspondence Research Article APPLICATION OF ARTIFICIAL NEURAL NETWORK FOR INTERFERENCE STUDIES OF LOW-RISE BUILDINGS 1 Narayan K*, 2 Gairola A Address for Correspondence 1 Associate Professor, Department of Civil

More information

Adaptive Inverse Control

Adaptive Inverse Control TA1-8:30 Adaptive nverse Control Bernard Widrow Michel Bilello Stanford University Department of Electrical Engineering, Stanford, CA 94305-4055 Abstract A plant can track an input command signal if it

More information

Identification of two-mass system parameters using neural networks

Identification of two-mass system parameters using neural networks 3ème conférence Internationale des énergies renouvelables CIER-2015 Proceedings of Engineering and Technology - PET Identification of two-mass system parameters using neural networks GHOZZI Dorsaf 1,NOURI

More information

inear Adaptive Inverse Control

inear Adaptive Inverse Control Proceedings of the 36th Conference on Decision & Control San Diego, California USA December 1997 inear Adaptive nverse Control WM15 1:50 Bernard Widrow and Gregory L. Plett Department of Electrical Engineering,

More information

Application of Artificial Neural Networks in Evaluation and Identification of Electrical Loss in Transformers According to the Energy Consumption

Application of Artificial Neural Networks in Evaluation and Identification of Electrical Loss in Transformers According to the Energy Consumption Application of Artificial Neural Networks in Evaluation and Identification of Electrical Loss in Transformers According to the Energy Consumption ANDRÉ NUNES DE SOUZA, JOSÉ ALFREDO C. ULSON, IVAN NUNES

More information

Direct Method for Training Feed-forward Neural Networks using Batch Extended Kalman Filter for Multi- Step-Ahead Predictions

Direct Method for Training Feed-forward Neural Networks using Batch Extended Kalman Filter for Multi- Step-Ahead Predictions Direct Method for Training Feed-forward Neural Networks using Batch Extended Kalman Filter for Multi- Step-Ahead Predictions Artem Chernodub, Institute of Mathematical Machines and Systems NASU, Neurotechnologies

More information

ECE Introduction to Artificial Neural Network and Fuzzy Systems

ECE Introduction to Artificial Neural Network and Fuzzy Systems ECE 39 - Introduction to Artificial Neural Network and Fuzzy Systems Wavelet Neural Network control of two Continuous Stirred Tank Reactors in Series using MATLAB Tariq Ahamed Abstract. With the rapid

More information

ARTIFICIAL NEURAL NETWORK PART I HANIEH BORHANAZAD

ARTIFICIAL NEURAL NETWORK PART I HANIEH BORHANAZAD ARTIFICIAL NEURAL NETWORK PART I HANIEH BORHANAZAD WHAT IS A NEURAL NETWORK? The simplest definition of a neural network, more properly referred to as an 'artificial' neural network (ANN), is provided

More information

Neural network modelling of reinforced concrete beam shear capacity

Neural network modelling of reinforced concrete beam shear capacity icccbe 2010 Nottingham University Press Proceedings of the International Conference on Computing in Civil and Building Engineering W Tizani (Editor) Neural network modelling of reinforced concrete beam

More information

Bidirectional Representation and Backpropagation Learning

Bidirectional Representation and Backpropagation Learning Int'l Conf on Advances in Big Data Analytics ABDA'6 3 Bidirectional Representation and Bacpropagation Learning Olaoluwa Adigun and Bart Koso Department of Electrical Engineering Signal and Image Processing

More information

A Logarithmic Neural Network Architecture for Unbounded Non-Linear Function Approximation

A Logarithmic Neural Network Architecture for Unbounded Non-Linear Function Approximation 1 Introduction A Logarithmic Neural Network Architecture for Unbounded Non-Linear Function Approximation J Wesley Hines Nuclear Engineering Department The University of Tennessee Knoxville, Tennessee,

More information

Forecasting River Flow in the USA: A Comparison between Auto-Regression and Neural Network Non-Parametric Models

Forecasting River Flow in the USA: A Comparison between Auto-Regression and Neural Network Non-Parametric Models Journal of Computer Science 2 (10): 775-780, 2006 ISSN 1549-3644 2006 Science Publications Forecasting River Flow in the USA: A Comparison between Auto-Regression and Neural Network Non-Parametric Models

More information

Online Identification And Control of A PV-Supplied DC Motor Using Universal Learning Networks

Online Identification And Control of A PV-Supplied DC Motor Using Universal Learning Networks Online Identification And Control of A PV-Supplied DC Motor Using Universal Learning Networks Ahmed Hussein * Kotaro Hirasawa ** Jinglu Hu ** * Graduate School of Information Science & Electrical Eng.,

More information

Simple neuron model Components of simple neuron

Simple neuron model Components of simple neuron Outline 1. Simple neuron model 2. Components of artificial neural networks 3. Common activation functions 4. MATLAB representation of neural network. Single neuron model Simple neuron model Components

More information

Keywords- Source coding, Huffman encoding, Artificial neural network, Multilayer perceptron, Backpropagation algorithm

Keywords- Source coding, Huffman encoding, Artificial neural network, Multilayer perceptron, Backpropagation algorithm Volume 4, Issue 5, May 2014 ISSN: 2277 128X International Journal of Advanced Research in Computer Science and Software Engineering Research Paper Available online at: www.ijarcsse.com Huffman Encoding

More information

ADAPTIVE INVERSE CONTROL BASED ON NONLINEAR ADAPTIVE FILTERING. Information Systems Lab., EE Dep., Stanford University

ADAPTIVE INVERSE CONTROL BASED ON NONLINEAR ADAPTIVE FILTERING. Information Systems Lab., EE Dep., Stanford University ADAPTIVE INVERSE CONTROL BASED ON NONLINEAR ADAPTIVE FILTERING Bernard Widrow 1, Gregory Plett, Edson Ferreira 3 and Marcelo Lamego 4 Information Systems Lab., EE Dep., Stanford University Abstract: Many

More information

A FUZZY NEURAL NETWORK MODEL FOR FORECASTING STOCK PRICE

A FUZZY NEURAL NETWORK MODEL FOR FORECASTING STOCK PRICE A FUZZY NEURAL NETWORK MODEL FOR FORECASTING STOCK PRICE Li Sheng Institute of intelligent information engineering Zheiang University Hangzhou, 3007, P. R. China ABSTRACT In this paper, a neural network-driven

More information

Convergence of Hybrid Algorithm with Adaptive Learning Parameter for Multilayer Neural Network

Convergence of Hybrid Algorithm with Adaptive Learning Parameter for Multilayer Neural Network Convergence of Hybrid Algorithm with Adaptive Learning Parameter for Multilayer Neural Network Fadwa DAMAK, Mounir BEN NASR, Mohamed CHTOUROU Department of Electrical Engineering ENIS Sfax, Tunisia {fadwa_damak,

More information

A Novel Activity Detection Method

A Novel Activity Detection Method A Novel Activity Detection Method Gismy George P.G. Student, Department of ECE, Ilahia College of,muvattupuzha, Kerala, India ABSTRACT: This paper presents an approach for activity state recognition of

More information

Artificial Neural Networks

Artificial Neural Networks Introduction ANN in Action Final Observations Application: Poverty Detection Artificial Neural Networks Alvaro J. Riascos Villegas University of los Andes and Quantil July 6 2018 Artificial Neural Networks

More information

Lecture 4: Perceptrons and Multilayer Perceptrons

Lecture 4: Perceptrons and Multilayer Perceptrons Lecture 4: Perceptrons and Multilayer Perceptrons Cognitive Systems II - Machine Learning SS 2005 Part I: Basic Approaches of Concept Learning Perceptrons, Artificial Neuronal Networks Lecture 4: Perceptrons

More information

Weight Initialization Methods for Multilayer Feedforward. 1

Weight Initialization Methods for Multilayer Feedforward. 1 Weight Initialization Methods for Multilayer Feedforward. 1 Mercedes Fernández-Redondo - Carlos Hernández-Espinosa. Universidad Jaume I, Campus de Riu Sec, Edificio TI, Departamento de Informática, 12080

More information

A SEASONAL FUZZY TIME SERIES FORECASTING METHOD BASED ON GUSTAFSON-KESSEL FUZZY CLUSTERING *

A SEASONAL FUZZY TIME SERIES FORECASTING METHOD BASED ON GUSTAFSON-KESSEL FUZZY CLUSTERING * No.2, Vol.1, Winter 2012 2012 Published by JSES. A SEASONAL FUZZY TIME SERIES FORECASTING METHOD BASED ON GUSTAFSON-KESSEL * Faruk ALPASLAN a, Ozge CAGCAG b Abstract Fuzzy time series forecasting methods

More information

Keywords: neural networks; training; Elmen; acoustic parameters; Backpropagation

Keywords: neural networks; training; Elmen; acoustic parameters; Backpropagation Acoustics Parameters Estimation by Artificial Neural Networks Noureddine Djarfour*, Kamel Baddari*, Lab. physique de la terre,. Université de Boumerdés 35000, Algérie, E.mail:djarfour@usa.net Abstract

More information

VC-dimension of a context-dependent perceptron

VC-dimension of a context-dependent perceptron 1 VC-dimension of a context-dependent perceptron Piotr Ciskowski Institute of Engineering Cybernetics, Wroc law University of Technology, Wybrzeże Wyspiańskiego 27, 50 370 Wroc law, Poland cis@vectra.ita.pwr.wroc.pl

More information

ANN Control of Non-Linear and Unstable System and its Implementation on Inverted Pendulum

ANN Control of Non-Linear and Unstable System and its Implementation on Inverted Pendulum Research Article International Journal of Current Engineering and Technology E-ISSN 2277 4106, P-ISSN 2347-5161 2014 INPRESSCO, All Rights Reserved Available at http://inpressco.com/category/ijcet ANN

More information

Christian Mohr

Christian Mohr Christian Mohr 20.12.2011 Recurrent Networks Networks in which units may have connections to units in the same or preceding layers Also connections to the unit itself possible Already covered: Hopfield

More information

Introduction to Neural Networks

Introduction to Neural Networks Introduction to Neural Networks Steve Renals Automatic Speech Recognition ASR Lecture 10 24 February 2014 ASR Lecture 10 Introduction to Neural Networks 1 Neural networks for speech recognition Introduction

More information

Artificial Neural Networks. Historical description

Artificial Neural Networks. Historical description Artificial Neural Networks Historical description Victor G. Lopez 1 / 23 Artificial Neural Networks (ANN) An artificial neural network is a computational model that attempts to emulate the functions of

More information

Model Reference Adaptive Control for Multi-Input Multi-Output Nonlinear Systems Using Neural Networks

Model Reference Adaptive Control for Multi-Input Multi-Output Nonlinear Systems Using Neural Networks Model Reference Adaptive Control for MultiInput MultiOutput Nonlinear Systems Using Neural Networks Jiunshian Phuah, Jianming Lu, and Takashi Yahagi Graduate School of Science and Technology, Chiba University,

More information

Design Collocation Neural Network to Solve Singular Perturbed Problems with Initial Conditions

Design Collocation Neural Network to Solve Singular Perturbed Problems with Initial Conditions Article International Journal of Modern Engineering Sciences, 204, 3(): 29-38 International Journal of Modern Engineering Sciences Journal homepage:www.modernscientificpress.com/journals/ijmes.aspx ISSN:

More information

1. Introduction. 2. Artificial Neural Networks and Fuzzy Time Series

1. Introduction. 2. Artificial Neural Networks and Fuzzy Time Series 382 IJCSNS International Journal of Computer Science and Network Security, VOL.8 No.9, September 2008 A Comparative Study of Neural-Network & Fuzzy Time Series Forecasting Techniques Case Study: Wheat

More information

Neural Network Control in a Wastewater Treatment Plant

Neural Network Control in a Wastewater Treatment Plant Neural Network Control in a Wastewater Treatment Plant Miguel A. Jaramillo 1 ; Juan C. Peguero 2, Enrique Martínez de Salazar 1, Montserrat García del alle 1 ( 1 )Escuela de Ingenierías Industriales. (

More information

Chapter - 3. ANN Approach for Efficient Computation of Logarithm and Antilogarithm of Decimal Numbers

Chapter - 3. ANN Approach for Efficient Computation of Logarithm and Antilogarithm of Decimal Numbers Chapter - 3 ANN Approach for Efficient Computation of Logarithm and Antilogarithm of Decimal Numbers Chapter - 3 ANN Approach for Efficient Computation of Logarithm and Antilogarithm of Decimal Numbers

More information

Short Term Load Forecasting Based Artificial Neural Network

Short Term Load Forecasting Based Artificial Neural Network Short Term Load Forecasting Based Artificial Neural Network Dr. Adel M. Dakhil Department of Electrical Engineering Misan University Iraq- Misan Dr.adelmanaa@gmail.com Abstract Present study develops short

More information

Artificial Intelligence

Artificial Intelligence Artificial Intelligence Jeff Clune Assistant Professor Evolving Artificial Intelligence Laboratory Announcements Be making progress on your projects! Three Types of Learning Unsupervised Supervised Reinforcement

More information

Gaussian Process for Internal Model Control

Gaussian Process for Internal Model Control Gaussian Process for Internal Model Control Gregor Gregorčič and Gordon Lightbody Department of Electrical Engineering University College Cork IRELAND E mail: gregorg@rennesuccie Abstract To improve transparency

More information

Artificial Neural Networks. Edward Gatt

Artificial Neural Networks. Edward Gatt Artificial Neural Networks Edward Gatt What are Neural Networks? Models of the brain and nervous system Highly parallel Process information much more like the brain than a serial computer Learning Very

More information

Using Neural Networks for Identification and Control of Systems

Using Neural Networks for Identification and Control of Systems Using Neural Networks for Identification and Control of Systems Jhonatam Cordeiro Department of Industrial and Systems Engineering North Carolina A&T State University, Greensboro, NC 27411 jcrodrig@aggies.ncat.edu

More information

Chapter 4 Neural Networks in System Identification

Chapter 4 Neural Networks in System Identification Chapter 4 Neural Networks in System Identification Gábor HORVÁTH Department of Measurement and Information Systems Budapest University of Technology and Economics Magyar tudósok körútja 2, 52 Budapest,

More information

Comparison of Heuristic Dynamic Programming and Dual Heuristic Programming Adaptive Critics for Neurocontrol of a Turbogenerator

Comparison of Heuristic Dynamic Programming and Dual Heuristic Programming Adaptive Critics for Neurocontrol of a Turbogenerator Missouri University of Science and Technology Scholars' Mine Electrical and Computer Engineering Faculty Research & Creative Works Electrical and Computer Engineering 1-1-2002 Comparison of Heuristic Dynamic

More information

HIGH PERFORMANCE ADAPTIVE INTELLIGENT DIRECT TORQUE CONTROL SCHEMES FOR INDUCTION MOTOR DRIVES

HIGH PERFORMANCE ADAPTIVE INTELLIGENT DIRECT TORQUE CONTROL SCHEMES FOR INDUCTION MOTOR DRIVES HIGH PERFORMANCE ADAPTIVE INTELLIGENT DIRECT TORQUE CONTROL SCHEMES FOR INDUCTION MOTOR DRIVES M. Vasudevan and R. Arumugam Department of Electrical and Electronics Engineering, Anna University, Chennai,

More information

Intelligent Handwritten Digit Recognition using Artificial Neural Network

Intelligent Handwritten Digit Recognition using Artificial Neural Network RESEARCH ARTICLE OPEN ACCESS Intelligent Handwritten Digit Recognition using Artificial Neural Networ Saeed AL-Mansoori Applications Development and Analysis Center (ADAC), Mohammed Bin Rashid Space Center

More information

COMPARISON OF CLEAR-SKY MODELS FOR EVALUATING SOLAR FORECASTING SKILL

COMPARISON OF CLEAR-SKY MODELS FOR EVALUATING SOLAR FORECASTING SKILL COMPARISON OF CLEAR-SKY MODELS FOR EVALUATING SOLAR FORECASTING SKILL Ricardo Marquez Mechanical Engineering and Applied Mechanics School of Engineering University of California Merced Carlos F. M. Coimbra

More information

Estimation of Inelastic Response Spectra Using Artificial Neural Networks

Estimation of Inelastic Response Spectra Using Artificial Neural Networks Estimation of Inelastic Response Spectra Using Artificial Neural Networks J. Bojórquez & S.E. Ruiz Universidad Nacional Autónoma de México, México E. Bojórquez Universidad Autónoma de Sinaloa, México SUMMARY:

More information

CSE 352 (AI) LECTURE NOTES Professor Anita Wasilewska. NEURAL NETWORKS Learning

CSE 352 (AI) LECTURE NOTES Professor Anita Wasilewska. NEURAL NETWORKS Learning CSE 352 (AI) LECTURE NOTES Professor Anita Wasilewska NEURAL NETWORKS Learning Neural Networks Classifier Short Presentation INPUT: classification data, i.e. it contains an classification (class) attribute.

More information

GMDH-type Neural Networks with a Feedback Loop and their Application to the Identification of Large-spatial Air Pollution Patterns.

GMDH-type Neural Networks with a Feedback Loop and their Application to the Identification of Large-spatial Air Pollution Patterns. GMDH-type Neural Networks with a Feedback Loop and their Application to the Identification of Large-spatial Air Pollution Patterns. Tadashi Kondo 1 and Abhijit S.Pandya 2 1 School of Medical Sci.,The Univ.of

More information

Prediction of Hourly Solar Radiation in Amman-Jordan by Using Artificial Neural Networks

Prediction of Hourly Solar Radiation in Amman-Jordan by Using Artificial Neural Networks Int. J. of Thermal & Environmental Engineering Volume 14, No. 2 (2017) 103-108 Prediction of Hourly Solar Radiation in Amman-Jordan by Using Artificial Neural Networks M. A. Hamdan a*, E. Abdelhafez b

More information

AN INTRODUCTION TO NEURAL NETWORKS. Scott Kuindersma November 12, 2009

AN INTRODUCTION TO NEURAL NETWORKS. Scott Kuindersma November 12, 2009 AN INTRODUCTION TO NEURAL NETWORKS Scott Kuindersma November 12, 2009 SUPERVISED LEARNING We are given some training data: We must learn a function If y is discrete, we call it classification If it is

More information

International Journal of Scientific Research and Reviews

International Journal of Scientific Research and Reviews Research article Available online www.ijsrr.org ISSN: 2279 0543 International Journal of Scientific Research and Reviews Prediction of Compressive Strength of Concrete using Artificial Neural Network ABSTRACT

More information

NN V: The generalized delta learning rule

NN V: The generalized delta learning rule NN V: The generalized delta learning rule We now focus on generalizing the delta learning rule for feedforward layered neural networks. The architecture of the two-layer network considered below is shown

More information

RESPONSE PREDICTION OF STRUCTURAL SYSTEM SUBJECT TO EARTHQUAKE MOTIONS USING ARTIFICIAL NEURAL NETWORK

RESPONSE PREDICTION OF STRUCTURAL SYSTEM SUBJECT TO EARTHQUAKE MOTIONS USING ARTIFICIAL NEURAL NETWORK ASIAN JOURNAL OF CIVIL ENGINEERING (BUILDING AND HOUSING) VOL. 7, NO. 3 (006) PAGES 301-308 RESPONSE PREDICTION OF STRUCTURAL SYSTEM SUBJECT TO EARTHQUAKE MOTIONS USING ARTIFICIAL NEURAL NETWORK S. Chakraverty

More information

Recurrent neural networks with trainable amplitude of activation functions

Recurrent neural networks with trainable amplitude of activation functions Neural Networks 16 (2003) 1095 1100 www.elsevier.com/locate/neunet Neural Networks letter Recurrent neural networks with trainable amplitude of activation functions Su Lee Goh*, Danilo P. Mandic Imperial

More information

Neural Network Approach to Control System Identification with Variable Activation Functions

Neural Network Approach to Control System Identification with Variable Activation Functions Neural Network Approach to Control System Identification with Variable Activation Functions Michael C. Nechyba and Yangsheng Xu The Robotics Institute Carnegie Mellon University Pittsburgh, PA 52 Abstract

More information

Load Forecasting Using Artificial Neural Networks and Support Vector Regression

Load Forecasting Using Artificial Neural Networks and Support Vector Regression Proceedings of the 7th WSEAS International Conference on Power Systems, Beijing, China, September -7, 2007 3 Load Forecasting Using Artificial Neural Networks and Support Vector Regression SILVIO MICHEL

More information

Supervised (BPL) verses Hybrid (RBF) Learning. By: Shahed Shahir

Supervised (BPL) verses Hybrid (RBF) Learning. By: Shahed Shahir Supervised (BPL) verses Hybrid (RBF) Learning By: Shahed Shahir 1 Outline I. Introduction II. Supervised Learning III. Hybrid Learning IV. BPL Verses RBF V. Supervised verses Hybrid learning VI. Conclusion

More information

Computational statistics

Computational statistics Computational statistics Lecture 3: Neural networks Thierry Denœux 5 March, 2016 Neural networks A class of learning methods that was developed separately in different fields statistics and artificial

More information

Elman Recurrent Neural Network in Thermal Modeling of Power Transformers

Elman Recurrent Neural Network in Thermal Modeling of Power Transformers Elman Recurrent Neural Network in Thermal Modeling of Power Transformers M. HELL, F. GOMIDE Department of Computer Engineering and Industrial Automation - DCA School of Electrical and Computer Engineering

More information

Lazy learning for control design

Lazy learning for control design Lazy learning for control design Gianluca Bontempi, Mauro Birattari, Hugues Bersini Iridia - CP 94/6 Université Libre de Bruxelles 5 Bruxelles - Belgium email: {gbonte, mbiro, bersini}@ulb.ac.be Abstract.

More information

Artificial Neural Network Simulation of Battery Performance

Artificial Neural Network Simulation of Battery Performance Artificial work Simulation of Battery Performance C.C. O Gorman, D. Ingersoll, R.G. Jungst and T.L. Paez Sandia National Laboratories PO Box 58 Albuquerque, NM 8785 Abstract Although they appear deceptively

More information

Artificial Neural Networks D B M G. Data Base and Data Mining Group of Politecnico di Torino. Elena Baralis. Politecnico di Torino

Artificial Neural Networks D B M G. Data Base and Data Mining Group of Politecnico di Torino. Elena Baralis. Politecnico di Torino Artificial Neural Networks Data Base and Data Mining Group of Politecnico di Torino Elena Baralis Politecnico di Torino Artificial Neural Networks Inspired to the structure of the human brain Neurons as

More information

A Hybrid Model of Wavelet and Neural Network for Short Term Load Forecasting

A Hybrid Model of Wavelet and Neural Network for Short Term Load Forecasting International Journal of Electronic and Electrical Engineering. ISSN 0974-2174, Volume 7, Number 4 (2014), pp. 387-394 International Research Publication House http://www.irphouse.com A Hybrid Model of

More information

Portugaliae Electrochimica Acta 26/4 (2008)

Portugaliae Electrochimica Acta 26/4 (2008) Portugaliae Electrochimica Acta 6/4 (008) 6-68 PORTUGALIAE ELECTROCHIMICA ACTA Comparison of Regression Model and Artificial Neural Network Model for the Prediction of Volume Percent of Diamond Deposition

More information

Back-Propagation Algorithm. Perceptron Gradient Descent Multilayered neural network Back-Propagation More on Back-Propagation Examples

Back-Propagation Algorithm. Perceptron Gradient Descent Multilayered neural network Back-Propagation More on Back-Propagation Examples Back-Propagation Algorithm Perceptron Gradient Descent Multilayered neural network Back-Propagation More on Back-Propagation Examples 1 Inner-product net =< w, x >= w x cos(θ) net = n i=1 w i x i A measure

More information

P. M. FONTE GONÇALO XUFRE SILVA J. C. QUADRADO DEEA Centro de Matemática DEEA ISEL Rua Conselheiro Emídio Navarro, LISBOA PORTUGAL

P. M. FONTE GONÇALO XUFRE SILVA J. C. QUADRADO DEEA Centro de Matemática DEEA ISEL Rua Conselheiro Emídio Navarro, LISBOA PORTUGAL Wind Speed Prediction using Artificial Neural Networks P. M. FONTE GONÇALO XUFRE SILVA J. C. QUADRADO DEEA Centro de Matemática DEEA ISEL Rua Conselheiro Emídio Navarro, 1950-072 LISBOA PORTUGAL Abstract:

More information

A. Pelliccioni (*), R. Cotroneo (*), F. Pungì (*) (*)ISPESL-DIPIA, Via Fontana Candida 1, 00040, Monteporzio Catone (RM), Italy.

A. Pelliccioni (*), R. Cotroneo (*), F. Pungì (*) (*)ISPESL-DIPIA, Via Fontana Candida 1, 00040, Monteporzio Catone (RM), Italy. Application of Neural Net Models to classify and to forecast the observed precipitation type at the ground using the Artificial Intelligence Competition data set. A. Pelliccioni (*), R. Cotroneo (*), F.

More information

Product Quality Prediction by a Neural Soft-Sensor Based on MSA and PCA

Product Quality Prediction by a Neural Soft-Sensor Based on MSA and PCA International Journal of Automation and Computing 1 (2006) 17-22 Product Quality Prediction by a Neural Soft-Sensor Based on MSA and PCA Jian Shi, Xing-Gao Liu National Laboratory of Industrial Control

More information

System Identification for the Hodgkin-Huxley Model using Artificial Neural Networks

System Identification for the Hodgkin-Huxley Model using Artificial Neural Networks Proceedings of International Joint Conference on Neural Networks, Orlando, Florida, USA, August 12-17, 2007 System Identification for the Hodgkin-Huxley Model using Artificial Neural Networks Manish Saggar,

More information

Experiments with a Hybrid-Complex Neural Networks for Long Term Prediction of Electrocardiograms

Experiments with a Hybrid-Complex Neural Networks for Long Term Prediction of Electrocardiograms IEEE. ransactions of the 6 International World Congress of Computational Intelligence, IJCNN 6 Experiments with a Hybrid-Complex Neural Networks for Long erm Prediction of Electrocardiograms Pilar Gómez-Gil,

More information

Temporal Backpropagation for FIR Neural Networks

Temporal Backpropagation for FIR Neural Networks Temporal Backpropagation for FIR Neural Networks Eric A. Wan Stanford University Department of Electrical Engineering, Stanford, CA 94305-4055 Abstract The traditional feedforward neural network is a static

More information

Linear Least-Squares Based Methods for Neural Networks Learning

Linear Least-Squares Based Methods for Neural Networks Learning Linear Least-Squares Based Methods for Neural Networks Learning Oscar Fontenla-Romero 1, Deniz Erdogmus 2, JC Principe 2, Amparo Alonso-Betanzos 1, and Enrique Castillo 3 1 Laboratory for Research and

More information

Singer Identification using MFCC and LPC and its comparison for ANN and Naïve Bayes Classifiers

Singer Identification using MFCC and LPC and its comparison for ANN and Naïve Bayes Classifiers Singer Identification using MFCC and LPC and its comparison for ANN and Naïve Bayes Classifiers Kumari Rambha Ranjan, Kartik Mahto, Dipti Kumari,S.S.Solanki Dept. of Electronics and Communication Birla

More information

Artificial Neural Networks Francesco DI MAIO, Ph.D., Politecnico di Milano Department of Energy - Nuclear Division IEEE - Italian Reliability Chapter

Artificial Neural Networks Francesco DI MAIO, Ph.D., Politecnico di Milano Department of Energy - Nuclear Division IEEE - Italian Reliability Chapter Artificial Neural Networks Francesco DI MAIO, Ph.D., Politecnico di Milano Department of Energy - Nuclear Division IEEE - Italian Reliability Chapter (Chair) STF - China Fellow francesco.dimaio@polimi.it

More information

Equivalence of Backpropagation and Contrastive Hebbian Learning in a Layered Network

Equivalence of Backpropagation and Contrastive Hebbian Learning in a Layered Network LETTER Communicated by Geoffrey Hinton Equivalence of Backpropagation and Contrastive Hebbian Learning in a Layered Network Xiaohui Xie xhx@ai.mit.edu Department of Brain and Cognitive Sciences, Massachusetts

More information

Neural Networks, Computation Graphs. CMSC 470 Marine Carpuat

Neural Networks, Computation Graphs. CMSC 470 Marine Carpuat Neural Networks, Computation Graphs CMSC 470 Marine Carpuat Binary Classification with a Multi-layer Perceptron φ A = 1 φ site = 1 φ located = 1 φ Maizuru = 1 φ, = 2 φ in = 1 φ Kyoto = 1 φ priest = 0 φ

More information

Neural Networks with Applications to Vision and Language. Feedforward Networks. Marco Kuhlmann

Neural Networks with Applications to Vision and Language. Feedforward Networks. Marco Kuhlmann Neural Networks with Applications to Vision and Language Feedforward Networks Marco Kuhlmann Feedforward networks Linear separability x 2 x 2 0 1 0 1 0 0 x 1 1 0 x 1 linearly separable not linearly separable

More information

Introduction to Neural Networks

Introduction to Neural Networks Introduction to Neural Networks What are (Artificial) Neural Networks? Models of the brain and nervous system Highly parallel Process information much more like the brain than a serial computer Learning

More information

4. Multilayer Perceptrons

4. Multilayer Perceptrons 4. Multilayer Perceptrons This is a supervised error-correction learning algorithm. 1 4.1 Introduction A multilayer feedforward network consists of an input layer, one or more hidden layers, and an output

More information

Unit III. A Survey of Neural Network Model

Unit III. A Survey of Neural Network Model Unit III A Survey of Neural Network Model 1 Single Layer Perceptron Perceptron the first adaptive network architecture was invented by Frank Rosenblatt in 1957. It can be used for the classification of

More information

Introduction Neural Networks - Architecture Network Training Small Example - ZIP Codes Summary. Neural Networks - I. Henrik I Christensen

Introduction Neural Networks - Architecture Network Training Small Example - ZIP Codes Summary. Neural Networks - I. Henrik I Christensen Neural Networks - I Henrik I Christensen Robotics & Intelligent Machines @ GT Georgia Institute of Technology, Atlanta, GA 30332-0280 hic@cc.gatech.edu Henrik I Christensen (RIM@GT) Neural Networks 1 /

More information

APPLICATION OF ARTIFICIAL NEURAL NETWORKS FOR FLOOD FORECASTING

APPLICATION OF ARTIFICIAL NEURAL NETWORKS FOR FLOOD FORECASTING Global Nest: the Int. J. Vol 6, No 3, pp 4-1, 4 Copyright 4 GLOBAL NEST Printed in Greece. All rights reserved APPLICATION OF ARTIFICIAL NEURAL NETWORKS FOR FLOOD FORECASTING D.F. LEKKAS 1,* 1 Department

More information

THE Back-Prop (Backpropagation) algorithm of Paul

THE Back-Prop (Backpropagation) algorithm of Paul COGNITIVE COMPUTATION The Back-Prop and No-Prop Training Algorithms Bernard Widrow, Youngsik Kim, Yiheng Liao, Dookun Park, and Aaron Greenblatt Abstract Back-Prop and No-Prop, two training algorithms

More information

Multi-layer Neural Networks

Multi-layer Neural Networks Multi-layer Neural Networks Steve Renals Informatics 2B Learning and Data Lecture 13 8 March 2011 Informatics 2B: Learning and Data Lecture 13 Multi-layer Neural Networks 1 Overview Multi-layer neural

More information

Different Criteria for Active Learning in Neural Networks: A Comparative Study

Different Criteria for Active Learning in Neural Networks: A Comparative Study Different Criteria for Active Learning in Neural Networks: A Comparative Study Jan Poland and Andreas Zell University of Tübingen, WSI - RA Sand 1, 72076 Tübingen, Germany Abstract. The field of active

More information

Artifical Neural Networks

Artifical Neural Networks Neural Networks Artifical Neural Networks Neural Networks Biological Neural Networks.................................. Artificial Neural Networks................................... 3 ANN Structure...........................................

More information

Multilayer Perceptrons and Backpropagation

Multilayer Perceptrons and Backpropagation Multilayer Perceptrons and Backpropagation Informatics 1 CG: Lecture 7 Chris Lucas School of Informatics University of Edinburgh January 31, 2017 (Slides adapted from Mirella Lapata s.) 1 / 33 Reading:

More information

NONLINEAR PLANT IDENTIFICATION BY WAVELETS

NONLINEAR PLANT IDENTIFICATION BY WAVELETS NONLINEAR PLANT IDENTIFICATION BY WAVELETS Edison Righeto UNESP Ilha Solteira, Department of Mathematics, Av. Brasil 56, 5385000, Ilha Solteira, SP, Brazil righeto@fqm.feis.unesp.br Luiz Henrique M. Grassi

More information

MODULAR ECHO STATE NEURAL NETWORKS IN TIME SERIES PREDICTION

MODULAR ECHO STATE NEURAL NETWORKS IN TIME SERIES PREDICTION Computing and Informatics, Vol. 30, 2011, 321 334 MODULAR ECHO STATE NEURAL NETWORKS IN TIME SERIES PREDICTION Štefan Babinec, Jiří Pospíchal Department of Mathematics Faculty of Chemical and Food Technology

More information

Artificial Neural Networks" and Nonparametric Methods" CMPSCI 383 Nov 17, 2011!

Artificial Neural Networks and Nonparametric Methods CMPSCI 383 Nov 17, 2011! Artificial Neural Networks" and Nonparametric Methods" CMPSCI 383 Nov 17, 2011! 1 Todayʼs lecture" How the brain works (!)! Artificial neural networks! Perceptrons! Multilayer feed-forward networks! Error

More information

Lab 5: 16 th April Exercises on Neural Networks

Lab 5: 16 th April Exercises on Neural Networks Lab 5: 16 th April 01 Exercises on Neural Networks 1. What are the values of weights w 0, w 1, and w for the perceptron whose decision surface is illustrated in the figure? Assume the surface crosses the

More information

Simple Neural Nets for Pattern Classification: McCulloch-Pitts Threshold Logic CS 5870

Simple Neural Nets for Pattern Classification: McCulloch-Pitts Threshold Logic CS 5870 Simple Neural Nets for Pattern Classification: McCulloch-Pitts Threshold Logic CS 5870 Jugal Kalita University of Colorado Colorado Springs Fall 2014 Logic Gates and Boolean Algebra Logic gates are used

More information

Reinforcement Learning, Neural Networks and PI Control Applied to a Heating Coil

Reinforcement Learning, Neural Networks and PI Control Applied to a Heating Coil Reinforcement Learning, Neural Networks and PI Control Applied to a Heating Coil Charles W. Anderson 1, Douglas C. Hittle 2, Alon D. Katz 2, and R. Matt Kretchmar 1 1 Department of Computer Science Colorado

More information

Macroscopic Modeling of Freeway Traffic Using an Artificial Neural Network

Macroscopic Modeling of Freeway Traffic Using an Artificial Neural Network 110 Paper No. CS7092 TRANSPORTATION RESEARCH RECORD 1588 Macroscopic Modeling of Freeway Traffic Using an Artificial Neural Network HONGJUN ZHANG, STEPHEN G. RITCHIE, AND ZHEN-PING LO Traffic flow on freeways

More information

Real Time wave forecasting using artificial neural network with varying input parameter

Real Time wave forecasting using artificial neural network with varying input parameter 82 Indian Journal of Geo-Marine SciencesINDIAN J MAR SCI VOL. 43(1), JANUARY 2014 Vol. 43(1), January 2014, pp. 82-87 Real Time wave forecasting using artificial neural network with varying input parameter

More information

An artificial neural networks (ANNs) model is a functional abstraction of the

An artificial neural networks (ANNs) model is a functional abstraction of the CHAPER 3 3. Introduction An artificial neural networs (ANNs) model is a functional abstraction of the biological neural structures of the central nervous system. hey are composed of many simple and highly

More information

Artificial Neural Network : Training

Artificial Neural Network : Training Artificial Neural Networ : Training Debasis Samanta IIT Kharagpur debasis.samanta.iitgp@gmail.com 06.04.2018 Debasis Samanta (IIT Kharagpur) Soft Computing Applications 06.04.2018 1 / 49 Learning of neural

More information