MODELING NONLINEAR DYNAMICS WITH NEURAL. Eric A. Wan. Stanford University, Department of Electrical Engineering, Stanford, CA

Size: px
Start display at page:

Download "MODELING NONLINEAR DYNAMICS WITH NEURAL. Eric A. Wan. Stanford University, Department of Electrical Engineering, Stanford, CA"

Transcription

1 MODELING NONLINEAR DYNAMICS WITH NEURAL NETWORKS: EXAMPLES IN TIME SERIES PREDICTION Eric A Wan Stanford University, Department of Electrical Engineering, Stanford, CA Abstract A neural networ architecture is discussed which uses Finite Impulse Response (FIR) linear lters to provide dynamic interconnectivity between processing units The networ is applied to a variety of chaotic time series prediction tass Phase-space plots of the networ dynamics are given to illustrate the reconstruction of underlying chaotic attractors An eample taen from the Santa Fe Institute Time Series Prediction Competition is also presented I Introduction The goal of time series prediction can be stated succinctly as follows: given a nite sequence y(); y(2); : : : y(n), nd the continuation y(n + ); y(n + 2)::: The series may arise from the sampling of a continuous time system, and be either stochastic or deterministic in origin Applications of prediction range from modeling turbulence to dierential pulse code modulation schemes for telecommunication to stocmaret portfolio management The standard prediction approach involves constructing an underlying model which gives rise to the observed sequence In the oldest and most studied method, a linear autoregression (AR) is t to the dqata: y() = TX n= a(n)y(? n) + e() = ^y() + e(): () This AR model forms y() as a weighted sum of past values of the sequence The single step prediction for y() is given by ^y() Neural networs may be used to etend the linear model to form a nonlinear prediction scheme The basic form y() = ^y() + e() is retained; however, the estimate ^y() is taen as the output N of a neural networ driven by past values of the sequence: y() = N [y(? ); y(? 2); :::y(? T )] + e(): (2) We wish to acnowledge the support of NSF under grant NSF IRI 9-253, of ONR under contract no N4-92-J-787, of EPRI under contract RP:8-3, and of the Department of the Army Belvoir R D & E Center under contract #DAAK7-92-K- 3 Note that this model is equally applicable for both scalar and vector sequences The use of this nonlinear autoregression can be motivated as follows First, Taens Theorem (Taens 98) implies that for a wide class of deterministic systems there eists a dieomorphism (one-to-one dierential mapping) between a nite window of the time series [y(? ); y(? 2); :::y(? T )] and the underlying state of the dynamics system which gives rise to the time series This implies that there eists, in theory, a nonlinear autoregression of the form y() = g[y(?); y(?2); :::y(? T )], which models the series eactly (assuming no noise) The neural networ thus forms an approimation to the ideal function g() Furthermore, it has been shown (Horni et al 989; Cybeno 989; Irie and Miyae 988) that a feedforward neural networ N with an arbitrary number of neurons and 2 or more layers is capable of approimating any uniformly continuous function These arguments provide the basic motivation for the use of neural networs in time series prediction II Networ Architecture and Prediction Configuration The use of neural networs for time series prediction is not new Previous wor includes (Werbos 974, 98; Lapedes and Farber 987; Weigend et al 99) to cite just a few In this paper, we focus on a method for achieving the nonlinear autoregression by use of a Finite Impulse Response (FIR) networ (Wan 99, 993 The networ resembles a standard feedforward networ where each synapse is replaced with an adaptive FIR linear lter as illustrated in Fig The FIR lter forms a weighted sum of past values of its input The neuron receives the ltered inputs and then passes the sum through a nonlinear squashing functions Neurons are arranged in layers to form a networ in which all connections are made with the synaptic lters Training the networ is accomplished through a modication of the bacpropagation algorithm (Rumelhart et al 986) called temporal bacpropagation in which error terms are symmetrically ltered bacward through the networ A complete description of the architecture along with the

2 () (-) (-2) (-T) q - q - q - Training: w() w() w(2) w(t) y() y(-) y() ^ l- () w l j FIR filters q - e() - + y() l- 2 () w l 2j w l 3j + l j () ^ y(-) Prediction: ^y() l- 3 () Neuron q - Fig 2: Networ prediction conguration: (a) Training (b) Iterated prediction This closed-loop system is illustrated in Fig 2b The system can be iterated forward in time to achieve predictions as far into the future as desired Networ Fig : FIR networ architecture (q? represents a time-domain unit delay operator) training algorithm can be found in (Wan 993) III Eamples of chaos prediction and attractor reconstruction In the eamples that follow, networs of various dimensions are trained on time series generated by chaotic processes The long term iterated prediction of the networs are then analyzed A Results of the SFI Competition Fig 2a illustrates the basic predictor training con- guration At each time step, the input to the FIR networ is the nown value y(? ), and the output ^y() = N q [y(? )] is the single step estimate of the true series value y() Our model construct is thus: y() = N q [y(? )] + e(): Since the FIR networ has only a nite memory of past samples, N q [y(? )] is equivalent to a nite nonlinear regression on y() During training, the squared error e() 2 = (y()? ^y()) 2 is minimized by using the temporal bacpropagation algorithm to adapt the networ Note we are performing open-loop adaptation; both the input and desired response are provided from the nown training series Intensity Chaotic intensity pulsations in a single-mode far infrared NH3 laser Once the networ is trained, long-term iterated prediction is achieved by taing the estimate ^y() and feeding it bac as input to the networ: ^y() = N q [^y(? )] Fig 3: points of laser data During the fall of 99, The Santa Fe Institute Time

3 Iterated Networ Prediction were not provided and were not available when the predictions were submitted As can be seen, the prediction is remarably accurate with only a slight eventual phase degradation A prediction based on a 25th order linear autoregression is also shown to emphasize the dierences from traditional linear methods Other submissions to the competition included methods of -d trees, piecewise linear interpolation, low-pass embedding, SVD, nearest neighbors, Wiener lters, as well as standard recurrent and feedforward neural networs As reported by the Santa Fe Institute, the FIR networ outperformed all other methods on this data set B Macey-Glass 25 Best Linear Predictor 4 2 Macey-Glass(3) Time Series Fig 4: (a) Networ prediction (solid line) and series continuation (dashed line) (b) linear AR prediction 4 Macey-Glass(3) Iterated Prediction Series Prediction and Analysis Competition was established as a means for evaluating and benchmaring new and eisting techniques in time series prediction (Weigend and Gershenfeld 993) The plot in Fig 3 shows the chaotic intensity pulsations of an N H 3 laser distributed as part of the competition Contestants were given only points of data and then invited to send in solutions predicting the net points During the course of the competition, the physical bacground of the data set, as well as the point continuation, was withheld to avoid biasing the nal prediction results The step prediction achieved by using an FIR networ (dimensions: 22 neurons with 25:5:5 order lters) is shown in Fig 4 along with the actual series continuation for comparison It is important to emphasize that this prediction was made based on only the past samples True values of the series for time past \Measurements were made on an 85-micron 4NH3 cw (FIR) laser, pumped optically by the P(3) line of an N2 laser via the vibrational aq(8,7) NH3 transition" - (Huebner 989) Fig 5: (a) Macey-Glass(3) delay-dierential equation (b) Iterated prediction (solid line) and series continuation (dashed-line) For the net eample we consider the Macey-Glass delay-dierential equation (Glass 977): d(t) :2(t? ) =?:(t) + : (3) dt + (t? ) with parameters = 7 and 3, initial conditions (t) = :9 for t, and sampling rate = 6 These

4 Table : Comparison of log normalized single step prediction errors for Macey-Glass (small numbers mean a better prediction) Linear Poly Rational loc() MG(7) MG(3) loc(2) RBF NNet FIR Net MG(7) MG(3) parameters were chosen to facilitate comparisons with prior wor (Farmer and Sidorowich 987; Lapedes and Farber 987; Casdagli 989) The time series for = 3 is shown in Fig 5a An FIR networ with 5 nodes and 8 : 2 in taps in each layer was trained on only the rst 5 points of the series The resulting log normalized single step prediction errors for the subsequent 5 points are given in Table along with results of other methods as summarized in (Casdagli 989) While the FIR networ shows a slight improvement over eisting methods, the single step prediction tas is really not that dicult A more challenging problem is the iterated prediction as shown in Fig 5b The networ receives no new inputs past the point 5 The iterated point prediction is remarably accurate C Henon Map () Henon Series Fig 6: Henon series Consider the rather benign looing Henon equations: n+ = :? a 2 n + y n, y n+ = b n, with a = :4 and b = :3(Henon 976) The iterated time series n is shown if Fig 6 The phase-plot versus y reveals a remarable structure called a strange attractor (see Henon Map () (-) Fig 7: Henon attractor Henon iterated prediction iteration prediction actual Fig 8: Iterated prediction Fig 7 Increased magnication of the attractor would reveal ever ner detail in a fractal lie geometry An FIR networ was trained on the series using single step predictions (Dimensions: 22 nodes with 2:: order FIR lters in each layer) y The iterated prediction versus the true series is shown in Fig 8 As can be seen, the prediction is eceptionally accurate starting out, but then diverges after around 5 time steps This divergence is unavoidable and reveals one of the fundamental tenants of chaos theory The networ system, however, can still be iterated thousands of time steps into the future and then used to construct its corresponding y While a smaller networ with lower embedding dimension would have wored for this problem, in general the actual dimensions of the system is not nown in advance Determining appropriate embedding dimensions for the networ is a rich topic of research and beyond the scope of this paper

5 attractor As seen in Fig 9, the original Henon attractor emerges indicating that the networ has indeed captured the underlying dynamics Networ Prediction Fig 9: Predicted attractor D Ieda Map Ieda Map Im[z] Re[z] Fig : Ieda attractor A more complicated equation corresponding the planewave interactivity in an optical ring laser is the Ieda map: z n+ = a + Rz n epfi[? p=( + jz n j 2 )]g, with a = :; R = :9; = :4; p = 6 (Hammel et al 985) The phase-plot of real vs imag is shown in Fig To mae the problem even more dicult for the networ, only the imaginary sequence is used to train the networ The networ must mae a prediction for both the net imaginary value and the net real value The networ is thus performing a state estimation as it must learn the interrelation between the real and imaginary sequences After training, the networ (dimensions: nodes with 5:2:2 taps) was iterated forward in time to form the image shown in Fig Again it is clear that the networ has captured much of the underlying dynamics Ieda Map: Networ Attractor out2 out Fig : Networ predicted attractor E Lorenz Dynamics A Lorenz system (Lorenz 963) is described by the solution of the three simultaneous dierential equations: d=dt =? + y (4) dy=dt =?z + r? y (5) dz=dt = y? bz (6) A projection of the trajectory in the? z plane for parameters values =, r = 28, and b = 8=3 is shown in Fig 2a A networ (dimensions: 22 nodes with 2:5:5 taps) was trained on observations of only the state sampled at a period of :5 seconds The output of the networ was a prediction of both and z at the net time step This is again a state estimation problem In Fig 2b the iterated networ prediction ^(t) vs ^z(t) is shown The ability of the networ to capture the underlying dynamics is evident IV Conclusion In this paper we have provided several eamples illustrating the potential of using neural networs for time series prediction and modeling While we have focused on autonomous deterministic scalar series, it should be clear that the basic principles directly apply to many problem of interest in dynamic modeling, system identi- cation, and control

6 z Lorenz Dynamics Huebner, U Abrahm, NB, and Weiss, CO 989 \ Dimension and entropies of a chaotic intensity pulsations in a single-mode far-infrared NH3 laser," Physics Review, A 4, 6354 Irie, B and Miyae, S 988 \Capabilities of three-layered perceptrons", In Proceedings of the IEEE Second International Conference on Neural Networs, (San Diego, CA, July), Vol I, z Lorenz Networ Dynamics Lapedes, A and Farber, R 987 \Nonlinear Signal Processing Using Neural Networs: Prediction and system modeling", Technical Report LA-UR , Los Alamos National Laboratory Lorenz, EN 963 \Deterministic Nonperiodic Flow", J Atmos Sci, 2, 3-4 Macey, M Glass, L 977 \Oscillations and chaos in physiological control systems", Science 97, 287 Rumelhart, DE McClelland, JL 986 and the PDP Research Group, Parallel Distributed Processing: Eplorations in the Microstructure of Cognition, Vol, Cambridge, MA: MIT Press Taens, F 98 \Detecting Strange Attractors in Fluid Turbulence" In Dynamical Systems and Turbulence, D Rand and LS Young, eds Springer, Berlin Fig 2: (a) Lorenz attractor (b) Networ attractor V References Casdagli, M 989 \Nonlinear Prediction of Chaotic Time Series", Physica D, 35, Cybeno, G 989 \Approimation by superpositions of a sigmoidal function," Mathematics of Control, Signals, and Systems, Vol 2 No 4, Farmer, JD and Sidorowich, JJ 987 \Predicting chaotic time series", Phys Rev Lett 59, 854 Henon, M 976 "A two-dimensional mapping with a strange attractor", Commun Math Phys 5, 69 Hammel, S Jones, CKRT and Moloney, JV 985 \Global dynamical Systems, and Bifurcations of Vector Fields", J Opt Soc Am, B 2, 552 Horni, K Stinchombe, M White, H 989 \Multilayer feedforward networs are universal approimators," Neural Networs, vol 2, Wan, E 99 \Temporal bacpropagation for FIR neural networs," In International Joint Conference on Neural Networs, (San Diego, 99) Vol, Wan, E 993, \Time series prediction using a neural networ with distributed time delays", In Proceedings of the NATO Advanced Research Worshop on Time Series Prediction and Analysis, (Santa Fe, New Meico, May ), A Weigend and N Gershenfeld, eds Addison-Wesley Weigend, A Huberman, B and Rumelhart, D 99 \Predicting the future: a connectionist approach", in International Journal of Neural Systems, vol 7, no3-4, Weigend, A and Gershenfeld, N eds 993 Proceedings of the NATO Advanced Research Worshop on Time Series Prediction and Analysis, (Santa Fe, New Meico, May ), Addison-Wesley Werbos P 974 \Beyond Regression: New Tools for Prediction and Analysis in the Behavioral Sciences", PhD thesis, Harvard University, Cambridge, MA Werbos P 988 \Generalization of Bacpropagation with Application to a Recurrent Gas Maret Model", Neural Networs, vol,

A New Look at Nonlinear Time Series Prediction with NARX Recurrent Neural Network. José Maria P. Menezes Jr. and Guilherme A.

A New Look at Nonlinear Time Series Prediction with NARX Recurrent Neural Network. José Maria P. Menezes Jr. and Guilherme A. A New Look at Nonlinear Time Series Prediction with NARX Recurrent Neural Network José Maria P. Menezes Jr. and Guilherme A. Barreto Department of Teleinformatics Engineering Federal University of Ceará,

More information

y(n) Time Series Data

y(n) Time Series Data Recurrent SOM with Local Linear Models in Time Series Prediction Timo Koskela, Markus Varsta, Jukka Heikkonen, and Kimmo Kaski Helsinki University of Technology Laboratory of Computational Engineering

More information

TECHNICAL RESEARCH REPORT

TECHNICAL RESEARCH REPORT TECHNICAL RESEARCH REPORT Reconstruction of Nonlinear Systems Using Delay Lines and Feedforward Networks by D.L. Elliott T.R. 95-17 ISR INSTITUTE FOR SYSTEMS RESEARCH Sponsored by the National Science

More information

Temporal Backpropagation for FIR Neural Networks

Temporal Backpropagation for FIR Neural Networks Temporal Backpropagation for FIR Neural Networks Eric A. Wan Stanford University Department of Electrical Engineering, Stanford, CA 94305-4055 Abstract The traditional feedforward neural network is a static

More information

memory networks, have been proposed by Hopeld (1982), Lapedes and Farber (1986), Almeida (1987), Pineda (1988), and Rohwer and Forrest (1987). Other r

memory networks, have been proposed by Hopeld (1982), Lapedes and Farber (1986), Almeida (1987), Pineda (1988), and Rohwer and Forrest (1987). Other r A Learning Algorithm for Continually Running Fully Recurrent Neural Networks Ronald J. Williams College of Computer Science Northeastern University Boston, Massachusetts 02115 and David Zipser Institute

More information

Ensembles of Nearest Neighbor Forecasts

Ensembles of Nearest Neighbor Forecasts Ensembles of Nearest Neighbor Forecasts Dragomir Yankov 1, Dennis DeCoste 2, and Eamonn Keogh 1 1 University of California, Riverside CA 92507, USA, {dyankov,eamonn}@cs.ucr.edu, 2 Yahoo! Research, 3333

More information

LECTURE # - NEURAL COMPUTATION, Feb 04, Linear Regression. x 1 θ 1 output... θ M x M. Assumes a functional form

LECTURE # - NEURAL COMPUTATION, Feb 04, Linear Regression. x 1 θ 1 output... θ M x M. Assumes a functional form LECTURE # - EURAL COPUTATIO, Feb 4, 4 Linear Regression Assumes a functional form f (, θ) = θ θ θ K θ (Eq) where = (,, ) are the attributes and θ = (θ, θ, θ ) are the function parameters Eample: f (, θ)

More information

Phase-Space Learning Fu-Sheng Tsung Chung Tai Ch'an Temple 56, Yuon-fon Road, Yi-hsin Li, Pu-li Nan-tou County, Taiwan 545 Republic of China Garrison

Phase-Space Learning Fu-Sheng Tsung Chung Tai Ch'an Temple 56, Yuon-fon Road, Yi-hsin Li, Pu-li Nan-tou County, Taiwan 545 Republic of China Garrison Phase-Space Learning Fu-Sheng Tsung Chung Tai Ch'an Temple 56, Yuon-fon Road, Yi-hsin Li, Pu-li Nan-tou County, Taiwan 545 Republic of China Garrison W. Cottrell Institute for Neural Computation Computer

More information

Dynamical Embodiments of Computation in Cognitive Processes James P. Crutcheld Physics Department, University of California, Berkeley, CA a

Dynamical Embodiments of Computation in Cognitive Processes James P. Crutcheld Physics Department, University of California, Berkeley, CA a Dynamical Embodiments of Computation in Cognitive Processes James P. Crutcheld Physics Department, University of California, Berkeley, CA 94720-7300 and Santa Fe Institute, 1399 Hyde Park Road, Santa Fe,

More information

ADAPTIVE INVERSE CONTROL BASED ON NONLINEAR ADAPTIVE FILTERING. Information Systems Lab., EE Dep., Stanford University

ADAPTIVE INVERSE CONTROL BASED ON NONLINEAR ADAPTIVE FILTERING. Information Systems Lab., EE Dep., Stanford University ADAPTIVE INVERSE CONTROL BASED ON NONLINEAR ADAPTIVE FILTERING Bernard Widrow 1, Gregory Plett, Edson Ferreira 3 and Marcelo Lamego 4 Information Systems Lab., EE Dep., Stanford University Abstract: Many

More information

An artificial neural networks (ANNs) model is a functional abstraction of the

An artificial neural networks (ANNs) model is a functional abstraction of the CHAPER 3 3. Introduction An artificial neural networs (ANNs) model is a functional abstraction of the biological neural structures of the central nervous system. hey are composed of many simple and highly

More information

Multilayer Perceptron

Multilayer Perceptron Outline Hong Chang Institute of Computing Technology, Chinese Academy of Sciences Machine Learning Methods (Fall 2012) Outline Outline I 1 Introduction 2 Single Perceptron 3 Boolean Function Learning 4

More information

Documents de Travail du Centre d Economie de la Sorbonne

Documents de Travail du Centre d Economie de la Sorbonne Documents de Travail du Centre d Economie de la Sorbonne Forecasting chaotic systems : The role of local Lyapunov exponents Dominique GUEGAN, Justin LEROUX 2008.14 Maison des Sciences Économiques, 106-112

More information

Information Dynamics Foundations and Applications

Information Dynamics Foundations and Applications Gustavo Deco Bernd Schürmann Information Dynamics Foundations and Applications With 89 Illustrations Springer PREFACE vii CHAPTER 1 Introduction 1 CHAPTER 2 Dynamical Systems: An Overview 7 2.1 Deterministic

More information

Direct Method for Training Feed-forward Neural Networks using Batch Extended Kalman Filter for Multi- Step-Ahead Predictions

Direct Method for Training Feed-forward Neural Networks using Batch Extended Kalman Filter for Multi- Step-Ahead Predictions Direct Method for Training Feed-forward Neural Networks using Batch Extended Kalman Filter for Multi- Step-Ahead Predictions Artem Chernodub, Institute of Mathematical Machines and Systems NASU, Neurotechnologies

More information

No. 6 Determining the input dimension of a To model a nonlinear time series with the widely used feed-forward neural network means to fit the a

No. 6 Determining the input dimension of a To model a nonlinear time series with the widely used feed-forward neural network means to fit the a Vol 12 No 6, June 2003 cfl 2003 Chin. Phys. Soc. 1009-1963/2003/12(06)/0594-05 Chinese Physics and IOP Publishing Ltd Determining the input dimension of a neural network for nonlinear time series prediction

More information

Phase-Space learning for recurrent networks. and. Abstract. We study the problem of learning nonstatic attractors in recurrent networks.

Phase-Space learning for recurrent networks. and. Abstract. We study the problem of learning nonstatic attractors in recurrent networks. Phase-Space learning for recurrent networks Fu-Sheng Tsung and Garrison W Cottrell Department of Computer Science & Engineering and Institute for Neural Computation University of California, San Diego,

More information

Modeling and Predicting Chaotic Time Series

Modeling and Predicting Chaotic Time Series Chapter 14 Modeling and Predicting Chaotic Time Series To understand the behavior of a dynamical system in terms of some meaningful parameters we seek the appropriate mathematical model that captures the

More information

The Behaviour of a Mobile Robot Is Chaotic

The Behaviour of a Mobile Robot Is Chaotic AISB Journal 1(4), c SSAISB, 2003 The Behaviour of a Mobile Robot Is Chaotic Ulrich Nehmzow and Keith Walker Department of Computer Science, University of Essex, Colchester CO4 3SQ Department of Physics

More information

Part 8: Neural Networks

Part 8: Neural Networks METU Informatics Institute Min720 Pattern Classification ith Bio-Medical Applications Part 8: Neural Netors - INTRODUCTION: BIOLOGICAL VS. ARTIFICIAL Biological Neural Netors A Neuron: - A nerve cell as

More information

Phase-Space Learning

Phase-Space Learning Phase-Space Learning Fu-Sheng Tsung Chung Tai Ch'an Temple 56, Yuon-fon Road, Yi-hsin Li, Pu-li Nan-tou County, Taiwan 545 Republic of China Garrison W. Cottrell Institute for Neural Computation Computer

More information

Equivalence of Backpropagation and Contrastive Hebbian Learning in a Layered Network

Equivalence of Backpropagation and Contrastive Hebbian Learning in a Layered Network LETTER Communicated by Geoffrey Hinton Equivalence of Backpropagation and Contrastive Hebbian Learning in a Layered Network Xiaohui Xie xhx@ai.mit.edu Department of Brain and Cognitive Sciences, Massachusetts

More information

Artificial Neural Network Simulation of Battery Performance

Artificial Neural Network Simulation of Battery Performance Artificial work Simulation of Battery Performance C.C. O Gorman, D. Ingersoll, R.G. Jungst and T.L. Paez Sandia National Laboratories PO Box 58 Albuquerque, NM 8785 Abstract Although they appear deceptively

More information

D-Facto public., ISBN , pp

D-Facto public., ISBN , pp A Bayesian Approach to Combined Forecasting Maurits D. Out and Walter A. Kosters Leiden Institute of Advanced Computer Science Universiteit Leiden P.O. Box 9512, 2300 RA Leiden, The Netherlands Email:

More information

Available online at AASRI Procedia 1 (2012 ) AASRI Conference on Computational Intelligence and Bioinformatics

Available online at  AASRI Procedia 1 (2012 ) AASRI Conference on Computational Intelligence and Bioinformatics Available online at www.sciencedirect.com AASRI Procedia ( ) 377 383 AASRI Procedia www.elsevier.com/locate/procedia AASRI Conference on Computational Intelligence and Bioinformatics Chaotic Time Series

More information

Bidirectional Representation and Backpropagation Learning

Bidirectional Representation and Backpropagation Learning Int'l Conf on Advances in Big Data Analytics ABDA'6 3 Bidirectional Representation and Bacpropagation Learning Olaoluwa Adigun and Bart Koso Department of Electrical Engineering Signal and Image Processing

More information

Is correlation dimension a reliable indicator of low-dimensional chaos in short hydrological time series?

Is correlation dimension a reliable indicator of low-dimensional chaos in short hydrological time series? WATER RESOURCES RESEARCH, VOL. 38, NO. 2, 1011, 10.1029/2001WR000333, 2002 Is correlation dimension a reliable indicator of low-dimensional chaos in short hydrological time series? Bellie Sivakumar Department

More information

Neural networks. Chapter 20. Chapter 20 1

Neural networks. Chapter 20. Chapter 20 1 Neural networks Chapter 20 Chapter 20 1 Outline Brains Neural networks Perceptrons Multilayer networks Applications of neural networks Chapter 20 2 Brains 10 11 neurons of > 20 types, 10 14 synapses, 1ms

More information

Experiments with a Hybrid-Complex Neural Networks for Long Term Prediction of Electrocardiograms

Experiments with a Hybrid-Complex Neural Networks for Long Term Prediction of Electrocardiograms IEEE. ransactions of the 6 International World Congress of Computational Intelligence, IJCNN 6 Experiments with a Hybrid-Complex Neural Networks for Long erm Prediction of Electrocardiograms Pilar Gómez-Gil,

More information

A Dynamical Systems Approach to Modeling Input-Output Systems

A Dynamical Systems Approach to Modeling Input-Output Systems A Dynamical Systems Approach to Modeling Input-Output Systems Martin Casdagli Santa Fe Institute, 1120 Canyon Road Santa Fe, New Mexico 87501 Abstract Motivated by practical applications, we generalize

More information

Hand Written Digit Recognition using Kalman Filter

Hand Written Digit Recognition using Kalman Filter International Journal of Electronics and Communication Engineering. ISSN 0974-2166 Volume 5, Number 4 (2012), pp. 425-434 International Research Publication House http://www.irphouse.com Hand Written Digit

More information

Robust Learning of Chaotic Attractors

Robust Learning of Chaotic Attractors published in: Advances in Neural Information Processing Systems 12, S.A. Solla, T.K. Leen, K.-R. Müller (eds.), MIT Press, 2000, pp. 879--885. Robust Learning of Chaotic Attractors Rembrandt Bakker* Jaap

More information

Classification and Clustering of Printed Mathematical Symbols with Improved Backpropagation and Self-Organizing Map

Classification and Clustering of Printed Mathematical Symbols with Improved Backpropagation and Self-Organizing Map BULLETIN Bull. Malaysian Math. Soc. (Second Series) (1999) 157-167 of the MALAYSIAN MATHEMATICAL SOCIETY Classification and Clustering of Printed Mathematical Symbols with Improved Bacpropagation and Self-Organizing

More information

Neural networks. Chapter 19, Sections 1 5 1

Neural networks. Chapter 19, Sections 1 5 1 Neural networks Chapter 19, Sections 1 5 Chapter 19, Sections 1 5 1 Outline Brains Neural networks Perceptrons Multilayer perceptrons Applications of neural networks Chapter 19, Sections 1 5 2 Brains 10

More information

Detection of Nonlinearity and Stochastic Nature in Time Series by Delay Vector Variance Method

Detection of Nonlinearity and Stochastic Nature in Time Series by Delay Vector Variance Method International Journal of Engineering & Technology IJET-IJENS Vol:10 No:02 11 Detection of Nonlinearity and Stochastic Nature in Time Series by Delay Vector Variance Method Imtiaz Ahmed Abstract-- This

More information

Lecture 16: Introduction to Neural Networks

Lecture 16: Introduction to Neural Networks Lecture 16: Introduction to Neural Networs Instructor: Aditya Bhasara Scribe: Philippe David CS 5966/6966: Theory of Machine Learning March 20 th, 2017 Abstract In this lecture, we consider Bacpropagation,

More information

Analysis of Neural Networks with Chaotic Dynamics

Analysis of Neural Networks with Chaotic Dynamics Chaos, Solitonr & Fructals Vol. 3, No. 2, pp. 133-139, 1993 Printed in Great Britain @60-0779/93$6.00 + 40 0 1993 Pergamon Press Ltd Analysis of Neural Networks with Chaotic Dynamics FRANCOIS CHAPEAU-BLONDEAU

More information

Evaluating nonlinearity and validity of nonlinear modeling for complex time series

Evaluating nonlinearity and validity of nonlinear modeling for complex time series Evaluating nonlinearity and validity of nonlinear modeling for complex time series Tomoya Suzuki, 1 Tohru Ikeguchi, 2 and Masuo Suzuki 3 1 Department of Information Systems Design, Doshisha University,

More information

Long-Term Prediction, Chaos and Artificial Neural Networks. Where is the Meeting Point?

Long-Term Prediction, Chaos and Artificial Neural Networks. Where is the Meeting Point? Engineering Letters, 5:, EL_5 Long-Term Prediction, Chaos and Artificial Neural Networks. Where is the Meeting Point? Pilar Gómez-Gil Abstract This paper presents the advances of a research using a combination

More information

Reconstruction Deconstruction:

Reconstruction Deconstruction: Reconstruction Deconstruction: A Brief History of Building Models of Nonlinear Dynamical Systems Jim Crutchfield Center for Computational Science & Engineering Physics Department University of California,

More information

A DYNAMICAL APPROACH TO TEMPORAL PATTERN PROCESSING

A DYNAMICAL APPROACH TO TEMPORAL PATTERN PROCESSING 750 A DYNAMCAL APPROACH TO TEMPORAL PATTERN PROCESSNG W. Scott Stornetta Stanford University, Physics Department, Stanford, Ca., 94305 Tad Hogg and B. A. Huberman Xerox Palo Alto Research Center, Palo

More information

MODELLING OF METALLURGICAL PROCESSES USING CHAOS THEORY AND HYBRID COMPUTATIONAL INTELLIGENCE

MODELLING OF METALLURGICAL PROCESSES USING CHAOS THEORY AND HYBRID COMPUTATIONAL INTELLIGENCE MODELLING OF METALLURGICAL PROCESSES USING CHAOS THEORY AND HYBRID COMPUTATIONAL INTELLIGENCE J. Krishanaiah, C. S. Kumar, M. A. Faruqi, A. K. Roy Department of Mechanical Engineering, Indian Institute

More information

Data Mining Part 5. Prediction

Data Mining Part 5. Prediction Data Mining Part 5. Prediction 5.5. Spring 2010 Instructor: Dr. Masoud Yaghini Outline How the Brain Works Artificial Neural Networks Simple Computing Elements Feed-Forward Networks Perceptrons (Single-layer,

More information

Neural networks. Chapter 20, Section 5 1

Neural networks. Chapter 20, Section 5 1 Neural networks Chapter 20, Section 5 Chapter 20, Section 5 Outline Brains Neural networks Perceptrons Multilayer perceptrons Applications of neural networks Chapter 20, Section 5 2 Brains 0 neurons of

More information

Neural Networks. Chapter 18, Section 7. TB Artificial Intelligence. Slides from AIMA 1/ 21

Neural Networks. Chapter 18, Section 7. TB Artificial Intelligence. Slides from AIMA   1/ 21 Neural Networks Chapter 8, Section 7 TB Artificial Intelligence Slides from AIMA http://aima.cs.berkeley.edu / 2 Outline Brains Neural networks Perceptrons Multilayer perceptrons Applications of neural

More information

Recursive Least Squares for an Entropy Regularized MSE Cost Function

Recursive Least Squares for an Entropy Regularized MSE Cost Function Recursive Least Squares for an Entropy Regularized MSE Cost Function Deniz Erdogmus, Yadunandana N. Rao, Jose C. Principe Oscar Fontenla-Romero, Amparo Alonso-Betanzos Electrical Eng. Dept., University

More information

xt+1 = 1 ax 2 t + y t y t+1 = bx t (1)

xt+1 = 1 ax 2 t + y t y t+1 = bx t (1) Exercise 2.2: Hénon map In Numerical study of quadratic area-preserving mappings (Commun. Math. Phys. 50, 69-77, 1976), the French astronomer Michel Hénon proposed the following map as a model of the Poincaré

More information

Artificial Neural Networks

Artificial Neural Networks Artificial Neural Networks 鮑興國 Ph.D. National Taiwan University of Science and Technology Outline Perceptrons Gradient descent Multi-layer networks Backpropagation Hidden layer representations Examples

More information

Learning and Memory in Neural Networks

Learning and Memory in Neural Networks Learning and Memory in Neural Networks Guy Billings, Neuroinformatics Doctoral Training Centre, The School of Informatics, The University of Edinburgh, UK. Neural networks consist of computational units

More information

Forecasting Chaotic time series by a Neural Network

Forecasting Chaotic time series by a Neural Network Forecasting Chaotic time series by a Neural Network Dr. ATSALAKIS George Technical University of Crete, Greece atsalak@otenet.gr Dr. SKIADAS Christos Technical University of Crete, Greece atsalak@otenet.gr

More information

International University Bremen Guided Research Proposal Improve on chaotic time series prediction using MLPs for output training

International University Bremen Guided Research Proposal Improve on chaotic time series prediction using MLPs for output training International University Bremen Guided Research Proposal Improve on chaotic time series prediction using MLPs for output training Aakash Jain a.jain@iu-bremen.de Spring Semester 2004 1 Executive Summary

More information

T Machine Learning and Neural Networks

T Machine Learning and Neural Networks T-61.5130 Machine Learning and Neural Networks (5 cr) Lecture 11: Processing of Temporal Information Prof. Juha Karhunen https://mycourses.aalto.fi/ Aalto University School of Science, Espoo, Finland 1

More information

Introduction to Neural Networks

Introduction to Neural Networks Introduction to Neural Networks What are (Artificial) Neural Networks? Models of the brain and nervous system Highly parallel Process information much more like the brain than a serial computer Learning

More information

Adaptive Inverse Control

Adaptive Inverse Control TA1-8:30 Adaptive nverse Control Bernard Widrow Michel Bilello Stanford University Department of Electrical Engineering, Stanford, CA 94305-4055 Abstract A plant can track an input command signal if it

More information

Chapter 4 Neural Networks in System Identification

Chapter 4 Neural Networks in System Identification Chapter 4 Neural Networks in System Identification Gábor HORVÁTH Department of Measurement and Information Systems Budapest University of Technology and Economics Magyar tudósok körútja 2, 52 Budapest,

More information

Neural Networks. Associative memory 12/30/2015. Associative memories. Associative memories

Neural Networks. Associative memory 12/30/2015. Associative memories. Associative memories //5 Neural Netors Associative memory Lecture Associative memories Associative memories The massively parallel models of associative or content associative memory have been developed. Some of these models

More information

Memory Capacity of Input-Driven Echo State NetworksattheEdgeofChaos

Memory Capacity of Input-Driven Echo State NetworksattheEdgeofChaos Memory Capacity of Input-Driven Echo State NetworksattheEdgeofChaos Peter Barančok and Igor Farkaš Faculty of Mathematics, Physics and Informatics Comenius University in Bratislava, Slovakia farkas@fmph.uniba.sk

More information

Autonomous learning algorithm for fully connected recurrent networks

Autonomous learning algorithm for fully connected recurrent networks Autonomous learning algorithm for fully connected recurrent networks Edouard Leclercq, Fabrice Druaux, Dimitri Lefebvre Groupe de Recherche en Electrotechnique et Automatique du Havre Université du Havre,

More information

Generalized phase space projection for nonlinear noise reduction

Generalized phase space projection for nonlinear noise reduction Physica D 201 (2005) 306 317 Generalized phase space projection for nonlinear noise reduction Michael T Johnson, Richard J Povinelli Marquette University, EECE, 1515 W Wisconsin, Milwaukee, WI 53233, USA

More information

Back-Propagation Algorithm. Perceptron Gradient Descent Multilayered neural network Back-Propagation More on Back-Propagation Examples

Back-Propagation Algorithm. Perceptron Gradient Descent Multilayered neural network Back-Propagation More on Back-Propagation Examples Back-Propagation Algorithm Perceptron Gradient Descent Multilayered neural network Back-Propagation More on Back-Propagation Examples 1 Inner-product net =< w, x >= w x cos(θ) net = n i=1 w i x i A measure

More information

Artificial Neural Networks" and Nonparametric Methods" CMPSCI 383 Nov 17, 2011!

Artificial Neural Networks and Nonparametric Methods CMPSCI 383 Nov 17, 2011! Artificial Neural Networks" and Nonparametric Methods" CMPSCI 383 Nov 17, 2011! 1 Todayʼs lecture" How the brain works (!)! Artificial neural networks! Perceptrons! Multilayer feed-forward networks! Error

More information

Predicting Phase Synchronization for Homoclinic Chaos in a CO 2 Laser

Predicting Phase Synchronization for Homoclinic Chaos in a CO 2 Laser Predicting Phase Synchronization for Homoclinic Chaos in a CO 2 Laser Isao Tokuda, Jürgen Kurths, Enrico Allaria, Riccardo Meucci, Stefano Boccaletti and F. Tito Arecchi Nonlinear Dynamics, Institute of

More information

Model of a Biological Neuron as a Temporal Neural Network

Model of a Biological Neuron as a Temporal Neural Network Model of a Biological Neuron as a Temporal Neural Network Sean D. Murphy and Edward W. Kairiss Interdepartmental Neuroscience Program, Department of Psychology, and The Center for Theoretical and Applied

More information

A Logarithmic Neural Network Architecture for Unbounded Non-Linear Function Approximation

A Logarithmic Neural Network Architecture for Unbounded Non-Linear Function Approximation 1 Introduction A Logarithmic Neural Network Architecture for Unbounded Non-Linear Function Approximation J Wesley Hines Nuclear Engineering Department The University of Tennessee Knoxville, Tennessee,

More information

Chapter 15. Dynamically Driven Recurrent Networks

Chapter 15. Dynamically Driven Recurrent Networks Chapter 15. Dynamically Driven Recurrent Networks Neural Networks and Learning Machines (Haykin) Lecture Notes on Self-learning Neural Algorithms Byoung-Tak Zhang School of Computer Science and Engineering

More information

Determinism, Complexity, and Predictability in Computer Performance

Determinism, Complexity, and Predictability in Computer Performance Determinism, Complexity, and Predictability in Computer Performance arxiv:1305.508v1 [nlin.cd] 23 May 2013 Joshua Garland, Ryan G. James and Elizabeth Bradley Dept. of Computer Science University of Colorado,

More information

Pattern Classification

Pattern Classification Pattern Classification All materials in these slides were taen from Pattern Classification (2nd ed) by R. O. Duda,, P. E. Hart and D. G. Stor, John Wiley & Sons, 2000 with the permission of the authors

More information

Effects of data windows on the methods of surrogate data

Effects of data windows on the methods of surrogate data Effects of data windows on the methods of surrogate data Tomoya Suzuki, 1 Tohru Ikeguchi, and Masuo Suzuki 1 1 Graduate School of Science, Tokyo University of Science, 1-3 Kagurazaka, Shinjuku-ku, Tokyo

More information

Predicting Chaotic Time Series by Reinforcement Learning

Predicting Chaotic Time Series by Reinforcement Learning Predicting Chaotic Time Series by Reinforcement Learning T. Kuremoto 1, M. Obayashi 1, A. Yamamoto 1, and K. Kobayashi 1 1 Dep. of Computer Science and Systems Engineering, Engineering Faculty,Yamaguchi

More information

Deep Belief Networks are compact universal approximators

Deep Belief Networks are compact universal approximators 1 Deep Belief Networks are compact universal approximators Nicolas Le Roux 1, Yoshua Bengio 2 1 Microsoft Research Cambridge 2 University of Montreal Keywords: Deep Belief Networks, Universal Approximation

More information

MODIFIED DYNAMIC NEURON MODEL

MODIFIED DYNAMIC NEURON MODEL MODIFIED DYNMIC NEURON MODEL Dubravko Majetić, Danko Brezak, Josip Kasać, Branko Novaković Prof.dr.sc. D. Majetić, University of Zagreb, FSB, I. Lucica 5, 0000 Zagreb Mr.sc. D. Brezak, University of Zagreb,

More information

Complex system approach to geospace and climate studies. Tatjana Živković

Complex system approach to geospace and climate studies. Tatjana Živković Complex system approach to geospace and climate studies Tatjana Živković 30.11.2011 Outline of a talk Importance of complex system approach Phase space reconstruction Recurrence plot analysis Test for

More information

Lecture 13 Back-propagation

Lecture 13 Back-propagation Lecture 13 Bac-propagation 02 March 2016 Taylor B. Arnold Yale Statistics STAT 365/665 1/21 Notes: Problem set 4 is due this Friday Problem set 5 is due a wee from Monday (for those of you with a midterm

More information

Supervised (BPL) verses Hybrid (RBF) Learning. By: Shahed Shahir

Supervised (BPL) verses Hybrid (RBF) Learning. By: Shahed Shahir Supervised (BPL) verses Hybrid (RBF) Learning By: Shahed Shahir 1 Outline I. Introduction II. Supervised Learning III. Hybrid Learning IV. BPL Verses RBF V. Supervised verses Hybrid learning VI. Conclusion

More information

Learning Cellular Automaton Dynamics with Neural Networks

Learning Cellular Automaton Dynamics with Neural Networks Learning Cellular Automaton Dynamics with Neural Networks N H Wulff* and J A Hertz t CONNECT, the Niels Bohr Institute and Nordita Blegdamsvej 17, DK-2100 Copenhagen 0, Denmark Abstract We have trained

More information

A new method for short-term load forecasting based on chaotic time series and neural network

A new method for short-term load forecasting based on chaotic time series and neural network A new method for short-term load forecasting based on chaotic time series and neural network Sajjad Kouhi*, Navid Taghizadegan Electrical Engineering Department, Azarbaijan Shahid Madani University, Tabriz,

More information

Relating Real-Time Backpropagation and. Backpropagation-Through-Time: An Application of Flow Graph. Interreciprocity.

Relating Real-Time Backpropagation and. Backpropagation-Through-Time: An Application of Flow Graph. Interreciprocity. Neural Computation, 1994 Relating Real-Time Backpropagation and Backpropagation-Through-Time: An Application of Flow Graph Interreciprocity. Francoise Beaufays and Eric A. Wan Abstract We show that signal

More information

Predicting the Future with the Appropriate Embedding Dimension and Time Lag JAMES SLUSS

Predicting the Future with the Appropriate Embedding Dimension and Time Lag JAMES SLUSS Predicting the Future with the Appropriate Embedding Dimension and Time Lag Georgios Lezos, Monte Tull, Joseph Havlicek, and Jim Sluss GEORGIOS LEZOS Graduate Student School of Electtical & Computer Engineering

More information

18.6 Regression and Classification with Linear Models

18.6 Regression and Classification with Linear Models 18.6 Regression and Classification with Linear Models 352 The hypothesis space of linear functions of continuous-valued inputs has been used for hundreds of years A univariate linear function (a straight

More information

Commun Nonlinear Sci Numer Simulat

Commun Nonlinear Sci Numer Simulat Commun Nonlinear Sci Numer Simulat 16 (2011) 3294 3302 Contents lists available at ScienceDirect Commun Nonlinear Sci Numer Simulat journal homepage: www.elsevier.com/locate/cnsns Neural network method

More information

Introduction Biologically Motivated Crude Model Backpropagation

Introduction Biologically Motivated Crude Model Backpropagation Introduction Biologically Motivated Crude Model Backpropagation 1 McCulloch-Pitts Neurons In 1943 Warren S. McCulloch, a neuroscientist, and Walter Pitts, a logician, published A logical calculus of the

More information

D-Facto public., ISBN , pp

D-Facto public., ISBN , pp Chaotic Time Series Prediction using the Kohonen Algorithm Luis Monzón Ben tez Λ;ΛΛ;y, Ademar Ferreira Λ and Diana I. Pedreira Iparraguirre ΛΛ Λ Departamento de Engenharia de Telecomunicaοc~ oes e Controle

More information

Quantitative Description of Robot-Environment Interaction Using Chaos Theory 1

Quantitative Description of Robot-Environment Interaction Using Chaos Theory 1 Quantitative Description of Robot-Environment Interaction Using Chaos Theory 1 Ulrich Nehmzow Keith Walker Dept. of Computer Science Department of Physics University of Essex Point Loma Nazarene University

More information

inear Adaptive Inverse Control

inear Adaptive Inverse Control Proceedings of the 36th Conference on Decision & Control San Diego, California USA December 1997 inear Adaptive nverse Control WM15 1:50 Bernard Widrow and Gregory L. Plett Department of Electrical Engineering,

More information

Nonlinear Systems, Chaos and Control in Engineering

Nonlinear Systems, Chaos and Control in Engineering Coordinating unit: Teaching unit: Academic year: Degree: ECTS credits: 2018 205 - ESEIAAT - Terrassa School of Industrial, Aerospace and Audiovisual Engineering 748 - FIS - Department of Physics BACHELOR'S

More information

ARTIFICIAL NEURAL NETWORKS گروه مطالعاتي 17 بهار 92

ARTIFICIAL NEURAL NETWORKS گروه مطالعاتي 17 بهار 92 ARTIFICIAL NEURAL NETWORKS گروه مطالعاتي 17 بهار 92 BIOLOGICAL INSPIRATIONS Some numbers The human brain contains about 10 billion nerve cells (neurons) Each neuron is connected to the others through 10000

More information

Recursive Neural Filters and Dynamical Range Transformers

Recursive Neural Filters and Dynamical Range Transformers Recursive Neural Filters and Dynamical Range Transformers JAMES T. LO AND LEI YU Invited Paper A recursive neural filter employs a recursive neural network to process a measurement process to estimate

More information

Refutation of Second Reviewer's Objections

Refutation of Second Reviewer's Objections Re: Submission to Science, "Harnessing nonlinearity: predicting chaotic systems and boosting wireless communication." (Ref: 1091277) Refutation of Second Reviewer's Objections Herbert Jaeger, Dec. 23,

More information

Frequency Selective Surface Design Based on Iterative Inversion of Neural Networks

Frequency Selective Surface Design Based on Iterative Inversion of Neural Networks J.N. Hwang, J.J. Choi, S. Oh, R.J. Marks II, "Query learning based on boundary search and gradient computation of trained multilayer perceptrons", Proceedings of the International Joint Conference on Neural

More information

Simple conservative, autonomous, second-order chaotic complex variable systems.

Simple conservative, autonomous, second-order chaotic complex variable systems. Simple conservative, autonomous, second-order chaotic complex variable systems. Delmar Marshall 1 (Physics Department, Amrita Vishwa Vidyapeetham, Clappana P.O., Kollam, Kerala 690-525, India) and J. C.

More information

Stable Adaptive Momentum for Rapid Online Learning in Nonlinear Systems

Stable Adaptive Momentum for Rapid Online Learning in Nonlinear Systems Stable Adaptive Momentum for Rapid Online Learning in Nonlinear Systems Thore Graepel and Nicol N. Schraudolph Institute of Computational Science ETH Zürich, Switzerland {graepel,schraudo}@inf.ethz.ch

More information

Complex Dynamics of Microprocessor Performances During Program Execution

Complex Dynamics of Microprocessor Performances During Program Execution Complex Dynamics of Microprocessor Performances During Program Execution Regularity, Chaos, and Others Hugues BERRY, Daniel GRACIA PÉREZ, Olivier TEMAM Alchemy, INRIA, Orsay, France www-rocq.inria.fr/

More information

Biological Systems Modeling & Simulation. Konstantinos P. Michmizos, PhD

Biological Systems Modeling & Simulation. Konstantinos P. Michmizos, PhD Biological Systems Modeling & Simulation 2 Konstantinos P. Michmizos, PhD June 25, 2012 Previous Lecture Biomedical Signal examples (1-d, 2-d, 3-d, ) Purpose of Signal Analysis Noise Frequency domain (1-d,

More information

Evolutionary Design of Neural Trees for Heart Rate Prediction

Evolutionary Design of Neural Trees for Heart Rate Prediction .! Evolutionary Design of Neural Trees for Heart Rate Prediction Byoung-Tak Zhang 1 Je-Gun Joung 2 1 Dept of Computer Engineering, Seoul National University Seoul 151-7 42, Korea. E-mail: btzhang@comp.snu.ac.kr

More information

Automatic Structure and Parameter Training Methods for Modeling of Mechanical System by Recurrent Neural Networks

Automatic Structure and Parameter Training Methods for Modeling of Mechanical System by Recurrent Neural Networks Automatic Structure and Parameter Training Methods for Modeling of Mechanical System by Recurrent Neural Networks C. James Li and Tung-Yung Huang Department of Mechanical Engineering, Aeronautical Engineering

More information

PHYSICAL REVIEW E VOLUME 56, NUMBER 3 ARTICLES. Data-based control trajectory planning for nonlinear systems

PHYSICAL REVIEW E VOLUME 56, NUMBER 3 ARTICLES. Data-based control trajectory planning for nonlinear systems PHYSICAL REVIEW E VOLUME 56, NUMBER 3 SEPTEMBER 1997 ARTICLES Data-based control trajectory planning for nonlinear systems Carl Rhodes Chemical Engineering 210-41, California Institute of Technology, Pasadena,

More information

Implementation of the Synergetic Computer Algorithm on AutoCAD Platform

Implementation of the Synergetic Computer Algorithm on AutoCAD Platform Implementation of the Synergetic Computer Algorithm on AutoCAD Platform DIRI LOGINOV Department of Environmental Engineering allinn University of echnology Ehitajate tee 5, 9086 allinn ESONIA dmitri.loginov@gmail.com

More information

Adaptive Inverse Control based on Linear and Nonlinear Adaptive Filtering

Adaptive Inverse Control based on Linear and Nonlinear Adaptive Filtering Adaptive Inverse Control based on Linear and Nonlinear Adaptive Filtering Bernard Widrow and Gregory L. Plett Department of Electrical Engineering, Stanford University, Stanford, CA 94305-9510 Abstract

More information

Detection of Nonlinearity and Chaos in Time. Series. Milan Palus. Santa Fe Institute Hyde Park Road. Santa Fe, NM 87501, USA

Detection of Nonlinearity and Chaos in Time. Series. Milan Palus. Santa Fe Institute Hyde Park Road. Santa Fe, NM 87501, USA Detection of Nonlinearity and Chaos in Time Series Milan Palus Santa Fe Institute 1399 Hyde Park Road Santa Fe, NM 87501, USA E-mail: mp@santafe.edu September 21, 1994 Abstract A method for identication

More information

Knowledge Extraction from DBNs for Images

Knowledge Extraction from DBNs for Images Knowledge Extraction from DBNs for Images Son N. Tran and Artur d Avila Garcez Department of Computer Science City University London Contents 1 Introduction 2 Knowledge Extraction from DBNs 3 Experimental

More information