Multiperiodicity and Exponential Attractivity Evoked by Periodic External Inputs in Delayed Cellular Neural Networks

Similar documents
Global Asymptotic Stability of a General Class of Recurrent Neural Networks With Time-Varying Delays

Synchronization of a General Delayed Complex Dynamical Network via Adaptive Feedback

CONVERGENCE BEHAVIOUR OF SOLUTIONS TO DELAY CELLULAR NEURAL NETWORKS WITH NON-PERIODIC COEFFICIENTS

Analysis of stability for impulsive stochastic fuzzy Cohen-Grossberg neural networks with mixed delays

Linear matrix inequality approach for synchronization control of fuzzy cellular neural networks with mixed time delays

Research Article Mathematical Model and Cluster Synchronization for a Complex Dynamical Network with Two Types of Chaotic Oscillators

Time-delay feedback control in a delayed dynamical chaos system and its applications

EXISTENCE AND EXPONENTIAL STABILITY OF ANTI-PERIODIC SOLUTIONS IN CELLULAR NEURAL NETWORKS WITH TIME-VARYING DELAYS AND IMPULSIVE EFFECTS

Research Article Delay-Dependent Exponential Stability for Discrete-Time BAM Neural Networks with Time-Varying Delays

Stability and bifurcation of a simple neural network with multiple time delays.

RECENTLY, many artificial neural networks especially

A DELAY-DEPENDENT APPROACH TO DESIGN STATE ESTIMATOR FOR DISCRETE STOCHASTIC RECURRENT NEURAL NETWORK WITH INTERVAL TIME-VARYING DELAYS

IN THIS PAPER, we consider a class of continuous-time recurrent

STABILITY AND HOPF BIFURCATION ON A TWO-NEURON SYSTEM WITH TIME DELAY IN THE FREQUENCY DOMAIN *

EXPONENTIAL SYNCHRONIZATION OF DELAYED REACTION-DIFFUSION NEURAL NETWORKS WITH GENERAL BOUNDARY CONDITIONS

A Novel Chaotic Neural Network Architecture

DISSIPATIVITY OF NEURAL NETWORKS WITH CONTINUOUSLY DISTRIBUTED DELAYS

LINEAR variational inequality (LVI) is to find

Global existence of periodic solutions of BAM neural networks with variable coefficients

New Stability Criteria for Recurrent Neural Networks with a Time-varying Delay

A Recurrent Neural Network for Solving Sylvester Equation With Time-Varying Coefficients

Patterns, Memory and Periodicity in Two-Neuron Delayed Recurrent Inhibitory Loops

A dual neural network for convex quadratic programming subject to linear equality and inequality constraints

An Introductory Course in Computational Neuroscience

Delay-Dependent Exponential Stability of Linear Systems with Fast Time-Varying Delay

Adaptive synchronization of chaotic neural networks with time delays via delayed feedback control

SYNCHRONIZATION OF NONAUTONOMOUS DYNAMICAL SYSTEMS

Hopfield Neural Network

CONTROLLING IN BETWEEN THE LORENZ AND THE CHEN SYSTEMS

Improved delay-dependent globally asymptotic stability of delayed uncertain recurrent neural networks with Markovian jumping parameters

Effects of Interactive Function Forms in a Self-Organized Critical Model Based on Neural Networks

STUDY OF SYNCHRONIZED MOTIONS IN A ONE-DIMENSIONAL ARRAY OF COUPLED CHAOTIC CIRCUITS

Existence of permanent oscillations for a ring of coupled van der Pol oscillators with time delays

Effects of Interactive Function Forms and Refractoryperiod in a Self-Organized Critical Model Based on Neural Networks

Projective synchronization of a complex network with different fractional order chaos nodes

Tracking the State of the Hindmarsh-Rose Neuron by Using the Coullet Chaotic System Based on a Single Input

The Mixed States of Associative Memories Realize Unimodal Distribution of Dominance Durations in Multistable Perception

PULSE-COUPLED networks (PCNs) of integrate-and-fire

On the dynamics of strongly tridiagonal competitive-cooperative system

Proceedings of Neural, Parallel, and Scientific Computations 4 (2010) xx-xx PHASE OSCILLATOR NETWORK WITH PIECEWISE-LINEAR DYNAMICS

ADAPTIVE FEEDBACK LINEARIZING CONTROL OF CHUA S CIRCUIT

A NEW COMPARISON METHOD FOR STABILITY THEORY OF DIFFERENTIAL SYSTEMS WITH TIME-VARYING DELAYS

EXISTENCE OF POSITIVE PERIODIC SOLUTIONS OF DISCRETE MODEL FOR THE INTERACTION OF DEMAND AND SUPPLY. S. H. Saker

SYNCHRONIZATION IN SMALL-WORLD DYNAMICAL NETWORKS

Machine Learning. Neural Networks

Learning and Memory in Neural Networks

Multiple-mode switched observer-based unknown input estimation for a class of switched systems

Existence of Positive Periodic Solutions of Mutualism Systems with Several Delays 1

DRAFT -- DRAFT -- DRAFT -- DRAFT -- DRAFT --

EXPONENTIAL STABILITY OF SWITCHED LINEAR SYSTEMS WITH TIME-VARYING DELAY

COMPLEX DYNAMICS AND CHAOS CONTROL IN DUFFING-VAN DER POL EQUATION WITH TWO EXTERNAL PERIODIC FORCING TERMS

Applied Mathematics Letters

Dynamical systems in neuroscience. Pacific Northwest Computational Neuroscience Connection October 1-2, 2010

A SYSTEMATIC APPROACH TO GENERATING n-scroll ATTRACTORS

This article appeared in a journal published by Elsevier. The attached copy is furnished to the author for internal non-commercial research and

Correspondence should be addressed to Chien-Yu Lu,

Asymptotic stability of solutions of a class of neutral differential equations with multiple deviating arguments

Hopfield Neural Network and Associative Memory. Typical Myelinated Vertebrate Motoneuron (Wikipedia) Topic 3 Polymers and Neurons Lecture 5

Noise-to-State Stability for Stochastic Hopfield Neural Networks with Delays

Robust SPR Synthesis for Low-Order Polynomial Segments and Interval Polynomials

Global dynamics of discrete neural networks allowing non-monotonic activation functions

A Delay-dependent Condition for the Exponential Stability of Switched Linear Systems with Time-varying Delay

Long-Term Prediction, Chaos and Artificial Neural Networks. Where is the Meeting Point?

Self-organized Criticality and Synchronization in a Pulse-coupled Integrate-and-Fire Neuron Model Based on Small World Networks

H Synchronization of Chaotic Systems via Delayed Feedback Control

Distributed Adaptive Synchronization of Complex Dynamical Network with Unknown Time-varying Weights

LYAPUNOV-RAZUMIKHIN METHOD FOR DIFFERENTIAL EQUATIONS WITH PIECEWISE CONSTANT ARGUMENT. Marat Akhmet. Duygu Aruğaslan

Research Article Finite-Time Robust Stabilization for Stochastic Neural Networks

Robust Control of Uncertain Stochastic Recurrent Neural Networks with Time-varying Delay

Author's personal copy

Solving TSP Using Lotka-Volterra Neural Networks without Self-Excitatory

Nontrivial Solutions for Boundary Value Problems of Nonlinear Differential Equation

Bidirectional Coupling of two Duffing-type Circuits

UNCERTAINTY FUNCTIONAL DIFFERENTIAL EQUATIONS FOR FINANCE

Visual Selection and Attention Shifting Based on FitzHugh-Nagumo Equations

A LaSalle version of Matrosov theorem

+ + ( + ) = Linear recurrent networks. Simpler, much more amenable to analytic treatment E.g. by choosing

Impulsive Stabilization for Control and Synchronization of Chaotic Systems: Theory and Application to Secure Communication

Blind Extraction of Singularly Mixed Source Signals

2.152 Course Notes Contraction Analysis MIT, 2005

Introduction to Neural Networks

Permanence Implies the Existence of Interior Periodic Solutions for FDEs

Existence of homoclinic solutions for Duffing type differential equation with deviating argument

Passivity-based Stabilization of Non-Compact Sets

Convergence Rate of Nonlinear Switched Systems

Impulsive Stabilization of certain Delay Differential Equations with Piecewise Constant Argument

Stability of interval positive continuous-time linear systems

Phase Response Properties and Phase-Locking in Neural Systems with Delayed Negative-Feedback. Carter L. Johnson

Localized Excitations in Networks of Spiking Neurons

Consider the following spike trains from two different neurons N1 and N2:

Contents. 1. Introduction

Associative Memories (I) Hopfield Networks

DESYNCHRONIZATION TRANSITIONS IN RINGS OF COUPLED CHAOTIC OSCILLATORS

On rigorous integration of piece-wise linear systems

Applied Mathematics Letters

L p Approximation of Sigma Pi Neural Networks

L -Bounded Robust Control of Nonlinear Cascade Systems

für Mathematik in den Naturwissenschaften Leipzig

Using a Hopfield Network: A Nuts and Bolts Approach

Multi-Scroll Chaotic Attractors in SC-CNN via Hyperbolic Tangent Function

Transcription:

LETTER Communicated by Richard Hahnloser Multiperiodicity and Exponential Attractivity Evoked by Periodic External Inputs in Delayed Cellular Neural Networks Zhigang Zeng zhigangzeng@163.com School of Automation, Wuhan University of Technology, Wuhan, Hubei, 37, China Jun Wang jwang@acae.cuhk.edu.hk Department of Automation and Computer-Aided Engineering, Chinese University of Hong Kong, Shatin, New Territories, Hong Kong We show that an n-neuron cellular neural network with time-varying delay can have n periodic orbits located in saturation regions and these periodic orbits are locally exponentially attractive. In addition, we give some conditions for ascertaining periodic orbits to be locally or globally exponentially attractive and allow them to locate in any designated region. As a special case of exponential periodicity, exponential stability of delayed cellular neural networks is also characterized. These conditions improve and extend the existing results in the literature. To illustrate and compare the results, simulation results are discussed in three numerical examples. 1 Introduction Cellular neural networks (CNNs) and delayed cellular neural networks (DCNNs) are arrays of dynamical cells that are suitable for solving many complex computational problems. In recent years, both have been extensively studied and successfully applied for signal processing and solving nonlinear algebraic equations. As dynamic systems with a special structure, CNNs and DCNNs have many interesting properties that deserve theoretical studies. In general, there are two interesting nonlinear neurodynamic properties in CNNs and DCNNs: stability and periodic oscillations. The stability of a CNN or a DCNN at an equilibrium point means that for a given activation function and a constant input vector, an equilibrium of the network exists and any state in the neighborhood converges to the equilibrium. The stability of neuron activation states at an equilibrium is prerequisite for most applications. Some neurodynamics have multiple (two) stable equilibria and may be stable at any equilibrium depending on the initial state, which is called Neural Computation 18, 88 87 (6) C 6 Massachusetts Institute of Technology

Multiperiodicity and Exponential Attractivity of CNNs 89 multistability (bistability). For stability, either an equilibrium or a set of equilibria is the attractor. Besides stability, an activation state may be periodically oscillatory around an orbit. In this case, the attractor is a limit set. Periodic oscillation in recurrent neural networks is an interesting dynamic behavior, as many biological and cognitive activities (e.g., heartbeat, respiration, mastication, locomotion, and memorization) require repetition. Persistent oscillation, such as limit cycles, represents a common feature of neural firing patterns produced by dynamic interplay between cellular and synaptic mechanisms. Stimulus-evoked oscillatory synchronization was observed in many biological neural systems, including the cerebral cortex of mammals and the brain of insects. It was also known that time delays can cause oscillations in neurodynamics (Gopalsamy & Leung, 1996; Belair, Campbell, & Driessche, 1996). In addition, periodic oscillations in recurrent neural networks have found many applications, such as associative memories (Nishikawa, Lai, & Hoppenstea, ), pattern recognition (Wang, 1995; Chen, Wang, & Liu, ), machine learning (Ruiz, Owens, & Townley, 1998; Townley et al., ), and robot motion control (Jin & Zacksenhouse, 3). The analysis of periodic oscillation of neural networks is more general than stability analysis since an equilibrium point can be viewed as a special case of oscillation with any arbitrary period. The stability of CNNs and DCNNs has been widely investigated (e.g., Chua & Roska, 199, 199; Civalleri, Gilli, & Pandolfi, 1993; Liao, Wu, & Yu, 1999; Roska, Wu, Balsi, & Chua, 199; Roska, Wu, & Chua, 1993; Setti, Thiran, & Serpico, 1998; Takahashi, ; Zeng, Wang, & Liao, 3). The existence of periodic orbits together with global exponential stability of CNNs and DCNNs is studied in Yi, Heng, and Vadakkepat () and Wang and Zou (). Most existing studies (Berns, Moiola, & Chen, 1998; Jiang & Teng, ; Kanamaru & Sekeine, ; Liao & Wang, 3; Liu, Chen, Cao, & Huang, 3; Wang & Zou, ) are based on the assumption that the equilibrium point of CNNs or DCNNs is globally stable or the periodic orbit of CNNs or DCNNs is globally attractive; hence, CNNs or DCNNs have only one equilibrium point or one periodic orbit. However, in most applications, it is required that CNNs or DCNNs exhibit more than one stable equilibrium point (e.g., Yi, Tan, & Lee, 3; Zeng, Wang, & Liao, ), or more than one exponentially attractive periodic orbit instead of a single globally stable equilibrium point. In this letter, we investigate the multiperiodicity and multistablity of DCNNs. We show that an n-neuron DCNN can have n periodic orbits that are locally exponentially attractive. Moreover, we present the estimates of attractive domain of such n locally exponentially attractive periodic orbits. In addition, we give the conditions for periodic orbits to be locally or globally exponentially attractive when the periodic orbits locate in a designated position. All of these conditions are very easy to be verified.

85 Z. Zeng and J. Wang The remaining part of this letter consists of six sections. In section, relevant background information is given. The main results are stated in sections 3,, and 5. In section 6, three illustrative examples are provided with simulation results. Finally, concluding remarks are given in section 7. Preliminaries Consider the DCNN model governed by the following normalized dynamic equations: = x i (t) + + a ij f (x j (t)) b ij f (x j (t τ ij (t))) + u i (t), i = 1,...,n, (.1) where x = (x 1,...,x n ) T R n is the state vector, A = (a ij )andb = (b ij )are connection weight matrices that are not assumed to be symmetric, u(t) = (u 1 (t),...,u n (t)) T R n is a periodic input vector with period ω (i.e., there exists a constant ω> such that u i (t + ω) = u i (t) t, i {1,,...,n}), τ ij (t) is the time-varying delay that satisfies τ ij (t) τ (τ is constant), and f ( ) is the piecewise linear activation function defined by f (v) = ( v + 1 v 1 )/. In particular, when b ij ( i, j = 1,,...,n), the DCNN degenerates as a CNN. Let C([t τ,t ], D) be the space of continuous functions mapping [t τ,t ] into D R n with the norm defined by φ t = max 1 i n {sup u [t τ,t ] φ i(u) }, where φ(s) = (φ 1 (s),φ (s),...,φ n (s)) T. Denote x = max 1 i n { x i } as the vector norm of the vector x = (x 1,...,x n ) T. φ,ϕ C([t τ,t ], D), where φ(s) = (φ 1 (s),φ (s),...,φ n (s)) T,ϕ(s) = (ϕ 1 (s), ϕ (s),..., ϕ n (s)) T.Denote φ,ϕ t = max 1 i n {sup t τ s t { φ i (s) ϕ i (s) }} as a measurement in C([t τ,t ], D). The initial condition of DCNN model.1 is assumed to be φ(ϑ) = (φ 1 (ϑ), φ (ϑ),...,φ n (ϑ)) T, where φ(ϑ) C([t τ,t ], R n ). Denote x(t; t,φ)asthe state of DCNN model.1 with initial condition (t,φ). It means that x(t; t,φ) is continuous and satisfies equation.1 and x(s; t,φ) = φ(s), for s [t τ,t ]. Denote (, 1) = (, 1) 1 [ 1,1] (1,+ ) ;[ 1,1] = (, 1) [ 1, 1] 1 (1, + ) ; (1, + ) = (, 1) [ 1, 1] (1, + ) 1, R=(, + ) = (, 1) [ 1, 1] (1, + ), so (, + ) n can be divided into 3 n subspaces: ={ i=1 n (, 1)δ(i) 1 [ 1, 1] δ(i) (1, + ) δ(i) 3, ( (i) ) δ 1,δ(i),δ(i) 3 = (1,, ) or (, 1, ) or (,, 1), i = 1,...,n}; (.)

Multiperiodicity and Exponential Attractivity of CNNs 851 and can be divided into three subspaces: 1 ={[ 1, 1] n } ={ n i=1 (, 1)δ(i) (1, + ) 1 δ(i), δ (i) = 1or, i = 1,...,n} 3 = 1 Hence, 1 is composed of one region, is composed of n regions, and 3 is composed of 3 n n 1 regions. Definition 1. A periodic orbit x (t) is said to be a limit cycle of a DCNN if x (t) is an isolated periodic orbit of the DCNN; that is, there exists ω>such that t t, x (t + ω) = x (t), and there exists δ>suchthat x {x < x, x (t) <δ,x R n, t t }, where x is not a point on any periodic orbit of the DCNN. Definition. A periodic orbit x (t) of a DCNN is said to be locally exponentially attractive in region if there exist constants α>,β >suchthat t t, x(t; t,φ) x (t) β φ t exp{ α(t t )}, where x(t; t,φ) is the state of the DCNN with any initial condition (t,φ),φ(ϑ) C([t τ,t ], ), and is said to be a locally exponentially attractive set of the periodic orbit x (t). When =R n, x (t) is said to be globally exponentially attractive. In particular, if x (t) is a fixed point x, then the DCNN is said to be global exponentially stable. Lemma 1. (Kosaku, 1978). Let D be a compact set in R n, H be a mapping on complete metric space (C([t τ,t ], D),, t ). If H(C([t τ,t ], D)) C([t τ,t ], D), and there exists a constant α<1 such that φ,ϕ C([t τ,t ], D), H(φ), H(ϕ) t α φ,ϕ t, then there exists φ C([t τ,t ], D) such that H(φ ) = φ. Consider the following coupled system: dx(t) dy(t) = x(t) + Ay(t) + By(t τ(t)), (.3) = h(t, y(t), y(t τ(t))), (.) where x(t) R n, y(t) R m, A and B are n m matrices, h C(R R m R m, R m ), and C(R R m R m, R m ) is the space of continuous functions mapping R R m R m into R m.

85 Z. Zeng and J. Wang Lemma. If system. is globally exponentially stable, then system.3 is also globally exponentially stable. Proof. By applying the variant format of constants, the solution x(t) of equation.3 can be expressed as x(t) = exp{ (t t )}x(t ) + t t exp{ (t s)}(ay(s) + By(s τ(s)))ds. Since equation. is globally exponentially stable, there exist constants α, β > such that y(s) β exp{ α(s t )}. Hence, Ay(s) + By(s τ(s)) β exp{ α(s t )}, where β = ( A + B exp{ατ})β. Then when α = 1, t t, x(t) x(t ) exp{ (t t )}+ β(t t ) exp{ (t t )}; when α 1, x(t) x(t ) exp{ (t t )}+ β(exp{ (t t )}+exp{ α(t t )})/ 1 α ; that is, equation.3 is also globally exponentially stable. Throughout this article, we assume that N 1 N N3 ={1,,...,n}, N 1 N, N 1 N3, and N N3 are empty. Denote D 1 ={x R n x i (, 1), i N 1 ; x i (1, ), i N ; x i [ 1, 1], i N 3 }. Note that D 1, where is defined in equation.. If N 3 is empty, then denote D ={x R n x i (, 1), i N 1 ; x i (1, ), i N }. 3 Locally Exponentially Attractive Multiperiodicity in a Saturation Region In this section, we show that an n-neuron delayed cellular neural network can have n periodic orbits located in saturation regions and these periodic orbits are locally exponentially attractive. Theorem 1. If i {1,,...,n}, t t, u i (t) < a ii 1 a ij, j i b ij, (3.1) then DCNN (see equation.1) has n locally exponentially attractive limit cycles. Proof. If s [t τ,t], x(s) D, from equation.1, i = 1,,...,n, = x i (t) + j N 1 (a ij + b ij ) + u i (t). (3.)

Multiperiodicity and Exponential Attractivity of CNNs 853 When i N and x i (t) = 1, from equations 3.1 and 3., = 1 + j N 1 (a ij + b ij ) + u i (t) >. (3.3) When i N 1 and x i (t) = 1, from equations 3.1 and 3., = 1 + j N 1 (a ij + b ij ) + u i (t) <. (3.) Equations 3.3 and 3. imply that if φ C([t τ,t ], D ), then x(t; t,φ) will keep in D, and D is an invariant set of DCNN (see equation.1). So t t τ, x(t) D. Hence, DCNN, equation.1, can be rewritten as equation 3.. Let x(t; t,φ)andx(t; t,ϕ) be two states of DCNN, equation.1, with initial conditions (t,φ)and(t,ϕ), where φ,ϕ C([t τ,t ], D ). From equations.1 and 3., i {1,,...,n}, t t, d(x i (t; t,φ) x i (t; t,ϕ)) Hence, i = 1,,...,n, t t, = (x i (t; t,φ) x i (t; t,ϕ)). (3.5) x i (t; t,φ) x i (t; t,ϕ) φ,ϕ t exp{ (t t )}. (3.6) Define x (t) φ (θ) = x(t + θ; t,φ),θ [t τ,t ]. Then from equations 3.3 and 3., φ C([t τ,t ], D ), x (t) φ C([t τ,t ], D ). Define a mapping H : C([t τ,t ], D ) C([t τ,t ], D )byh(φ) = x (ω) φ. Then H(C([t τ,t ], D )) C([t τ,t ], D ), and H m (φ) = x (mω) φ. We can choose a positive integer m such that exp{ (mω τ)} α<1. Hence, from equation 3.6, { H m (φ), H m (ϕ) t max 1 i n sup θ [t τ,t ] } x i (mω + θ; t,φ) x i (mω + θ; t,ϕ) φ,ϕ t exp{ (mω + t τ t )} α φ,ϕ t. Based on lemma 1, there exists a unique fixed point φ C([t τ,t ], D )such that H m (φ ) = φ. In addition, H m (H(φ )) = H(H m (φ )) = H(φ ). This shows that H(φ ) is also a fixed point of H m. Hence, by the uniqueness of the fixed

85 Z. Zeng and J. Wang point of the mapping H m, H(φ ) = φ ; that is, x (ω) φ = φ. Let x(t; t,φ )beastate of DCNN, equation.1, with initial condition (t,φ ). Then from equation.1, i = 1,,...,n, t t, dx i (t; t,φ ) = x i (t; t,φ ) + j N 1 (a ij + b ij ) + u i (t). Hence, i = 1,,...,n, t + ω t, dx i (t + ω; t,φ ) = x i (t + ω; t,φ ) + j N 1 (a ij + b ij ) j N (a ij + b ij ) + u i (t + ω) = x i (t + ω; t,φ ) + j N 1 (a ij + b ij ) j N (a ij + b ij ) + u i (t). This implies x(t + ω; t,φ ) is also a state of DCNN, equation.1, with initial condition (t,φ ). x (ω) φ = φ implies that t t, x(t + ω; t,φ ) = x(t; t, x (ω) φ ) = x(t; t,φ ). Hence, x(t; t,φ ) is a periodic orbit of DCNN, equation.1, with period ω. From equation 3.5, it is easy to see that any state of DCNN, equation.1, with initial condition (t,φ)(φ C([t τ,t ], D )) converges to this periodic orbit exponentially as t +. Hence, the isolated periodic orbit x(t; t,φ ) located in D is locally exponentially attractive, and D is a locally exponentially attractive set of x(t; t,φ ). Since there exist n elements in, there exist n isolated periodic orbits in. And such n isolated periodic orbits are locally exponentially attractive. When theperiodicexternal inputu(t) degenerates into a constant vector, we have the following corollary: Corollary 1. If i {1,,...,n}, t t, u i (t) u i (constant), and u i < a ii 1 a ij, j i b ij, then DCNN (see equation.1) has n locally exponentially stable equilibrium points.

Multiperiodicity and Exponential Attractivity of CNNs 855 Proof. Since u i (t) u i (constant), for an arbitrary constant ν R, u i (t + ν) u i u i (t). According to theorem 1, DCNN, equation.1, has n locally exponentially attractive limit cycles with period ν. The arbitrariness of constant ν implies that such limit cycles are fixed points. Hence, DCNN, equation.1, has n locally exponentially attractive equilibrium points. Remark 1. In theorem 1 and corollary 1, it is necessary for a ii to be dominant such that a ii > 1 + n, j i a ij + n b ij. Remark. A main objective for designing associative memories is to store a large number of patterns as stable equilibria or limit cycles such that stored patterns can be retrieved when the initial probes contain sufficient information about the patterns. CNNs and DCNNs are also suitable for very largescale integration (VLSI) implementations of associative memories. It is also expected that they can be applied to association memories by storing patterns as periodic limit cycles. According to theorem 1 and corollary 1, the n-neuron DCNN model, equation.1, can store up to n patterns in locally exponentially attractive limit cycles or equilibria, which can be retrieved when the input vector satisfies condition 3.1. This implies that the external stimuli also play a major role in encoding and decoding patterns in DCNN associative memories, in contrast with the zero input vector in the bidirectional associate memories and the autoassociative memories based on the Hopfield network. Locally Exponentially Attractive Periodicity in a Designated Region As the limit cycles are stimulus driven (nonautonomous), some information can be encoded in the phases of the oscillating states x i relative to the inputs u i. Hence, it is necessary to find some conditions on the inputs u i, when the periodic orbit x(t) is desired to be located in a designated region. In this section, we give the conditions that allow a periodic orbit to be locally exponentially attractive and located in any designated region. Theorem. If t t, u i (t) < 1 + j N 1 (a ij + b ij ) j N 3 ( a ij + b ij ), i N 1, u i (t) > 1 + j N 1 (a ij + b ij ) + j N 3 ( a ij + b ij ), i N, (.1) (.)

856 Z. Zeng and J. Wang u i (t) < 1 a ii a ij b ij + (a ij + b ij ) j N 3 j N 1 j N 3, j i (a ij + b ij ), i N 3, (.3) j N u i (t) > a ii 1 + a ij + b ij + (a ij + b ij ) j N 3 j N 1 j N 3, j i j N (a ij + b ij ), i N 3, (.) and i {1,,...,n}, j N 3, τ ij (t) = τ ij (t + ω), then DCNN, equation.1, has only one limit cycle located in D 1, which is locally exponentially attractive in D 1. Proof. If s [t τ,t], x(s) D 1, then from equation.1, s [t, t], i = 1,,...,n, dx i (s) = x i (s) j N 1 (a ij + b ij ) + j N (a ij + b ij ) + j N 3 a ij x j (s) + j N 3 b ij x j (s τ ij (s)) + u i (s). (.5) When i N 1 and x i (t) = 1, from equations.1 and.5, 1 j N 1 (a ij + b ij ) + j N (a ij + b ij ) + j N 3 ( a ij + b ij ) + u i (t) <. (.6) When i N and x i (t) = 1, from equations. and.5, 1 j N 1 (a ij + b ij ) j N 3 ( a ij + b ij ) + j N (a ij + b ij ) + u i (t) >. (.7) When i N 3 and x i (t) = 1, from equations.3 and.5, 1 j N 1 (a ij + b ij ) + j N (a ij + b ij ) + a ii + j N 3, j i a ij + j N 3 b ij +u i (t) <. (.8)

Multiperiodicity and Exponential Attractivity of CNNs 857 When i N 3 and x i (t) = 1, from equations. and.5, 1 j N 1 (a ij + b ij ) + j N (a ij + b ij ) a ii j N 3, j i a ij j N 3 b ij +u i (t) >. (.9) Equations.6 to.9 imply that if s [t τ,t ],φ(s) D 1, then x(t; t,φ) will keep in D 1, and D 1 is an invariant set of DCNN (see equation.1). So t t τ, x(t) D 1. Hence, t t, DCNN, equation.1, can be rewritten as = x i (t) j N 1 (a ij + b ij ) + j N 3 a ij x j (t) + j N 3 b ij x j (t τ ij (t)) + j N (a ij + b ij ) + u i (t), i = 1,,...,n. (.1) Let x(t; t,φ)andx(t; t,ϕ) be two states of DCNN (equation.1) with initial conditions (t,φ)and(t,ϕ), where φ,ϕ C([t τ,t ], D 1 ). From equation.1, i = 1,,...,n; t t, d(x i (t; t,φ) x i (t; t,ϕ)) = (x i (t; t,φ) x i (t; t,ϕ)) + j N 3 (a ij (x j (t; t,φ) x j (t; t,ϕ)) +b ij (x j (t τ ij (t); t,φ) x j (t τ ij (t); t,ϕ))). (.11) Let y i (t) = x i (t; t,φ) x i (t; t,ϕ). Then from equation.11, for i = 1,...,n; t t, dy i (t) = y i (t) + j N 3 a ij y j (t) + j N 3 b ij y j (t τ ij (t)). (.1) From equations.3 and., for i N 3, a ii + b ii + j N 3, j i ( a ij + b ij ) + j N 1 (a ij + b ij ) u i (t) < 1. Hence, there exists ϑ>such that (1 a ii ) j N 3, j i a ij + b ij exp{ϑτ} + ϑ, i N 3. j N 3 (.13)

858 Z. Zeng and J. Wang Consider a subsystem of equation.1: dy i (t) = y i (t) + j N 3 a ij y j (t) + j N 3 b ij y j (t τ ij (t)), t t, i N 3. (.1) Denote ȳ t = max t τ s t { y(s) }. Then for i N 3, t t, y i (t) ȳ t exp{ ϑ(t t )}. Otherwise, one of the following two cases holds: Case i. There exist t > t 1 t, k N 3, sufficiently small ε 1 > such that y k (t 1 ) ȳ t exp{ ϑ(t 1 t )}=, y k (t ) ȳ t exp{ ϑ(t t )}=ε 1, and when s [t τ,t ], for all i N 3, y i (s) ȳ t exp{ ϑ(s t )} ε 1,and dy k (t) t=t1 + ϑ ȳ t exp{ ϑ(t 1 t )}, dy k (t) t=t + ϑ ȳ t exp{ ϑ(t t )} >. (.15) Case ii. There exist t > t 3 t, j N 3, sufficiently small ε > such that y j (t 3 ) + ȳ t exp{ ϑ(t 3 t )}=, y k (t ) + ȳ t exp{ ϑ(t t )}= ε, and when s [t τ,t ], for all i N 3, y i (s) ȳ t exp{ ϑ(s t )} ε,and dy j (t) t=t3 ϑ ȳ t exp{ ϑ(t 3 t )}, dy j (t) t=t ϑ ȳ t exp{ ϑ(t t )} <. (.16) It follows from equations.13 and.1 that for k N 3, dy k (t) t=t = y k (t ) + (a kj y j (t ) + b kj y j (t τ kj (t ))) j N 3 [ ( ȳ t exp{ ϑ(t t )} 1 + a kk + j N 3, j k a kj + ) ] b kj exp{ϑτ} + ϑ ϑ ȳ t exp{ ϑ(t t )} j N 3 [ ( +ε 1 1 + a kk + j N 3, j k ϑ ȳ t exp{ ϑ(t t )}. a kj + )] b kj j N 3

Multiperiodicity and Exponential Attractivity of CNNs 859 This contradicts equation.15. Similarly, it follows from equations.13 and.1 that dy j (t) t=t ϑ ȳ t exp{ ϑ(t t )}. This contradicts equation.16. The two contradictions show that for i N 3, t t, y i (t) ȳ t exp{ ϑ(t t )}. Hence, according to lemma, there exists ϑ > such that i = 1,,...,n, t t, x i (t; t,φ) x i (t; t,ϕ) φ,ϕ t exp{ ϑ(t t )}. (.17) Define x (t) φ (θ) = x(t + θ; t,φ),θ [t τ,t ]. From equations.6 to.9, if φ C([t τ,t ], D 1 ), then x (t) φ C([t τ,t ], D 1 ). Define a mapping H : C([t τ,t ], D 1 ) C([t τ,t ], D 1 ) by H(φ) = x (ω) φ, then H(C([t τ,t ], D 1 )) C([t τ,t ], D 1 ), and H m (φ) = x (mω) φ. Similar to the proof of theorem 1, there exists a periodic orbit x(t; t,φ )of DCNN, equation.1, with period ω such that t t, x(t; t,φ ) D 1 and all other states of DCNN, equation.1, with initial condition (t,φ)(φ C([t τ,t ], D 1 )) converge to this periodic orbit exponentially as t +. Hence, the isolated periodic orbit x(t; t,φ )locatedind 1 is locally exponentially attractive, and D 1 is a locally exponentially attractive set of x(t; t,φ ). Remark 3. From equations.1 to., we can see that the input vector u(t) can control the locality of a limit cycle that represents a memory pattern in a designated region. Specifically, when condition.1 holds, the part in corresponding coordinate of the limit cycle is located in (1, + ); when condition. holds, the part in corresponding coordinate of the limit cycle is located (, 1); when conditions.3 and. hold, the part in corresponding coordinate of the limit cycle is located [ 1, 1]. When N 3 is empty, we have the following corollary: Corollary. Let N 1 N ={1,,...,n}, and N 1 N be empty. If t t, u i (t) < j N 1 (a ij + b ij ) 1, i N 1, (.18) u i (t) > j N 1 (a ij + b ij ) + 1, i N, (.19)

86 Z. Zeng and J. Wang then DCNN, equation.1, has exactly one limit cycle located in D, and such a limit cycle is locally exponentially attractive. Proof. Let N 3 in theorem be an empty set. According to theorem, corollary holds. When the periodic external input u(t) degenerates into a constant, we have the following corollary. Corollary 3. If t t, u i (t) u i (constant), and u i < 1 + j N 1 (a ij + b ij ) j N 3 ( a ij + b ij ), i N 1, u i > 1 + j N 1 (a ij + b ij ) + j N 3 ( a ij + b ij ), i N, u i < 1 a ii u i > a ii 1 + j N 3, j i j N 3, j i a ij j N 3 b ij + j N 1 (a ij + b ij ), i N 3, a ij + j N 3 b ij + j N 1 (a ij + b ij ), i N 3, and i {1,,...,n}, j N 3, τ ij (t) τ ij (constant), then DCNN, equation.1, has only one equilibrium point located in D 1, which is locally exponentially stable. Proof. Since u i (t) u i (constant), for arbitrary constant ν R, u i (t + ν) u i u i (t). According to theorem, DCNN, equation.1, has only one limit cycle located in D 1, which is locally exponentially attractive in D 1. The arbitrariness of constant ν implies that such a limit cycle is a fixed point. Hence, DCNN, equation.1, has only one equilibrium point located in D 1, which is locally exponentially stable. 5 Globally Exponentially Attractive Periodicity in a Designated Region In order to obtain optimal spatiotemporal coding in the periodic orbit and reduce computational time, it is desirable for a neural network to be globally exponentially attractive to periodic orbit in a designated region. In this section, we give some conditions that allow a periodic orbit to be globally exponentially attractive and to be located in any designated region. Theorem 3. If t t, u i (t) < 1 ( a ij + b ij ), i N 1, (5.1)

Multiperiodicity and Exponential Attractivity of CNNs 861 u i (t) > 1 + ( a ij + b ij ), i N, (5.) u i (t) < 1 a ii a ij b ij, i N 3, (5.3), j i and i {1,,...,n}, j N 3, τ ij (t) = τ ij (t + ω), then DCNN, equation.1, has a unique limit cycle located in D 1, and such a limit cycle is globally exponentially attractive. Proof. When i N 1 and x i (t) 1, from equations.1 and 5.1, 1 + ( a ij + b ij ) + u i (t) <. (5.) When i N and x i (t) 1, from equations.1 and 5., 1 ( a ij + b ij ) + u i (t) >. (5.5) When i N 3 and x i (t) 1, from equations.1 and 5.3, 1 a ii, j i a ij b ij +u i (t) >. (5.6) When i N 3 and x i (t) 1, from equations.1 and 5.3, 1 + a ii +, j i a ij + b ij +u i (t) <. (5.7) Equations 5. to 5.7 imply that x(t; t,φ) will go into and keep in D 1, where φ C([t τ,t ], R n ). So there exists T > such that t T, x(t) D 1.Hence, t T + τ, DCNN, equation.1, can be rewritten as = x i (t) j N 1 (a ij + b ij ) + j N 3 a ij x j (t) + j N 3 b ij x j (t τ ij (t)) + j N (a ij + b ij ) + u i (t), i = 1,,...,n.

86 Z. Zeng and J. Wang Similar to the proof of theorem, DCNN, equation.1, has a unique limit cycle located in D 1, and such a limit cycle is globally exponentially attractive. Remark. By comparison, we can see that if conditions 5.1 to 5.3 hold, then conditions.1 to. also hold. But not vice versa, as will be shown in examples and 3. In other words, the conditions in theorem 3 are stronger than those in theorem. When N 1 N is empty, we have the following corollary: Corollary. If i, j {1,,...,n}, τ ij (t) = τ ij (t + ω), and t t, u i (t) < 1 a ii a ij, j i b ij, then the DCNN, equation.1, has a unique limit cycle located in [ 1, 1] n, which is globally exponentially attractive. Proof. Choose N 3 ={1,,...,n} in theorem 3. According to theorem 3, the corollary holds. When N 3 is empty, we have the following corollary: Corollary 5. Let N 3 be empty. If u i (t) < 1 ( a ij + b ij ), i N 1, (5.8) u i (t) > 1 + ( a ij + b ij ), i N, (5.9) then DCNN, equation.1, has a unique limit cycle located in D. Moreover, such a limit cycle is globally exponentially attractive. Proof. Since N 3 is empty, according to theorem 3, corollary 5 holds. Remark 5. Since 1 n ( a ij + b ij ) 1+ j N 1 (a ij +b ij ) j N (a ij + b ij ), if condition 5.8 holds, then condition.18 also holds, but not vice versa. Similarly, if condition 5.9 holds, then condition.19 also holds, but not vice versa. This implies that the conditions in corollary 5 are stronger than those in corollary. In addition, corollary 5 shows that a DCNN has a globally exponentially attractive limit cycle if its periodic external stimulus is sufficiently strong.

Multiperiodicity and Exponential Attractivity of CNNs 863 When the periodicexternal inputu(t) degenerates into a constant vector, we have the following corollary: Corollary 6. If t t, u i (t) u i (constant), and u i < 1 ( a ij + b ij ), i N 1, u i > 1 + ( a ij + b ij ), i N, u i < 1 a ii a ij b ij, i N 3,, j i and i {1,,...,n}, j N 3,τ ij (t) τ ij (constant), then DCNN, equation.1, has a unique equilibrium point located in D 1 and is globally exponentially stable at such an equilibrium point. Proof. Since u i (t) u i (constant), for an arbitrary constant ν R, u i (t + ν) u i (t) u i. According to theorem 3, DCNN, equation.1, has a unique limit cycle located in D 1, which is globally exponentially attractive. The arbitrariness of constant ν implies that such a limit cycle is a fixed point. Hence, DCNN, equation.1, has a unique equilibrium point located in D 1, and such an equilibrium point is globally exponentially stable. 6 Illustrative Examples In this section, we give three numerical examples to illustrate the new results. 6.1 Example 1. Consider a CNN, where...5sin(t) A =... u(t) =.6 cos(t)...6..(sin(t) + cos(t)) According to theorem 1, this CNN has 3 = 8 limit cycles, which are locally exponentially attractive. Simulation results with 136 random initial states are depicted in Figures 1 to 3.

86 Z. Zeng and J. Wang x 1 x x 3 5 1 15 5 3 time 5 1 15 5 3 time 5 1 15 5 3 time Figure 1: Transient behavior of x 1, x, x 3 in Example 1. 3 1 x 3 1 3 3 1 1 3 x 1 Figure : Transient behavior of (x 1, x 3 ) in Example 1.

Multiperiodicity and Exponential Attractivity of CNNs 865 x 3 3 1 1 3 x x 1 Figure 3: Transient behavior of (x 1, x, x 3 ) in Example 1. 6. Example. Consider a CNN, where...5sin(t) A =.5..5 u(t) =.5 cos(t). 1...8.5(sin(t) + cos(t)) Choose N 1 ={1}, N ={3}, N 3 ={}. Since u 1 (t) < 1 + a 11 a 13 a 1 ; a + a 1 a 3 u (t) < 1; u 3 (t) > 1 + a 31 a 33 + a 3, according to theorem, this CNN has a limit cycle located in D 1 ={x x 1 < 1, x 1, x 3 > 1}, which is locally exponentially attractive in D 1. Choose N 1 ={3}, N ={1}, N 3 ={}. Since u 1 (t) > 1 + a 13 a 11 + a 1, a + a 1 a 3 u (t) < 1, u 3 (t) < 1 + a 33 a 31 a 3, according to theorem, this CNN has a limit cycle located in D 1 ={x x 3 < 1, x 1, x 1 > 1}, which is locally exponentially attractive in D 1. However, since.5sin(t) > 1 + (+. +.) = 3. does not hold (i.e., condition 5.1 does not hold), it does not satisfy the conditions in theorem 3. Simulation results with 136 random initial states are depicted in Figures and 5.

866 Z. Zeng and J. Wang x 1 x x 3 5 1 15 5 3 time 5 1 15 5 3 time 5 1 15 5 3 time Figure : Transient behavior of x 1, x, x 3 in Example. x 3 3 1 1 3 x x 1 Figure 5: Transient behavior of (x 1, x, x 3 ) in Example.

Multiperiodicity and Exponential Attractivity of CNNs 867 5 x 1 5 5 1 15 5 3 time 5 x 5 5 1 15 5 3 time 5 x 3 5 5 1 15 5 3 time Figure 6: Transient behavior of x 1, x, x 3 in Example 3. 6.3 Example 3. Consider a CNN, where....8sin(t).6 A =..6 u(t) =.8 cos(t).....8(sin(t) + cos(t)) +.8 Choose N 1 ={1}, N ={3}, N 3 ={}. Since u 1 (t) < 1 3 a 1 j, u (t) < 1 a 3, j a j, u 3 (t) > 1 + 3 a 3 j, according to theorem 3, this CNN has a limit cycle located in D 1 ={x x 1 < 1, x 1, x 3 > 1}, which is globally exponentially attractive. Since u 1 (t) < 1 + a 11 a 1 a 13 ; u (t) < 1 a + a 1 a 3, u (t) > 1 + a + a 1 a 3 ; u 3 (t) > 1 + a 31 + a 3 a 33, conditions.1 to. also hold. According to theorem, this CNN has a limit cycle located in D 1, which is also locally exponentially attractive. However, since a 11 >, a 33 > the conditions in Yi et al. (3) cannot be used to ascertain the complete stability of this CNN. Simulation results with 136 random initial states are depicted in Figures 6 and 7.

868 Z. Zeng and J. Wang x 3 5 3 1 1 3 6 x x 1 6 Figure 7: Transient behavior of (x 1, x, x 3 ) in Example 3. 7 Concluding Remarks Rhythmicity represents one of most striking manifestations of dynamic behaviors in biological systems. CNNs and DCNNs, which have been shown to be capable of operating in a pacemaker or pattern generator mode, are studied here as oscillatory mechanisms in response to periodic external stimuli. Some information can be encoded in the oscillating activation states relative to external inputs, and these relative phases change as a function of the chosen limit cycle. In this article, we show that the number of locally exponentially attractive periodic orbits located in saturation regions in a DCNN is exponential of the number of the neurons. In view of the fact that neural information is often desired to be encoded in a designated region, we also give conditions to allow a globally exponentially attractive periodic orbit located in any designated region. The theoretical results are supplemented by simulation results in three illustrative examples. Acknowledgments This work was supported by the Hong Kong Research Grants Council under grant CUHK165/3E, the Natural Science Foundation of China

Multiperiodicity and Exponential Attractivity of CNNs 869 under grant 65 and China Postdoctoral Science Foundation under Grant 35579. References Belair, J., Campbell, S., & Driessche, P. (1996). Frustration, stability and delay-induced oscillation in a neural network model. SIAM J. Applied Math., 56, 5 55. Berns, D. W., Moiola, J. L., & Chen, G. (1998). Predicting period-doubling bifurcations and multiple oscillations in nonlinear time-delayed feedback systems. IEEE Trans. Circuits Syst. I, 5(7), 759 763. Chen, K., Wang, D. L., & Liu, X. (). Weight adaptation and oscillatory correlation for image segmentation. IEEE Trans. Neural Networks, 11, 116 113. Chua, L. O., & Roska, T. (199). Stability of a class of nonreciprocal cellular neural networks. IEEE Trans. Circuits Syst., 37, 15 157. Chua, L. O., & Roska, T. (199). Cellular neural networks with nonlinear and delaytype template elements and non-uniform grids. Int. J. Circuit Theory and Applicat.,, 9 51. Civalleri, P. P., Gilli, M., & Pandolfi, L. (1993). On stability of cellular neural networks with delay. IEEE Trans. Circuits Syst. I,, 157 16. Gopalsamy, K., & Leung, I. (1996). Delay induced periodicity in a neural netlet of excitation and inhibition. Physica D, 89, 395 6. Jiang, H., & Teng, Z. (). Boundedness and stability for nonautonomous bidirectional associative neural networks with delay. IEEE Trans. Circuits Syst. II, 51(), 17 18. Jin, H. L., & Zacksenhouse, M. (3). Oscillatory neural networks for robotic yo-yo control. IEEE Trans. Neural Networks, 1(), 317 35. Kanamaru, T., & Sekine, M. (). An analysis of globally connected active rotators with excitatory and inhibitory connections having different time constants using the nonlinear Fokker-Planck equations.ieee Trans. Neural Networks, 15(5), 19 117. Kosaku, Y. (1978). Functional analysis. Berlin: Springer-Verlag. Liao, X. X., & Wang, J. (3). Algebraic criteria for global exponential stability of cellular neural networks with multiple time delays. IEEE Trans. Circuits and Syst. I, 5(), 68 75. Liao, X., Wu, Z., & Yu, J. (1999). Stability switches and bifurcation analysis of a neural network with continuous delay. IEEE Trans. Systems, Man and Cybernetics, Part A, 9(6), 69 696. Liu, Z., Chen, A., Cao, J., & Huang, L. (3). Existence and global exponential stability of periodic solution for BAM neural networks with periodic coefficients and time-varying delays. IEEE Trans. Circuits Syst. I, 5(9), 116 1173. Nishikawa, T., Lai, Y. C., & Hoppenstea, F. C. (). Capacity of oscillatory associative-memory networks with error-free retrieval. Physical Review Letters, 9(1), 1811. Roska, T., Wu, C. W., Balsi, M., & Chua, L. O. (199). Stability and dynamics of delay-type general and cellular neural networks. IEEE Trans. Circuits Syst. I, 39, 87 9.

87 Z. Zeng and J. Wang Roska, T., Wu, C. W., & Chua, L. O. (1993). Stability of cellular neural networks with dominant nonlinear and delay-type templates. IEEE Trans. Circuits Syst. I,, 7 7. Ruiz, A., Owens, D. H., & Townley, S. (1998). Existence, learning, and replication of periodic motion in recurrent neural networks. IEEE Trans. Neural Networks, 9(), 651 661. Setti, G., Thiran, P., & Serpico, C. (1998). An approach to information propagation in 1-D cellular neural networks, part II: Global propagation. IEEE Trans. Circuits and Syst. I, 5(8), 79 811. Takahashi, N. (). A new sufficient condition for complete stability of cellular neural networks with delay. IEEE Trans. Circuits Syst. I, 7, 793 799. Townley, S., Ilchmann, A., Weiss, M. G., McClements, W., Ruiz, A. C., Owens, D. H., & Pratzel-Wolters, D. (). Existence and learning of oscillations in recurrent neural networks. IEEE Trans. Neural Networks, 11, 5 1. Wang, D. L. (1995). Emergent synchrony in locally coupled neural oscillators. IEEE Trans. Neural Networks, 6, 91 98. Wang, L., & Zou, X. (). Capacity of stable periodic solutions in discrete-time bidirectional associative memory neural networks. IEEE Trans. Circuits Syst. II, 51(6), 315 319. Yi, Z., Heng, P. A., & Vadakkepat, P. (). Absolute periodicity and absolute stability of delayed neural networks. IEEE Trans. Circuits Syst. I, 9(), 56 61. Yi, Z., Tan, K. K., & Lee, T. H. (3). Multistability analysis for recurrent neural networks with unsaturating piecewise linear transfer functions. Neural Computation, 15(3), 639 66. Zeng, Z., Wang, J., & Liao, X. X. (3). Global exponential stability of a general class of recurrent neural networks with time-varying delays. IEEE Trans. Circuits and Syst.I,5(1), 1353 1358. Zeng, Z., Wang, J., & Liao, X. X. (). Stability analysis of delayed cellular neural networks described using cloning templates. IEEE Trans. Circuits and Syst. I, 51(11), 313 3. Received November 3, ; accepted June 8, 5.