Storage capacity of hierarchically coupled associative memories

Size: px
Start display at page:

Download "Storage capacity of hierarchically coupled associative memories"

Transcription

1 Storage capacity of hierarchically coupled associative memories Rogério M. Gomes CEFET/MG - LSI Av. Amazonas, 7675 Belo Horizonte, MG, Brasil rogerio@lsi.cefetmg.br Antônio P. Braga PPGEE-UFMG - LITC Av. Antônio Carlos, 667 Belo Horizonte, MG, Brasil apbraga@cpdee.ufmg.br Henrique E. Borges CEFET/MG - LSI Av. Amazonas, 7675 Belo Horizonte, MG, Brasil henrique@lsi.cefetmg.br Abstract This paper, taking as inspiration the ideas proposal for the TNGS (Theory of Neuronal Group Selection), presents a study of convergence capacity of two-level associative memories based on coupled Generalized-Brain-State-in-a-Box (GBSB) neural networks. In this model, the memory processes are described as being organized functionally in hierarchical levels, where the higher levels would coordinate sets of function of the lower levels. Simulations were carried out to illustrate the behaviour of the capacity of the system for a wide range of the system parameters considering linearly independent (LI) and orthogonal vectors. The results obtained show the relations amongst convergence, intensity and density of coupling. 1. Introduction The brain-state-in-a-box (BSB) neural model was proposed by Anderson and collaborators in 1977 [] and it may be viewed as a version of Hopfield s model [7] with continuous and synchronous updating of the neurons. Hui and Zak [8] extended the BSB neural model with the inclusion of a bias field. Their model is referred to as the generalizedbrain-state-in-a-box (GBSB) neural network model. The BSB and GBSB models can be used in the implementation of associative memories, where each stored prototype pattern, i.e., a memory, is an asymptotically stable equilibrium point. Thus, when the system is initialized in a pattern close enough to the stored pattern, such that it lies within the basin of attraction of the memorized pattern, then the state of the system will evolve in time towards that memorized pattern. The design of artificial neural network associative memories has been dealt with in the last two decades, and some methods have been proposed in [7], [1], [13], [9], [11]. Despite the fact that associative memories have been intensively studied, they have been only analyzed as a single system and they have not been designed as a part of a hierarchical or coupled system. Therefore, taking as inspiration the Theory of Neuronal Group Selection (TNGS) proposed by Edelman [] [3], a multi-level associative memory based on coupled GBSB neural networks was proposed and analyzed in [6] [5]. The TNGS establishes that synapses of the localized neural cells in the cortical area of the brain generate cluster units denoted as: neuronal groups (cluster of neural cells), local maps (reentrant clusters of neuronal groups) and global maps (reentrant clusters of neural maps). In accordance with this theory a neuronal group is the most basic unit in the cortical area of the brain, where memory process arises. It is formed not by a single neuron, but by a cluster of neural cells. Each one of these clusters (Neuronal Groups), is a set of localized, tightly coupled neurons, firing and oscillating in synchrony, thus, forming the building blocks of memory. These neuronal groups are our first-level memory. Some neurons which locate in a cluster, however, have synaptic connections with neurons belonging to other clusters, generating a second level physical structure denoted as Local Map in TNGS. Each of these connection arrangements amongst clusters within a given Local Map results in a certain inter-cluster activity, yielding a second-level memory. This process of grouping and connecting smaller structures generating a larger one through synaptic interconnections between neurons of different neuronal groups, can be repeated recursively. Consequently, new hierarchical levels of memories would emerge through selected correlations of the lower level memories []. In this paper, the generalized-brain-state-in-a-box (GBSB) neural model is used to create a first-level associative memory in a two-level system and is organized as follows. In section we present the model of hierarchically coupled GBSB neural networks and show how multi-level memories may emerge from it. Section 3 presents an analysis of the storage capacity of hierarchically coupled associative memories. Section illustrates the analysis made based

2 on experiments, showing the probability of convergence of the system into global patterns taking into consideration orthogonal and linearly independent vectors. Finally, Section 5 concludes the paper and presents some relevant extensions of this work. W (i,a)(j,a) i Second-level Memories WCor(i,a)(j,b) j. Two-level memories The GBSB (Generalized-Brain-State-in-a-Box) model [8] is described by: x k+1 = ϕ((i n + βw)x k + βf), (1) where I n is the n n identity matrix, β>0is a step size, W ɛ R n n is the weight matrix which need not be symmetrical, f ɛ R n is the bias field which allows us to have better control of the extension of the basins of attraction of the fixed points of the system and ϕ is a linear saturating function [8]. In our two-level memories, each GBSB neural network will play the role of a first-level memory or a Neuronal Group (TNGS). In order to build a second-level memory we will couple any number of GBSB networks through bidirectional synapses. These new structures will play the part of a second-level memory in which the global patterns could emerge as a selected coupling of the first-level stored patterns (Local Maps - TNGS). Fig. 1 illustrates a two-level hierarchical memory via coupled GBSB model, where each one of the neuron networks A, B and C, is represented by a GBSB network. In a given network, each neuron has synaptic connections with each other, i.e., the GBSB is a fully connected nonsymmetric neural network. Beyond this, some selected neurons in a network are bidirectionally connected with some selected neurons in other networks [1]. These internetwork connections can be represented by a weight internetwork matrix W cor, which accounts for the contribution of one network to another one due to coupling. An analogous procedure could be carried out in order to establish higher levels in the hierarchy [], [1]. In order to account for the effect in one given GBSB network due to the coupling with the remaining GBSB networks, one should, of course, extend 1, by means of adding to it a term which represents the inter-network coupling. Consequently, our multi-level associative memory model can be defined by [6]: x k+1 a = ϕ (I n + β a W a ) x k a + β a f a + γ b=1,b =a W cor x k b () where x k a is the state vector of the a th network at time k, β (i,a) > 0 is the step size and f a is the bias field of the a th network, W a is the synaptic weight of the a th network, N r A γ W Cor(j,b)(i,a) First Level memories C GBSB Nets Figure 1. Coupled neural network design is the number of networks, W cor is the inter-network weight matrix and γ is the intensity of coupling of the synapses between the a th network and the b th network, and x k b is the state vector of the b th network at time k. To sum it up, the first three terms account for the uncoupled GBSB network whilst the fourth term of, represents the intergroup connections. 3. Storage capacity and stability analysis of the coupled model In our coupled model, the first-level memories must be stored as asymptotically stable equilibrium points, moreover, it must be guaranteed that some of these stored patterns in each network form specific combinations, or globally stable emergent patterns, yielding a second-level memory. The weight matrix of each individual network was carefully designed following the algorithm proposed in [15]. This algorithm ensures that the negative patterns of the desired objets are not automatically stored as asymptotically stable equilibrium points of the network, and besides minimizing the number of spurious states. The weight matrix W proposed by Zak and collaborators [15] is described as follows: W a =(D a V a B a )V a +Λ a (I n V a V a) (3) where D a is the R n n strongly row diagonal dominant matrix, V a = [ v 1, v,...,v r] ε { 1, 1} n r, is the matrix of stored patterns, B a =[b, b,...,b] εr n r is the bias field matrix consisting of the column vector b repeated r times, V a é the pseudo-inverse matrix of stored patterns, I N is the n n identity matrix and Λ a is the R n n matrix given by: B

3 λ (i,a)(i,a) < n,j =i λ (i,a)(i,a) b i () In order to measure the storage capacity of the system, our two-level coupled network is initialized at time k =0in one of the networks which is chosen at random in one of the first-level memories that compose a second-level memory. The other networks, in turn, are initialized in one of the possible patterns, also, randomly. Therefore, the storage capacity is investigated in three analyses: 1. The storage capacity of the networks which are initialized in one of the first-level memories that compose a second-level memory;. The storage capacity of the networks which are initialized in one of their first-level memories, but which do not compose a second-level memory; 3. The storage capacity of the networks which are initialized in one of the possible patterns which do not belong to a first-level memory. Analysis 1: The storage capacity of the networks which are initialized in one of the first-level memories that compose a second-level memory. First of all it will be assumed that V av a = In and from [10] we find: W a V a = (D a V a f a )V av a +Λ a (I n V a V a)v a = D a V a f a (5) Now, due to the fact that in this case the network is initialized in a global pattern we must verify the conditions in which this pattern stays in this stable equilibrium point. Therefore, replacing 5 into and performing the operation L that represents an iteration of the GBSB algorithm, results in: (L(v z a)) i = ϕ {(I n v z a + β a D a v z a) i + γ w cor(i,a)(j,b) x (j,b) b=1,b =a = ϕ vz (i,a) + β a d (i,a)(j,a) v(j,a) z + γ N b=1,b =a m=1 (6) where v z a is the z th state vector of the a th network, N r is the number of networks, is the number of neurons of the individual networks and N p is the number of patterns chosen to function as both first and second-level memories. From the former equation, we define the terms γ Nr β a N b=1,b =a m=1 d (i,a)(j,a) v(j,a) z = Desc = Corr for simplification. Given that Desc has the same absolute value as v(i,a) z to provide instability it is necessary that Corr and Desc of equation 6 should hold different absolute values and Corr should also be greater than Desc in modulo. Hence, this can occur in the following situations: when v (i,a) = 1 and (Corr + Desc) > 0 or when v (i,a) = 1 and (Corr + Desc) < 0. Consequently, the probability P of error of the neuron v (i,a) can be characterized as: Perro 1 = P (v (i,a) = 1)P {(Corr + Desc) > 0} + P (v (i,a) =1)P {(Corr + Desc) < 0} (7) Considering vectors v belong to the set of global patterns chosen randomly implies that P (v(i,a) z = 1) = P (v(i,a) z =1)= 1. Thus, equation 7 can be express as follows: Perro 1 = 1 P {(Corr + Desc) > 0} + 1 P {(Corr + Desc) < 0} (8) Therefore, it is necessary to determine the probability density function of {(Corr + Desc) > 0} and of {(Corr + Desc) < 0} considering that the term Desc represents only a displacement. Regarding U as being the number of components of each vector whose value is 1 in the term Corr and D as being the density of inter-network coupling, (i.e., the actual percentage of the value of Corr due to the interconnection of the inter-network neurons), we assume: Corr = γd(n r 1) [U ( N p U)] = γd(n r 1) [U N p ] (9) Talking into account the fact that the stored vectors are chosen randomly, P (U) can be defined by the following binomial distribution: 3

4 P (U) = N p (N r 1) 1 N p(n r 1) (10) The binomial distribution defined in 10 can be approached by a normal distribution at average E[U] = N p(n r 1) and variance of σ (u) = NnNp(Nr 1). Then, the average and variance of the term Corr can be obtained from equation 9, E[U] and σ (u) and can be expressed by: [ ] γd(nr 1) E (U N p ) =0 (11) σ Corr = E [ (γd(nr ) ] 1) (U N p ) [ E γd(nr 1) (U N p ) = γ D (N r 1) ( E[U Nn ] N p E[U] +( N p ) ) (1) Since E[U ] = σ [U] +E [U] = (N p(n r 1)),wehave: σ Corr = γ D (N r 1) (N p (N r 1) Let K = N p (N r 1). Then: + (N p (N r 1)) (N r 1) Np + Np ) ] NnNp(Nr 1) + (13) ( σ = γ D Np (N r 1) 3 ) + K( N r ) (1) Finally, it can be easily verified that Corr is a distributed random variable in accordance with a normal distribution at average 0 and variance σ. Furthermore, it is known that a normal distribution is symmetrical in relation to its average point leading to P (Corr > 0) = P (Corr < 0). As a result, equation 8 can be rewritten in the form presented in 15, where the integral function is achieved from the standard normal probability density function at average E[X] and variance σ [x] and the term Desc, in this case, represents the modulo of displacement. Perro 1 = P (Corr > 0) = + 1 = e (u E Corr ) σ Corr du (15) πσcorr Desc Analysis : The storage capacity of the networks which are initialized in one of their first-level memories, but which do not compose a second-level memory This analysis is based on the same procedures observed in Analysis 1. However, differently from the previous analysis, it is expected that the system be unstable for this new condition, in other words, ones expects that the probability of error defined in 15 turns into a rightness probability. In this case, the probability of error in this study will be the complement of equation 15 and can be defined by: Perro = P (Corr > 0) = Desc 1 = e (u E Corr ) σ Corr du (16) πσcorr Analysis 3: The storage capacity of the networks which are initialized in one of the possible patterns which do not belong to a first-level memory. Lillo and collaborators [10] added a term to the right hand side of 3 where (I n V a V a) represents an orthogonal projection onto the null space of V a. As a result, the weight matrix of the individual networks becomes: W a y a = (D a V a B a )V ay a +Λ a (I n V a V a)y a = Λ a (I n V a V a)y a = Λ a y a (17) Then, by substituting operation term 17 into equation and carrying out an L transformation, which represents an iteration of GBSB algorithm, one can verify in which conditions the network which was initialized should not keep evolving towards the initialization vector, i.e., the network would be unstable for this vector which was not stored and does not belong to a global pattern: (L(y a )) i = ϕ {(y a + β a (Λy a + b a )) i +γ w cor(i,a)(j,b) x (j,b) b=1,b =a = ϕ y (i,a) + β a λ (i,a)(j,a) y (j,a) + b (i,a) N +γ (18) b=1,b =a m=1 As defined in the Analysis 1, wehave: β a λ (i,a)(j,a) y (j,a) + b (i,a) = Desc

5 γ Nr N b=1,b =a m=1 = Corr Given that Desc has different absolute value as y (i,a) to provide stability, which is not desirable for memories that have not been stored, it is necessary that Corr and Desc of 18 should hold different absolute values and Corr should also be greater than Desc in modulo. Hence, this can occur in the following situations: when y (i,a) = 1 and (Corr + Desc) < 0 or when y (i,a) = 1 and (Corr + Desc) > 0. This way, the probability P that stability or error occurs in y (i,a), can be described generically by: Perro 3 = P (y (i,a) = 1)P {(Corr + Desc) < 0} + P (y (i,a) =1)P {(Corr + Desc) > 0} (19) Considering the vectors y chosen randomly we obtain P (y (i,a) = 1) = P (y (i,a) =1)= 1. Thus, equation 19 can be expressed as follows: Perro 3 = 1 P {(Corr + Desc) < 0} + 1 P {(Corr + Desc) > 0} (0) Therefore, it is necessary to determine the probability density function of P {(Corr + Desc) < 0} and of P {(Corr + Desc) > 0} considering that the term Desc represents only a displacement. Hence, Corr can be expressed by 9, obtained in the analysis 1. At last, repeating the procedure developed by equations 10, 11, 1 and 1 we find that Corr is a randomly distributed variable in accordance with a normal distribution at average 0 and variance σ. Furthermore, it is known that a normal distribution is symmetrical in relation to its average point leading to P (Corr > 0) = P (Corr < 0). As a result, equation 0 can be rewriten in the form presented in 1, where the integral function is achieved from the standard normal probability density function at average E[X] and variance σ [x] and the term Desc, in this case, represents the module of the displacement. Perro 3 = P (Corr < 0) = Desc 1 = e (u E Corr ) σ Corr du(1) πσcorr To sum up, the total probability of convergence P conver of the coupled system could be defined by the product of the complement of the probabilities of error as in the previous analyses: P conver = (1 P 1 erro) (1 P erro) (1 P 3 erro). Simulation results The storage capacity of the system was measured by taking into consideration three GBSB networks connected as shown in Fig. 1. In our simulations each network contains 1 neurons. 6 out of 096 possible patterns ( 1 ) are selected to be stored as our first-level memories. The weight matrix of the individual networks was carefully designed following the algorithm proposed in [10]. The selected set of 6 patterns stored as first-level memories were chosen randomly considering LI or orthogonal vectors. In addition, we have chosen randomly amongst the 6 3 = 16 possible combinations of the 3 sets of first-level memories to be our second-level memories. The selected patterns extracted from the first-level memories to form a global pattern, determine the inter-network weight matrix W cor(a,b) by a generalized Hebb rule or Outer Product Method. Network A was initialized at time k =0in one of the two possible first-level memories which compose a secondlevel memory. Network B was initialized in one of the other 5 patterns which were stored as first-level memories but which do not compose second-level memories. On the other hand, Network C was initialized, randomly, in one of the remaining patterns (090) which does not belong to a first-level memory. Then, we measured the probability of convergence of the coupled system considering a density of coupling amongst the inter-network neurons of 0%, 0%, 60% and 100%. Neurons that took part of the inter-network connections were chosen randomly. Points in our experiments were averaged over 1000 trials for a given particular γ (intensity of coupling) and β (intra-network step size) values. The results for LI and orthogonal vectors can be seen in Fig. and 3, which shows that even when 0% of the inter-network neurons were connected our model presented a relevant probability rate of convergence. In addition, the differences between orthogonal and LI vectors were close 10%. We have also analyzed the relation between intensity (γ) and density (D) of coupling. We observed that when D value decreases it is necessary an increase of the γ value in such way as to improve the probability rate of convergence. 5. Conclusions In this paper, we have presented a proposal for the evaluation of the capacity of a model of multi-level associative memories based on the TNGS [], through artificial neural networks. We derive a set of equations that evaluates the probability of convergence of these coupled systems. Simulations had been carried out through a two-level memory system and the relations between convergence, intensity and density of coupling were showed considering 5

6 80 80 Probability of convergence (%) % 60% 0% 0% Beta=0.3 Probability of convergence (%) % 60% 0% 0% Beta= Gamma Gamma Figure. Probability of convergence for a density of coupling amongst the internetwork neurons of 0%, 0%, 60% and 100% - LI vectors Figure 3. Probability of convergence for a density of coupling amongst the internetwork neurons of 0%, 0%, 60% and 100% - Orthogonal vectors linearly and orthogonal vectors. The storage capacity proved to be significant for both LI and orthogonal vectors and it could also be noted that the probability of convergence achieved for orthogonal vectors exceeded that in LI vectors in 10% showing that it is possible to build multi-level memories where higher levels would coordinate sets of memories of the lower levels. This work is currently being generalized in order to compare the capacity and convergence of multi-level memories with two-level memories which present the same number of first-level memories. References [1] I. Aleksander. What is thought? NATURE, 9(6993):701 70, 00. [] J. A. Anderson, J. W. Silverstein, S. A. Ritz, and R. S. Jones. Distinctive features, categorical perception, probability learning: some applications of a neural model, chapter, pages MIT Press, Cambridge, Massachusetts, [3] W. J. Clancey. Situated cognition : on human knowledge and computer representations. Learning in doing. Cambridge University Press, Cambridge, U.K. ; New York, NY, USA, [] G.M.Edelman. Neural darwinism: The theory of neuronal group selection. Basic Books, New York, [5] R. M. Gomes, A. P. Braga, and H. E. Borges. Energy analysis of hierarchically coupled generalized-brain-state-in-box GBSB neural network. In Proceeding of V Encontro Nacional de Inteligência Artificial - ENIA 005, pages , São Leopoldo, Brazil, Julho 005. [6] R. M. Gomes, A. P. Braga, and H. E. Borges. A model for hierarchical associative memories via dynamically coupled GBSB neural networks. In Proceeding of Internacional Conference in Artificial Neural Networks - ICANN 005, Warsaw, Poland, September 005. Springer-Verlag. (to be published). [7] J. J. Hopfield. Neurons with graded response have collective computational properties like those of two-state neurons. Proceedings of the National Academy of Science U.S.A., 81: , May 198. [8] S. Hui and S. H. Zak. Dynamical analysis of the brain-statein-a-box (BSB) neural models. IEEE Transactions on Neural Networks, 3(5):86 9, 199. [9] J. Li, A. N. Michel, and W. Porod. Analysis and synthesis of a class of neural networks: Variable structure systems with infinite gains. IEEE Transactions on Circuits and Systems, 36: , May [10] W. E. Lillo, D. C. Miller, S. Hui, and S. H. Zak. Synthesis of brain-state-in-a-box (BSB) based associative memories. IEEE Transactions on Neural Network, 5(5): , set 199. [11] A. N. Michel, J. A. Farrell, and W. Porod. Qualitative analysis of neural networks. IEEE Transactions on Circuits and Systems, 36:9 3, [1] L. Personnaz, I. Guyon, and G. Dreyfus. Information storage and retrieval in spin-glass-like neural networks. Journal de Physique Lettres (Paris), 6: , [13] L. Personnaz, I. Guyon, and G. Dreyfus. Collective computational properties of neural networks: New learning mechanisms. Physical Review A, 3:17 8, [1] J. P. Sutton, J. S. Beis, and L. E. H. Trainor. A hierarchical model of neocortical synaptic organization. Mathl. Comput. Modeling, 11:36 350, [15] S. H. Zak, W. E. Lillo, and S. Hui. Learning and forgetting in generalized brain-state-in-a-box (BSB) neural associative memories. Neural Networks, 9(5):85 85,

SCRAM: Statistically Converging Recurrent Associative Memory

SCRAM: Statistically Converging Recurrent Associative Memory SCRAM: Statistically Converging Recurrent Associative Memory Sylvain Chartier Sébastien Hélie Mounir Boukadoum Robert Proulx Department of computer Department of computer science, UQAM, Montreal, science,

More information

The Generalized Brain-State-in-a-Box (gbsb) Neural Network: Model, Analysis, and Applications

The Generalized Brain-State-in-a-Box (gbsb) Neural Network: Model, Analysis, and Applications The Generalized Brain-State-in-a-Box (gbsb) Neural Network: Model, Analysis, and Applications Cheolhwan Oh and Stanislaw H. Żak School of Electrical and Computer Engineering Purdue University West Lafayette,

More information

Storage Capacity of Letter Recognition in Hopfield Networks

Storage Capacity of Letter Recognition in Hopfield Networks Storage Capacity of Letter Recognition in Hopfield Networks Gang Wei (gwei@cs.dal.ca) Zheyuan Yu (zyu@cs.dal.ca) Faculty of Computer Science, Dalhousie University, Halifax, N.S., Canada B3H 1W5 Abstract:

More information

Hopfield Neural Network

Hopfield Neural Network Lecture 4 Hopfield Neural Network Hopfield Neural Network A Hopfield net is a form of recurrent artificial neural network invented by John Hopfield. Hopfield nets serve as content-addressable memory systems

More information

Artificial Intelligence Hopfield Networks

Artificial Intelligence Hopfield Networks Artificial Intelligence Hopfield Networks Andrea Torsello Network Topologies Single Layer Recurrent Network Bidirectional Symmetric Connection Binary / Continuous Units Associative Memory Optimization

More information

Hopfield Neural Network and Associative Memory. Typical Myelinated Vertebrate Motoneuron (Wikipedia) Topic 3 Polymers and Neurons Lecture 5

Hopfield Neural Network and Associative Memory. Typical Myelinated Vertebrate Motoneuron (Wikipedia) Topic 3 Polymers and Neurons Lecture 5 Hopfield Neural Network and Associative Memory Typical Myelinated Vertebrate Motoneuron (Wikipedia) PHY 411-506 Computational Physics 2 1 Wednesday, March 5 1906 Nobel Prize in Physiology or Medicine.

More information

CHAPTER 3. Pattern Association. Neural Networks

CHAPTER 3. Pattern Association. Neural Networks CHAPTER 3 Pattern Association Neural Networks Pattern Association learning is the process of forming associations between related patterns. The patterns we associate together may be of the same type or

More information

A. The Hopfield Network. III. Recurrent Neural Networks. Typical Artificial Neuron. Typical Artificial Neuron. Hopfield Network.

A. The Hopfield Network. III. Recurrent Neural Networks. Typical Artificial Neuron. Typical Artificial Neuron. Hopfield Network. III. Recurrent Neural Networks A. The Hopfield Network 2/9/15 1 2/9/15 2 Typical Artificial Neuron Typical Artificial Neuron connection weights linear combination activation function inputs output net

More information

Consider the way we are able to retrieve a pattern from a partial key as in Figure 10 1.

Consider the way we are able to retrieve a pattern from a partial key as in Figure 10 1. CompNeuroSci Ch 10 September 8, 2004 10 Associative Memory Networks 101 Introductory concepts Consider the way we are able to retrieve a pattern from a partial key as in Figure 10 1 Figure 10 1: A key

More information

Learning and Memory in Neural Networks

Learning and Memory in Neural Networks Learning and Memory in Neural Networks Guy Billings, Neuroinformatics Doctoral Training Centre, The School of Informatics, The University of Edinburgh, UK. Neural networks consist of computational units

More information

7 Rate-Based Recurrent Networks of Threshold Neurons: Basis for Associative Memory

7 Rate-Based Recurrent Networks of Threshold Neurons: Basis for Associative Memory Physics 178/278 - David Kleinfeld - Fall 2005; Revised for Winter 2017 7 Rate-Based Recurrent etworks of Threshold eurons: Basis for Associative Memory 7.1 A recurrent network with threshold elements The

More information

In biological terms, memory refers to the ability of neural systems to store activity patterns and later recall them when required.

In biological terms, memory refers to the ability of neural systems to store activity patterns and later recall them when required. In biological terms, memory refers to the ability of neural systems to store activity patterns and later recall them when required. In humans, association is known to be a prominent feature of memory.

More information

A. The Hopfield Network. III. Recurrent Neural Networks. Typical Artificial Neuron. Typical Artificial Neuron. Hopfield Network.

A. The Hopfield Network. III. Recurrent Neural Networks. Typical Artificial Neuron. Typical Artificial Neuron. Hopfield Network. Part 3A: Hopfield Network III. Recurrent Neural Networks A. The Hopfield Network 1 2 Typical Artificial Neuron Typical Artificial Neuron connection weights linear combination activation function inputs

More information

( ) T. Reading. Lecture 22. Definition of Covariance. Imprinting Multiple Patterns. Characteristics of Hopfield Memory

( ) T. Reading. Lecture 22. Definition of Covariance. Imprinting Multiple Patterns. Characteristics of Hopfield Memory Part 3: Autonomous Agents /8/07 Reading Lecture 22 Flake, ch. 20 ( Genetics and Evolution ) /8/07 /8/07 2 Imprinting Multiple Patterns Let x, x 2,, x p be patterns to be imprinted Define the sum-of-outer-products

More information

Using Variable Threshold to Increase Capacity in a Feedback Neural Network

Using Variable Threshold to Increase Capacity in a Feedback Neural Network Using Variable Threshold to Increase Capacity in a Feedback Neural Network Praveen Kuruvada Abstract: The article presents new results on the use of variable thresholds to increase the capacity of a feedback

More information

7 Recurrent Networks of Threshold (Binary) Neurons: Basis for Associative Memory

7 Recurrent Networks of Threshold (Binary) Neurons: Basis for Associative Memory Physics 178/278 - David Kleinfeld - Winter 2019 7 Recurrent etworks of Threshold (Binary) eurons: Basis for Associative Memory 7.1 The network The basic challenge in associative networks, also referred

More information

Lecture 7 Artificial neural networks: Supervised learning

Lecture 7 Artificial neural networks: Supervised learning Lecture 7 Artificial neural networks: Supervised learning Introduction, or how the brain works The neuron as a simple computing element The perceptron Multilayer neural networks Accelerated learning in

More information

Spiking Neural P Systems and Modularization of Complex Networks from Cortical Neural Network to Social Networks

Spiking Neural P Systems and Modularization of Complex Networks from Cortical Neural Network to Social Networks Spiking Neural P Systems and Modularization of Complex Networks from Cortical Neural Network to Social Networks Adam Obtu lowicz Institute of Mathematics, Polish Academy of Sciences Śniadeckich 8, P.O.B.

More information

Memories Associated with Single Neurons and Proximity Matrices

Memories Associated with Single Neurons and Proximity Matrices Memories Associated with Single Neurons and Proximity Matrices Subhash Kak Oklahoma State University, Stillwater Abstract: This paper extends the treatment of single-neuron memories obtained by the use

More information

Convolutional Associative Memory: FIR Filter Model of Synapse

Convolutional Associative Memory: FIR Filter Model of Synapse Convolutional Associative Memory: FIR Filter Model of Synapse Rama Murthy Garimella 1, Sai Dileep Munugoti 2, Anil Rayala 1 1 International Institute of Information technology, Hyderabad, India. rammurthy@iiit.ac.in,

More information

Using a Hopfield Network: A Nuts and Bolts Approach

Using a Hopfield Network: A Nuts and Bolts Approach Using a Hopfield Network: A Nuts and Bolts Approach November 4, 2013 Gershon Wolfe, Ph.D. Hopfield Model as Applied to Classification Hopfield network Training the network Updating nodes Sequencing of

More information

CSE 5526: Introduction to Neural Networks Hopfield Network for Associative Memory

CSE 5526: Introduction to Neural Networks Hopfield Network for Associative Memory CSE 5526: Introduction to Neural Networks Hopfield Network for Associative Memory Part VII 1 The basic task Store a set of fundamental memories {ξξ 1, ξξ 2,, ξξ MM } so that, when presented a new pattern

More information

Hopfield Network for Associative Memory

Hopfield Network for Associative Memory CSE 5526: Introduction to Neural Networks Hopfield Network for Associative Memory 1 The next few units cover unsupervised models Goal: learn the distribution of a set of observations Some observations

More information

Terminal attractor optical associative memory with adaptive control parameter

Terminal attractor optical associative memory with adaptive control parameter 1 June 1998 Ž. Optics Communications 151 1998 353 365 Full length article Terminal attractor optical associative memory with adaptive control parameter Xin Lin a,), Junji Ohtsubo b, Masahiko Mori a a Electrotechnical

More information

arxiv: v2 [nlin.ao] 19 May 2015

arxiv: v2 [nlin.ao] 19 May 2015 Efficient and optimal binary Hopfield associative memory storage using minimum probability flow arxiv:1204.2916v2 [nlin.ao] 19 May 2015 Christopher Hillar Redwood Center for Theoretical Neuroscience University

More information

Introduction to Neural Networks

Introduction to Neural Networks Introduction to Neural Networks What are (Artificial) Neural Networks? Models of the brain and nervous system Highly parallel Process information much more like the brain than a serial computer Learning

More information

Neural Networks. Prof. Dr. Rudolf Kruse. Computational Intelligence Group Faculty for Computer Science

Neural Networks. Prof. Dr. Rudolf Kruse. Computational Intelligence Group Faculty for Computer Science Neural Networks Prof. Dr. Rudolf Kruse Computational Intelligence Group Faculty for Computer Science kruse@iws.cs.uni-magdeburg.de Rudolf Kruse Neural Networks 1 Hopfield Networks Rudolf Kruse Neural Networks

More information

Artificial Neural Network and Fuzzy Logic

Artificial Neural Network and Fuzzy Logic Artificial Neural Network and Fuzzy Logic 1 Syllabus 2 Syllabus 3 Books 1. Artificial Neural Networks by B. Yagnanarayan, PHI - (Cover Topologies part of unit 1 and All part of Unit 2) 2. Neural Networks

More information

Hopfield Networks. (Excerpt from a Basic Course at IK 2008) Herbert Jaeger. Jacobs University Bremen

Hopfield Networks. (Excerpt from a Basic Course at IK 2008) Herbert Jaeger. Jacobs University Bremen Hopfield Networks (Excerpt from a Basic Course at IK 2008) Herbert Jaeger Jacobs University Bremen Building a model of associative memory should be simple enough... Our brain is a neural network Individual

More information

Computational Intelligence Lecture 6: Associative Memory

Computational Intelligence Lecture 6: Associative Memory Computational Intelligence Lecture 6: Associative Memory Farzaneh Abdollahi Department of Electrical Engineering Amirkabir University of Technology Fall 2011 Farzaneh Abdollahi Computational Intelligence

More information

Complexity Bounds of Radial Basis Functions and Multi-Objective Learning

Complexity Bounds of Radial Basis Functions and Multi-Objective Learning Complexity Bounds of Radial Basis Functions and Multi-Objective Learning Illya Kokshenev and Antônio P. Braga Universidade Federal de Minas Gerais - Depto. Engenharia Eletrônica Av. Antônio Carlos, 6.67

More information

Analysis of Neural Networks with Chaotic Dynamics

Analysis of Neural Networks with Chaotic Dynamics Chaos, Solitonr & Fructals Vol. 3, No. 2, pp. 133-139, 1993 Printed in Great Britain @60-0779/93$6.00 + 40 0 1993 Pergamon Press Ltd Analysis of Neural Networks with Chaotic Dynamics FRANCOIS CHAPEAU-BLONDEAU

More information

Associative Memories (I) Hopfield Networks

Associative Memories (I) Hopfield Networks Associative Memories (I) Davide Bacciu Dipartimento di Informatica Università di Pisa bacciu@di.unipi.it Applied Brain Science - Computational Neuroscience (CNS) A Pun Associative Memories Introduction

More information

Synaptic Plasticity. Introduction. Biophysics of Synaptic Plasticity. Functional Modes of Synaptic Plasticity. Activity-dependent synaptic plasticity:

Synaptic Plasticity. Introduction. Biophysics of Synaptic Plasticity. Functional Modes of Synaptic Plasticity. Activity-dependent synaptic plasticity: Synaptic Plasticity Introduction Dayan and Abbott (2001) Chapter 8 Instructor: Yoonsuck Choe; CPSC 644 Cortical Networks Activity-dependent synaptic plasticity: underlies learning and memory, and plays

More information

Selected method of artificial intelligence in modelling safe movement of ships

Selected method of artificial intelligence in modelling safe movement of ships Safety and Security Engineering II 391 Selected method of artificial intelligence in modelling safe movement of ships J. Malecki Faculty of Mechanical and Electrical Engineering, Naval University of Gdynia,

More information

The Mixed States of Associative Memories Realize Unimodal Distribution of Dominance Durations in Multistable Perception

The Mixed States of Associative Memories Realize Unimodal Distribution of Dominance Durations in Multistable Perception The Mixed States of Associative Memories Realize Unimodal Distribution of Dominance Durations in Multistable Perception Takashi Kanamaru Department of Mechanical Science and ngineering, School of Advanced

More information

Pattern Association or Associative Networks. Jugal Kalita University of Colorado at Colorado Springs

Pattern Association or Associative Networks. Jugal Kalita University of Colorado at Colorado Springs Pattern Association or Associative Networks Jugal Kalita University of Colorado at Colorado Springs To an extent, learning is forming associations. Human memory associates similar items, contrary/opposite

More information

arxiv:quant-ph/ v1 17 Oct 1995

arxiv:quant-ph/ v1 17 Oct 1995 PHYSICS AND CONSCIOUSNESS Patricio Pérez arxiv:quant-ph/9510017v1 17 Oct 1995 Departamento de Física, Universidad de Santiago de Chile Casilla 307, Correo 2, Santiago, Chile ABSTRACT Some contributions

More information

COMP9444 Neural Networks and Deep Learning 11. Boltzmann Machines. COMP9444 c Alan Blair, 2017

COMP9444 Neural Networks and Deep Learning 11. Boltzmann Machines. COMP9444 c Alan Blair, 2017 COMP9444 Neural Networks and Deep Learning 11. Boltzmann Machines COMP9444 17s2 Boltzmann Machines 1 Outline Content Addressable Memory Hopfield Network Generative Models Boltzmann Machine Restricted Boltzmann

More information

Neural Networks Lecture 6: Associative Memory II

Neural Networks Lecture 6: Associative Memory II Neural Networks Lecture 6: Associative Memory II H.A Talebi Farzaneh Abdollahi Department of Electrical Engineering Amirkabir University of Technology Winter 2011. A. Talebi, Farzaneh Abdollahi Neural

More information

3.3 Discrete Hopfield Net An iterative autoassociative net similar to the nets described in the previous sections has been developed by Hopfield

3.3 Discrete Hopfield Net An iterative autoassociative net similar to the nets described in the previous sections has been developed by Hopfield 3.3 Discrete Hopfield Net An iterative autoassociative net similar to the nets described in the previous sections has been developed by Hopfield (1982, 1984). - The net is a fully interconnected neural

More information

arxiv: v1 [cs.lg] 2 Feb 2018

arxiv: v1 [cs.lg] 2 Feb 2018 Short-term Memory of Deep RNN Claudio Gallicchio arxiv:1802.00748v1 [cs.lg] 2 Feb 2018 Department of Computer Science, University of Pisa Largo Bruno Pontecorvo 3-56127 Pisa, Italy Abstract. The extension

More information

Storkey Learning Rules for Hopfield Networks

Storkey Learning Rules for Hopfield Networks Storey Learning Rules for Hopfield Networs Xiao Hu Bejing Technology Group September 18, 2013 Abstract We summarize the Storey Learning Rules for the Hopfield Model, and evaluate performance relative to

More information

Dynamical Systems and Deep Learning: Overview. Abbas Edalat

Dynamical Systems and Deep Learning: Overview. Abbas Edalat Dynamical Systems and Deep Learning: Overview Abbas Edalat Dynamical Systems The notion of a dynamical system includes the following: A phase or state space, which may be continuous, e.g. the real line,

More information

HOPFIELD neural networks (HNNs) are a class of nonlinear

HOPFIELD neural networks (HNNs) are a class of nonlinear IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS II: EXPRESS BRIEFS, VOL. 52, NO. 4, APRIL 2005 213 Stochastic Noise Process Enhancement of Hopfield Neural Networks Vladimir Pavlović, Member, IEEE, Dan Schonfeld,

More information

Multilayer Perceptron

Multilayer Perceptron Outline Hong Chang Institute of Computing Technology, Chinese Academy of Sciences Machine Learning Methods (Fall 2012) Outline Outline I 1 Introduction 2 Single Perceptron 3 Boolean Function Learning 4

More information

Neural Network Analysis of Russian Parliament Voting Patterns

Neural Network Analysis of Russian Parliament Voting Patterns eural etwork Analysis of Russian Parliament Voting Patterns Dusan Husek Acad. of Sci. of the Czech Republic, Institute of Computer Science, the Czech Republic Email: dusan@cs.cas.cz Alexander A. Frolov

More information

Application of Artificial Neural Networks in Evaluation and Identification of Electrical Loss in Transformers According to the Energy Consumption

Application of Artificial Neural Networks in Evaluation and Identification of Electrical Loss in Transformers According to the Energy Consumption Application of Artificial Neural Networks in Evaluation and Identification of Electrical Loss in Transformers According to the Energy Consumption ANDRÉ NUNES DE SOUZA, JOSÉ ALFREDO C. ULSON, IVAN NUNES

More information

Logic Learning in Hopfield Networks

Logic Learning in Hopfield Networks Logic Learning in Hopfield Networks Saratha Sathasivam (Corresponding author) School of Mathematical Sciences, University of Science Malaysia, Penang, Malaysia E-mail: saratha@cs.usm.my Wan Ahmad Tajuddin

More information

Artificial Neural Networks Examination, June 2005

Artificial Neural Networks Examination, June 2005 Artificial Neural Networks Examination, June 2005 Instructions There are SIXTY questions. (The pass mark is 30 out of 60). For each question, please select a maximum of ONE of the given answers (either

More information

INTRODUCTION TO NEURAL NETWORKS

INTRODUCTION TO NEURAL NETWORKS INTRODUCTION TO NEURAL NETWORKS R. Beale & T.Jackson: Neural Computing, an Introduction. Adam Hilger Ed., Bristol, Philadelphia and New York, 990. THE STRUCTURE OF THE BRAIN The brain consists of about

More information

On the Hopfield algorithm. Foundations and examples

On the Hopfield algorithm. Foundations and examples General Mathematics Vol. 13, No. 2 (2005), 35 50 On the Hopfield algorithm. Foundations and examples Nicolae Popoviciu and Mioara Boncuţ Dedicated to Professor Dumitru Acu on his 60th birthday Abstract

More information

Neural networks: Associative memory

Neural networks: Associative memory Neural networs: Associative memory Prof. Sven Lončarić sven.loncaric@fer.hr http://www.fer.hr/ipg 1 Overview of topics l Introduction l Associative memories l Correlation matrix as an associative memory

More information

Supervised Learning Part I

Supervised Learning Part I Supervised Learning Part I http://www.lps.ens.fr/~nadal/cours/mva Jean-Pierre Nadal CNRS & EHESS Laboratoire de Physique Statistique (LPS, UMR 8550 CNRS - ENS UPMC Univ. Paris Diderot) Ecole Normale Supérieure

More information

Proceedings of Neural, Parallel, and Scientific Computations 4 (2010) xx-xx PHASE OSCILLATOR NETWORK WITH PIECEWISE-LINEAR DYNAMICS

Proceedings of Neural, Parallel, and Scientific Computations 4 (2010) xx-xx PHASE OSCILLATOR NETWORK WITH PIECEWISE-LINEAR DYNAMICS Proceedings of Neural, Parallel, and Scientific Computations 4 (2010) xx-xx PHASE OSCILLATOR NETWORK WITH PIECEWISE-LINEAR DYNAMICS WALTER GALL, YING ZHOU, AND JOSEPH SALISBURY Department of Mathematics

More information

Iterative Autoassociative Net: Bidirectional Associative Memory

Iterative Autoassociative Net: Bidirectional Associative Memory POLYTECHNIC UNIVERSITY Department of Computer and Information Science Iterative Autoassociative Net: Bidirectional Associative Memory K. Ming Leung Abstract: Iterative associative neural networks are introduced.

More information

Supervised (BPL) verses Hybrid (RBF) Learning. By: Shahed Shahir

Supervised (BPL) verses Hybrid (RBF) Learning. By: Shahed Shahir Supervised (BPL) verses Hybrid (RBF) Learning By: Shahed Shahir 1 Outline I. Introduction II. Supervised Learning III. Hybrid Learning IV. BPL Verses RBF V. Supervised verses Hybrid learning VI. Conclusion

More information

Causality and communities in neural networks

Causality and communities in neural networks Causality and communities in neural networks Leonardo Angelini, Daniele Marinazzo, Mario Pellicoro, Sebastiano Stramaglia TIRES-Center for Signal Detection and Processing - Università di Bari, Bari, Italy

More information

How can ideas from quantum computing improve or speed up neuromorphic models of computation?

How can ideas from quantum computing improve or speed up neuromorphic models of computation? Neuromorphic Computation: Architectures, Models, Applications Associative Memory Models with Adiabatic Quantum Optimization Kathleen Hamilton, Alexander McCaskey, Jonathan Schrock, Neena Imam and Travis

More information

Bobby Hunt, Mariappan S. Nadar, Paul Keller, Eric VonColln, and Anupam Goyal III. ASSOCIATIVE RECALL BY A POLYNOMIAL MAPPING

Bobby Hunt, Mariappan S. Nadar, Paul Keller, Eric VonColln, and Anupam Goyal III. ASSOCIATIVE RECALL BY A POLYNOMIAL MAPPING Synthesis of a Nonrecurrent Associative Memory Model Based on a Nonlinear Transformation in the Spectral Domain p. 1 Bobby Hunt, Mariappan S. Nadar, Paul Keller, Eric VonColln, Anupam Goyal Abstract -

More information

2- AUTOASSOCIATIVE NET - The feedforward autoassociative net considered in this section is a special case of the heteroassociative net.

2- AUTOASSOCIATIVE NET - The feedforward autoassociative net considered in this section is a special case of the heteroassociative net. 2- AUTOASSOCIATIVE NET - The feedforward autoassociative net considered in this section is a special case of the heteroassociative net. - For an autoassociative net, the training input and target output

More information

Bidirectional Representation and Backpropagation Learning

Bidirectional Representation and Backpropagation Learning Int'l Conf on Advances in Big Data Analytics ABDA'6 3 Bidirectional Representation and Bacpropagation Learning Olaoluwa Adigun and Bart Koso Department of Electrical Engineering Signal and Image Processing

More information

An artificial neural networks (ANNs) model is a functional abstraction of the

An artificial neural networks (ANNs) model is a functional abstraction of the CHAPER 3 3. Introduction An artificial neural networs (ANNs) model is a functional abstraction of the biological neural structures of the central nervous system. hey are composed of many simple and highly

More information

Serious limitations of (single-layer) perceptrons: Cannot learn non-linearly separable tasks. Cannot approximate (learn) non-linear functions

Serious limitations of (single-layer) perceptrons: Cannot learn non-linearly separable tasks. Cannot approximate (learn) non-linear functions BACK-PROPAGATION NETWORKS Serious limitations of (single-layer) perceptrons: Cannot learn non-linearly separable tasks Cannot approximate (learn) non-linear functions Difficult (if not impossible) to design

More information

Memory capacity of neural networks learning within bounds

Memory capacity of neural networks learning within bounds Nous We done sont are faites J. Physique 48 (1987) 20532058 DTCEMBRE 1987, 1 2053 Classification Physics Abstracts 75.10H 64.60 87.30 Memory capacity of neural networks learning within bounds Mirta B.

More information

Artificial Neural Network Method of Rock Mass Blastability Classification

Artificial Neural Network Method of Rock Mass Blastability Classification Artificial Neural Network Method of Rock Mass Blastability Classification Jiang Han, Xu Weiya, Xie Shouyi Research Institute of Geotechnical Engineering, Hohai University, Nanjing, Jiangshu, P.R.China

More information

ARTIFICIAL NEURAL NETWORK PART I HANIEH BORHANAZAD

ARTIFICIAL NEURAL NETWORK PART I HANIEH BORHANAZAD ARTIFICIAL NEURAL NETWORK PART I HANIEH BORHANAZAD WHAT IS A NEURAL NETWORK? The simplest definition of a neural network, more properly referred to as an 'artificial' neural network (ANN), is provided

More information

Neural Nets and Symbolic Reasoning Hopfield Networks

Neural Nets and Symbolic Reasoning Hopfield Networks Neural Nets and Symbolic Reasoning Hopfield Networks Outline The idea of pattern completion The fast dynamics of Hopfield networks Learning with Hopfield networks Emerging properties of Hopfield networks

More information

Shigetaka Fujita. Rokkodai, Nada, Kobe 657, Japan. Haruhiko Nishimura. Yashiro-cho, Kato-gun, Hyogo , Japan. Abstract

Shigetaka Fujita. Rokkodai, Nada, Kobe 657, Japan. Haruhiko Nishimura. Yashiro-cho, Kato-gun, Hyogo , Japan. Abstract KOBE-TH-94-07 HUIS-94-03 November 1994 An Evolutionary Approach to Associative Memory in Recurrent Neural Networks Shigetaka Fujita Graduate School of Science and Technology Kobe University Rokkodai, Nada,

More information

Lecture 6. Notes on Linear Algebra. Perceptron

Lecture 6. Notes on Linear Algebra. Perceptron Lecture 6. Notes on Linear Algebra. Perceptron COMP90051 Statistical Machine Learning Semester 2, 2017 Lecturer: Andrey Kan Copyright: University of Melbourne This lecture Notes on linear algebra Vectors

More information

Dynamics of Structured Complex Recurrent Hopfield Networks

Dynamics of Structured Complex Recurrent Hopfield Networks Dynamics of Structured Complex Recurrent Hopfield Networks by Garimella Ramamurthy Report No: IIIT/TR/2016/-1 Centre for Security, Theory and Algorithms International Institute of Information Technology

More information

Properties of Associative Memory Model with the β-th-order Synaptic Decay

Properties of Associative Memory Model with the β-th-order Synaptic Decay Regular Paper Properties of Associative Memory Model with the β-th-order Synaptic Decay Ryota Miyata 1,2 Toru Aonishi 1 Jun Tsuzurugi 3 Koji Kurata 4,a) Received: January 30, 2013, Revised: March 20, 2013/June

More information

Neural Networks Introduction

Neural Networks Introduction Neural Networks Introduction H.A Talebi Farzaneh Abdollahi Department of Electrical Engineering Amirkabir University of Technology Winter 2011 H. A. Talebi, Farzaneh Abdollahi Neural Networks 1/22 Biological

More information

Data Mining Part 5. Prediction

Data Mining Part 5. Prediction Data Mining Part 5. Prediction 5.5. Spring 2010 Instructor: Dr. Masoud Yaghini Outline How the Brain Works Artificial Neural Networks Simple Computing Elements Feed-Forward Networks Perceptrons (Single-layer,

More information

Application of hopfield network in improvement of fingerprint recognition process Mahmoud Alborzi 1, Abbas Toloie- Eshlaghy 1 and Dena Bazazian 2

Application of hopfield network in improvement of fingerprint recognition process Mahmoud Alborzi 1, Abbas Toloie- Eshlaghy 1 and Dena Bazazian 2 5797 Available online at www.elixirjournal.org Computer Science and Engineering Elixir Comp. Sci. & Engg. 41 (211) 5797-582 Application hopfield network in improvement recognition process Mahmoud Alborzi

More information

Neural Nets in PR. Pattern Recognition XII. Michal Haindl. Outline. Neural Nets in PR 2

Neural Nets in PR. Pattern Recognition XII. Michal Haindl. Outline. Neural Nets in PR 2 Neural Nets in PR NM P F Outline Motivation: Pattern Recognition XII human brain study complex cognitive tasks Michal Haindl Faculty of Information Technology, KTI Czech Technical University in Prague

More information

1 Introduction Tasks like voice or face recognition are quite dicult to realize with conventional computer systems, even for the most powerful of them

1 Introduction Tasks like voice or face recognition are quite dicult to realize with conventional computer systems, even for the most powerful of them Information Storage Capacity of Incompletely Connected Associative Memories Holger Bosch Departement de Mathematiques et d'informatique Ecole Normale Superieure de Lyon Lyon, France Franz Kurfess Department

More information

Effects of Interactive Function Forms in a Self-Organized Critical Model Based on Neural Networks

Effects of Interactive Function Forms in a Self-Organized Critical Model Based on Neural Networks Commun. Theor. Phys. (Beijing, China) 40 (2003) pp. 607 613 c International Academic Publishers Vol. 40, No. 5, November 15, 2003 Effects of Interactive Function Forms in a Self-Organized Critical Model

More information

A Novel Chaotic Neural Network Architecture

A Novel Chaotic Neural Network Architecture ESANN' proceedings - European Symposium on Artificial Neural Networks Bruges (Belgium), - April, D-Facto public., ISBN ---, pp. - A Novel Neural Network Architecture Nigel Crook and Tjeerd olde Scheper

More information

A Remark on Alan H. Kawamoto: Nonlinear Dynamics in the Resolution of Lexical Ambiguity: A Parallel Distributed Processing Account

A Remark on Alan H. Kawamoto: Nonlinear Dynamics in the Resolution of Lexical Ambiguity: A Parallel Distributed Processing Account A Remark on Alan H. Kawamoto: Nonlinear Dynamics in the Resolution of Lexical Ambiguity: A Parallel Distributed Processing Account REGINALD FERBER Fachbereich 2, Universität Paderborn, D-33095 Paderborn,

More information

FEATURE REDUCTION FOR NEURAL NETWORK BASED SMALL-SIGNAL STABILITY ASSESSMENT

FEATURE REDUCTION FOR NEURAL NETWORK BASED SMALL-SIGNAL STABILITY ASSESSMENT FEATURE REDUCTION FOR NEURAL NETWORK BASED SMALL-SIGNAL STABILITY ASSESSMENT S.P. Teeuwsen teeuwsen@uni-duisburg.de University of Duisburg, Germany I. Erlich erlich@uni-duisburg.de M.A. El-Sharkawi elsharkawi@ee.washington.edu

More information

(a) (b) (c) Time Time. Time

(a) (b) (c) Time Time. Time Baltzer Journals Stochastic Neurodynamics and the System Size Expansion Toru Ohira and Jack D. Cowan 2 Sony Computer Science Laboratory 3-4-3 Higashi-gotanda, Shinagawa, Tokyo 4, Japan E-mail: ohiracsl.sony.co.jp

More information

Machine Learning. Neural Networks

Machine Learning. Neural Networks Machine Learning Neural Networks Bryan Pardo, Northwestern University, Machine Learning EECS 349 Fall 2007 Biological Analogy Bryan Pardo, Northwestern University, Machine Learning EECS 349 Fall 2007 THE

More information

CSC Neural Networks. Perceptron Learning Rule

CSC Neural Networks. Perceptron Learning Rule CSC 302 1.5 Neural Networks Perceptron Learning Rule 1 Objectives Determining the weight matrix and bias for perceptron networks with many inputs. Explaining what a learning rule is. Developing the perceptron

More information

Optimization of Quadratic Forms: NP Hard Problems : Neural Networks

Optimization of Quadratic Forms: NP Hard Problems : Neural Networks 1 Optimization of Quadratic Forms: NP Hard Problems : Neural Networks Garimella Rama Murthy, Associate Professor, International Institute of Information Technology, Gachibowli, HYDERABAD, AP, INDIA ABSTRACT

More information

The Variance of Covariance Rules for Associative Matrix Memories and Reinforcement Learning

The Variance of Covariance Rules for Associative Matrix Memories and Reinforcement Learning NOTE Communicated by David Willshaw The Variance of Covariance Rules for Associative Matrix Memories and Reinforcement Learning Peter Dayan Terrence J. Sejnowski Computational Neurobiology Laboratory,

More information

Extending the Associative Rule Chaining Architecture for Multiple Arity Rules

Extending the Associative Rule Chaining Architecture for Multiple Arity Rules Extending the Associative Rule Chaining Architecture for Multiple Arity Rules Nathan Burles, James Austin, and Simon O Keefe Advanced Computer Architectures Group Department of Computer Science University

More information

Influence of Criticality on 1/f α Spectral Characteristics of Cortical Neuron Populations

Influence of Criticality on 1/f α Spectral Characteristics of Cortical Neuron Populations Influence of Criticality on 1/f α Spectral Characteristics of Cortical Neuron Populations Robert Kozma rkozma@memphis.edu Computational Neurodynamics Laboratory, Department of Computer Science 373 Dunn

More information

Neural Network Essentials 1

Neural Network Essentials 1 Neural Network Essentials Draft: Associative Memory Chapter Anthony S. Maida University of Louisiana at Lafayette, USA June 6, 205 Copyright c 2000 2003 by Anthony S. Maida Contents 0. Associative memory.............................

More information

McGill University > Schulich School of Music > MUMT 611 > Presentation III. Neural Networks. artificial. jason a. hockman

McGill University > Schulich School of Music > MUMT 611 > Presentation III. Neural Networks. artificial. jason a. hockman jason a. hockman Overvie hat is a neural netork? basics and architecture learning applications in music History 1940s: William McCulloch defines neuron 1960s: Perceptron 1970s: limitations presented (Minsky)

More information

Consider the following spike trains from two different neurons N1 and N2:

Consider the following spike trains from two different neurons N1 and N2: About synchrony and oscillations So far, our discussions have assumed that we are either observing a single neuron at a, or that neurons fire independent of each other. This assumption may be correct in

More information

Limit Cycles in High-Resolution Quantized Feedback Systems

Limit Cycles in High-Resolution Quantized Feedback Systems Limit Cycles in High-Resolution Quantized Feedback Systems Li Hong Idris Lim School of Engineering University of Glasgow Glasgow, United Kingdom LiHonIdris.Lim@glasgow.ac.uk Ai Poh Loh Department of Electrical

More information

9 Competitive Neural Networks

9 Competitive Neural Networks 9 Competitive Neural Networks the basic competitive neural network consists of two layers of neurons: The distance-measure layer, The competitive layer, also known as a Winner-Takes-All (WTA) layer The

More information

Artificial Neural Networks. Edward Gatt

Artificial Neural Networks. Edward Gatt Artificial Neural Networks Edward Gatt What are Neural Networks? Models of the brain and nervous system Highly parallel Process information much more like the brain than a serial computer Learning Very

More information

A Cell Assembly Model of Sequential Memory

A Cell Assembly Model of Sequential Memory A Cell Assembly Model of Sequential Memory Hina Ghalib, Christian Huyck Abstract Perception, prediction and generation of sequences is a fundamental aspect of human behavior and depends on the ability

More information

Neuromorphic computing with Memristive devices. NCM group

Neuromorphic computing with Memristive devices. NCM group Neuromorphic computing with Memristive devices NCM group Why neuromorphic? New needs for computing Recognition, Mining, Synthesis (Intel) Increase of Fault (nanoscale engineering) SEMICONDUCTOR TECHNOLOGY

More information

Neural Networks. Hopfield Nets and Auto Associators Fall 2017

Neural Networks. Hopfield Nets and Auto Associators Fall 2017 Neural Networks Hopfield Nets and Auto Associators Fall 2017 1 Story so far Neural networks for computation All feedforward structures But what about.. 2 Loopy network Θ z = ቊ +1 if z > 0 1 if z 0 y i

More information

Hopfield Networks and Boltzmann Machines. Christian Borgelt Artificial Neural Networks and Deep Learning 296

Hopfield Networks and Boltzmann Machines. Christian Borgelt Artificial Neural Networks and Deep Learning 296 Hopfield Networks and Boltzmann Machines Christian Borgelt Artificial Neural Networks and Deep Learning 296 Hopfield Networks A Hopfield network is a neural network with a graph G = (U,C) that satisfies

More information

Marr's Theory of the Hippocampus: Part I

Marr's Theory of the Hippocampus: Part I Marr's Theory of the Hippocampus: Part I Computational Models of Neural Systems Lecture 3.3 David S. Touretzky October, 2015 David Marr: 1945-1980 10/05/15 Computational Models of Neural Systems 2 Marr

More information

Cellular Automata. ,C ) (t ) ,..., C i +[ K / 2] Cellular Automata. x > N : C x ! N. = C x. x < 1: C x. = C N+ x.

Cellular Automata. ,C ) (t ) ,..., C i +[ K / 2] Cellular Automata. x > N : C x ! N. = C x. x < 1: C x. = C N+ x. and beyond Lindenmayer Systems The World of Simple Programs Christian Jacob Department of Computer Science Department of Biochemistry & Molecular Biology University of Calgary CPSC 673 Winter 2004 Random

More information