T sg. α c (0)= T=1/β. α c (T ) α=p/n

Size: px
Start display at page:

Download "T sg. α c (0)= T=1/β. α c (T ) α=p/n"

Transcription

1 Taejon, Korea, vol. 2 of 2, pp. 779{784, Nov. 2. Capacity Analysis of Bidirectional Associative Memory Toshiyuki Tanaka y, Shinsuke Kakiya y, and Yoshiyuki Kabashima z ygraduate School of Engineering, Tokyo Metropolitan University Hachioji Tokyo Japan zinterdisciplinary Graduate School of Science and Engineering, Tokyo Institute of Technology Yokohama Japan Abstract Macroscopic properties of the bidirectional associative memory (BAM) are studied in a framework of statistical physics. The relative capacity, which is the relative number of pattern pairs being able to be memorized and retrieved by the BAM while allowing nite fraction of retrieval error, is evaluated using replica method. The obtained capacity for a system with N units in each of two layers is :998N, which can be regarded as BAM's counterpart of the result :38N for the autocorrelation associative memory, or the Hopeld model, evaluated by Amit, Gutfreund, and Sompolinsky. Introduction The bidirectional associative memory (BAM) [] is a variation of associative memory neural networks. The principal function of associative memories is to store and retrieve multiple patterns in a distributed manner. The storage capacity, which represents how many patterns can be stored in a network, is one of the fundamental performance measures to evaluate the ability of the network. There have been two distinct denitions of the storage capacity in the literature: One is the so-called absolute capacity, which refers to the upper limit of the number of memorized patterns when one requires perfect recall, i.e., no retrieval error is allowed. The other is the so-called relative capacity, in which one allows a nite amount of retrieval error rate. The objective of this paper is to evaluate the relative capacity of BAM. The autocorrelation associative memory (AAM), sometimes termed as the Hopeld model, is a basic model of associative memory neural networks, and its storage capacity has been extensively investigated. The absolute capacity of AAM is known to scale as O(N= log N), where N is the size of the model [2, 3, 4]. There have been two approaches to the relative capacity of AAM. One is from statistical physics [5], which evaluates the capacity as :38N, while a dierent value :6N is given by the other approach using statistical neurodynamics [4]. (Some renements have been done for each of these approaches: See [6, 7, 8] for the statistical physics approach, and [9] for the statistical neurodynamics approach.) BAM can be regarded as a kind of AAM, whose connections are systematically removed []. This observation suggests that methods for analyzing the properties of AAM can be applied also for analyzing the properties of BAM. In fact, the absolute capacity of BAM has been analyzed along this line, and the same order of the capacity O(N= log N) has been reported by Haines and Hecht-Nielsen []. As for the relative capacity, however, to our knowledge, there are so few results reported in the literature. Amari [2] derived the macroscopic time evolution equations for BAM based on the statistical neurodynamics, but he did not discuss the relative capacity issue. Yanai et al. [] mentioned the value of the relative capacity of BAM ( :22N) evaluated by following Amari's argument. Leung et al. [3] evaluated a lower bound of the relative capacity of BAM based on a large-deviation-type argument. As far as we know, this is the rst study on the relative capacity of BAM based on the statistical physics approach. 2 Model In order to apply the statistical physics approach to the capacity analysis of BAM, we consider a version of BAM in which each unit is updated stochastically and asynchronously, just as in the case of AAM. BAM consists of two layers of neural units (see Figure ). Let and be the numbers of units these layers have. The state of unit is represented by a binary value, and the state of BAM is given by the pair fs; ~sg, where s 2 f?; g and ~s 2 f?; g are the states of the layers. Let w ij be the connection weight between unit i in the rst layer and unit j in the second layer. Unit i in the rst layer changes its state s i based on the input h i fed into it, according to the following stochastic state updating rule: Prob[s i := ] = e hi e hi + e?hi ; ()

2 s s 2 s 3 s W = (w ij ) ~s ~s 2 ~s 3 ~s Layer Layer 2 Figure : Structure of BAM where denotes the \inverse temperature," which controls the degree of stochasticity of the updating. The limit! corresponds to deterministic updating of states. The input h i is the weighted sum of the state of the second layer, i.e., h i = j= w ij ~s j : (2) Similar rules are applied to unit j in the second layer as well, i.e., and e ~ h j Prob[~s j := ] = ; (3) e ~ h j + e? h ~ j ~h j = i= w ij s i : (4) We assume that each unit is updated asynchronously with others, although it seems common in most of previous studies to assume the synchronous update over each layer and alternative between the layers. The system described so far has the following equilibrium distribution of the state: p(s; ~s) =? e?h(s; ~s) (5) where the Hamiltonian H(s; ~s) of the system is de- ned as X H(s; ~s) w ij s i ~s j ; (6) i; j and is the partition function normalizing p(s; ~s). We consider a situation in which p pattern pairs f( ; ~ )g, = ; : : : ; p, are memorized in BAM. Here, 2 f?; g and ~ 2 f?; g represents binary random patterns with the components i and ~ j being realizations of independent and identically-distributed (iid) binary variables following Prob[ i = ] = Prob[ ~ j = ] = =2. Hereafter, we focus on the Hebbian learning, so that synaptic weight w ij is given by w ij = N px = i ~ j : (7) Since we follow the statistical physics approach, we will consider properties of the model in the \thermodynamic" limit N! while keeping c; ~c = O(). Just like AAM, we can expect that the relative capacity of BAM should be of the same order as N, so that we will consider the case where the memory rate p=n is of the order O() in the limit N!. The principal interest lies in whether or not there is an equilibrium state corresponding to the retrieval of a pattern pair. We assume that only one pattern pair is being retrieved, and that the rst pattern pair ( ; ~ ), for the sake of simplicity, is nominated for retrieval. The overlaps m i= i s i; ~m j= ~ j ~s j (8) serve as the macroscopic measure of how well the nominated pattern pairs are retrieved. jmj; j ~mj holds, and if s and ~s are close to and ~, m and ~m become close to, respectively. An equilibrium state with nonvanishing m and ~m is the retrieval state of the model. We follow the statistical physics approach to explore the existence of the retrieval state. 3 Replica Analysis We have followed the conventional replica approach to analyze the equilibrium states of BAM. For details of the approach, see, for example, [4]. We take the following two assumptions:. The macroscopic properties of the model are self-averaging with respect to the randomness of the patterns in the thermodynamic limit, i.e., we can analyze the properties of the model by evaluating the pattern-averaged free energy hhlog ii instead of the (sample) free energy log, the latter of which is the relevant quantity in describing the properties of a single sample system. 2. The macroscopic quantities satisfy the replica symmetric (RS) ansatz. In the replica approach, the pattern-averaged free energy hhlog ii is evaluated in the thermodynamic limit by the saddle-point approximation, and RS ansatz means that we place some restrictions on the form of the saddle-point solutions: see [4] for details.

3 The rst assumption is clearly supported by experimental observation, in which one can nd that uctuations of macroscopic quantities, such as the overlaps, vanish as the system size N is increased. The second assumption, the RS ansatz, cannot be checked in such a direct way. For AAM, the validity of the RS ansatz has been established except for a region of very low temperature (large ). More rigorous treatments considering replica-symmetrybreaking (RSB) [6, 7] has shown, however, that the results need very small corrections even when the RS ansatz is not valid [4]. We can therefore expect that the RS ansatz is a reasonable assumption also for BAM. Under these assumptions, we obtained the following result. Proposition In the thermodynamic limit, the overlaps m and ~m of the model at thermal equilibrium at the inverse temperature satisfy the following saddle-point equations: ~ ~ Dz tanh ( p rz + ~c ~m) Dz tanh ( p ~rz + cm) Dz tanh 2 ( p rz + ~c ~m) Dz tanh 2 ( p ~rz + cm) r = ~c[~q + 2 c~cq(? ~q) 2 ] [? 2 c~c(? q)(? ~q)] 2 ~r = c[q + 2 c~c(? q) 2 ~q] [? 2 c~c(? q)(? ~q)] 2 (9) where fq; ~q; r; ~rg are the variables to be determined simultaneously with m and ~m by these equations, and Dz p (dz= 2)e?z2 =2 is the Gaussian measure. This is the main result of this paper. The physical interpretations of the variables fm; ~m; q; ~q; r; ~rg are the following: ~ ~ r = i= j= i hs ii ~ j h~s ji hs i i 2 i= h~s j i 2 j= px (m ) 2 > T=/β α c (T ) α=p/n T sg α c ()=.998 Figure 2: Phase diagram for BAM with c = ~c = where m ~r = i= px ( ~m ) 2 ; () > i s i; ~m j= ~ j ~s j; () and hi denotes the average over stochasticity arising from the stochastic nature of the state updating rule (Equations () and (3)). On the other hand, hhii represents the average over memorized pattern pairs f =; :::; P ; ~ =; ::: ;P g. The resulting saddle-point equations (9) have a strong similarity to those obtained for AAM by Amit, Gutfreund, and Sompolinsky [5]. The latter are r = Dz tanh ( p rz + m) Dz tanh 2 ( p rz + m) q [? (? q)] 2 : (2) The main dierence lies in r and ~r, which represent the variance of the noise components in the inputs h i and ~ h j. This may aect the quantitative properties of the models, including the relative capacity. 4 Numerical Results and Discussion One can solve the saddle-point equations (9) numerically with respect to the variables fm; ~m; q; ~q; r; ~rg. A solution with m ~m 6= corresponds to the retrieval state. We investigated the conditions under which the retrieval state exists. Figure 2 shows the result for the case c = ~c =. The curve = c (T ) represents the relative capacity, the maximum memory rate with which the retrieval state exists at the temperature T = =. The retrieval state exists in the lower-left, shaded region. At zero temperature

4 (! +), the relative capacity of BAM with c = ~c = is evaluated as c () = :998N. We have also performed a series of numerical experiments of BAM at zero temperature with varying the system size N, and have found, via the nite-size-scaling analysis [8], that the relative capacity of BAM with c = ~c = is :2N. The analytically-obtained value :998N is in good agreement with, but slightly lower than, the experimentally-obtained result :2N. This small discrepancy can be ascribed to failure of the RS ansatz, as discussed below. For AAM at zero temperature, one can notice a similar discrepancy between the analytically-obtained result for the relative capacity (:38N [5]) and the experimentallyobtained one (:4N [8]). It has been known for AAM that the retrieval state at zero temperature does not satisfy the RS ansatz [5]. In order to investigate this possibility for the current system, we have checked the de Almeida- Thouless instability [5], which signals the rst bifurcation from the RS solution to the solutions without replica symmetry. Evaluating the eigenvalues of the Hessian of the replicated free energy around the RS solution, we found that only the socalled \replicon" modes become possibly unstable as long as we focus on the attractor states of iteration dynamics used to solve the saddle-point equations (9). In contrast with the case of AAM, the eigenvalues are not degenerated but split into two values for BAM due to the existence of inter-layer interaction. By considering stability condition for these modes, two critical temperatures are obtained for each value of. However, only one temperature is physically relevant among them because the other one is always negative. The relevant temperature corresponds to the conventional AT line, below which the RS solution becomes unstable. The obtained AT line is shown in Figure 3. From this gure, it is found that the RS solution is unstable, and therefore the RS ansatz is no longer valid, in suciently low temperatures in the vicinity of the capacity = P=N :998. We can point out further evidence which supports the observed discrepancy, on the basis of similarity about the properties of the RS solution between AAM and BAM. In Figures 2 and 3 one can observe that, at very low temperature, the capacity of BAM slightly increases as the temperature rises, i.e., d c (T )=dt >. This \re-entrant" phenomenon is also observed in the AAM [6]. For AAM, it is known that the re-entrant phenomenon is an artifact arising from the RS ansatz, and it tends to disappear in analyses taking RSB into consideration, resulting in a slight increase of the relative capacity [6, 7]. We therefore expect that the situation should be the same for BAM, because T=/β m AT line α α c (T ) Figure 3: AT line for BAM with c = ~c = Figure 4: Overlap m of the retrieval state for BAM with c = ~c = and T = α of the formal similarity between BAM and AAM. This implies that the theoretical prediction about the relative capacity is slightly larger than :998N, which is in excellent agreement with the experimental result. Another curve shown in Figure 2 is the transition temperature T = T sg, at which the solution bifurcates from ~ (above the curve; the socalled \paramagnetic" phase) to ~q 6= (below; the \spin-glass" phase). In both of the two phases m and ~m remain zero, so that the model will fail to retrieve a pattern pair. The value of the overlap m(= ~m) in the solution at zero temperature with c = ~c = is shown in Figure 4. The upper, solid curve shows the stable solution corresponding to the retrieval state. The lower, dashed curve shows the unstable solution. At the critical capacity = :998, these two curves meet with each other, signaling the bifurcation at which the retrieval state disappears. The overlap at the critical capacity is m( = :998) = :93. This value is smaller than that for AAM, which is

5 α, ρ/ α. c ρ/2 Figure 5: ero-temperature capacity and loading ratio of BAM m( = :38) = :967 [5]. This means that the error-correction ability at the critical capacity is lower for BAM than for AAM. The results described so far are for BAM with c = ~c =, i.e., with the two layers having the same number of units. The relative capacity will change when we assume dierent numbers of units for the two layers of BAM. Figure 5 shows the relative capacity = p=n of BAM with ~c = =c at zero temperature, against c. The condition ~c = =c corresponds to xing the total numbers of connections to N 2. The result shows that the relative capacity decreases as the numbers of units in the two layers become dierent. On the other hand, we can take \loading ratio []" as an alternative measure of performance. The loading ratio is dened as the amount of information loaded per one connection. In the case considered here, it is given by p N(c + =c) = N 2 = c + : (3) c =2 = holds when c =, but in general =2. Figure 5 also shows the half of the loading ratio, =2, versus c. Measured by the loading ratio, making c away from causes a small improvement in the memorizing ability. Finally, we mention that the results will change if we assume dierent statistics for the memorized patterns. For example, consider the case where c = ~c = and i follows Prob[ i = ] = =2 while ^ i = i. In this case BAM functions as an autoassociative memory. Under the RS ansatz, the saddle-point equations for this case are Dz tanh ( p rz + m) Dz tanh 2 ( p rz + m) r = q [? (? q)]? (4) 2? 2 (? q) 2 which are dierent from those given in Proposition, reecting the dierence of the statistics assumed for the memorized patterns. It should be noted that these equations (4) are also dierent from the saddle-point equations (2) for AAM, although they coincide in the zero-temperature limit! +. 5 Conclusions In this study, replica analysis was performed on the equilibrium state of BAM, and the relative capacity was evaluated by numerically solving the resulting equations (9). The relative capacity of BAM with c = ~c = for zero temperature was evaluated to be :998N, which is in excellent agreement with the experimentally obtained result, :2N. The small discrepancy can be ascribed to the failure of the RS ansatz at low enough temperature, as in the case of AAM. The capacity when c and ~c deviate from was also shown. Acknowledgments T. T. would like to thank Dr. H. Yanai at Ibaraki University, Japan, for his helpful comments. References [] B. Kosko, \Bidirectional associative memories," IEEE Trans. Systems, Man, Cybern, vol. 8, pp. 49{6, 988. [2] G. Weisbuch and F. Fogelman-Soulie, \Scaling laws for the attractors of Hopeld networks," J. Physique Lett., vol. 46, no. 4, pp. L-623{63, 985. [3] R. J. McEliece, E. C. Posner, E. R. Rodemich, and S. S. Venkatesh, \The capacity of the Hop- eld associative memory," IEEE Trans. Inform. Theory, vol. IT-33, no. 4, pp. 46{482, 987. [4] S. Amari and K. Maginu, "Statistical neurodynamics of associative memory," Neural Networks, vol., pp. 63{73, 988. [5] D. J. Amit, H. Gutfreund, and H. Sompolinsky, "Storing innite numbers of patterns in a spin-glass model of neural networks," Phys. Rev. Lett., vol. 55, pp. 53{533, 985. [6] A. Crisanti, D. J. Amit, and H. Gutfreund, \Saturation level of the Hopeld model for neural network," Europhys. Lett., vol. 2, no. 4, pp. 337{34, 986. [7] K. Tokita, \The replica-symmetry-breaking solution of the Hopeld model at zero temperature: critical storage capacity and frozen eld distribution," J. Phys. A: Math. Gen., vol. 27, no. 3, pp. 443{4424, 994.

6 [8] T. Stiefvater, K.-R. Muller, and R. Kuhn, \Averaging and nite-size analysis for disorder: the Hopeld model," Physica A, vol. 232, nos. {2, pp. 6{73, 996. [9] M. Okada, \A hierarchy of macrodynamical equations for associative memory," Neural Networks, vol. 8, no. 6, pp. 833{838, 995. [] H.-F. Yanai, Y. Sawada, and S. Yoshizawa, \Dynamics of an auto-associative neural network model with arbitrary connectivity and noise in the threshold," Network, vol. 2, pp. 295{34, 99. [] K. Haines and R. Hecht-Nielsen, \A BAM with increased information storage capacity," Proc. IEEE Conf. Neural Networks, San Diego, vol., pp. I-8{9, 988. [2] S. Amari, \Statistical neurodynamics of various versions of correlation associative memory," Proc. IEEE Conf. Neural Networks, San Diego, vol., pp. I-633{64, 988. [3] C. S. Leung, L. W. Chan, and J. Sum, \Attraction basin of bidirectional associative memories," Int. J. Neural Syst., vol. 7, no. 6, pp. 75{725, 996. [4] J. A. Hertz, A. S. Krogh, and R. G. Palmer, Introduction to the Theory of Neural Computation, Addison-Wesley, 99. [5] J. R. L. de Almeida and D. J. Thouless, \Stability of the Sherrington-Kirkpatrick solution of a spin glass model," J. Phys. A: Math. Gen., vol., no. 3, pp. 983{99, 978. [6] V. Dotsenko, An Introduction to the Theory of Spin Glasses and Neural Networks, World Scientic, 994.

(a) (b) (c) Time Time. Time

(a) (b) (c) Time Time. Time Baltzer Journals Stochastic Neurodynamics and the System Size Expansion Toru Ohira and Jack D. Cowan 2 Sony Computer Science Laboratory 3-4-3 Higashi-gotanda, Shinagawa, Tokyo 4, Japan E-mail: ohiracsl.sony.co.jp

More information

Storage Capacity of Letter Recognition in Hopfield Networks

Storage Capacity of Letter Recognition in Hopfield Networks Storage Capacity of Letter Recognition in Hopfield Networks Gang Wei (gwei@cs.dal.ca) Zheyuan Yu (zyu@cs.dal.ca) Faculty of Computer Science, Dalhousie University, Halifax, N.S., Canada B3H 1W5 Abstract:

More information

Propagating beliefs in spin glass models. Abstract

Propagating beliefs in spin glass models. Abstract Propagating beliefs in spin glass models Yoshiyuki Kabashima arxiv:cond-mat/0211500v1 [cond-mat.dis-nn] 22 Nov 2002 Department of Computational Intelligence and Systems Science, Tokyo Institute of Technology,

More information

Properties of Associative Memory Model with the β-th-order Synaptic Decay

Properties of Associative Memory Model with the β-th-order Synaptic Decay Regular Paper Properties of Associative Memory Model with the β-th-order Synaptic Decay Ryota Miyata 1,2 Toru Aonishi 1 Jun Tsuzurugi 3 Koji Kurata 4,a) Received: January 30, 2013, Revised: March 20, 2013/June

More information

Since Gardner's pioneering work on the storage capacity of a single layer perceptron [?], there have been numerous eorts to use the statistical mechan

Since Gardner's pioneering work on the storage capacity of a single layer perceptron [?], there have been numerous eorts to use the statistical mechan Weight space structure and the storage capacity of a fully-connected committee machine Yuansheng Xiong and Jong-Hoon Oh Department of Physics, Pohang Institute of Science and Technology, Hyoa San 31, Pohang,

More information

7 Rate-Based Recurrent Networks of Threshold Neurons: Basis for Associative Memory

7 Rate-Based Recurrent Networks of Threshold Neurons: Basis for Associative Memory Physics 178/278 - David Kleinfeld - Fall 2005; Revised for Winter 2017 7 Rate-Based Recurrent etworks of Threshold eurons: Basis for Associative Memory 7.1 A recurrent network with threshold elements The

More information

Stochastic Learning in a Neural Network with Adapting. Synapses. Istituto Nazionale di Fisica Nucleare, Sezione di Bari

Stochastic Learning in a Neural Network with Adapting. Synapses. Istituto Nazionale di Fisica Nucleare, Sezione di Bari Stochastic Learning in a Neural Network with Adapting Synapses. G. Lattanzi 1, G. Nardulli 1, G. Pasquariello and S. Stramaglia 1 Dipartimento di Fisica dell'universita di Bari and Istituto Nazionale di

More information

7 Recurrent Networks of Threshold (Binary) Neurons: Basis for Associative Memory

7 Recurrent Networks of Threshold (Binary) Neurons: Basis for Associative Memory Physics 178/278 - David Kleinfeld - Winter 2019 7 Recurrent etworks of Threshold (Binary) eurons: Basis for Associative Memory 7.1 The network The basic challenge in associative networks, also referred

More information

Shigetaka Fujita. Rokkodai, Nada, Kobe 657, Japan. Haruhiko Nishimura. Yashiro-cho, Kato-gun, Hyogo , Japan. Abstract

Shigetaka Fujita. Rokkodai, Nada, Kobe 657, Japan. Haruhiko Nishimura. Yashiro-cho, Kato-gun, Hyogo , Japan. Abstract KOBE-TH-94-07 HUIS-94-03 November 1994 An Evolutionary Approach to Associative Memory in Recurrent Neural Networks Shigetaka Fujita Graduate School of Science and Technology Kobe University Rokkodai, Nada,

More information

arxiv:cond-mat/ v1 [cond-mat.dis-nn] 30 Sep 1999

arxiv:cond-mat/ v1 [cond-mat.dis-nn] 30 Sep 1999 arxiv:cond-mat/9909443v1 [cond-mat.dis-nn] 30 Sep 1999 Thresholds in layered neural networks with variable activity 1. Introduction D Bollé and G Massolo Instituut voor Theoretische Fysica, K.U. Leuven,

More information

HOPFIELD neural networks (HNNs) are a class of nonlinear

HOPFIELD neural networks (HNNs) are a class of nonlinear IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS II: EXPRESS BRIEFS, VOL. 52, NO. 4, APRIL 2005 213 Stochastic Noise Process Enhancement of Hopfield Neural Networks Vladimir Pavlović, Member, IEEE, Dan Schonfeld,

More information

arxiv:cond-mat/ v1 [cond-mat.dis-nn] 18 Nov 1996

arxiv:cond-mat/ v1 [cond-mat.dis-nn] 18 Nov 1996 arxiv:cond-mat/9611130v1 [cond-mat.dis-nn] 18 Nov 1996 Learning by dilution in a Neural Network B López and W Kinzel Institut für Theoretische Physik, Universität Würzburg, Am Hubland, D-97074 Würzburg

More information

Hopfield Network for Associative Memory

Hopfield Network for Associative Memory CSE 5526: Introduction to Neural Networks Hopfield Network for Associative Memory 1 The next few units cover unsupervised models Goal: learn the distribution of a set of observations Some observations

More information

p D q EA Figure 3: F as a function of p d plotted for p d ' q EA, t = 0:05 and y =?0:01.

p D q EA Figure 3: F as a function of p d plotted for p d ' q EA, t = 0:05 and y =?0:01. F p D q EA Figure 3: F as a function of p d plotted for p d ' q EA, t = :5 and y =?:. 5 F p A Figure : F as a function of p d plotted for < p d < 2p A, t = :5 and y =?:. F p A p max p p p D q EA Figure

More information

Memory capacity of neural networks learning within bounds

Memory capacity of neural networks learning within bounds Nous We done sont are faites J. Physique 48 (1987) 20532058 DTCEMBRE 1987, 1 2053 Classification Physics Abstracts 75.10H 64.60 87.30 Memory capacity of neural networks learning within bounds Mirta B.

More information

CHAPTER 3. Pattern Association. Neural Networks

CHAPTER 3. Pattern Association. Neural Networks CHAPTER 3 Pattern Association Neural Networks Pattern Association learning is the process of forming associations between related patterns. The patterns we associate together may be of the same type or

More information

Learning and Memory in Neural Networks

Learning and Memory in Neural Networks Learning and Memory in Neural Networks Guy Billings, Neuroinformatics Doctoral Training Centre, The School of Informatics, The University of Edinburgh, UK. Neural networks consist of computational units

More information

Spin Glass Approach to Restricted Isometry Constant

Spin Glass Approach to Restricted Isometry Constant Spin Glass Approach to Restricted Isometry Constant Ayaka Sakata 1,Yoshiyuki Kabashima 2 1 Institute of Statistical Mathematics 2 Tokyo Institute of Technology 1/29 Outline Background: Compressed sensing

More information

Terminal attractor optical associative memory with adaptive control parameter

Terminal attractor optical associative memory with adaptive control parameter 1 June 1998 Ž. Optics Communications 151 1998 353 365 Full length article Terminal attractor optical associative memory with adaptive control parameter Xin Lin a,), Junji Ohtsubo b, Masahiko Mori a a Electrotechnical

More information

( ) T. Reading. Lecture 22. Definition of Covariance. Imprinting Multiple Patterns. Characteristics of Hopfield Memory

( ) T. Reading. Lecture 22. Definition of Covariance. Imprinting Multiple Patterns. Characteristics of Hopfield Memory Part 3: Autonomous Agents /8/07 Reading Lecture 22 Flake, ch. 20 ( Genetics and Evolution ) /8/07 /8/07 2 Imprinting Multiple Patterns Let x, x 2,, x p be patterns to be imprinted Define the sum-of-outer-products

More information

Neural Nets and Symbolic Reasoning Hopfield Networks

Neural Nets and Symbolic Reasoning Hopfield Networks Neural Nets and Symbolic Reasoning Hopfield Networks Outline The idea of pattern completion The fast dynamics of Hopfield networks Learning with Hopfield networks Emerging properties of Hopfield networks

More information

= w 2. w 1. B j. A j. C + j1j2

= w 2. w 1. B j. A j. C + j1j2 Local Minima and Plateaus in Multilayer Neural Networks Kenji Fukumizu and Shun-ichi Amari Brain Science Institute, RIKEN Hirosawa 2-, Wako, Saitama 35-098, Japan E-mail: ffuku, amarig@brain.riken.go.jp

More information

A. The Hopfield Network. III. Recurrent Neural Networks. Typical Artificial Neuron. Typical Artificial Neuron. Hopfield Network.

A. The Hopfield Network. III. Recurrent Neural Networks. Typical Artificial Neuron. Typical Artificial Neuron. Hopfield Network. Part 3A: Hopfield Network III. Recurrent Neural Networks A. The Hopfield Network 1 2 Typical Artificial Neuron Typical Artificial Neuron connection weights linear combination activation function inputs

More information

A. The Hopfield Network. III. Recurrent Neural Networks. Typical Artificial Neuron. Typical Artificial Neuron. Hopfield Network.

A. The Hopfield Network. III. Recurrent Neural Networks. Typical Artificial Neuron. Typical Artificial Neuron. Hopfield Network. III. Recurrent Neural Networks A. The Hopfield Network 2/9/15 1 2/9/15 2 Typical Artificial Neuron Typical Artificial Neuron connection weights linear combination activation function inputs output net

More information

ESANN'1999 proceedings - European Symposium on Artificial Neural Networks Bruges (Belgium), April 1999, D-Facto public., ISBN X, pp.

ESANN'1999 proceedings - European Symposium on Artificial Neural Networks Bruges (Belgium), April 1999, D-Facto public., ISBN X, pp. Statistical mechanics of support vector machines Arnaud Buhot and Mirta B. Gordon Department de Recherche Fondamentale sur la Matiere Condensee CEA-Grenoble, 17 rue des Martyrs, 38054 Grenoble Cedex 9,

More information

Lyapunov exponents in random Boolean networks

Lyapunov exponents in random Boolean networks Physica A 284 (2000) 33 45 www.elsevier.com/locate/physa Lyapunov exponents in random Boolean networks Bartolo Luque a;, Ricard V. Sole b;c a Centro de Astrobiolog a (CAB), Ciencias del Espacio, INTA,

More information

arxiv:cond-mat/ v1 [cond-mat.dis-nn] 10 Aug 2004

arxiv:cond-mat/ v1 [cond-mat.dis-nn] 10 Aug 2004 arxiv:cond-mat/4829v [cond-mat.dis-nn] Aug 24 Competition between ferro-retrieval and anti-ferro orders in a Hopfield-like network model for plant intelligence Jun-ichi Inoue a Bikas K. Chakrabarti b a

More information

arxiv: v2 [nlin.ao] 19 May 2015

arxiv: v2 [nlin.ao] 19 May 2015 Efficient and optimal binary Hopfield associative memory storage using minimum probability flow arxiv:1204.2916v2 [nlin.ao] 19 May 2015 Christopher Hillar Redwood Center for Theoretical Neuroscience University

More information

c(t) t (T-0.21) Figure 14: Finite-time scaling eq.(23) for the open case. Good scaling is obtained with

c(t) t (T-0.21) Figure 14: Finite-time scaling eq.(23) for the open case. Good scaling is obtained with 1 0.8 0.6 c(t) 0.4 0.2 0 0.001 0.01 0.1 1 10 t (T-0.21) 2 Figure 14: Finite-time scaling eq.(23) for the open case. Good scaling is obtained with T G 0:21 0:02 and 2. 32 1 0.8 0.6 c(t) 0.4 0.2 0 0.01 0.1

More information

Hopfield Neural Network and Associative Memory. Typical Myelinated Vertebrate Motoneuron (Wikipedia) Topic 3 Polymers and Neurons Lecture 5

Hopfield Neural Network and Associative Memory. Typical Myelinated Vertebrate Motoneuron (Wikipedia) Topic 3 Polymers and Neurons Lecture 5 Hopfield Neural Network and Associative Memory Typical Myelinated Vertebrate Motoneuron (Wikipedia) PHY 411-506 Computational Physics 2 1 Wednesday, March 5 1906 Nobel Prize in Physiology or Medicine.

More information

The Sherrington-Kirkpatrick model

The Sherrington-Kirkpatrick model Stat 36 Stochastic Processes on Graphs The Sherrington-Kirkpatrick model Andrea Montanari Lecture - 4/-4/00 The Sherrington-Kirkpatrick (SK) model was introduced by David Sherrington and Scott Kirkpatrick

More information

Artificial Intelligence Hopfield Networks

Artificial Intelligence Hopfield Networks Artificial Intelligence Hopfield Networks Andrea Torsello Network Topologies Single Layer Recurrent Network Bidirectional Symmetric Connection Binary / Continuous Units Associative Memory Optimization

More information

Hertz, Krogh, Palmer: Introduction to the Theory of Neural Computation. Addison-Wesley Publishing Company (1991). (v ji (1 x i ) + (1 v ji )x i )

Hertz, Krogh, Palmer: Introduction to the Theory of Neural Computation. Addison-Wesley Publishing Company (1991). (v ji (1 x i ) + (1 v ji )x i ) Symmetric Networks Hertz, Krogh, Palmer: Introduction to the Theory of Neural Computation. Addison-Wesley Publishing Company (1991). How can we model an associative memory? Let M = {v 1,..., v m } be a

More information

2 Information transmission is typically corrupted by noise during transmission. Various strategies have been adopted for reducing or eliminating the n

2 Information transmission is typically corrupted by noise during transmission. Various strategies have been adopted for reducing or eliminating the n Finite size eects and error-free communication in Gaussian channels Ido Kanter and David Saad # Minerva Center and Department of Physics, Bar-Ilan University, Ramat-Gan 52900, Israel. # The Neural Computing

More information

Learning About Spin Glasses Enzo Marinari (Cagliari, Italy) I thank the organizers... I am thrilled since... (Realistic) Spin Glasses: a debated, inte

Learning About Spin Glasses Enzo Marinari (Cagliari, Italy) I thank the organizers... I am thrilled since... (Realistic) Spin Glasses: a debated, inte Learning About Spin Glasses Enzo Marinari (Cagliari, Italy) I thank the organizers... I am thrilled since... (Realistic) Spin Glasses: a debated, interesting issue. Mainly work with: Giorgio Parisi, Federico

More information

Generalization in a Hopfield network

Generalization in a Hopfield network Generalization in a Hopfield network J.F. Fontanari To cite this version: J.F. Fontanari. Generalization in a Hopfield network. Journal de Physique, 1990, 51 (21), pp.2421-2430. .

More information

Multiplicative Multifractal Modeling of. Long-Range-Dependent (LRD) Trac in. Computer Communications Networks. Jianbo Gao and Izhak Rubin

Multiplicative Multifractal Modeling of. Long-Range-Dependent (LRD) Trac in. Computer Communications Networks. Jianbo Gao and Izhak Rubin Multiplicative Multifractal Modeling of Long-Range-Dependent (LRD) Trac in Computer Communications Networks Jianbo Gao and Izhak Rubin Electrical Engineering Department, University of California, Los Angeles

More information

Hopfield Neural Network

Hopfield Neural Network Lecture 4 Hopfield Neural Network Hopfield Neural Network A Hopfield net is a form of recurrent artificial neural network invented by John Hopfield. Hopfield nets serve as content-addressable memory systems

More information

Replica Field Theory for Deterministic Models (II): A Non-Random Spin Glass with Glassy Behavior

Replica Field Theory for Deterministic Models (II): A Non-Random Spin Glass with Glassy Behavior Syracuse University SURFACE Northeast Parallel Architecture Center College of Engineering and Computer Science 1994 Replica Field Theory for Deterministic Models (II): A Non-Random Spin Glass with Glassy

More information

3.3 Discrete Hopfield Net An iterative autoassociative net similar to the nets described in the previous sections has been developed by Hopfield

3.3 Discrete Hopfield Net An iterative autoassociative net similar to the nets described in the previous sections has been developed by Hopfield 3.3 Discrete Hopfield Net An iterative autoassociative net similar to the nets described in the previous sections has been developed by Hopfield (1982, 1984). - The net is a fully interconnected neural

More information

Hopfield Networks. (Excerpt from a Basic Course at IK 2008) Herbert Jaeger. Jacobs University Bremen

Hopfield Networks. (Excerpt from a Basic Course at IK 2008) Herbert Jaeger. Jacobs University Bremen Hopfield Networks (Excerpt from a Basic Course at IK 2008) Herbert Jaeger Jacobs University Bremen Building a model of associative memory should be simple enough... Our brain is a neural network Individual

More information

Neural networks that use three-state neurons

Neural networks that use three-state neurons J. Phys. A: Math. Gen. 22 (1989) 2265-2273. Printed in the UK Neural networks that use three-state neurons Jonathan S Yedidia Department of Physics, Jadwin Hall, Princeton University, Princeton, NJ 08544,

More information

Symmetry breaking in non-monotonic neural networks

Symmetry breaking in non-monotonic neural networks J. Phys. A Math. Gen. 26 (1993) L507-L513. Printed in'the UK LETTER TO THE EDITOR Symmetry breaking in non-monotonic neural networks G Boffettat, R Mmassonl: and R Zecchinas t Dip. di Pisica Generale e

More information

patterns (\attractors"). For instance, an associative memory may have tocomplete (or correct) an incomplete

patterns (\attractors). For instance, an associative memory may have tocomplete (or correct) an incomplete Chapter 6 Associative Models Association is the task of mapping input patterns to target patterns (\attractors"). For instance, an associative memory may have tocomplete (or correct) an incomplete (or

More information

Stochastic Oscillator Death in Globally Coupled Neural Systems

Stochastic Oscillator Death in Globally Coupled Neural Systems Journal of the Korean Physical Society, Vol. 52, No. 6, June 2008, pp. 19131917 Stochastic Oscillator Death in Globally Coupled Neural Systems Woochang Lim and Sang-Yoon Kim y Department of Physics, Kangwon

More information

A variational approach to Ising spin glasses in finite dimensions

A variational approach to Ising spin glasses in finite dimensions . Phys. A: Math. Gen. 31 1998) 4127 4140. Printed in the UK PII: S0305-447098)89176-2 A variational approach to Ising spin glasses in finite dimensions R Baviera, M Pasquini and M Serva Dipartimento di

More information

In biological terms, memory refers to the ability of neural systems to store activity patterns and later recall them when required.

In biological terms, memory refers to the ability of neural systems to store activity patterns and later recall them when required. In biological terms, memory refers to the ability of neural systems to store activity patterns and later recall them when required. In humans, association is known to be a prominent feature of memory.

More information

Information Capacity of Binary Weights Associative. Memories. Mathematical Sciences. University at Bualo. Bualo NY

Information Capacity of Binary Weights Associative. Memories. Mathematical Sciences. University at Bualo. Bualo NY Information Capacity of Binary Weights Associative Memories Arun Jagota Department of Computer Science University of California, Santa Cruz CA 95064 jagota@cse.ucsc.edu Kenneth W. Regan University at Bualo

More information

F.P. Battaglia 1. Istituto di Fisica. Universita di Roma, La Sapienza, Ple Aldo Moro, Roma and. S. Fusi

F.P. Battaglia 1. Istituto di Fisica. Universita di Roma, La Sapienza, Ple Aldo Moro, Roma and. S. Fusi partially structured synaptic transitions F.P. Battaglia 1 Istituto di Fisica Universita di Roma, La Sapienza, Ple Aldo oro, Roma and S. Fusi INFN, Sezione dell'istituto Superiore di Sanita, Viale Regina

More information

Statistical mechanics and capacity-approaching error-correctingcodes

Statistical mechanics and capacity-approaching error-correctingcodes Physica A 302 (2001) 14 21 www.elsevier.com/locate/physa Statistical mechanics and capacity-approaching error-correctingcodes Nicolas Sourlas Laboratoire de Physique Theorique de l, UMR 8549, Unite Mixte

More information

Parallel dynamics of fully connected Q-Ising neural networks

Parallel dynamics of fully connected Q-Ising neural networks arxiv:cond-mat/9704089v1 [cond-mat.dis-nn] 10 Apr 1997 Parallel dynamics of fully connected Q-Ising neural networks D. Bollé 1 2 and G. Jongen 1 3 Instituut voor Theoretische Fysica, K.U. Leuven B-3001

More information

High-conductance states in a mean-eld cortical network model

High-conductance states in a mean-eld cortical network model Neurocomputing 58 60 (2004) 935 940 www.elsevier.com/locate/neucom High-conductance states in a mean-eld cortical network model Alexander Lerchner a;, Mandana Ahmadi b, John Hertz b a Oersted-DTU, Technical

More information

Iterative Autoassociative Net: Bidirectional Associative Memory

Iterative Autoassociative Net: Bidirectional Associative Memory POLYTECHNIC UNIVERSITY Department of Computer and Information Science Iterative Autoassociative Net: Bidirectional Associative Memory K. Ming Leung Abstract: Iterative associative neural networks are introduced.

More information

Brazilian Journal of Physics, vol. 27, no. 4, december, with Aperiodic Interactions. Instituto de Fsica, Universidade de S~ao Paulo

Brazilian Journal of Physics, vol. 27, no. 4, december, with Aperiodic Interactions. Instituto de Fsica, Universidade de S~ao Paulo Brazilian Journal of Physics, vol. 27, no. 4, december, 1997 567 Critical Behavior of an Ising Model with periodic Interactions S. T. R. Pinho, T.. S. Haddad, S. R. Salinas Instituto de Fsica, Universidade

More information

Analytical and numerical study of internal representations in multilayer neural networks with binary weights

Analytical and numerical study of internal representations in multilayer neural networks with binary weights PHYSICAL REVIEW E VOLUME 54, NUMBER 1 JULY 1996 Analytical and numerical study of internal representations in multilayer neural networks with binary weights Simona Cocco,* Rémi Monasson, and Riccardo Zecchina

More information

Brazilian Journal of Physics ISSN: Sociedade Brasileira de Física Brasil

Brazilian Journal of Physics ISSN: Sociedade Brasileira de Física Brasil Brazilian Journal of Physics ISSN: 0103-9733 luizno.bjp@gmail.com Sociedade Brasileira de Física Brasil Almeida, J.R.L. de On the Entropy of the Viana-Bray Model Brazilian Journal of Physics, vol. 33,

More information

Quadratic replica coupling in the Sherrington-Kirkpatrick mean field spin glass model

Quadratic replica coupling in the Sherrington-Kirkpatrick mean field spin glass model arxiv:cond-mat/0201091v2 [cond-mat.dis-nn] 5 Mar 2002 Quadratic replica coupling in the Sherrington-Kirkpatrick mean field spin glass model Francesco Guerra Dipartimento di Fisica, Università di Roma La

More information

Nonmonotonic Networks. a. IRST, I Povo (Trento) Italy, b. Univ. of Trento, Physics Dept., I Povo (Trento) Italy

Nonmonotonic Networks. a. IRST, I Povo (Trento) Italy, b. Univ. of Trento, Physics Dept., I Povo (Trento) Italy Storage Capacity and Dynaics of Nononotonic Networks Bruno Crespi a and Ignazio Lazzizzera b a. IRST, I-38050 Povo (Trento) Italy, b. Univ. of Trento, Physics Dept., I-38050 Povo (Trento) Italy INFN Gruppo

More information

Igor A. Khovanov,Vadim S. Anishchenko, Astrakhanskaya str. 83, Saratov, Russia

Igor A. Khovanov,Vadim S. Anishchenko, Astrakhanskaya str. 83, Saratov, Russia Noise Induced Escape from Dierent Types of Chaotic Attractor Igor A. Khovanov,Vadim S. Anishchenko, Dmitri G. Luchinsky y,andpeter V.E. McClintock y Department of Physics, Saratov State University, Astrakhanskaya

More information

THE PHYSICS OF COUNTING AND SAMPLING ON RANDOM INSTANCES. Lenka Zdeborová

THE PHYSICS OF COUNTING AND SAMPLING ON RANDOM INSTANCES. Lenka Zdeborová THE PHYSICS OF COUNTING AND SAMPLING ON RANDOM INSTANCES Lenka Zdeborová (CEA Saclay and CNRS, France) MAIN CONTRIBUTORS TO THE PHYSICS UNDERSTANDING OF RANDOM INSTANCES Braunstein, Franz, Kabashima, Kirkpatrick,

More information

arxiv:cond-mat/ v1 [cond-mat.dis-nn] 12 Jun 2003

arxiv:cond-mat/ v1 [cond-mat.dis-nn] 12 Jun 2003 arxiv:cond-mat/0306326v1 [cond-mat.dis-nn] 12 Jun 2003 CONVEX REPLICA SIMMETRY BREAKING FROM POSITIVITY AND THERMODYNAMIC LIMIT Pierluigi Contucci, Sandro Graffi Dipartimento di Matematica Università di

More information

Effects of refractory periods in the dynamics of a diluted neural network

Effects of refractory periods in the dynamics of a diluted neural network Effects of refractory periods in the dynamics of a diluted neural network F. A. Tamarit, 1, * D. A. Stariolo, 2, * S. A. Cannas, 2, *, and P. Serra 2, 1 Facultad de Matemática, Astronomía yfísica, Universidad

More information

Capacity of neural networks with discrete synaptic couplings

Capacity of neural networks with discrete synaptic couplings J. Phys. A: Math. Gen. 23 (1990) 2613-2630. Printed in the UK Capacity of neural networks with discrete synaptic couplings H Gutfreund and Y Stein The Racah Institute of Physics, The Hebrew University

More information

In: Proc. BENELEARN-98, 8th Belgian-Dutch Conference on Machine Learning, pp 9-46, 998 Linear Quadratic Regulation using Reinforcement Learning Stephan ten Hagen? and Ben Krose Department of Mathematics,

More information

maximally charged black holes and Hideki Ishihara Department ofphysics, Tokyo Institute of Technology, Oh-okayama, Meguro, Tokyo 152, Japan

maximally charged black holes and Hideki Ishihara Department ofphysics, Tokyo Institute of Technology, Oh-okayama, Meguro, Tokyo 152, Japan Quasinormal modes of maximally charged black holes Hisashi Onozawa y,takashi Mishima z,takashi Okamura, and Hideki Ishihara Department ofphysics, Tokyo Institute of Technology, Oh-okayama, Meguro, Tokyo

More information

Analog Neural Nets with Gaussian or other Common. Noise Distributions cannot Recognize Arbitrary. Regular Languages.

Analog Neural Nets with Gaussian or other Common. Noise Distributions cannot Recognize Arbitrary. Regular Languages. Analog Neural Nets with Gaussian or other Common Noise Distributions cannot Recognize Arbitrary Regular Languages Wolfgang Maass Inst. for Theoretical Computer Science, Technische Universitat Graz Klosterwiesgasse

More information

in a Chaotic Neural Network distributed randomness of the input in each neuron or the weight in the

in a Chaotic Neural Network distributed randomness of the input in each neuron or the weight in the Heterogeneity Enhanced Order in a Chaotic Neural Network Shin Mizutani and Katsunori Shimohara NTT Communication Science Laboratories, 2-4 Hikaridai, Seika-cho, Soraku-gun, Kyoto, 69-237 Japan shin@cslab.kecl.ntt.co.jp

More information

Neural networks with fast time-variation of synapses

Neural networks with fast time-variation of synapses J. Phys. A: Math. Gen. 30 (1997) 7801 7816. Printed in the UK PII: S0305-4470(97)79319-3 Neural networks with fast time-variation of synapses J J Torres, P L Garrido and J Marro Instituto Carlos I de Física

More information

Chaotic Balanced State in a Model of Cortical Circuits C. van Vreeswijk and H. Sompolinsky Racah Institute of Physics and Center for Neural Computatio

Chaotic Balanced State in a Model of Cortical Circuits C. van Vreeswijk and H. Sompolinsky Racah Institute of Physics and Center for Neural Computatio Chaotic Balanced State in a Model of Cortical Circuits C. van Vreeswij and H. Sompolinsy Racah Institute of Physics and Center for Neural Computation Hebrew University Jerusalem, 91904 Israel 10 March

More information

Learning with Ensembles: How. over-tting can be useful. Anders Krogh Copenhagen, Denmark. Abstract

Learning with Ensembles: How. over-tting can be useful. Anders Krogh Copenhagen, Denmark. Abstract Published in: Advances in Neural Information Processing Systems 8, D S Touretzky, M C Mozer, and M E Hasselmo (eds.), MIT Press, Cambridge, MA, pages 190-196, 1996. Learning with Ensembles: How over-tting

More information

The Last Survivor: a Spin Glass Phase in an External Magnetic Field.

The Last Survivor: a Spin Glass Phase in an External Magnetic Field. The Last Survivor: a Spin Glass Phase in an External Magnetic Field. J. J. Ruiz-Lorenzo Dep. Física, Universidad de Extremadura Instituto de Biocomputación y Física de los Sistemas Complejos (UZ) http://www.eweb.unex.es/eweb/fisteor/juan

More information

Spin glasses and Stein s method

Spin glasses and Stein s method UC Berkeley Spin glasses Magnetic materials with strange behavior. ot explained by ferromagnetic models like the Ising model. Theoretically studied since the 70 s. An important example: Sherrington-Kirkpatrick

More information

CSE 5526: Introduction to Neural Networks Hopfield Network for Associative Memory

CSE 5526: Introduction to Neural Networks Hopfield Network for Associative Memory CSE 5526: Introduction to Neural Networks Hopfield Network for Associative Memory Part VII 1 The basic task Store a set of fundamental memories {ξξ 1, ξξ 2,, ξξ MM } so that, when presented a new pattern

More information

Statistical mechanics of complex neural systems and high dimensional data

Statistical mechanics of complex neural systems and high dimensional data Statistical mechanics of complex neural systems and high dimensional data Madhu Advani, Subhaneil Lahiri, and Surya Ganguli Dept. of Applied Physics, Stanford University, Stanford, CA E-mail: sganguli@stanford.edu

More information

Batch-mode, on-line, cyclic, and almost cyclic learning 1 1 Introduction In most neural-network applications, learning plays an essential role. Throug

Batch-mode, on-line, cyclic, and almost cyclic learning 1 1 Introduction In most neural-network applications, learning plays an essential role. Throug A theoretical comparison of batch-mode, on-line, cyclic, and almost cyclic learning Tom Heskes and Wim Wiegerinck RWC 1 Novel Functions SNN 2 Laboratory, Department of Medical hysics and Biophysics, University

More information

Phase-Space Learning Fu-Sheng Tsung Chung Tai Ch'an Temple 56, Yuon-fon Road, Yi-hsin Li, Pu-li Nan-tou County, Taiwan 545 Republic of China Garrison

Phase-Space Learning Fu-Sheng Tsung Chung Tai Ch'an Temple 56, Yuon-fon Road, Yi-hsin Li, Pu-li Nan-tou County, Taiwan 545 Republic of China Garrison Phase-Space Learning Fu-Sheng Tsung Chung Tai Ch'an Temple 56, Yuon-fon Road, Yi-hsin Li, Pu-li Nan-tou County, Taiwan 545 Republic of China Garrison W. Cottrell Institute for Neural Computation Computer

More information

Brazilian Journal of Physics, vol. 29, no. 4, December, Caixa Postal 676, 13564{200, S~ao Carlos, SP, Brazil

Brazilian Journal of Physics, vol. 29, no. 4, December, Caixa Postal 676, 13564{200, S~ao Carlos, SP, Brazil Brazilian Journal of Physics, vol. 9, no. 4, December, 1999 685 Evolution of Dynamic Localization Regimes in Coupled Minibands P. H.Rivera 1 and P. A.Schulz 1 Departamento de Fsica, Universidade Federal

More information

Emergent phenomena in large interacting communities

Emergent phenomena in large interacting communities Emergent phenomena in large interacting communities Giulio Biroli Institute for Theoretical Physics, CEA Saclay, France Statistical Physics Lab- ENS Paris Joint works with G. Bunin, C. Cammarota, V. Ros,

More information

Associative Memories (I) Hopfield Networks

Associative Memories (I) Hopfield Networks Associative Memories (I) Davide Bacciu Dipartimento di Informatica Università di Pisa bacciu@di.unipi.it Applied Brain Science - Computational Neuroscience (CNS) A Pun Associative Memories Introduction

More information

Logic Learning in Hopfield Networks

Logic Learning in Hopfield Networks Logic Learning in Hopfield Networks Saratha Sathasivam (Corresponding author) School of Mathematical Sciences, University of Science Malaysia, Penang, Malaysia E-mail: saratha@cs.usm.my Wan Ahmad Tajuddin

More information

Ensemble equivalence for non-extensive thermostatistics

Ensemble equivalence for non-extensive thermostatistics Physica A 305 (2002) 52 57 www.elsevier.com/locate/physa Ensemble equivalence for non-extensive thermostatistics Raul Toral a;, Rafael Salazar b a Instituto Mediterraneo de Estudios Avanzados (IMEDEA),

More information

Finite-temperature magnetism of ultrathin lms and nanoclusters PhD Thesis Booklet. Levente Rózsa Supervisor: László Udvardi

Finite-temperature magnetism of ultrathin lms and nanoclusters PhD Thesis Booklet. Levente Rózsa Supervisor: László Udvardi Finite-temperature magnetism of ultrathin lms and nanoclusters PhD Thesis Booklet Levente Rózsa Supervisor: László Udvardi BME 2016 Background of the research Magnetic materials continue to play an ever

More information

Motivation. Lecture 23. The Stochastic Neuron ( ) Properties of Logistic Sigmoid. Logistic Sigmoid With Varying T. Logistic Sigmoid T = 0.

Motivation. Lecture 23. The Stochastic Neuron ( ) Properties of Logistic Sigmoid. Logistic Sigmoid With Varying T. Logistic Sigmoid T = 0. Motivation Lecture 23 Idea: with low probability, go against the local field move up the energy surface make the wrong microdecision Potential value for optimization: escape from local optima Potential

More information

Homoclinic bifurcations in Chua s circuit

Homoclinic bifurcations in Chua s circuit Physica A 262 (1999) 144 152 Homoclinic bifurcations in Chua s circuit Sandra Kahan, Anibal C. Sicardi-Schino Instituto de Fsica, Universidad de la Republica, C.C. 30, C.P. 11 000, Montevideo, Uruguay

More information

Learning in Boltzmann Trees. Lawrence Saul and Michael Jordan. Massachusetts Institute of Technology. Cambridge, MA January 31, 1995.

Learning in Boltzmann Trees. Lawrence Saul and Michael Jordan. Massachusetts Institute of Technology. Cambridge, MA January 31, 1995. Learning in Boltzmann Trees Lawrence Saul and Michael Jordan Center for Biological and Computational Learning Massachusetts Institute of Technology 79 Amherst Street, E10-243 Cambridge, MA 02139 January

More information

arxiv:cond-mat/ v1 3 Mar 1997

arxiv:cond-mat/ v1 3 Mar 1997 On-Line AdaTron Learning of Unlearnable Rules Jun-ichi Inoue and Hidetoshi Nishimori Department of Physics, Tokyo Institute of Technology, Oh-okayama, Meguro-ku, Tokyo 152, Japan arxiv:cond-mat/9703019v1

More information

Neural Networks. Hopfield Nets and Auto Associators Fall 2017

Neural Networks. Hopfield Nets and Auto Associators Fall 2017 Neural Networks Hopfield Nets and Auto Associators Fall 2017 1 Story so far Neural networks for computation All feedforward structures But what about.. 2 Loopy network Θ z = ቊ +1 if z > 0 1 if z 0 y i

More information

Constrained Leja points and the numerical solution of the constrained energy problem

Constrained Leja points and the numerical solution of the constrained energy problem Journal of Computational and Applied Mathematics 131 (2001) 427 444 www.elsevier.nl/locate/cam Constrained Leja points and the numerical solution of the constrained energy problem Dan I. Coroian, Peter

More information

John P.F.Sum and Peter K.S.Tam. Hong Kong Polytechnic University, Hung Hom, Kowloon.

John P.F.Sum and Peter K.S.Tam. Hong Kong Polytechnic University, Hung Hom, Kowloon. Note on the Maxnet Dynamics John P.F.Sum and Peter K.S.Tam Department of Electronic Engineering, Hong Kong Polytechnic University, Hung Hom, Kowloon. April 7, 996 Abstract A simple method is presented

More information

Energy-Decreasing Dynamics in Mean-Field Spin Models

Energy-Decreasing Dynamics in Mean-Field Spin Models arxiv:cond-mat/0210545 v1 24 Oct 2002 Energy-Decreasing Dynamics in Mean-Field Spin Models L. Bussolari, P. Contucci, M. Degli Esposti, C. Giardinà Dipartimento di Matematica dell Università di Bologna,

More information

2 Dissipative particle dynamics with energy conservation: Heat conduction that is dissipated due to the velocity dependent forces is transformed into

2 Dissipative particle dynamics with energy conservation: Heat conduction that is dissipated due to the velocity dependent forces is transformed into International Journal of Modern Physics C, fc World Scientic Publishing Company DISSIPATIVE PARTICLE DYNAMICS WITH ENERGY CONSERVATION: HEAT CONDUCTION M. RIPOLL, P. ESPA ~NOL Departamento de Fsica Fundamental,

More information

of the dynamics. There is a competition between the capacity of the network and the stability of the

of the dynamics. There is a competition between the capacity of the network and the stability of the Special Issue on the Role and Control of Random Events in Biological Systems c World Scientic Publishing Company LEARNING SYNFIRE CHAINS: TURNING NOISE INTO SIGNAL JOHN HERTZ and ADAM PRUGEL-BENNETT y

More information

Dynamical Systems and Deep Learning: Overview. Abbas Edalat

Dynamical Systems and Deep Learning: Overview. Abbas Edalat Dynamical Systems and Deep Learning: Overview Abbas Edalat Dynamical Systems The notion of a dynamical system includes the following: A phase or state space, which may be continuous, e.g. the real line,

More information

Statistical mechanics of low-density parity check error-correcting codes over Galois fields

Statistical mechanics of low-density parity check error-correcting codes over Galois fields EUROPHYSICS LETTERS 15 November 2001 Europhys. Lett., 56 (4), pp. 610 616 (2001) Statistical mechanics of low-density parity check error-correcting codes over Galois fields K. Nakamura 1 ( ), Y. Kabashima

More information

Institute for Theoretical Physics. State University of New York. Abstract

Institute for Theoretical Physics. State University of New York. Abstract ITP-SB-95-9 April, 1995 Zeros of the Partition Function for Higher{Spin 2D Ising Models PostScript processed by the SLAC/DESY Libraries on 10 May 1995. Victor Matveev and Robert Shrock Institute for Theoretical

More information

How can ideas from quantum computing improve or speed up neuromorphic models of computation?

How can ideas from quantum computing improve or speed up neuromorphic models of computation? Neuromorphic Computation: Architectures, Models, Applications Associative Memory Models with Adiabatic Quantum Optimization Kathleen Hamilton, Alexander McCaskey, Jonathan Schrock, Neena Imam and Travis

More information

Week 4: Hopfield Network

Week 4: Hopfield Network Week 4: Hopfield Network Phong Le, Willem Zuidema November 20, 2013 Last week we studied multi-layer perceptron, a neural network in which information is only allowed to transmit in one direction (from

More information

COMP9444 Neural Networks and Deep Learning 11. Boltzmann Machines. COMP9444 c Alan Blair, 2017

COMP9444 Neural Networks and Deep Learning 11. Boltzmann Machines. COMP9444 c Alan Blair, 2017 COMP9444 Neural Networks and Deep Learning 11. Boltzmann Machines COMP9444 17s2 Boltzmann Machines 1 Outline Content Addressable Memory Hopfield Network Generative Models Boltzmann Machine Restricted Boltzmann

More information

Through Kaleidoscope Eyes Spin Glasses Experimental Results and Theoretical Concepts

Through Kaleidoscope Eyes Spin Glasses Experimental Results and Theoretical Concepts Through Kaleidoscope Eyes Spin Glasses Experimental Results and Theoretical Concepts Benjamin Hsu December 11, 2007 Abstract A spin glass describes a system of spins on a lattice (or a crystal) where the

More information

Asynchronous updating of threshold-coupled chaotic neurons

Asynchronous updating of threshold-coupled chaotic neurons PRAMANA c Indian Academy of Sciences Vol. 70, No. 6 journal of June 2008 physics pp. 1127 1134 Asynchronous updating of threshold-coupled chaotic neurons MANISH DEV SHRIMALI 1,2,3,, SUDESHNA SINHA 4 and

More information