A. The Hopfield Network. III. Recurrent Neural Networks. Typical Artificial Neuron. Typical Artificial Neuron. Hopfield Network.

Size: px
Start display at page:

Download "A. The Hopfield Network. III. Recurrent Neural Networks. Typical Artificial Neuron. Typical Artificial Neuron. Hopfield Network."

Transcription

1 III. Recurrent Neural Networks A. The Hopfield Network 2/9/15 1 2/9/15 2 Typical Artificial Neuron Typical Artificial Neuron connection weights linear combination activation function inputs output net input (local field) threshold 2/9/15 3 2/9/15 4 Equations Hopfield Network Net input: New neural state: # h i = % $ n j=1 h = Ws θ ( ) ( ) " s i = σ h i s " = σ h & ( θ ' Symmetric weights: = w ji No self-action: w ii = 0 Zero threshold: θ = 0 Bipolar states: s i { 1, +1} Discontinuous bipolar activation function: σ( h) = sgn( h) = % 1, h < 0 $ & +1, h > 0 2/9/15 5 2/9/15 6 1

2 What to do about h = 0? There are several options: σ(0) = +1 σ(0) = 1 σ(0) = 1 or +1 with equal probability h i = 0 no state change (s i = s i ) Not much difference, but be consistent Last option is slightly preferable, since symmetric Positive Coupling Positive sense (sign) Large strength 2/9/15 7 2/9/15 8 Negative Coupling Negative sense (sign) Large strength Either sense (sign) Little strength Weak Coupling 2/9/15 9 2/9/15 10 State = 1 & Local Field < 0 State = 1 & Local Field > 0 h < 0 h > 0 2/9/ /9/

3 State Reverses State = +1 & Local Field > 0 h > 0 h > 0 2/9/ /9/15 14 State = +1 & Local Field < 0 State Reverses h < 0 h < 0 2/9/ /9/15 16 NetLogo Demonstration of Hopfield State Updating Run Hopfield-update.nlogo 2/9/15 17 Hopfield Net as Soft Constraint Satisfaction System States of neurons as yes/no decisions Weights represent soft constraints between decisions hard constraints must be respected soft constraints have degrees of importance Decisions change to better respect constraints Is there an optimal set of decisions that best respects all constraints? 2/9/

4 Convergence Demonstration of Hopfield Net Dynamics I Run Hopfield-dynamics.nlogo Does such a system converge to a stable state? Under what conditions does it converge? There is a sense in which each step relaxes the tension in the system But could a relaxation of one neuron lead to greater tension in other places? 2/9/ /9/15 20 Quantifying Tension If > 0, then s i and want to have the same sign (s i = +1) If < 0, then s i and want to have opposite signs (s i = 1) If = 0, their signs are independent Strength of interaction varies with Define disharmony ( tension ) D ij between neurons i and j: D ij = s i D ij < 0 they are happy D ij > 0 they are unhappy 2/9/15 21 Total Energy of System The energy of the system is the total tension (disharmony) in it: E{ s} = D ij = s i ij ij = 1 2 s i = 1 2 i i 2/9/15 22 j i = 1 2 st Ws s i j Review of Some Vector Notation " x 1 % $ ' x = $ ' = x 1 x n # & x n x T y = n i=1 ( ) T (column vectors) x i y i = x y " x 1 y 1 x 1 y n % xy T $ ' = $ ' # x m y 1 x m y n & x T My = m n i=1 j=1 x i M ij y j (inner product) (outer product) (quadratic form) 2/9/15 23 Another View of Energy The energy measures the disharmony of the neurons states with their local fields (i.e. of opposite sign): E{ s} = 1 2 s i 2/9/15 24 i = 1 2 s i i j = 1 2 s i h i i = 1 2 st h j 4

5 Do State Changes Decrease Energy? Suppose that neuron k changes state Change of energy: ΔE = E { s #} E{ s} = s # i s # j + ij s i 2/9/15 25 = s # k w kj + s k w kj j k = s # k s k ij j k ( ) w kj = Δs k h k < 0 j k Energy Does Not Increase In each step in which a neuron is considered for update: E{s(t + 1)} E{s(t)} 0 Energy cannot increase Energy decreases if any neuron changes Must it stop? 2/9/15 26 Proof of Convergence in Finite Time There is a minimum possible energy: The number of possible states s { 1, +1} n is finite Hence E min = min {E(s) s {±1} n } exists Must reach in a finite number of steps because only finite number of states Conclusion If we do asynchronous updating, the Hopfield net must reach a stable, minimum energy state in a finite number of updates This does not imply that it is a global minimum 2/9/ /9/15 28 Lyapunov Functions A way of showing the convergence of discreteor continuous-time dynamical systems For discrete-time system: need a Lyapunov function E ( energy of the state) E is bounded below (E{s} > E min ) E < ( E) max 0 (energy decreases a certain minimum amount each step) then the system will converge in finite time Problem: finding a suitable Lyapunov function Example Limit Cycle with Synchronous Updating w > 0 w > 0 2/9/ /9/

6 The Hopfield Energy Function is Even A function f is odd if f ( x) = f (x), for all x A function f is even if f ( x) = f (x), for all x Observe: E{ s} = 1 2 ( s)t W( s) = 1 2 st Ws = E{ s} Conceptual Picture of Descent on Energy Surface 2/9/ /9/15 (fig. from Solé & Goodwin) 32 Energy Surface Energy Surface + Flow Lines 2/9/15 (fig. from Haykin Neur. Netw.) 33 2/9/15 (fig. from Haykin Neur. Netw.) 34 Flow Lines Basins of Attraction Bipolar State Space 2/9/15 (fig. from Haykin Neur. Netw.) 35 2/9/

7 Part 3A: Hopfield Network 2/9/15 Basins in Bipolar State Space Demonstration of Hopfield Net Dynamics II Run initialized Hopfield.nlogo energy decreasing paths 2/9/15 37 (fig. from Solé & Goodwin) 39 Restoration 2/9/15 38 Restoration Storing Memories as Attractors 2/9/15 2/9/15 2/9/15 (fig. from Arbib 1995) 40 (fig. from Arbib 1995) 42 Restoration (fig. from Arbib 1995) 41 2/9/15 7

8 Restoration Restoration 2/9/15 (fig. from Arbib 1995) 43 2/9/15 (fig. from Arbib 1995) 44 Completion Completion 2/9/15 (fig. from Arbib 1995) 45 2/9/15 (fig. from Arbib 1995) 46 Completion Completion 2/9/15 (fig. from Arbib 1995) 47 2/9/15 (fig. from Arbib 1995) 48 8

9 Completion Association 2/9/15 (fig. from Arbib 1995) 49 2/9/15 (fig. from Arbib 1995) 50 Association Association 2/9/15 (fig. from Arbib 1995) 51 2/9/15 (fig. from Arbib 1995) 52 Association Association 2/9/15 (fig. from Arbib 1995) 53 2/9/15 (fig. from Arbib 1995) 54 9

10 Applications of Hopfield Memory restoration completion generalization association Hopfield Net for Optimization and for Associative Memory For optimization: we know the weights (couplings) we want to know the minima (solutions) For associative memory: we know the minima (retrieval states) we want to know the weights 2/9/ /9/15 56 Hebb s Rule When an axon of cell A is near enough to excite a cell B and repeatedly or persistently takes part in firing it, some growth or metabolic change takes place in one or both cells such that A s efficiency, as one of the cells firing B, is increased. Donald Hebb (The Organization of Behavior, 1949, p. 62) Neurons that fire together, wire together 2/9/15 57 Hebbian Learning: Imprinted 2/9/15 58 Hebbian Learning: Partial Reconstruction Mathematical Model of Hebbian Learning for One # Let W ij = x x, if i j i j $ % 0, if i = j Since x i x i = x i 2 =1, W = xx T I For simplicity, we will include self-coupling: W = xx T 2/9/ /9/

11 A Single Imprinted is a Stable State Suppose W = xx T Then h = Wx = xx T x = nx since n ( ) 2 x T x = x i 2 = ±1 i=1 Hence, if initial state is s = x, then new state is s = sgn (n x) = x For this reason, scale W by 1/n May be other stable states (e.g., x) n i=1 = n Questions How big is the basin of attraction of the imprinted pattern? How many patterns can be imprinted? Are there unneeded spurious stable states? These issues will be addressed in the context of multiple imprinted patterns 2/9/ /9/15 62 Imprinting Multiple s Let x 1, x 2,, x p be patterns to be imprinted Define the sum-of-outer-products matrix: W ij = 1 n p k=1 x i k x j k ( ) T W = 1 n x k x k p k=1 2/9/15 63 Definition of Covariance Consider samples (x 1, y 1 ), (x 2, y 2 ),, (x N, y N ) Let x = x k and y = y k Covariance of x and y values : C xy = ( x k x )( y k y ) = x k y k x y k x k y + x y = x k y k x y k x k y + x y = x k y k x y x y + x y C xy = x k y k x y 2/9/15 64 Weights & the Covariance Matrix Sample pattern vectors: x 1, x 2,, x p Covariance of i th and j th components: C ij = x i k x j k x i x j If i : x i = 0 (±1 equally likely in all positions) : C ij = x i k x j k nw = pc p = 1 p x i k x j k k =1 Characteristics of Hopfield Memory Distributed ( holographic ) every pattern is stored in every location (weight) Robust correct retrieval in spite of noise or error in patterns correct operation in spite of considerable weight damage or noise 2/9/ /9/

12 Stability of Imprinted Memories Demonstration of Hopfield Net Run Malasri Hopfield Demo 2/9/15 67 Suppose the state is one of the imprinted patterns x m Then: h = Wx m = 1 n x k x k [ ( ) T k ] x m = 1 n x k ( x k ) T x m k = 1 x m n ( x m ) T x m + 1 n x k ( x k ) T x m = x m + 1 n ( x k x m )x k 2/9/15 68 Interpretation of Inner Products x k x m = n if they are identical highly correlated x k x m = n if they are complementary highly correlated (reversed) x k x m = 0 if they are orthogonal largely uncorrelated x k x m measures the crosstalk between patterns k and m 2/9/15 69 Cosines and Inner products u v = u v cosθ uv If u is bipolar, then u 2 = u u = n Hence, u v = n n cosθ uv = n cosθ uv Hence h = x m + x k cosθ km 2/9/15 70 θ uv u v Conditions for Stability Stability of entire pattern : % x m = sgn' x m + & ( x k cosθ km * ) Stability of a single bit : % x m i = sgn' x m i + & ( x k i cosθ km * ) Sufficient Conditions for Instability (Case 1) Suppose x i m = 1. Then unstable if : ( 1) + x k i cosθ km > 0 x i k cosθ km >1 2/9/ /9/

13 Sufficient Conditions for Instability (Case 2) Suppose x i m = +1. Then unstable if : ( +1) + x k i cosθ km < 0 x i k cosθ km < 1 Sufficient Conditions for Stability x k i cosθ km 1 The crosstalk with the sought pattern must be sufficiently small 2/9/ /9/15 74 Capacity of Hopfield Memory Depends on the patterns imprinted If orthogonal, p max = n weight matrix is identity matrix hence every state is stable trivial basins So p max < n Let load parameter α = p / n 2/9/15 75 equations Single Bit Stability Analysis For simplicity, suppose x k are random Then x k x m are sums of n random ±1 binomial distribution Gaussian in range n,, +n with mean µ = 0 and variance σ 2 = n Probability sum > t: ) 1 1 erf # t &, 2 + % (. * $ 2n '- [See Review of Gaussian (Normal) Distributions on course website] 2/9/15 76 Approximation of Probability ( ) Let crosstalk C i m = 1 n x i k x k x m We want Pr C i m >1 Note : nc i m = { } = Pr{ nc m i > n} p n k=1 j=1 x i k x j k x j m A sum of n(p 1) np random ±1s Probability of Bit Instability { } = 1 1 erf n + 2 % Pr nc i m > n ) # &, (. * + $ 2np '-. [ 1 erf( n 2p) ] [ 1 erf( 1 2α )] = 1 2 = 1 2 Variance σ 2 = np 2/9/ /9/15 (fig. from Hertz & al. Intr. Theory Neur. Comp.) 78 13

14 Tabulated Probability of Single-Bit Instability P error 0.1% % % % % /9/15 (table from Hertz & al. Intr. Theory Neur. Comp.) 79 α Orthogonality of Random Bipolar Vectors of High Dimension 99.99% probability of being within 4σ of mean It is 99.99% probable that random n-dimensional vectors will be within ε = 4/ n orthogonal ε = 4% for n = 10,000 Probability of being less orthogonal than ε decreases exponentially with n The brain gets approximate orthogonality by assigning random high-dimensional vectors Pr cosθ > ε u v < 4σ iff u v cosθ < 4 n iff n cosθ < 4 n iff cosθ < 4 / { } = erfc# ε n 2 n = ε 2/9/15 80! " $ & % 1 6 exp ( ε 2 n / 2) exp ( 2ε 2 n / 3) Spurious Attractors Mixture states: sums or differences of odd numbers of retrieval states number increases combinatorially with p shallower, smaller basins basins of mixtures swamp basins of retrieval states overload useful as combinatorial generalizations? self-coupling generates spurious attractors Spin-glass states: not correlated with any finite number of imprinted patterns occur beyond overload because weights effectively random 2/9/15 81 Basins of Mixture States x k 1 x k 3 x mix x k 2 ( ) x i mix = sgn x i k 1 + x i k 2 + x i k 3 2/9/15 82 Fraction of Unstable Imprints (n = 100) Run Hopfield-Capacity Test 2/9/ /9/15 (fig from Bar-Yam) 84 14

15 Number of Stable Imprints (n = 100) Number of Imprints with Basins of Indicated Size (n = 100) 2/9/15 (fig from Bar-Yam) 85 2/9/15 (fig from Bar-Yam) 86 Summary of Capacity Results Absolute limit: p max < α c n = n If a small number of errors in each pattern permitted: p max n If all or most patterns must be recalled perfectly: p max n / log n Recall: all this analysis is based on random patterns Unrealistic, but sometimes can be arranged III B 2/9/

A. The Hopfield Network. III. Recurrent Neural Networks. Typical Artificial Neuron. Typical Artificial Neuron. Hopfield Network.

A. The Hopfield Network. III. Recurrent Neural Networks. Typical Artificial Neuron. Typical Artificial Neuron. Hopfield Network. Part 3A: Hopfield Network III. Recurrent Neural Networks A. The Hopfield Network 1 2 Typical Artificial Neuron Typical Artificial Neuron connection weights linear combination activation function inputs

More information

( ) T. Reading. Lecture 22. Definition of Covariance. Imprinting Multiple Patterns. Characteristics of Hopfield Memory

( ) T. Reading. Lecture 22. Definition of Covariance. Imprinting Multiple Patterns. Characteristics of Hopfield Memory Part 3: Autonomous Agents /8/07 Reading Lecture 22 Flake, ch. 20 ( Genetics and Evolution ) /8/07 /8/07 2 Imprinting Multiple Patterns Let x, x 2,, x p be patterns to be imprinted Define the sum-of-outer-products

More information

7 Rate-Based Recurrent Networks of Threshold Neurons: Basis for Associative Memory

7 Rate-Based Recurrent Networks of Threshold Neurons: Basis for Associative Memory Physics 178/278 - David Kleinfeld - Fall 2005; Revised for Winter 2017 7 Rate-Based Recurrent etworks of Threshold eurons: Basis for Associative Memory 7.1 A recurrent network with threshold elements The

More information

Hopfield Networks. (Excerpt from a Basic Course at IK 2008) Herbert Jaeger. Jacobs University Bremen

Hopfield Networks. (Excerpt from a Basic Course at IK 2008) Herbert Jaeger. Jacobs University Bremen Hopfield Networks (Excerpt from a Basic Course at IK 2008) Herbert Jaeger Jacobs University Bremen Building a model of associative memory should be simple enough... Our brain is a neural network Individual

More information

7 Recurrent Networks of Threshold (Binary) Neurons: Basis for Associative Memory

7 Recurrent Networks of Threshold (Binary) Neurons: Basis for Associative Memory Physics 178/278 - David Kleinfeld - Winter 2019 7 Recurrent etworks of Threshold (Binary) eurons: Basis for Associative Memory 7.1 The network The basic challenge in associative networks, also referred

More information

Hopfield Network for Associative Memory

Hopfield Network for Associative Memory CSE 5526: Introduction to Neural Networks Hopfield Network for Associative Memory 1 The next few units cover unsupervised models Goal: learn the distribution of a set of observations Some observations

More information

CSE 5526: Introduction to Neural Networks Hopfield Network for Associative Memory

CSE 5526: Introduction to Neural Networks Hopfield Network for Associative Memory CSE 5526: Introduction to Neural Networks Hopfield Network for Associative Memory Part VII 1 The basic task Store a set of fundamental memories {ξξ 1, ξξ 2,, ξξ MM } so that, when presented a new pattern

More information

Neural Nets and Symbolic Reasoning Hopfield Networks

Neural Nets and Symbolic Reasoning Hopfield Networks Neural Nets and Symbolic Reasoning Hopfield Networks Outline The idea of pattern completion The fast dynamics of Hopfield networks Learning with Hopfield networks Emerging properties of Hopfield networks

More information

Hopfield Neural Network and Associative Memory. Typical Myelinated Vertebrate Motoneuron (Wikipedia) Topic 3 Polymers and Neurons Lecture 5

Hopfield Neural Network and Associative Memory. Typical Myelinated Vertebrate Motoneuron (Wikipedia) Topic 3 Polymers and Neurons Lecture 5 Hopfield Neural Network and Associative Memory Typical Myelinated Vertebrate Motoneuron (Wikipedia) PHY 411-506 Computational Physics 2 1 Wednesday, March 5 1906 Nobel Prize in Physiology or Medicine.

More information

In biological terms, memory refers to the ability of neural systems to store activity patterns and later recall them when required.

In biological terms, memory refers to the ability of neural systems to store activity patterns and later recall them when required. In biological terms, memory refers to the ability of neural systems to store activity patterns and later recall them when required. In humans, association is known to be a prominent feature of memory.

More information

Using a Hopfield Network: A Nuts and Bolts Approach

Using a Hopfield Network: A Nuts and Bolts Approach Using a Hopfield Network: A Nuts and Bolts Approach November 4, 2013 Gershon Wolfe, Ph.D. Hopfield Model as Applied to Classification Hopfield network Training the network Updating nodes Sequencing of

More information

Motivation. Lecture 23. The Stochastic Neuron ( ) Properties of Logistic Sigmoid. Logistic Sigmoid With Varying T. Logistic Sigmoid T = 0.

Motivation. Lecture 23. The Stochastic Neuron ( ) Properties of Logistic Sigmoid. Logistic Sigmoid With Varying T. Logistic Sigmoid T = 0. Motivation Lecture 23 Idea: with low probability, go against the local field move up the energy surface make the wrong microdecision Potential value for optimization: escape from local optima Potential

More information

Artificial Intelligence Hopfield Networks

Artificial Intelligence Hopfield Networks Artificial Intelligence Hopfield Networks Andrea Torsello Network Topologies Single Layer Recurrent Network Bidirectional Symmetric Connection Binary / Continuous Units Associative Memory Optimization

More information

3.3 Discrete Hopfield Net An iterative autoassociative net similar to the nets described in the previous sections has been developed by Hopfield

3.3 Discrete Hopfield Net An iterative autoassociative net similar to the nets described in the previous sections has been developed by Hopfield 3.3 Discrete Hopfield Net An iterative autoassociative net similar to the nets described in the previous sections has been developed by Hopfield (1982, 1984). - The net is a fully interconnected neural

More information

Hopfield Neural Network

Hopfield Neural Network Lecture 4 Hopfield Neural Network Hopfield Neural Network A Hopfield net is a form of recurrent artificial neural network invented by John Hopfield. Hopfield nets serve as content-addressable memory systems

More information

Artificial Neural Networks Examination, March 2004

Artificial Neural Networks Examination, March 2004 Artificial Neural Networks Examination, March 2004 Instructions There are SIXTY questions (worth up to 60 marks). The exam mark (maximum 60) will be added to the mark obtained in the laborations (maximum

More information

Hopfield networks. Lluís A. Belanche Soft Computing Research Group

Hopfield networks. Lluís A. Belanche Soft Computing Research Group Lluís A. Belanche belanche@lsi.upc.edu Soft Computing Research Group Dept. de Llenguatges i Sistemes Informàtics (Software department) Universitat Politècnica de Catalunya 2010-2011 Introduction Content-addressable

More information

Logic Learning in Hopfield Networks

Logic Learning in Hopfield Networks Logic Learning in Hopfield Networks Saratha Sathasivam (Corresponding author) School of Mathematical Sciences, University of Science Malaysia, Penang, Malaysia E-mail: saratha@cs.usm.my Wan Ahmad Tajuddin

More information

2- AUTOASSOCIATIVE NET - The feedforward autoassociative net considered in this section is a special case of the heteroassociative net.

2- AUTOASSOCIATIVE NET - The feedforward autoassociative net considered in this section is a special case of the heteroassociative net. 2- AUTOASSOCIATIVE NET - The feedforward autoassociative net considered in this section is a special case of the heteroassociative net. - For an autoassociative net, the training input and target output

More information

Hertz, Krogh, Palmer: Introduction to the Theory of Neural Computation. Addison-Wesley Publishing Company (1991). (v ji (1 x i ) + (1 v ji )x i )

Hertz, Krogh, Palmer: Introduction to the Theory of Neural Computation. Addison-Wesley Publishing Company (1991). (v ji (1 x i ) + (1 v ji )x i ) Symmetric Networks Hertz, Krogh, Palmer: Introduction to the Theory of Neural Computation. Addison-Wesley Publishing Company (1991). How can we model an associative memory? Let M = {v 1,..., v m } be a

More information

Neural Networks. Prof. Dr. Rudolf Kruse. Computational Intelligence Group Faculty for Computer Science

Neural Networks. Prof. Dr. Rudolf Kruse. Computational Intelligence Group Faculty for Computer Science Neural Networks Prof. Dr. Rudolf Kruse Computational Intelligence Group Faculty for Computer Science kruse@iws.cs.uni-magdeburg.de Rudolf Kruse Neural Networks 1 Hopfield Networks Rudolf Kruse Neural Networks

More information

Learning and Memory in Neural Networks

Learning and Memory in Neural Networks Learning and Memory in Neural Networks Guy Billings, Neuroinformatics Doctoral Training Centre, The School of Informatics, The University of Edinburgh, UK. Neural networks consist of computational units

More information

COMP9444 Neural Networks and Deep Learning 11. Boltzmann Machines. COMP9444 c Alan Blair, 2017

COMP9444 Neural Networks and Deep Learning 11. Boltzmann Machines. COMP9444 c Alan Blair, 2017 COMP9444 Neural Networks and Deep Learning 11. Boltzmann Machines COMP9444 17s2 Boltzmann Machines 1 Outline Content Addressable Memory Hopfield Network Generative Models Boltzmann Machine Restricted Boltzmann

More information

Hopfield Networks and Boltzmann Machines. Christian Borgelt Artificial Neural Networks and Deep Learning 296

Hopfield Networks and Boltzmann Machines. Christian Borgelt Artificial Neural Networks and Deep Learning 296 Hopfield Networks and Boltzmann Machines Christian Borgelt Artificial Neural Networks and Deep Learning 296 Hopfield Networks A Hopfield network is a neural network with a graph G = (U,C) that satisfies

More information

Neural Networks. Hopfield Nets and Auto Associators Fall 2017

Neural Networks. Hopfield Nets and Auto Associators Fall 2017 Neural Networks Hopfield Nets and Auto Associators Fall 2017 1 Story so far Neural networks for computation All feedforward structures But what about.. 2 Loopy network Θ z = ቊ +1 if z > 0 1 if z 0 y i

More information

Neural networks: Unsupervised learning

Neural networks: Unsupervised learning Neural networks: Unsupervised learning 1 Previously The supervised learning paradigm: given example inputs x and target outputs t learning the mapping between them the trained network is supposed to give

More information

How can ideas from quantum computing improve or speed up neuromorphic models of computation?

How can ideas from quantum computing improve or speed up neuromorphic models of computation? Neuromorphic Computation: Architectures, Models, Applications Associative Memory Models with Adiabatic Quantum Optimization Kathleen Hamilton, Alexander McCaskey, Jonathan Schrock, Neena Imam and Travis

More information

Introduction to Neural Networks

Introduction to Neural Networks Introduction to Neural Networks What are (Artificial) Neural Networks? Models of the brain and nervous system Highly parallel Process information much more like the brain than a serial computer Learning

More information

Week 4: Hopfield Network

Week 4: Hopfield Network Week 4: Hopfield Network Phong Le, Willem Zuidema November 20, 2013 Last week we studied multi-layer perceptron, a neural network in which information is only allowed to transmit in one direction (from

More information

Machine Learning. Neural Networks

Machine Learning. Neural Networks Machine Learning Neural Networks Bryan Pardo, Northwestern University, Machine Learning EECS 349 Fall 2007 Biological Analogy Bryan Pardo, Northwestern University, Machine Learning EECS 349 Fall 2007 THE

More information

Consider the way we are able to retrieve a pattern from a partial key as in Figure 10 1.

Consider the way we are able to retrieve a pattern from a partial key as in Figure 10 1. CompNeuroSci Ch 10 September 8, 2004 10 Associative Memory Networks 101 Introductory concepts Consider the way we are able to retrieve a pattern from a partial key as in Figure 10 1 Figure 10 1: A key

More information

Computational Intelligence Lecture 6: Associative Memory

Computational Intelligence Lecture 6: Associative Memory Computational Intelligence Lecture 6: Associative Memory Farzaneh Abdollahi Department of Electrical Engineering Amirkabir University of Technology Fall 2011 Farzaneh Abdollahi Computational Intelligence

More information

CHAPTER 3. Pattern Association. Neural Networks

CHAPTER 3. Pattern Association. Neural Networks CHAPTER 3 Pattern Association Neural Networks Pattern Association learning is the process of forming associations between related patterns. The patterns we associate together may be of the same type or

More information

Artificial Neural Networks Examination, June 2005

Artificial Neural Networks Examination, June 2005 Artificial Neural Networks Examination, June 2005 Instructions There are SIXTY questions. (The pass mark is 30 out of 60). For each question, please select a maximum of ONE of the given answers (either

More information

Neural networks. Chapter 19, Sections 1 5 1

Neural networks. Chapter 19, Sections 1 5 1 Neural networks Chapter 19, Sections 1 5 Chapter 19, Sections 1 5 1 Outline Brains Neural networks Perceptrons Multilayer perceptrons Applications of neural networks Chapter 19, Sections 1 5 2 Brains 10

More information

Associative Memories (I) Hopfield Networks

Associative Memories (I) Hopfield Networks Associative Memories (I) Davide Bacciu Dipartimento di Informatica Università di Pisa bacciu@di.unipi.it Applied Brain Science - Computational Neuroscience (CNS) A Pun Associative Memories Introduction

More information

Iterative Autoassociative Net: Bidirectional Associative Memory

Iterative Autoassociative Net: Bidirectional Associative Memory POLYTECHNIC UNIVERSITY Department of Computer and Information Science Iterative Autoassociative Net: Bidirectional Associative Memory K. Ming Leung Abstract: Iterative associative neural networks are introduced.

More information

Neural Networks Lecture 6: Associative Memory II

Neural Networks Lecture 6: Associative Memory II Neural Networks Lecture 6: Associative Memory II H.A Talebi Farzaneh Abdollahi Department of Electrical Engineering Amirkabir University of Technology Winter 2011. A. Talebi, Farzaneh Abdollahi Neural

More information

Master Recherche IAC TC2: Apprentissage Statistique & Optimisation

Master Recherche IAC TC2: Apprentissage Statistique & Optimisation Master Recherche IAC TC2: Apprentissage Statistique & Optimisation Alexandre Allauzen Anne Auger Michèle Sebag LIMSI LRI Oct. 4th, 2012 This course Bio-inspired algorithms Classical Neural Nets History

More information

Neural networks. Chapter 20, Section 5 1

Neural networks. Chapter 20, Section 5 1 Neural networks Chapter 20, Section 5 Chapter 20, Section 5 Outline Brains Neural networks Perceptrons Multilayer perceptrons Applications of neural networks Chapter 20, Section 5 2 Brains 0 neurons of

More information

Basic Principles of Unsupervised and Unsupervised

Basic Principles of Unsupervised and Unsupervised Basic Principles of Unsupervised and Unsupervised Learning Toward Deep Learning Shun ichi Amari (RIKEN Brain Science Institute) collaborators: R. Karakida, M. Okada (U. Tokyo) Deep Learning Self Organization

More information

Artificial Neural Networks Examination, June 2004

Artificial Neural Networks Examination, June 2004 Artificial Neural Networks Examination, June 2004 Instructions There are SIXTY questions (worth up to 60 marks). The exam mark (maximum 60) will be added to the mark obtained in the laborations (maximum

More information

Neural Networks for Machine Learning. Lecture 11a Hopfield Nets

Neural Networks for Machine Learning. Lecture 11a Hopfield Nets Neural Networks for Machine Learning Lecture 11a Hopfield Nets Geoffrey Hinton Nitish Srivastava, Kevin Swersky Tijmen Tieleman Abdel-rahman Mohamed Hopfield Nets A Hopfield net is composed of binary threshold

More information

Plan. Perceptron Linear discriminant. Associative memories Hopfield networks Chaotic networks. Multilayer perceptron Backpropagation

Plan. Perceptron Linear discriminant. Associative memories Hopfield networks Chaotic networks. Multilayer perceptron Backpropagation Neural Networks Plan Perceptron Linear discriminant Associative memories Hopfield networks Chaotic networks Multilayer perceptron Backpropagation Perceptron Historically, the first neural net Inspired

More information

Lecture 7 Artificial neural networks: Supervised learning

Lecture 7 Artificial neural networks: Supervised learning Lecture 7 Artificial neural networks: Supervised learning Introduction, or how the brain works The neuron as a simple computing element The perceptron Multilayer neural networks Accelerated learning in

More information

Neural Network Essentials 1

Neural Network Essentials 1 Neural Network Essentials Draft: Associative Memory Chapter Anthony S. Maida University of Louisiana at Lafayette, USA June 6, 205 Copyright c 2000 2003 by Anthony S. Maida Contents 0. Associative memory.............................

More information

Stochastic Networks Variations of the Hopfield model

Stochastic Networks Variations of the Hopfield model 4 Stochastic Networks 4. Variations of the Hopfield model In the previous chapter we showed that Hopfield networks can be used to provide solutions to combinatorial problems that can be expressed as the

More information

Hebb rule book: 'The Organization of Behavior' Theory about the neural bases of learning

Hebb rule book: 'The Organization of Behavior' Theory about the neural bases of learning PCA by neurons Hebb rule 1949 book: 'The Organization of Behavior' Theory about the neural bases of learning Learning takes place in synapses. Synapses get modified, they get stronger when the pre- and

More information

Neural Networks. Chapter 18, Section 7. TB Artificial Intelligence. Slides from AIMA 1/ 21

Neural Networks. Chapter 18, Section 7. TB Artificial Intelligence. Slides from AIMA   1/ 21 Neural Networks Chapter 8, Section 7 TB Artificial Intelligence Slides from AIMA http://aima.cs.berkeley.edu / 2 Outline Brains Neural networks Perceptrons Multilayer perceptrons Applications of neural

More information

Memories Associated with Single Neurons and Proximity Matrices

Memories Associated with Single Neurons and Proximity Matrices Memories Associated with Single Neurons and Proximity Matrices Subhash Kak Oklahoma State University, Stillwater Abstract: This paper extends the treatment of single-neuron memories obtained by the use

More information

Memory capacity of networks with stochastic binary synapses

Memory capacity of networks with stochastic binary synapses Memory capacity of networks with stochastic binary synapses Alexis M. Dubreuil 1,, Yali Amit 3 and Nicolas Brunel 1, 1 UMR 8118, CNRS, Université Paris Descartes, Paris, France Departments of Statistics

More information

The Hebb rule Neurons that fire together wire together.

The Hebb rule Neurons that fire together wire together. Unsupervised learning The Hebb rule Neurons that fire together wire together. PCA RF development with PCA Classical Conditioning and Hebbʼs rule Ear A Nose B Tongue When an axon in cell A is near enough

More information

Neural networks. Chapter 20. Chapter 20 1

Neural networks. Chapter 20. Chapter 20 1 Neural networks Chapter 20 Chapter 20 1 Outline Brains Neural networks Perceptrons Multilayer networks Applications of neural networks Chapter 20 2 Brains 10 11 neurons of > 20 types, 10 14 synapses, 1ms

More information

Properties of Associative Memory Model with the β-th-order Synaptic Decay

Properties of Associative Memory Model with the β-th-order Synaptic Decay Regular Paper Properties of Associative Memory Model with the β-th-order Synaptic Decay Ryota Miyata 1,2 Toru Aonishi 1 Jun Tsuzurugi 3 Koji Kurata 4,a) Received: January 30, 2013, Revised: March 20, 2013/June

More information

Terminal attractor optical associative memory with adaptive control parameter

Terminal attractor optical associative memory with adaptive control parameter 1 June 1998 Ž. Optics Communications 151 1998 353 365 Full length article Terminal attractor optical associative memory with adaptive control parameter Xin Lin a,), Junji Ohtsubo b, Masahiko Mori a a Electrotechnical

More information

Financial Informatics XVII:

Financial Informatics XVII: Financial Informatics XVII: Unsupervised Learning Khurshid Ahmad, Professor of Computer Science, Department of Computer Science Trinity College, Dublin-, IRELAND November 9 th, 8. https://www.cs.tcd.ie/khurshid.ahmad/teaching.html

More information

Artifical Neural Networks

Artifical Neural Networks Neural Networks Artifical Neural Networks Neural Networks Biological Neural Networks.................................. Artificial Neural Networks................................... 3 ANN Structure...........................................

More information

Neural Networks Lecture 2:Single Layer Classifiers

Neural Networks Lecture 2:Single Layer Classifiers Neural Networks Lecture 2:Single Layer Classifiers H.A Talebi Farzaneh Abdollahi Department of Electrical Engineering Amirkabir University of Technology Winter 2011. A. Talebi, Farzaneh Abdollahi Neural

More information

18.6 Regression and Classification with Linear Models

18.6 Regression and Classification with Linear Models 18.6 Regression and Classification with Linear Models 352 The hypothesis space of linear functions of continuous-valued inputs has been used for hundreds of years A univariate linear function (a straight

More information

Neural Network Analysis of Russian Parliament Voting Patterns

Neural Network Analysis of Russian Parliament Voting Patterns eural etwork Analysis of Russian Parliament Voting Patterns Dusan Husek Acad. of Sci. of the Czech Republic, Institute of Computer Science, the Czech Republic Email: dusan@cs.cas.cz Alexander A. Frolov

More information

Storage Capacity of Letter Recognition in Hopfield Networks

Storage Capacity of Letter Recognition in Hopfield Networks Storage Capacity of Letter Recognition in Hopfield Networks Gang Wei (gwei@cs.dal.ca) Zheyuan Yu (zyu@cs.dal.ca) Faculty of Computer Science, Dalhousie University, Halifax, N.S., Canada B3H 1W5 Abstract:

More information

Pattern Association or Associative Networks. Jugal Kalita University of Colorado at Colorado Springs

Pattern Association or Associative Networks. Jugal Kalita University of Colorado at Colorado Springs Pattern Association or Associative Networks Jugal Kalita University of Colorado at Colorado Springs To an extent, learning is forming associations. Human memory associates similar items, contrary/opposite

More information

CS:4420 Artificial Intelligence

CS:4420 Artificial Intelligence CS:4420 Artificial Intelligence Spring 2018 Neural Networks Cesare Tinelli The University of Iowa Copyright 2004 18, Cesare Tinelli and Stuart Russell a a These notes were originally developed by Stuart

More information

On the Hopfield algorithm. Foundations and examples

On the Hopfield algorithm. Foundations and examples General Mathematics Vol. 13, No. 2 (2005), 35 50 On the Hopfield algorithm. Foundations and examples Nicolae Popoviciu and Mioara Boncuţ Dedicated to Professor Dumitru Acu on his 60th birthday Abstract

More information

Introduction to Chaotic Dynamics and Fractals

Introduction to Chaotic Dynamics and Fractals Introduction to Chaotic Dynamics and Fractals Abbas Edalat ae@ic.ac.uk Imperial College London Bertinoro, June 2013 Topics covered Discrete dynamical systems Periodic doublig route to chaos Iterated Function

More information

Chapter 10 Associative Memory Networks We are in for a surprise though!

Chapter 10 Associative Memory Networks We are in for a surprise though! We are in for a surprise though! 11 If we return to our calculated W based on the stored ξ and use x = (-1,1,-1,-1,-1,1) T as the input then we get the output y = (-1,1,-1,-1,-1,1) T, i.e. the same output

More information

arxiv: v2 [nlin.ao] 19 May 2015

arxiv: v2 [nlin.ao] 19 May 2015 Efficient and optimal binary Hopfield associative memory storage using minimum probability flow arxiv:1204.2916v2 [nlin.ao] 19 May 2015 Christopher Hillar Redwood Center for Theoretical Neuroscience University

More information

patterns (\attractors"). For instance, an associative memory may have tocomplete (or correct) an incomplete

patterns (\attractors). For instance, an associative memory may have tocomplete (or correct) an incomplete Chapter 6 Associative Models Association is the task of mapping input patterns to target patterns (\attractors"). For instance, an associative memory may have tocomplete (or correct) an incomplete (or

More information

Systems Biology: A Personal View IX. Landscapes. Sitabhra Sinha IMSc Chennai

Systems Biology: A Personal View IX. Landscapes. Sitabhra Sinha IMSc Chennai Systems Biology: A Personal View IX. Landscapes Sitabhra Sinha IMSc Chennai Fitness Landscapes Sewall Wright pioneered the description of how genotype or phenotypic fitness are related in terms of a fitness

More information

Lecture 5: Logistic Regression. Neural Networks

Lecture 5: Logistic Regression. Neural Networks Lecture 5: Logistic Regression. Neural Networks Logistic regression Comparison with generative models Feed-forward neural networks Backpropagation Tricks for training neural networks COMP-652, Lecture

More information

SCRAM: Statistically Converging Recurrent Associative Memory

SCRAM: Statistically Converging Recurrent Associative Memory SCRAM: Statistically Converging Recurrent Associative Memory Sylvain Chartier Sébastien Hélie Mounir Boukadoum Robert Proulx Department of computer Department of computer science, UQAM, Montreal, science,

More information

Introduction to Artificial Neural Networks

Introduction to Artificial Neural Networks Facultés Universitaires Notre-Dame de la Paix 27 March 2007 Outline 1 Introduction 2 Fundamentals Biological neuron Artificial neuron Artificial Neural Network Outline 3 Single-layer ANN Perceptron Adaline

More information

Machine Learning and Adaptive Systems. Lectures 5 & 6

Machine Learning and Adaptive Systems. Lectures 5 & 6 ECE656- Lectures 5 & 6, Professor Department of Electrical and Computer Engineering Colorado State University Fall 2015 c. Performance Learning-LMS Algorithm (Widrow 1960) The iterative procedure in steepest

More information

Artificial Neural Networks Examination, March 2002

Artificial Neural Networks Examination, March 2002 Artificial Neural Networks Examination, March 2002 Instructions There are SIXTY questions (worth up to 60 marks). The exam mark (maximum 60) will be added to the mark obtained in the laborations (maximum

More information

Using Variable Threshold to Increase Capacity in a Feedback Neural Network

Using Variable Threshold to Increase Capacity in a Feedback Neural Network Using Variable Threshold to Increase Capacity in a Feedback Neural Network Praveen Kuruvada Abstract: The article presents new results on the use of variable thresholds to increase the capacity of a feedback

More information

Branislav K. Nikolić

Branislav K. Nikolić Interdisciplinary Topics in Complex Systems: Cellular Automata, Self-Organized Criticality, Neural Networks and Spin Glasses Branislav K. Nikolić Department of Physics and Astronomy, University of Delaware,

More information

In this section, we review some basics of modeling via linear algebra: finding a line of best fit, Hebbian learning, pattern classification.

In this section, we review some basics of modeling via linear algebra: finding a line of best fit, Hebbian learning, pattern classification. Linear Models In this section, we review some basics of modeling via linear algebra: finding a line of best fit, Hebbian learning, pattern classification. Best Fitting Line In this section, we examine

More information

Computational Models of Human Cognition

Computational Models of Human Cognition Computational Models of Human Cognition Models A model is a means of representing the structure or workings of a system or object. e.g., model car e.g., economic model e.g., psychophysics model 2500 loudness

More information

Neural networks III: The delta learning rule with semilinear activation function

Neural networks III: The delta learning rule with semilinear activation function Neural networks III: The delta learning rule with semilinear activation function The standard delta rule essentially implements gradient descent in sum-squared error for linear activation functions. We

More information

COMP 551 Applied Machine Learning Lecture 14: Neural Networks

COMP 551 Applied Machine Learning Lecture 14: Neural Networks COMP 551 Applied Machine Learning Lecture 14: Neural Networks Instructor: Ryan Lowe (ryan.lowe@mail.mcgill.ca) Slides mostly by: Class web page: www.cs.mcgill.ca/~hvanho2/comp551 Unless otherwise noted,

More information

Slide02 Haykin Chapter 2: Learning Processes

Slide02 Haykin Chapter 2: Learning Processes Slide2 Haykin Chapter 2: Learning Processes CPSC 636-6 Instructor: Yoonsuck Choe Spring 28 Introduction Property of primary significance in nnet: learn from its environment, and improve its performance

More information

Hopfield and Potts-Hopfield Networks

Hopfield and Potts-Hopfield Networks Hopfield and Potts-Hopfield Networks Andy Somogyi Dec 10, 2010 1 Introduction Many early models of neural networks lacked mathematical rigor or analysis. They were, by and large, ad hoc models. John Hopfield

More information

Storkey Learning Rules for Hopfield Networks

Storkey Learning Rules for Hopfield Networks Storey Learning Rules for Hopfield Networs Xiao Hu Bejing Technology Group September 18, 2013 Abstract We summarize the Storey Learning Rules for the Hopfield Model, and evaluate performance relative to

More information

22c145-Fall 01: Neural Networks. Neural Networks. Readings: Chapter 19 of Russell & Norvig. Cesare Tinelli 1

22c145-Fall 01: Neural Networks. Neural Networks. Readings: Chapter 19 of Russell & Norvig. Cesare Tinelli 1 Neural Networks Readings: Chapter 19 of Russell & Norvig. Cesare Tinelli 1 Brains as Computational Devices Brains advantages with respect to digital computers: Massively parallel Fault-tolerant Reliable

More information

ESANN'2001 proceedings - European Symposium on Artificial Neural Networks Bruges (Belgium), April 2001, D-Facto public., ISBN ,

ESANN'2001 proceedings - European Symposium on Artificial Neural Networks Bruges (Belgium), April 2001, D-Facto public., ISBN , Relevance determination in learning vector quantization Thorsten Bojer, Barbara Hammer, Daniel Schunk, and Katharina Tluk von Toschanowitz University of Osnabrück, Department of Mathematics/ Computer Science,

More information

Effects of Interactive Function Forms in a Self-Organized Critical Model Based on Neural Networks

Effects of Interactive Function Forms in a Self-Organized Critical Model Based on Neural Networks Commun. Theor. Phys. (Beijing, China) 40 (2003) pp. 607 613 c International Academic Publishers Vol. 40, No. 5, November 15, 2003 Effects of Interactive Function Forms in a Self-Organized Critical Model

More information

Artificial Neural Networks The Introduction

Artificial Neural Networks The Introduction Artificial Neural Networks The Introduction 01001110 01100101 01110101 01110010 01101111 01101110 01101111 01110110 01100001 00100000 01110011 01101011 01110101 01110000 01101001 01101110 01100001 00100000

More information

Computational Neuroscience. Structure Dynamics Implementation Algorithm Computation - Function

Computational Neuroscience. Structure Dynamics Implementation Algorithm Computation - Function Computational Neuroscience Structure Dynamics Implementation Algorithm Computation - Function Learning at psychological level Classical conditioning Hebb's rule When an axon of cell A is near enough to

More information

Introduction to Neural Networks

Introduction to Neural Networks Introduction to Neural Networks Vincent Barra LIMOS, UMR CNRS 6158, Blaise Pascal University, Clermont-Ferrand, FRANCE January 4, 2011 1 / 46 1 INTRODUCTION Introduction History Brain vs. ANN Biological

More information

Storage capacity of hierarchically coupled associative memories

Storage capacity of hierarchically coupled associative memories Storage capacity of hierarchically coupled associative memories Rogério M. Gomes CEFET/MG - LSI Av. Amazonas, 7675 Belo Horizonte, MG, Brasil rogerio@lsi.cefetmg.br Antônio P. Braga PPGEE-UFMG - LITC Av.

More information

Associative Neural Networks using Matlab

Associative Neural Networks using Matlab Associative Neural Networks using Matlab Example 1: Write a matlab program to find the weight matrix of an auto associative net to store the vector (1 1-1 -1). Test the response of the network by presenting

More information

Neural Networks and Fuzzy Logic Rajendra Dept.of CSE ASCET

Neural Networks and Fuzzy Logic Rajendra Dept.of CSE ASCET Unit-. Definition Neural network is a massively parallel distributed processing system, made of highly inter-connected neural computing elements that have the ability to learn and thereby acquire knowledge

More information

EEE 241: Linear Systems

EEE 241: Linear Systems EEE 4: Linear Systems Summary # 3: Introduction to artificial neural networks DISTRIBUTED REPRESENTATION An ANN consists of simple processing units communicating with each other. The basic elements of

More information

Last update: October 26, Neural networks. CMSC 421: Section Dana Nau

Last update: October 26, Neural networks. CMSC 421: Section Dana Nau Last update: October 26, 207 Neural networks CMSC 42: Section 8.7 Dana Nau Outline Applications of neural networks Brains Neural network units Perceptrons Multilayer perceptrons 2 Example Applications

More information

Hopfield Network Recurrent Netorks

Hopfield Network Recurrent Netorks Hopfield Network Recurrent Netorks w 2 w n E.P. P.E. y (k) Auto-Associative Memory: Given an initial n bit pattern returns the closest stored (associated) pattern. No P.E. self-feedback! w 2 w n2 E.P.2

More information

Hebbian Learning II. Robert Jacobs Department of Brain & Cognitive Sciences University of Rochester. July 20, 2017

Hebbian Learning II. Robert Jacobs Department of Brain & Cognitive Sciences University of Rochester. July 20, 2017 Hebbian Learning II Robert Jacobs Department of Brain & Cognitive Sciences University of Rochester July 20, 2017 Goals Teach about one-half of an undergraduate course on Linear Algebra Understand when

More information

Associative Memory : Soft Computing Course Lecture 21 24, notes, slides RC Chakraborty, Aug.

Associative Memory : Soft Computing Course Lecture 21 24, notes, slides   RC Chakraborty,  Aug. Associative Memory : Soft Computing Course Lecture 21 24, notes, slides www.myreaders.info/, RC Chakraborty, e-mail rcchak@gmail.com, Aug. 10, 2010 http://www.myreaders.info/html/soft_computing.html www.myreaders.info

More information

HOPFIELD neural networks (HNNs) are a class of nonlinear

HOPFIELD neural networks (HNNs) are a class of nonlinear IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS II: EXPRESS BRIEFS, VOL. 52, NO. 4, APRIL 2005 213 Stochastic Noise Process Enhancement of Hopfield Neural Networks Vladimir Pavlović, Member, IEEE, Dan Schonfeld,

More information

Artificial Neural Networks

Artificial Neural Networks Artificial Neural Networks 鮑興國 Ph.D. National Taiwan University of Science and Technology Outline Perceptrons Gradient descent Multi-layer networks Backpropagation Hidden layer representations Examples

More information

DEEP LEARNING AND NEURAL NETWORKS: BACKGROUND AND HISTORY

DEEP LEARNING AND NEURAL NETWORKS: BACKGROUND AND HISTORY DEEP LEARNING AND NEURAL NETWORKS: BACKGROUND AND HISTORY 1 On-line Resources http://neuralnetworksanddeeplearning.com/index.html Online book by Michael Nielsen http://matlabtricks.com/post-5/3x3-convolution-kernelswith-online-demo

More information