Hopfield networks. Lluís A. Belanche Soft Computing Research Group

Size: px
Start display at page:

Download "Hopfield networks. Lluís A. Belanche Soft Computing Research Group"

Transcription

1 Lluís A. Belanche Soft Computing Research Group Dept. de Llenguatges i Sistemes Informàtics (Software department) Universitat Politècnica de Catalunya

2 Introduction Content-addressable autoassociative memory Deals with incomplete/erroneous keys: Julia xxxxázar: Rxxxela H ENERGY FUNCTION Attractors (memories) Basins of attraction A mechanical system tends to minimum-energy states (e.g. a pole) Symmetric Hopfield networks have an analogous behaviour: low-energy states attractors memories

3 Introduction A Hopfield network is a singlelayer recurrent neural network, where the only layer has n neurons (as many as the input dimension)

4 Introduction The data sample is a list of keys (aka patterns) L = {ξ i 1 i l}, with ξ i {0, 1} n that we want to associate with themselves. What is the meaning of autoassociativity? when a new pattern Ψ is shown to the network, the output is the key in L closest (in Hamming distance) to Ψ. Evidently, this can be achieved by direct programming; yet in hardware it is quite complex (and the Hopfield net does it quite distinctly) We therefore set up two goals: 1. Content-addressable autoassociative memory (not easy) 2. Tolerance to small errors, that will be corrected (rather hard)

5 Intuitive analysis The state of the network is the vector of activations e of the n neurons. The state space is {0, 1} n : ξ ξ ξ 3 When the network evolves, it traverses the hypercube

6 Formal analysis (I) For convenience, we change from {0, 1} to { 1, +1}, by s = 2e 1. Dynamics of a neuron: s i := sgn j=1 w ij s j θ i, with sgn(z) = { +1 : z 0 1 : z < 0 Let us simplify the analysis by using θ i = 0, and uncorrelated keys ξ i (from a uniform distribution).

7 Formal analysis (II) How are neurons chosen to update their activations? 1. Synchronous: all at once 2. Asynchronous: there is a sequential order or a fair probability distribution The memories do not change; the end attractor (the recovered memory) does We will assume asynchronous updating When does the process stop? When there are no more changes in the network state (we are in a stable state)

8 Formal analysis (III) PROCEDURE: 1. Formula for setting the weights w ij to store the keys ξ i L 2. Sanity check that the keys ξ i L are stable (they are autoassociated with themselves) 3. Check that small perturbations of the keys ξ i L are corrected (they are associated with the closest pattern in L)

9 Formal analysis (IV) Case l = 1 (only one pattern ξ): Present ξ to the net: s i := ξ i, 1 i n Stability: s i := sgn w ij s j = sgn w ij ξ j =... should be... = ξi j=1 j=1 One way to get this is: w ij = αξ i ξ j, α > 0 R (called Hebbian learning) sgn w ij ξ j = sgn αξ i ξ j ξ j = sgn αξ i = sgn (αnξ i ) = sgn(ξ i ) = ξ i j=1 j=1 j=1

10 Formal analysis (V) Take α = 1/n. This has a normalizing effect: w i 2 = 1/ n Therefore W n n = (w ij ) = 1 n ξ ξ, i.e. w ij = 1 n ξ iξ j If the perturbation consists in b flipped bits, where b (n 1) 2, then the sign of n j=1 w ij s j does not change! = if the number of correct bits exceeds that of incorrect bits, the network is able to correct the error bits

11 Formal analysis (VI) Case l 1: Superposition: w ij = 1 n l k=1 ξ ki ξ kj and W n n = 1 n l k=1 ξ k ξ k Defining E n l = [ξ 1,..., ξ l ], we get W n n = 1 n EEt Note W n n is a symmetric matrix

12 Formal analysis (VII) Stability of a pattern ξ v L: Initialize the net to s i := ξ vi, 1 i n Stability: s i := sgn h vi = n j=1 w ij ξ vj = 1 n ( j=1 l j=1 k=1 w ij ξ vj ) = sgn(h vi ) ξ ki ξ kj ξ vj = 1 n j=1 ( l k=1,k v ) ξ ki ξ kj ξ vj + ξ vi ξ vj ξ }{{ vj } +1 = ξ vi + 1 n l j=1 k=1,k v ξ ki ξ kj ξ vj = ξ vi + crosstalk(v, i) If crosstalk(v, i) < 1 then sgn(h vi ) = sgn(ξ vi + crosstalk(v, i)) = sgn(ξ vi ) = ξ vi

13 Formal analysis (and VIII) Analogously to the case l = 1, if the number of correct bits of a pattern ξ v L exceeds that of incorrect bits, the network is able to correct the bits in error (the pattern ξ v L is indeed a memory) When is crosstalk(v, i) < 1? If l << n nothing is wrong, but as we have l closer to n, the crosstalk term is quite strong compared to ξ v : there is no stability The million euro question: given a network of n neurons, what is the maximum value for l? What is the capacity of the network?

14 Capacity analysis For random uncorrelated patterns (for convenience): l max = { n 2ln(n)+lnln(n) : perfect recall 0,138n : small errors ( 1,6 %) Realistic patterns will not be random, though some coding can make them more so An alternative setting is to have negatively correlated patterns: ξ v ξ u 0, v u This condition is equivalent to stating that the number of different bits is at least the number of equal bits

15 Spurious states When some pattern ξ is stored, the pattern ξ is also stored. In consequence, all initial configurations of ξ where the majority of bits are incorrect will end up in ξ! Hardly surprising: from the point of view of ξ, there are more correct bits than incorrect bits The network can be made more stable (less spurious states) if we set w ii = 0 for all i W n n = 1 n l k=1 ξ k ξ k l n I n n = 1 n (EEt li n n )

16 Example ξ 1 = ( ), ξ 2 = ( ), E 3 2 = +1 1 ( ) = ( ) = ( ) ( ) (+1 1 1) ( ) ( ), ( 1 1 1) ( ) ( ) The network is able to correct 1 erroneous bit

17 2. Memorias asociativas: 20 field Practical applications (I) Una red de 120 nodos (12x10) CONJUNTO DE OCHO PATRONES SECUENCIA DE SALIDA PARA LA ENTRADA "3" DISTORSIONADA de iteraciones A Hopfield network that stores and recovers several digits UPV: 18 de diciembre de 2001 Redes Neuronales Artificiales Facultat d Informática UPV: 18 de diciembre de Memorias asociativas: Otros paradigmas Conexionistas 2. Memorias asociativas: 23

18 Practical applications (II) A Hopfield network that stores fighter aircraft shapes is able to reconstruct them. The network operates on the video images supplied by a camera on board a USAF bomber

19 The energy function (I) If the weights are symmetric, there exists a so-called energy function: H(s) = i=1 j>i w ij s i s j which is a function of the state of the system. The central property of H is that it always decreases (or remains constant) as the system evolves Therefore the attractors (the stored memories) have to be the local minima of the energy function

20 The energy function (II) Proposition 1: A change of state in one neuron i yields a change H < 0 Let h i = n j=1 w ij s j. If neuron i changes is because s i sgn(h i ) Therefore H = ( s i )h i ( s i h i ) = 2s i h i < 0 Proposition 2: H(s) n i=1 j=1 w ij (the energy function is lower-bounded)

21 The energy function (III) The energy function H is also useful to derive the proposed weights Consider the case of only one pattern ξ: we want H to be minimum when the correlation between the current state s and ξ is maximum and equal to ξ, s 2 = ξ, ξ 2 = ξ 2 = n Therefore we choose H(s) = K 1 n ξ, s 2, where K > 0 is a suitable constant For the many-pattern case, we can try to make each of the ξ i into local minima of H: H(s) = K 1 n = K ξ v, s 2 = K 1 n v=1 i=1 j=1 ( 1 n ( ) 2 ξ vi s i = K 1 n v=1 i=1 ) ξ vi ξ vj s i s j = K v=1 i=1 j=1 ( v=1 i=1 w ij s i s j = 1 2 ξ vi s i ) i=1 ξ vj s j j=1 w ij s i s j j>i

22 The energy function (and IV) Exercises: 1. In the previous example, a) Check that H(s) = 2 3 (s 1s 2 s 2 s 3 + s 1 s 3 ) b) Show that H decreases as the network evolves towards the stored patterns 2. Prove Proposition 2

23 Spurious states (continued) We have not shown that the explicitly stored memories are the only attractors of the system We have seen that the reversed patterns ξ are also stored (they have the same energy than ξ) There are stable mixture states, linear combinations of an odd number of patterns, e.g.: ξ (mix) i = sgn α ±ξ vk,i k=1, 1 i n, α odd

24 Dynamics (continued) Let us change the dynamics of a neuron by using an activation with memory: s i (t + 1) = +1 if h i (t + 1) > 0 h i (t) if h i (t + 1) = 0 1 if h i (t + 1) < 0 where h i (t + 1) = n j=1 w ij s j (t)

25 Application to optimization problems Problem: find the optimum (or close to) of a scalar function subject to a set of restrictions e.g.: The traveling-salesman problem (TSP) or graph bipartitioning (GBP); often we deal with NP-complete problems. TSP(n) = tour n cities with the minimum total distance; TSP(60) 6, possible valid tours. In general, TSP(n) = n! 2n. GBP(n) = given a graph with an even number n of nodes, partition it in two subgraphs of n/2 nodes each such that the number of crossing edges is minimum Idea: setup the problem as a function to be minimized, setting the weights so that the obtained function is the energy function of a Hopfield network

26 Application to graph bipartitioning Representation: n graph nodes = n neurons, with s i = { +1 : node i in left part 1 : node i in right part Connectivity matrix: c ij = 1 indicates a connection between nodes i and j.

27 Application to graph bipartitioning H(S) = c ij S i S j i=1 j>i }{{} connections between parts + α 2 S i i=1 }{{} bipartition = H(S) = i=1 j>i (c ij α)s i S j i=1 j>i w ij S i S j + i=1 θ i S i = w ij = c ij α, θ i = 0 (general idea: w ij is the coefficient of S i S j in the energy function)

28 Application to the traveling-salesman problem Representation: 1 city n neurons, coding for position; hence n cities = n 2 neurons. S n n = (S ij ) (network states); D n n (distances between cities, given) with S ij = +1 if the city i is visited in position j and 1 otherwise. Restrictions on S: 1. A city cannot be visited more than once every row i must have only one +1: S ij S ik = 0 i=1 j=1 k>j R 1 (S) 2. A city can only be visited in sequence (no ubiquity gift every column j must have only one +1: R 1 (S t ) 3. Every city must be visited ( 2 S ij n) = 0 i=1 j=1 there must be n +1s overall: R 2 (S)

29 Application to the traveling-salesman problem 4. The tour has to be the shortest possible D ij S ik (S j,k 1 + S j,k+1 ) k=1 i=1 j>i Total sum of distances in the tour: R 3 (S, D) H(S) = w ij,kl S ij S kl + θ ij S ij αr 1 (S)+βR 1 (S t )+γr 2 (S)+ɛR 3 (S, D) i=1 j=1 k>i l>j i=1 j>i = w ij,kl = αδ ik (1 δ jl ) βδ jl (1 δ ik ) γ ɛd ik (δ j,l+1 + δ j,l 1 ) and θ ij = γn This is a continuous Hopfield network: S ij ( 1, 1) since S ij := tanh χ (h ij θ ij ) Problem: set up good values for α, β, γ, ɛ > 0, χ > 0

30 Application to the traveling-salesman problem Example for n = 5: 120 valid tours (12 different), 25 neurons 2 25 potential attractors; actual number of them: 1,740 (among them, the valid 120). 1, = 1,620 spurious attractors (invalid tours) The valid tours correspond to the states of lower energy (correct design) But... we have a network with 30 times more invalid tours than valid ones: a random initialization will end up in an invalid tour with probability Moreover, we aim for the best tour! A possible workaround is to use stochastic units with simulated annealing

7 Rate-Based Recurrent Networks of Threshold Neurons: Basis for Associative Memory

7 Rate-Based Recurrent Networks of Threshold Neurons: Basis for Associative Memory Physics 178/278 - David Kleinfeld - Fall 2005; Revised for Winter 2017 7 Rate-Based Recurrent etworks of Threshold eurons: Basis for Associative Memory 7.1 A recurrent network with threshold elements The

More information

7 Recurrent Networks of Threshold (Binary) Neurons: Basis for Associative Memory

7 Recurrent Networks of Threshold (Binary) Neurons: Basis for Associative Memory Physics 178/278 - David Kleinfeld - Winter 2019 7 Recurrent etworks of Threshold (Binary) eurons: Basis for Associative Memory 7.1 The network The basic challenge in associative networks, also referred

More information

Artificial Intelligence Hopfield Networks

Artificial Intelligence Hopfield Networks Artificial Intelligence Hopfield Networks Andrea Torsello Network Topologies Single Layer Recurrent Network Bidirectional Symmetric Connection Binary / Continuous Units Associative Memory Optimization

More information

A. The Hopfield Network. III. Recurrent Neural Networks. Typical Artificial Neuron. Typical Artificial Neuron. Hopfield Network.

A. The Hopfield Network. III. Recurrent Neural Networks. Typical Artificial Neuron. Typical Artificial Neuron. Hopfield Network. Part 3A: Hopfield Network III. Recurrent Neural Networks A. The Hopfield Network 1 2 Typical Artificial Neuron Typical Artificial Neuron connection weights linear combination activation function inputs

More information

A. The Hopfield Network. III. Recurrent Neural Networks. Typical Artificial Neuron. Typical Artificial Neuron. Hopfield Network.

A. The Hopfield Network. III. Recurrent Neural Networks. Typical Artificial Neuron. Typical Artificial Neuron. Hopfield Network. III. Recurrent Neural Networks A. The Hopfield Network 2/9/15 1 2/9/15 2 Typical Artificial Neuron Typical Artificial Neuron connection weights linear combination activation function inputs output net

More information

In biological terms, memory refers to the ability of neural systems to store activity patterns and later recall them when required.

In biological terms, memory refers to the ability of neural systems to store activity patterns and later recall them when required. In biological terms, memory refers to the ability of neural systems to store activity patterns and later recall them when required. In humans, association is known to be a prominent feature of memory.

More information

CSE 5526: Introduction to Neural Networks Hopfield Network for Associative Memory

CSE 5526: Introduction to Neural Networks Hopfield Network for Associative Memory CSE 5526: Introduction to Neural Networks Hopfield Network for Associative Memory Part VII 1 The basic task Store a set of fundamental memories {ξξ 1, ξξ 2,, ξξ MM } so that, when presented a new pattern

More information

Hertz, Krogh, Palmer: Introduction to the Theory of Neural Computation. Addison-Wesley Publishing Company (1991). (v ji (1 x i ) + (1 v ji )x i )

Hertz, Krogh, Palmer: Introduction to the Theory of Neural Computation. Addison-Wesley Publishing Company (1991). (v ji (1 x i ) + (1 v ji )x i ) Symmetric Networks Hertz, Krogh, Palmer: Introduction to the Theory of Neural Computation. Addison-Wesley Publishing Company (1991). How can we model an associative memory? Let M = {v 1,..., v m } be a

More information

Hopfield Network for Associative Memory

Hopfield Network for Associative Memory CSE 5526: Introduction to Neural Networks Hopfield Network for Associative Memory 1 The next few units cover unsupervised models Goal: learn the distribution of a set of observations Some observations

More information

Neural Networks. Prof. Dr. Rudolf Kruse. Computational Intelligence Group Faculty for Computer Science

Neural Networks. Prof. Dr. Rudolf Kruse. Computational Intelligence Group Faculty for Computer Science Neural Networks Prof. Dr. Rudolf Kruse Computational Intelligence Group Faculty for Computer Science kruse@iws.cs.uni-magdeburg.de Rudolf Kruse Neural Networks 1 Hopfield Networks Rudolf Kruse Neural Networks

More information

Markov Chains and MCMC

Markov Chains and MCMC Markov Chains and MCMC Markov chains Let S = {1, 2,..., N} be a finite set consisting of N states. A Markov chain Y 0, Y 1, Y 2,... is a sequence of random variables, with Y t S for all points in time

More information

Hopfield Neural Network and Associative Memory. Typical Myelinated Vertebrate Motoneuron (Wikipedia) Topic 3 Polymers and Neurons Lecture 5

Hopfield Neural Network and Associative Memory. Typical Myelinated Vertebrate Motoneuron (Wikipedia) Topic 3 Polymers and Neurons Lecture 5 Hopfield Neural Network and Associative Memory Typical Myelinated Vertebrate Motoneuron (Wikipedia) PHY 411-506 Computational Physics 2 1 Wednesday, March 5 1906 Nobel Prize in Physiology or Medicine.

More information

Neural Network Essentials 1

Neural Network Essentials 1 Neural Network Essentials Draft: Associative Memory Chapter Anthony S. Maida University of Louisiana at Lafayette, USA June 6, 205 Copyright c 2000 2003 by Anthony S. Maida Contents 0. Associative memory.............................

More information

Computational Intelligence Lecture 6: Associative Memory

Computational Intelligence Lecture 6: Associative Memory Computational Intelligence Lecture 6: Associative Memory Farzaneh Abdollahi Department of Electrical Engineering Amirkabir University of Technology Fall 2011 Farzaneh Abdollahi Computational Intelligence

More information

Iterative Autoassociative Net: Bidirectional Associative Memory

Iterative Autoassociative Net: Bidirectional Associative Memory POLYTECHNIC UNIVERSITY Department of Computer and Information Science Iterative Autoassociative Net: Bidirectional Associative Memory K. Ming Leung Abstract: Iterative associative neural networks are introduced.

More information

Week 4: Hopfield Network

Week 4: Hopfield Network Week 4: Hopfield Network Phong Le, Willem Zuidema November 20, 2013 Last week we studied multi-layer perceptron, a neural network in which information is only allowed to transmit in one direction (from

More information

Hopfield Networks and Boltzmann Machines. Christian Borgelt Artificial Neural Networks and Deep Learning 296

Hopfield Networks and Boltzmann Machines. Christian Borgelt Artificial Neural Networks and Deep Learning 296 Hopfield Networks and Boltzmann Machines Christian Borgelt Artificial Neural Networks and Deep Learning 296 Hopfield Networks A Hopfield network is a neural network with a graph G = (U,C) that satisfies

More information

Neural Networks Lecture 6: Associative Memory II

Neural Networks Lecture 6: Associative Memory II Neural Networks Lecture 6: Associative Memory II H.A Talebi Farzaneh Abdollahi Department of Electrical Engineering Amirkabir University of Technology Winter 2011. A. Talebi, Farzaneh Abdollahi Neural

More information

Neural Nets and Symbolic Reasoning Hopfield Networks

Neural Nets and Symbolic Reasoning Hopfield Networks Neural Nets and Symbolic Reasoning Hopfield Networks Outline The idea of pattern completion The fast dynamics of Hopfield networks Learning with Hopfield networks Emerging properties of Hopfield networks

More information

Artificial Neural Networks Examination, March 2004

Artificial Neural Networks Examination, March 2004 Artificial Neural Networks Examination, March 2004 Instructions There are SIXTY questions (worth up to 60 marks). The exam mark (maximum 60) will be added to the mark obtained in the laborations (maximum

More information

Hopfield Networks. (Excerpt from a Basic Course at IK 2008) Herbert Jaeger. Jacobs University Bremen

Hopfield Networks. (Excerpt from a Basic Course at IK 2008) Herbert Jaeger. Jacobs University Bremen Hopfield Networks (Excerpt from a Basic Course at IK 2008) Herbert Jaeger Jacobs University Bremen Building a model of associative memory should be simple enough... Our brain is a neural network Individual

More information

Neural Networks. Hopfield Nets and Auto Associators Fall 2017

Neural Networks. Hopfield Nets and Auto Associators Fall 2017 Neural Networks Hopfield Nets and Auto Associators Fall 2017 1 Story so far Neural networks for computation All feedforward structures But what about.. 2 Loopy network Θ z = ቊ +1 if z > 0 1 if z 0 y i

More information

( ) T. Reading. Lecture 22. Definition of Covariance. Imprinting Multiple Patterns. Characteristics of Hopfield Memory

( ) T. Reading. Lecture 22. Definition of Covariance. Imprinting Multiple Patterns. Characteristics of Hopfield Memory Part 3: Autonomous Agents /8/07 Reading Lecture 22 Flake, ch. 20 ( Genetics and Evolution ) /8/07 /8/07 2 Imprinting Multiple Patterns Let x, x 2,, x p be patterns to be imprinted Define the sum-of-outer-products

More information

COMP9444 Neural Networks and Deep Learning 11. Boltzmann Machines. COMP9444 c Alan Blair, 2017

COMP9444 Neural Networks and Deep Learning 11. Boltzmann Machines. COMP9444 c Alan Blair, 2017 COMP9444 Neural Networks and Deep Learning 11. Boltzmann Machines COMP9444 17s2 Boltzmann Machines 1 Outline Content Addressable Memory Hopfield Network Generative Models Boltzmann Machine Restricted Boltzmann

More information

CHAPTER 3. Pattern Association. Neural Networks

CHAPTER 3. Pattern Association. Neural Networks CHAPTER 3 Pattern Association Neural Networks Pattern Association learning is the process of forming associations between related patterns. The patterns we associate together may be of the same type or

More information

Algorithms and Theory of Computation. Lecture 22: NP-Completeness (2)

Algorithms and Theory of Computation. Lecture 22: NP-Completeness (2) Algorithms and Theory of Computation Lecture 22: NP-Completeness (2) Xiaohui Bei MAS 714 November 8, 2018 Nanyang Technological University MAS 714 November 8, 2018 1 / 20 Set Cover Set Cover Input: a set

More information

3.3 Discrete Hopfield Net An iterative autoassociative net similar to the nets described in the previous sections has been developed by Hopfield

3.3 Discrete Hopfield Net An iterative autoassociative net similar to the nets described in the previous sections has been developed by Hopfield 3.3 Discrete Hopfield Net An iterative autoassociative net similar to the nets described in the previous sections has been developed by Hopfield (1982, 1984). - The net is a fully interconnected neural

More information

( ) ( ) ( ) ( ) Simulated Annealing. Introduction. Pseudotemperature, Free Energy and Entropy. A Short Detour into Statistical Mechanics.

( ) ( ) ( ) ( ) Simulated Annealing. Introduction. Pseudotemperature, Free Energy and Entropy. A Short Detour into Statistical Mechanics. Aims Reference Keywords Plan Simulated Annealing to obtain a mathematical framework for stochastic machines to study simulated annealing Parts of chapter of Haykin, S., Neural Networks: A Comprehensive

More information

Data Mining (Mineria de Dades)

Data Mining (Mineria de Dades) Data Mining (Mineria de Dades) Lluís A. Belanche belanche@lsi.upc.edu Soft Computing Research Group Dept. de Llenguatges i Sistemes Informàtics (Software department) Universitat Politècnica de Catalunya

More information

Consider the way we are able to retrieve a pattern from a partial key as in Figure 10 1.

Consider the way we are able to retrieve a pattern from a partial key as in Figure 10 1. CompNeuroSci Ch 10 September 8, 2004 10 Associative Memory Networks 101 Introductory concepts Consider the way we are able to retrieve a pattern from a partial key as in Figure 10 1 Figure 10 1: A key

More information

Using a Hopfield Network: A Nuts and Bolts Approach

Using a Hopfield Network: A Nuts and Bolts Approach Using a Hopfield Network: A Nuts and Bolts Approach November 4, 2013 Gershon Wolfe, Ph.D. Hopfield Model as Applied to Classification Hopfield network Training the network Updating nodes Sequencing of

More information

Associative Memories (I) Hopfield Networks

Associative Memories (I) Hopfield Networks Associative Memories (I) Davide Bacciu Dipartimento di Informatica Università di Pisa bacciu@di.unipi.it Applied Brain Science - Computational Neuroscience (CNS) A Pun Associative Memories Introduction

More information

Introduction to Neural Networks

Introduction to Neural Networks Introduction to Neural Networks What are (Artificial) Neural Networks? Models of the brain and nervous system Highly parallel Process information much more like the brain than a serial computer Learning

More information

How can ideas from quantum computing improve or speed up neuromorphic models of computation?

How can ideas from quantum computing improve or speed up neuromorphic models of computation? Neuromorphic Computation: Architectures, Models, Applications Associative Memory Models with Adiabatic Quantum Optimization Kathleen Hamilton, Alexander McCaskey, Jonathan Schrock, Neena Imam and Travis

More information

2- AUTOASSOCIATIVE NET - The feedforward autoassociative net considered in this section is a special case of the heteroassociative net.

2- AUTOASSOCIATIVE NET - The feedforward autoassociative net considered in this section is a special case of the heteroassociative net. 2- AUTOASSOCIATIVE NET - The feedforward autoassociative net considered in this section is a special case of the heteroassociative net. - For an autoassociative net, the training input and target output

More information

Logic Learning in Hopfield Networks

Logic Learning in Hopfield Networks Logic Learning in Hopfield Networks Saratha Sathasivam (Corresponding author) School of Mathematical Sciences, University of Science Malaysia, Penang, Malaysia E-mail: saratha@cs.usm.my Wan Ahmad Tajuddin

More information

Analysis of Algorithms. Unit 5 - Intractable Problems

Analysis of Algorithms. Unit 5 - Intractable Problems Analysis of Algorithms Unit 5 - Intractable Problems 1 Intractable Problems Tractable Problems vs. Intractable Problems Polynomial Problems NP Problems NP Complete and NP Hard Problems 2 In this unit we

More information

Chapter 10 Associative Memory Networks We are in for a surprise though!

Chapter 10 Associative Memory Networks We are in for a surprise though! We are in for a surprise though! 11 If we return to our calculated W based on the stored ξ and use x = (-1,1,-1,-1,-1,1) T as the input then we get the output y = (-1,1,-1,-1,-1,1) T, i.e. the same output

More information

NP and Computational Intractability

NP and Computational Intractability NP and Computational Intractability 1 Polynomial-Time Reduction Desiderata'. Suppose we could solve X in polynomial-time. What else could we solve in polynomial time? don't confuse with reduces from Reduction.

More information

Q520: Answers to the Homework on Hopfield Networks. 1. For each of the following, answer true or false with an explanation:

Q520: Answers to the Homework on Hopfield Networks. 1. For each of the following, answer true or false with an explanation: Q50: Answers to the Homework on Hopfield Networks 1. For each of the following, answer true or false with an explanation: a. Fix a Hopfield net. If o and o are neighboring observation patterns then Φ(

More information

Learning and Memory in Neural Networks

Learning and Memory in Neural Networks Learning and Memory in Neural Networks Guy Billings, Neuroinformatics Doctoral Training Centre, The School of Informatics, The University of Edinburgh, UK. Neural networks consist of computational units

More information

Week Cuts, Branch & Bound, and Lagrangean Relaxation

Week Cuts, Branch & Bound, and Lagrangean Relaxation Week 11 1 Integer Linear Programming This week we will discuss solution methods for solving integer linear programming problems. I will skip the part on complexity theory, Section 11.8, although this is

More information

Neural Networks for Machine Learning. Lecture 11a Hopfield Nets

Neural Networks for Machine Learning. Lecture 11a Hopfield Nets Neural Networks for Machine Learning Lecture 11a Hopfield Nets Geoffrey Hinton Nitish Srivastava, Kevin Swersky Tijmen Tieleman Abdel-rahman Mohamed Hopfield Nets A Hopfield net is composed of binary threshold

More information

Optimization Exercise Set n.5 :

Optimization Exercise Set n.5 : Optimization Exercise Set n.5 : Prepared by S. Coniglio translated by O. Jabali 2016/2017 1 5.1 Airport location In air transportation, usually there is not a direct connection between every pair of airports.

More information

Optimization Exercise Set n. 4 :

Optimization Exercise Set n. 4 : Optimization Exercise Set n. 4 : Prepared by S. Coniglio and E. Amaldi translated by O. Jabali 2018/2019 1 4.1 Airport location In air transportation, usually there is not a direct connection between every

More information

Pattern Association or Associative Networks. Jugal Kalita University of Colorado at Colorado Springs

Pattern Association or Associative Networks. Jugal Kalita University of Colorado at Colorado Springs Pattern Association or Associative Networks Jugal Kalita University of Colorado at Colorado Springs To an extent, learning is forming associations. Human memory associates similar items, contrary/opposite

More information

Stochastic Networks Variations of the Hopfield model

Stochastic Networks Variations of the Hopfield model 4 Stochastic Networks 4. Variations of the Hopfield model In the previous chapter we showed that Hopfield networks can be used to provide solutions to combinatorial problems that can be expressed as the

More information

Vector, Matrix, and Tensor Derivatives

Vector, Matrix, and Tensor Derivatives Vector, Matrix, and Tensor Derivatives Erik Learned-Miller The purpose of this document is to help you learn to take derivatives of vectors, matrices, and higher order tensors (arrays with three dimensions

More information

Artificial Neural Networks Examination, June 2004

Artificial Neural Networks Examination, June 2004 Artificial Neural Networks Examination, June 2004 Instructions There are SIXTY questions (worth up to 60 marks). The exam mark (maximum 60) will be added to the mark obtained in the laborations (maximum

More information

Neural networks. Chapter 20. Chapter 20 1

Neural networks. Chapter 20. Chapter 20 1 Neural networks Chapter 20 Chapter 20 1 Outline Brains Neural networks Perceptrons Multilayer networks Applications of neural networks Chapter 20 2 Brains 10 11 neurons of > 20 types, 10 14 synapses, 1ms

More information

Storage Capacity of Letter Recognition in Hopfield Networks

Storage Capacity of Letter Recognition in Hopfield Networks Storage Capacity of Letter Recognition in Hopfield Networks Gang Wei (gwei@cs.dal.ca) Zheyuan Yu (zyu@cs.dal.ca) Faculty of Computer Science, Dalhousie University, Halifax, N.S., Canada B3H 1W5 Abstract:

More information

CHALMERS, GÖTEBORGS UNIVERSITET. EXAM for ARTIFICIAL NEURAL NETWORKS. COURSE CODES: FFR 135, FIM 720 GU, PhD

CHALMERS, GÖTEBORGS UNIVERSITET. EXAM for ARTIFICIAL NEURAL NETWORKS. COURSE CODES: FFR 135, FIM 720 GU, PhD CHALMERS, GÖTEBORGS UNIVERSITET EXAM for ARTIFICIAL NEURAL NETWORKS COURSE CODES: FFR 135, FIM 72 GU, PhD Time: Place: Teachers: Allowed material: Not allowed: October 23, 217, at 8 3 12 3 Lindholmen-salar

More information

CSC321 Lecture 8: Optimization

CSC321 Lecture 8: Optimization CSC321 Lecture 8: Optimization Roger Grosse Roger Grosse CSC321 Lecture 8: Optimization 1 / 26 Overview We ve talked a lot about how to compute gradients. What do we actually do with them? Today s lecture:

More information

Travelling Salesman Problem

Travelling Salesman Problem Travelling Salesman Problem Fabio Furini November 10th, 2014 Travelling Salesman Problem 1 Outline 1 Traveling Salesman Problem Separation Travelling Salesman Problem 2 (Asymmetric) Traveling Salesman

More information

Slide10 Haykin Chapter 14: Neurodynamics (3rd Ed. Chapter 13)

Slide10 Haykin Chapter 14: Neurodynamics (3rd Ed. Chapter 13) Slie10 Haykin Chapter 14: Neuroynamics (3r E. Chapter 13) CPSC 636-600 Instructor: Yoonsuck Choe Spring 2012 Neural Networks with Temporal Behavior Inclusion of feeback gives temporal characteristics to

More information

Artificial Neural Networks Examination, June 2005

Artificial Neural Networks Examination, June 2005 Artificial Neural Networks Examination, June 2005 Instructions There are SIXTY questions. (The pass mark is 30 out of 60). For each question, please select a maximum of ONE of the given answers (either

More information

Lecture 7 Artificial neural networks: Supervised learning

Lecture 7 Artificial neural networks: Supervised learning Lecture 7 Artificial neural networks: Supervised learning Introduction, or how the brain works The neuron as a simple computing element The perceptron Multilayer neural networks Accelerated learning in

More information

8.5 Sequencing Problems

8.5 Sequencing Problems 8.5 Sequencing Problems Basic genres. Packing problems: SET-PACKING, INDEPENDENT SET. Covering problems: SET-COVER, VERTEX-COVER. Constraint satisfaction problems: SAT, 3-SAT. Sequencing problems: HAMILTONIAN-CYCLE,

More information

Computational complexity theory

Computational complexity theory Computational complexity theory Introduction to computational complexity theory Complexity (computability) theory deals with two aspects: Algorithm s complexity. Problem s complexity. References S. Cook,

More information

2 Notation and Preliminaries

2 Notation and Preliminaries On Asymmetric TSP: Transformation to Symmetric TSP and Performance Bound Ratnesh Kumar Haomin Li epartment of Electrical Engineering University of Kentucky Lexington, KY 40506-0046 Abstract We show that

More information

7.1 Basis for Boltzmann machine. 7. Boltzmann machines

7.1 Basis for Boltzmann machine. 7. Boltzmann machines 7. Boltzmann machines this section we will become acquainted with classical Boltzmann machines which can be seen obsolete being rarely applied in neurocomputing. It is interesting, after all, because is

More information

CSC321 Lecture 7: Optimization

CSC321 Lecture 7: Optimization CSC321 Lecture 7: Optimization Roger Grosse Roger Grosse CSC321 Lecture 7: Optimization 1 / 25 Overview We ve talked a lot about how to compute gradients. What do we actually do with them? Today s lecture:

More information

Polynomial-Time Reductions

Polynomial-Time Reductions Reductions 1 Polynomial-Time Reductions Classify Problems According to Computational Requirements Q. Which problems will we be able to solve in practice? A working definition. [von Neumann 1953, Godel

More information

Neural networks. Chapter 19, Sections 1 5 1

Neural networks. Chapter 19, Sections 1 5 1 Neural networks Chapter 19, Sections 1 5 Chapter 19, Sections 1 5 1 Outline Brains Neural networks Perceptrons Multilayer perceptrons Applications of neural networks Chapter 19, Sections 1 5 2 Brains 10

More information

Computational complexity theory

Computational complexity theory Computational complexity theory Introduction to computational complexity theory Complexity (computability) theory deals with two aspects: Algorithm s complexity. Problem s complexity. References S. Cook,

More information

12. LOCAL SEARCH. gradient descent Metropolis algorithm Hopfield neural networks maximum cut Nash equilibria

12. LOCAL SEARCH. gradient descent Metropolis algorithm Hopfield neural networks maximum cut Nash equilibria 12. LOCAL SEARCH gradient descent Metropolis algorithm Hopfield neural networks maximum cut Nash equilibria Lecture slides by Kevin Wayne Copyright 2005 Pearson-Addison Wesley h ttp://www.cs.princeton.edu/~wayne/kleinberg-tardos

More information

Sample Exam COMP 9444 NEURAL NETWORKS Solutions

Sample Exam COMP 9444 NEURAL NETWORKS Solutions FAMILY NAME OTHER NAMES STUDENT ID SIGNATURE Sample Exam COMP 9444 NEURAL NETWORKS Solutions (1) TIME ALLOWED 3 HOURS (2) TOTAL NUMBER OF QUESTIONS 12 (3) STUDENTS SHOULD ANSWER ALL QUESTIONS (4) QUESTIONS

More information

Neural Networks. Yan Shao Department of Linguistics and Philology, Uppsala University 7 December 2016

Neural Networks. Yan Shao Department of Linguistics and Philology, Uppsala University 7 December 2016 Neural Networks Yan Shao Department of Linguistics and Philology, Uppsala University 7 December 2016 Outline Part 1 Introduction Feedforward Neural Networks Stochastic Gradient Descent Computational Graph

More information

Computational statistics

Computational statistics Computational statistics Combinatorial optimization Thierry Denœux February 2017 Thierry Denœux Computational statistics February 2017 1 / 37 Combinatorial optimization Assume we seek the maximum of f

More information

8.3 Hamiltonian Paths and Circuits

8.3 Hamiltonian Paths and Circuits 8.3 Hamiltonian Paths and Circuits 8.3 Hamiltonian Paths and Circuits A Hamiltonian path is a path that contains each vertex exactly once A Hamiltonian circuit is a Hamiltonian path that is also a circuit

More information

CS 781 Lecture 9 March 10, 2011 Topics: Local Search and Optimization Metropolis Algorithm Greedy Optimization Hopfield Networks Max Cut Problem Nash

CS 781 Lecture 9 March 10, 2011 Topics: Local Search and Optimization Metropolis Algorithm Greedy Optimization Hopfield Networks Max Cut Problem Nash CS 781 Lecture 9 March 10, 2011 Topics: Local Search and Optimization Metropolis Algorithm Greedy Optimization Hopfield Networks Max Cut Problem Nash Equilibrium Price of Stability Coping With NP-Hardness

More information

CS 301: Complexity of Algorithms (Term I 2008) Alex Tiskin Harald Räcke. Hamiltonian Cycle. 8.5 Sequencing Problems. Directed Hamiltonian Cycle

CS 301: Complexity of Algorithms (Term I 2008) Alex Tiskin Harald Räcke. Hamiltonian Cycle. 8.5 Sequencing Problems. Directed Hamiltonian Cycle 8.5 Sequencing Problems Basic genres. Packing problems: SET-PACKING, INDEPENDENT SET. Covering problems: SET-COVER, VERTEX-COVER. Constraint satisfaction problems: SAT, 3-SAT. Sequencing problems: HAMILTONIAN-CYCLE,

More information

Branislav K. Nikolić

Branislav K. Nikolić Interdisciplinary Topics in Complex Systems: Cellular Automata, Self-Organized Criticality, Neural Networks and Spin Glasses Branislav K. Nikolić Department of Physics and Astronomy, University of Delaware,

More information

Technische Universität München, Zentrum Mathematik Lehrstuhl für Angewandte Geometrie und Diskrete Mathematik. Combinatorial Optimization (MA 4502)

Technische Universität München, Zentrum Mathematik Lehrstuhl für Angewandte Geometrie und Diskrete Mathematik. Combinatorial Optimization (MA 4502) Technische Universität München, Zentrum Mathematik Lehrstuhl für Angewandte Geometrie und Diskrete Mathematik Combinatorial Optimization (MA 4502) Dr. Michael Ritter Problem Sheet 1 Homework Problems Exercise

More information

6. APPLICATION TO THE TRAVELING SALESMAN PROBLEM

6. APPLICATION TO THE TRAVELING SALESMAN PROBLEM 6. Application to the Traveling Salesman Problem 92 6. APPLICATION TO THE TRAVELING SALESMAN PROBLEM The properties that have the most significant influence on the maps constructed by Kohonen s algorithm

More information

Convolutional Associative Memory: FIR Filter Model of Synapse

Convolutional Associative Memory: FIR Filter Model of Synapse Convolutional Associative Memory: FIR Filter Model of Synapse Rama Murthy Garimella 1, Sai Dileep Munugoti 2, Anil Rayala 1 1 International Institute of Information technology, Hyderabad, India. rammurthy@iiit.ac.in,

More information

Swarm Intelligence Traveling Salesman Problem and Ant System

Swarm Intelligence Traveling Salesman Problem and Ant System Swarm Intelligence Leslie Pérez áceres leslie.perez.caceres@ulb.ac.be Hayfa Hammami haifa.hammami@ulb.ac.be IRIIA Université Libre de ruxelles (UL) ruxelles, elgium Outline 1.oncept review 2.Travelling

More information

Introduction to Artificial Neural Networks

Introduction to Artificial Neural Networks Facultés Universitaires Notre-Dame de la Paix 27 March 2007 Outline 1 Introduction 2 Fundamentals Biological neuron Artificial neuron Artificial Neural Network Outline 3 Single-layer ANN Perceptron Adaline

More information

15.083J/6.859J Integer Optimization. Lecture 2: Efficient Algorithms and Computational Complexity

15.083J/6.859J Integer Optimization. Lecture 2: Efficient Algorithms and Computational Complexity 15.083J/6.859J Integer Optimization Lecture 2: Efficient Algorithms and Computational Complexity 1 Outline Efficient algorithms Slide 1 Complexity The classes P and N P The classes N P-complete and N P-hard

More information

Neural networks: Unsupervised learning

Neural networks: Unsupervised learning Neural networks: Unsupervised learning 1 Previously The supervised learning paradigm: given example inputs x and target outputs t learning the mapping between them the trained network is supposed to give

More information

Solving TSP Using Lotka-Volterra Neural Networks without Self-Excitatory

Solving TSP Using Lotka-Volterra Neural Networks without Self-Excitatory Solving TSP Using Lotka-Volterra Neural Networks without Self-Excitatory Manli Li, Jiali Yu, Stones Lei Zhang, Hong Qu Computational Intelligence Laboratory, School of Computer Science and Engineering,

More information

Imperial College London

Imperial College London Imperial College London Department of Computing Solving optimisation problems and k-sat with dynamical systems Rafa l R. Szymański rs2909@doc.ic.ac.uk Supervisor: Second Marker: Murray Shanahan Aldo Faisal

More information

arxiv: v2 [nlin.ao] 19 May 2015

arxiv: v2 [nlin.ao] 19 May 2015 Efficient and optimal binary Hopfield associative memory storage using minimum probability flow arxiv:1204.2916v2 [nlin.ao] 19 May 2015 Christopher Hillar Redwood Center for Theoretical Neuroscience University

More information

Introduction to Chaotic Dynamics and Fractals

Introduction to Chaotic Dynamics and Fractals Introduction to Chaotic Dynamics and Fractals Abbas Edalat ae@ic.ac.uk Imperial College London Bertinoro, June 2013 Topics covered Discrete dynamical systems Periodic doublig route to chaos Iterated Function

More information

Algorithms Reading Group Notes: Provable Bounds for Learning Deep Representations

Algorithms Reading Group Notes: Provable Bounds for Learning Deep Representations Algorithms Reading Group Notes: Provable Bounds for Learning Deep Representations Joshua R. Wang November 1, 2016 1 Model and Results Continuing from last week, we again examine provable algorithms for

More information

ON COST MATRICES WITH TWO AND THREE DISTINCT VALUES OF HAMILTONIAN PATHS AND CYCLES

ON COST MATRICES WITH TWO AND THREE DISTINCT VALUES OF HAMILTONIAN PATHS AND CYCLES ON COST MATRICES WITH TWO AND THREE DISTINCT VALUES OF HAMILTONIAN PATHS AND CYCLES SANTOSH N. KABADI AND ABRAHAM P. PUNNEN Abstract. Polynomially testable characterization of cost matrices associated

More information

Finding optimal configurations ( combinatorial optimization)

Finding optimal configurations ( combinatorial optimization) CS 1571 Introduction to AI Lecture 10 Finding optimal configurations ( combinatorial optimization) Milos Hauskrecht milos@cs.pitt.edu 539 Sennott Square Constraint satisfaction problem (CSP) Constraint

More information

Data Structures and Algorithms (CSCI 340)

Data Structures and Algorithms (CSCI 340) University of Wisconsin Parkside Fall Semester 2008 Department of Computer Science Prof. Dr. F. Seutter Data Structures and Algorithms (CSCI 340) Homework Assignments The numbering of the problems refers

More information

Dynamical Systems and Deep Learning: Overview. Abbas Edalat

Dynamical Systems and Deep Learning: Overview. Abbas Edalat Dynamical Systems and Deep Learning: Overview Abbas Edalat Dynamical Systems The notion of a dynamical system includes the following: A phase or state space, which may be continuous, e.g. the real line,

More information

Data Structures in Java

Data Structures in Java Data Structures in Java Lecture 21: Introduction to NP-Completeness 12/9/2015 Daniel Bauer Algorithms and Problem Solving Purpose of algorithms: find solutions to problems. Data Structures provide ways

More information

ACO Comprehensive Exam October 14 and 15, 2013

ACO Comprehensive Exam October 14 and 15, 2013 1. Computability, Complexity and Algorithms (a) Let G be the complete graph on n vertices, and let c : V (G) V (G) [0, ) be a symmetric cost function. Consider the following closest point heuristic for

More information

Neural Networks. Associative memory 12/30/2015. Associative memories. Associative memories

Neural Networks. Associative memory 12/30/2015. Associative memories. Associative memories //5 Neural Netors Associative memory Lecture Associative memories Associative memories The massively parallel models of associative or content associative memory have been developed. Some of these models

More information

CMSC 421: Neural Computation. Applications of Neural Networks

CMSC 421: Neural Computation. Applications of Neural Networks CMSC 42: Neural Computation definition synonyms neural networks artificial neural networks neural modeling connectionist models parallel distributed processing AI perspective Applications of Neural Networks

More information

Memories Associated with Single Neurons and Proximity Matrices

Memories Associated with Single Neurons and Proximity Matrices Memories Associated with Single Neurons and Proximity Matrices Subhash Kak Oklahoma State University, Stillwater Abstract: This paper extends the treatment of single-neuron memories obtained by the use

More information

5.3 METABOLIC NETWORKS 193. P (x i P a (x i )) (5.30) i=1

5.3 METABOLIC NETWORKS 193. P (x i P a (x i )) (5.30) i=1 5.3 METABOLIC NETWORKS 193 5.3 Metabolic Networks 5.4 Bayesian Networks Let G = (V, E) be a directed acyclic graph. We assume that the vertices i V (1 i n) represent for example genes and correspond to

More information

Lecture 20 : Markov Chains

Lecture 20 : Markov Chains CSCI 3560 Probability and Computing Instructor: Bogdan Chlebus Lecture 0 : Markov Chains We consider stochastic processes. A process represents a system that evolves through incremental changes called

More information

Proceedings of Neural, Parallel, and Scientific Computations 4 (2010) xx-xx PHASE OSCILLATOR NETWORK WITH PIECEWISE-LINEAR DYNAMICS

Proceedings of Neural, Parallel, and Scientific Computations 4 (2010) xx-xx PHASE OSCILLATOR NETWORK WITH PIECEWISE-LINEAR DYNAMICS Proceedings of Neural, Parallel, and Scientific Computations 4 (2010) xx-xx PHASE OSCILLATOR NETWORK WITH PIECEWISE-LINEAR DYNAMICS WALTER GALL, YING ZHOU, AND JOSEPH SALISBURY Department of Mathematics

More information

18.6 Regression and Classification with Linear Models

18.6 Regression and Classification with Linear Models 18.6 Regression and Classification with Linear Models 352 The hypothesis space of linear functions of continuous-valued inputs has been used for hundreds of years A univariate linear function (a straight

More information

Neural Networks. Mark van Rossum. January 15, School of Informatics, University of Edinburgh 1 / 28

Neural Networks. Mark van Rossum. January 15, School of Informatics, University of Edinburgh 1 / 28 1 / 28 Neural Networks Mark van Rossum School of Informatics, University of Edinburgh January 15, 2018 2 / 28 Goals: Understand how (recurrent) networks behave Find a way to teach networks to do a certain

More information

Christian Mohr

Christian Mohr Christian Mohr 20.12.2011 Recurrent Networks Networks in which units may have connections to units in the same or preceding layers Also connections to the unit itself possible Already covered: Hopfield

More information