Artificial Intelligence Hopfield Networks

Similar documents
Hopfield Neural Network

Hopfield Neural Network and Associative Memory. Typical Myelinated Vertebrate Motoneuron (Wikipedia) Topic 3 Polymers and Neurons Lecture 5

In biological terms, memory refers to the ability of neural systems to store activity patterns and later recall them when required.

CHAPTER 3. Pattern Association. Neural Networks

3.3 Discrete Hopfield Net An iterative autoassociative net similar to the nets described in the previous sections has been developed by Hopfield

7 Rate-Based Recurrent Networks of Threshold Neurons: Basis for Associative Memory

Using a Hopfield Network: A Nuts and Bolts Approach

Iterative Autoassociative Net: Bidirectional Associative Memory

7 Recurrent Networks of Threshold (Binary) Neurons: Basis for Associative Memory

Week 4: Hopfield Network

Introduction to Neural Networks

2- AUTOASSOCIATIVE NET - The feedforward autoassociative net considered in this section is a special case of the heteroassociative net.

Neural Networks. Hopfield Nets and Auto Associators Fall 2017

Pattern Association or Associative Networks. Jugal Kalita University of Colorado at Colorado Springs

How can ideas from quantum computing improve or speed up neuromorphic models of computation?

Computational Intelligence Lecture 6: Associative Memory

A. The Hopfield Network. III. Recurrent Neural Networks. Typical Artificial Neuron. Typical Artificial Neuron. Hopfield Network.

Lecture 7 Artificial neural networks: Supervised learning

Consider the way we are able to retrieve a pattern from a partial key as in Figure 10 1.

A. The Hopfield Network. III. Recurrent Neural Networks. Typical Artificial Neuron. Typical Artificial Neuron. Hopfield Network.

Neural Networks Lecture 6: Associative Memory II

Neural Nets and Symbolic Reasoning Hopfield Networks

Hopfield Network for Associative Memory

Associative Memories (I) Hopfield Networks

CSE 5526: Introduction to Neural Networks Hopfield Network for Associative Memory

Hopfield Networks. (Excerpt from a Basic Course at IK 2008) Herbert Jaeger. Jacobs University Bremen

Neural networks. Chapter 19, Sections 1 5 1

Neural Networks. Chapter 18, Section 7. TB Artificial Intelligence. Slides from AIMA 1/ 21

Hopfield networks. Lluís A. Belanche Soft Computing Research Group

Memories Associated with Single Neurons and Proximity Matrices

Neural Networks. Prof. Dr. Rudolf Kruse. Computational Intelligence Group Faculty for Computer Science

Christian Mohr

Storage Capacity of Letter Recognition in Hopfield Networks

Artificial Neural Networks Examination, March 2004

Neural networks. Chapter 20. Chapter 20 1

Using Variable Threshold to Increase Capacity in a Feedback Neural Network

Neural Networks. Associative memory 12/30/2015. Associative memories. Associative memories

Artificial Intelligence

Neural Networks. Fundamentals Framework for distributed processing Network topologies Training of ANN s Notation Perceptron Back Propagation

Learning and Memory in Neural Networks

Neural Networks. Fundamentals of Neural Networks : Architectures, Algorithms and Applications. L, Fausett, 1994

Neural networks: Unsupervised learning

An artificial neural networks (ANNs) model is a functional abstraction of the

Storkey Learning Rules for Hopfield Networks

CN2 1: Introduction. Paul Gribble. Sep 10,

COMP9444 Neural Networks and Deep Learning 11. Boltzmann Machines. COMP9444 c Alan Blair, 2017

Neural Networks for Machine Learning. Lecture 11a Hopfield Nets

Introduction to Artificial Neural Networks

Machine Learning. Neural Networks

Hopfield Networks and Boltzmann Machines. Christian Borgelt Artificial Neural Networks and Deep Learning 296

Artificial Neural Network and Fuzzy Logic

18.6 Regression and Classification with Linear Models

Neural Nets in PR. Pattern Recognition XII. Michal Haindl. Outline. Neural Nets in PR 2

Neural networks. Chapter 20, Section 5 1

Neural Network with Ensembles

ARTIFICIAL NEURAL NETWORK PART I HANIEH BORHANAZAD

CS:4420 Artificial Intelligence

Artificial Neural Networks. Introduction to Computational Neuroscience Tambet Matiisen

Neural Network Essentials 1

COMP 551 Applied Machine Learning Lecture 14: Neural Networks

Associative Memory : Soft Computing Course Lecture 21 24, notes, slides RC Chakraborty, Aug.

On the Hopfield algorithm. Foundations and examples

Hertz, Krogh, Palmer: Introduction to the Theory of Neural Computation. Addison-Wesley Publishing Company (1991). (v ji (1 x i ) + (1 v ji )x i )

Dynamics of Structured Complex Recurrent Hopfield Networks

HOPFIELD MODEL OF NEURAL NETWORK

Part 8: Neural Networks

EEE 241: Linear Systems

CHALMERS, GÖTEBORGS UNIVERSITET. EXAM for ARTIFICIAL NEURAL NETWORKS. COURSE CODES: FFR 135, FIM 720 GU, PhD

Feedforward Neural Nets and Backpropagation

J. Marro Institute Carlos I, University of Granada. Net-Works 08, Pamplona, 9-11 June 2008

AI Programming CS F-20 Neural Networks

Extending the Associative Rule Chaining Architecture for Multiple Arity Rules

Sample Exam COMP 9444 NEURAL NETWORKS Solutions

Simple Neural Nets for Pattern Classification: McCulloch-Pitts Threshold Logic CS 5870

How to do backpropagation in a brain

Lecture 4: Feed Forward Neural Networks

SCRAM: Statistically Converging Recurrent Associative Memory

Artificial Neural Networks" and Nonparametric Methods" CMPSCI 383 Nov 17, 2011!

Dynamical Systems and Deep Learning: Overview. Abbas Edalat

Artificial Neural Networks. Part 2

( ) T. Reading. Lecture 22. Definition of Covariance. Imprinting Multiple Patterns. Characteristics of Hopfield Memory

Artificial Neural Networks. Q550: Models in Cognitive Science Lecture 5

Neural Networks. Mark van Rossum. January 15, School of Informatics, University of Edinburgh 1 / 28

Terminal attractor optical associative memory with adaptive control parameter

Data Mining Part 5. Prediction

Deep Learning Recurrent Networks 2/28/2018

Recurrent Neural Networks

Hopfield and Potts-Hopfield Networks

Introduction to Neural Networks

EE04 804(B) Soft Computing Ver. 1.2 Class 2. Neural Networks - I Feb 23, Sasidharan Sreedharan

Integer weight training by differential evolution algorithms

Neural Network Analysis of Russian Parliament Voting Patterns

Content-Addressable Memory Associative Memory Lernmatrix Association Heteroassociation Learning Retrieval Reliability of the answer

Neural Networks Introduction

ASYMPTOTICALLY STABLE MULTI-VALUED MANY-TO-MANY ASSOCIATIVE MEMORY NEURAL NETWORK AND ITS APPLICATION IN IMAGE RETRIEVAL

Last update: October 26, Neural networks. CMSC 421: Section Dana Nau

Associative Neural Networks using Matlab

Master Recherche IAC TC2: Apprentissage Statistique & Optimisation

Systems Biology: A Personal View IX. Landscapes. Sitabhra Sinha IMSc Chennai

Bidirectional Representation and Backpropagation Learning

Transcription:

Artificial Intelligence Hopfield Networks Andrea Torsello

Network Topologies Single Layer Recurrent Network Bidirectional Symmetric Connection Binary / Continuous Units Associative Memory Optimization Problem

Hopfield Model Discrete Case Recurrent neural network that uses McCulloch and Pitt s (binary) neurons. Update rule is stochastic. Eeach neuron has two states : V L i, V H i V L i = -1, V H i = 1 Usually : V L i = 0, V H i = 1 Input to neuron i is : Where: wij = strength of the connection from j to i Vj = state (or output) of neuron j Ii = external input to neuron i

Hopfield Model Discrete Case Each neuron updates its state in an asynchronous way, using the following rule: The updating of states is a stochastic process: To select the to-be-updated neurons we can proceed in either of two ways: At each time step select at random a unit i to be updated (useful for simulation) Let each unit independently choose to update itself with some constant probability per unit time (useful for modeling and hardware implementation)

Dynamics of Hopfield Model In contrast to feed-forward networks (which are static ) Hopfield networks are dynamical system. The network starts from an initial state V(0) = ( V 1 (0),,V n (0) )T and evolves in state space following a trajectory: Until it reaches a fixed point: V(t+1) = V(t)

Dynamics of Hopfield Networks What is the dynamical behavior of a Hopfield network? Does it coverge? Does it produce cycles? Examples

To study the dynamical behavior of Hopfield networks we make the following assumption: w ij = w ji i, j = 1...n In other words, if W = (wij) is the weight matrix we assume: W =W T In this case the network always converges to a fixed point. In this case the system possesses a Liapunov (or energy) function that is minimized as the process evolves.

The Energy Function Discrete Case Consider the following real function: and let E = E (t + 1) E (t ) Assuming that neuron h has changed its state, we have: But H h and V h have the same sign. Hence E 0 (provided that W = W T )

Schematic configuration space model with three attractors

Hopfield Net As Associative Memory Store a set of p patterns xµ, µ = 1,,p,in such a way that when presented with a new pattern x, the network responds by producing that stored pattern which most closely resembles x. N binary units, with outputs s1,,sn Stored patterns and test patterns are binary (0/1,±1) Connection weights (Hebb Rule) Hebb suggested changes in synaptic strengths proportional to the correlation between the firing of the pre and post-synaptic neurons. Recall mechanism Synchronous / Asynchronous updating Pattern information is stored in equilibrium states of the network

Example With Two Patterns Two patterns Compute weights Weight matrix Recall Input (-1,-1,-1,+1) (-1,-1,-1,+1) Input (-1,-1,-1,-1) (+1,+1,+1,+1) Input (-1,+1,+1,+1) (-1,-1,-1,-1) stable stable spurious

An example of the behavior of a Hopfield net when used as a constant-addressable memory. A 120 node net was trained using the eight examplars shown in (A). The pattern for the digit 3 was corrupted by randomly reversing each bit with a probability of 0.25 and then applied to the net at time zero. Outputs at time zero and after the first seven iterations are shown in (B).

Example of how an associative memory can reconstruct images. These are binary images with 130 x 180 pixels. The images on the right were recalled by the memory after presentation of the corrupted images shown on the left. The middle column shows some intermediate states. A sparsely connected Hopfield network with seven stored images was used.

Storage Capacity of Hopfield Network There is a maximum limit on the number of random patterns that a Hopfield network can store Pmax 0.15N If p < 0.15N, almost perfect recall If memory patterns are orthogonal vectors instead of random patterns, then more patterns can be stored. However, this is not useful. Evoked memory is not necessarily the memory pattern that is most similar to the input pattern All patterns are not remembered with equal emphasis, some are evoked inappropriately often Sometimes the network evokes spurious states