Hopfield Neural Network

Similar documents
Artificial Intelligence Hopfield Networks

In biological terms, memory refers to the ability of neural systems to store activity patterns and later recall them when required.

CHAPTER 3. Pattern Association. Neural Networks

3.3 Discrete Hopfield Net An iterative autoassociative net similar to the nets described in the previous sections has been developed by Hopfield

Iterative Autoassociative Net: Bidirectional Associative Memory

Consider the way we are able to retrieve a pattern from a partial key as in Figure 10 1.

Week 4: Hopfield Network

Neural Networks. Associative memory 12/30/2015. Associative memories. Associative memories

2- AUTOASSOCIATIVE NET - The feedforward autoassociative net considered in this section is a special case of the heteroassociative net.

Hopfield Neural Network and Associative Memory. Typical Myelinated Vertebrate Motoneuron (Wikipedia) Topic 3 Polymers and Neurons Lecture 5

Introduction to Neural Networks

Computational Intelligence Lecture 6: Associative Memory

Memories Associated with Single Neurons and Proximity Matrices

Using Variable Threshold to Increase Capacity in a Feedback Neural Network

COMP-4360 Machine Learning Neural Networks

A. The Hopfield Network. III. Recurrent Neural Networks. Typical Artificial Neuron. Typical Artificial Neuron. Hopfield Network.

Simple Neural Nets For Pattern Classification

7 Rate-Based Recurrent Networks of Threshold Neurons: Basis for Associative Memory

Pattern Association or Associative Networks. Jugal Kalita University of Colorado at Colorado Springs

A. The Hopfield Network. III. Recurrent Neural Networks. Typical Artificial Neuron. Typical Artificial Neuron. Hopfield Network.

Lecture 7 Artificial neural networks: Supervised learning

7 Recurrent Networks of Threshold (Binary) Neurons: Basis for Associative Memory

Associative Memory : Soft Computing Course Lecture 21 24, notes, slides RC Chakraborty, Aug.

Neural Networks (Part 1) Goals for the lecture

Artificial Neural Network and Fuzzy Logic

Hopfield Networks. (Excerpt from a Basic Course at IK 2008) Herbert Jaeger. Jacobs University Bremen

Artificial Neural Network

Using a Hopfield Network: A Nuts and Bolts Approach

Introduction to Artificial Neural Networks

Neural Networks and Fuzzy Logic Rajendra Dept.of CSE ASCET

Plan. Perceptron Linear discriminant. Associative memories Hopfield networks Chaotic networks. Multilayer perceptron Backpropagation

Artifical Neural Networks

Neural Networks. Hopfield Nets and Auto Associators Fall 2017

Fundamentals of Neural Networks

Bidirectional Representation and Backpropagation Learning

Associative Memories (I) Hopfield Networks

Neural Networks Lecture 6: Associative Memory II

Neural Networks. Prof. Dr. Rudolf Kruse. Computational Intelligence Group Faculty for Computer Science

Neural Networks Introduction

Learning and Memory in Neural Networks

PMR5406 Redes Neurais e Lógica Fuzzy Aula 3 Single Layer Percetron

Storage Capacity of Letter Recognition in Hopfield Networks

Neural Networks. Fundamentals Framework for distributed processing Network topologies Training of ANN s Notation Perceptron Back Propagation

Hopfield Networks and Boltzmann Machines. Christian Borgelt Artificial Neural Networks and Deep Learning 296

CS 730/730W/830: Intro AI

Logic Learning in Hopfield Networks

Artificial Neural Networks The Introduction

Machine Learning. Neural Networks

Data Mining Part 5. Prediction

Artificial Neural Networks Examination, March 2004

Neural Networks and the Back-propagation Algorithm

Neural Network Analysis of Russian Parliament Voting Patterns

Machine Learning and Adaptive Systems. Lectures 5 & 6

Hopfield Network for Associative Memory

Neural Nets in PR. Pattern Recognition XII. Michal Haindl. Outline. Neural Nets in PR 2

Learning in State-Space Reinforcement Learning CIS 32

Artificial Neural Networks Examination, June 2004

CSE 5526: Introduction to Neural Networks Hopfield Network for Associative Memory

AI Programming CS F-20 Neural Networks

Part 8: Neural Networks

CHALMERS, GÖTEBORGS UNIVERSITET. EXAM for ARTIFICIAL NEURAL NETWORKS. COURSE CODES: FFR 135, FIM 720 GU, PhD

Simple neuron model Components of simple neuron

How can ideas from quantum computing improve or speed up neuromorphic models of computation?

COMP9444 Neural Networks and Deep Learning 11. Boltzmann Machines. COMP9444 c Alan Blair, 2017

ARTIFICIAL NEURAL NETWORK PART I HANIEH BORHANAZAD

Artificial Neural Networks Examination, June 2005

CE213 Artificial Intelligence Lecture 14

Lecture 5: Recurrent Neural Networks

Neural Networks Lecture 2:Single Layer Classifiers

Neural networks: Unsupervised learning

Unit III. A Survey of Neural Network Model

ASYMPTOTICALLY STABLE MULTI-VALUED MANY-TO-MANY ASSOCIATIVE MEMORY NEURAL NETWORK AND ITS APPLICATION IN IMAGE RETRIEVAL

Feedforward Neural Nets and Backpropagation

The perceptron learning algorithm is one of the first procedures proposed for learning in neural network models and is mostly credited to Rosenblatt.

In the Name of God. Lecture 9: ANN Architectures

CS:4420 Artificial Intelligence

Application of hopfield network in improvement of fingerprint recognition process Mahmoud Alborzi 1, Abbas Toloie- Eshlaghy 1 and Dena Bazazian 2

Neural Networks for Machine Learning. Lecture 11a Hopfield Nets

Convolutional Associative Memory: FIR Filter Model of Synapse

On the Hopfield algorithm. Foundations and examples

Master Recherche IAC TC2: Apprentissage Statistique & Optimisation

In the Name of God. Lecture 11: Single Layer Perceptrons

Neural Networks. Fundamentals of Neural Networks : Architectures, Algorithms and Applications. L, Fausett, 1994

Lecture 4: Feed Forward Neural Networks

Slide10 Haykin Chapter 14: Neurodynamics (3rd Ed. Chapter 13)

Artificial Intelligence (AI) Common AI Methods. Training. Signals to Perceptrons. Artificial Neural Networks (ANN) Artificial Intelligence

Artificial Neural Network

Administration. Registration Hw3 is out. Lecture Captioning (Extra-Credit) Scribing lectures. Questions. Due on Thursday 10/6

Revision: Neural Network

Introduction to Neural Networks

ADALINE for Pattern Classification

22c145-Fall 01: Neural Networks. Neural Networks. Readings: Chapter 19 of Russell & Norvig. Cesare Tinelli 1

HOPFIELD MODEL OF NEURAL NETWORK

SCRAM: Statistically Converging Recurrent Associative Memory

Course 395: Machine Learning - Lectures

Artificial Neural Networks Examination, March 2002

Computational Intelligence Lecture 3: Simple Neural Networks for Pattern Classification

patterns (\attractors"). For instance, an associative memory may have tocomplete (or correct) an incomplete

9 Competitive Neural Networks

INTRODUCTION TO ARTIFICIAL INTELLIGENCE

Transcription:

Lecture 4 Hopfield Neural Network Hopfield Neural Network A Hopfield net is a form of recurrent artificial neural network invented by John Hopfield. Hopfield nets serve as content-addressable memory systems with binary threshold units. It can be used to solve the optimization problems. 2

Hopfield Neural Network A Hopfield network: DL u ( n) w y ( n ) x ( n) i i i y ( n) f ( u ( n)) In Hopfield network the synaptic weights are symmetric: w w Also, in HN there are no self feedback: w 0 i ii i DL DL x x 2 y 2 x 3 u w [ w i ] 3 y y 3 Hopfield Neural Network A Binary Hopfield network: u ( n) w y ( n ) x ( n) i i i u( n) u( n) y ( n) or y ( n) y ( n ) u ( n) 0 u( n) 0 u( n) 4 2

Hopfield Neural Network he energy E for the whole network can be determined from energy function as the following equation: E wi yi y xi yi y i i 2 i i i So: E w y x y i i i i i Δy i is positive when the terms in brackets is positive; and Δy i becomes negative in the other case. herefore the energy increment for the whole network ΔE will always decrease however the input changes. 5 Hopfield Neural Network So, the following two statements can be introduced:. he energy function E is a Lyapunov function. 2. he HNN is a stable in accordance with Lyapunov s heorem. he ability to minimize the energy function in a very short convergence time makes the HN described above be very useful in solving the problems with solutions obtained through minimizing a cost function. herefore, this cost function can be rewritten into the form of the energy function as E if the synaptic weights w i and the external input x i can be determined in advance. 6 3

Hopfield Neural Network Hopfield networks can be implemented to operate in two modes: - Synchronous mode of training Hopfield networks means that all neurons fire at the same time. - Asynchronous mode of training Hopfield networks means that the neurons fire at random. Example: Consider a Hopfield network with three neurons W 0 0.4 0.2 0.4 0 0.5 0.2 0.5 0 Let the state of the network be: y(0)=[,, 0 ]. 7 Hopfield Neural Network Example: W 0 0.4 0.2 0.4 0 0.5 0.2 0.5 0 y(0) 0 Synchronous mode Asynchronous mode y(0)=[,,0] y ()=[ 0,,0 ] y 2 ()=[ 0,0,0 ] y=y 3 ()=[ 0,0,0] y()=[ 0,,0 ] y()=[,0,0 ] y()=[,,] 8 4

State table of Hopfield N.N. A Hopfield net with n neurons has 2 n possible states, assuming that each neuron output produces two values 0 and. he state table for the above example Hopfield network with 3 neurons is given below. Init. state if state if state if state N fires N2 fires N3 fires 000 00 000 000 00 0 0 000 00 00 000 0 0 0 0 0 00 00 00 0 0 0 0 0 00 00 0 9 Hopfield N.N. as BAM Hopfield networks are used as content-addressable memory or Bidirectional Associative Memory (BAM). he content-addressable memory is such a device that returns a pattern when given a noisy or incomplete version of it. In this sense a content-addressable memory is error-correcting as it can override provided inconsistent information. he discrete Hopfield network as a memory device operates in two phases: storage phase and retrieval phase. During the storage phase the network learns the weights after presenting the training examples. he training examples for this case of automated learning are binary vectors, called also fundamental memories. he weights matrix is learned using the Widrow-Hoff rule. According to this rule when an input pattern is passed to the network and the estimated network output does not match the given target, the corresponding weights are modified by a small amount. he difference from the single-layer perceptron is that no error is computed, rather the target is taken directly for weight updating. 0 0 5

Widrow-Hoff Learning Learning: he Widrow-Hoff learning rule suggests to compute the summation block of the i-th neuron: u ( n) w y ( n ) x ( n) i i i here are two cases to consider: u ( n) 0 0. u & wi wi n y ( n) 0 ( n) u ( n) 0 0. u & wi wi n y ( n) ( n) Where, n denotes the number of neurons. Outer product Learning Learning: Suppose that we wish to store a set of N-dimensional vectors (binary words), denoted by {x m, m =, 2,..., M}. We call these M vectors fundamental memories, representing the patterns to be memorized by the network. he outer product learning rule, that is, the generalization of Hebb s learning rule: M W ξmξm MI N m From these defining equations of the synaptic weights matrix, we note the following: he output of each neuron in the network is fed back to all other neurons. here is no self-feedback in the network (i.e., w ii = 0). he weight matrix of the network is symmetric. (i.e., W =W) 2 2 6

Learning Algorithm Initialization: Let the testing vector become initial state x(0) Repeat -update asynchronously the components of the state x(t) -continue this updating until the state remains unchanged until convergence u ( n) w y ( n ) x ( n) 0 y ( n) i i i i u ( n) w y ( n ) x ( n) 0 y ( n) 0 i i i i Generate output: return the stable state (fixed point) as a result. he network finally produces a time invariant state vector y which satisfies the stability condition: y sgn( Wy b) 3 3 Learning Algorithm During the retrieval phase a testing vector called probe is presented to the network, which initiates computing the neuron outputs and developing the state. After sending the training input to the recurrent network its output changes for a number of steps until reaching a stable state. he selection of the next neuron to fire is asynchronous, while the modifications of the state are deterministic. After the state evolves to a stable configuration, that is the state is not more updated, the network produces a solution. his state solution can be envision as a fixed point of the dynamical network system. he solution is obtained after adaptive training. 4 4 7

Summary of Hopfield Model he operational procedure for the Hopfield network may now be summarized as follows:. Storage (Learning). Let x, x 2,, x M denote a known set of N-dimensional memories. Construct the network by using the Widrow-Hoff or outer product rule (Le., Hebb's postulate of learning) to compute the synaptic weights of the network. he elements of the vector x M equal +/-. Once they are computed, the synaptic weights are kept fixed. 2. Initialization. Let x probe denote an unknown N-dimensional input vector (probe) presented to the network. he algorithm is initialized by setting y (0) = x,probe =,..., N where y (0) is the state of neuron at time n = 0. 3. Iteration until Convergence. Update the elements of state vector y(n) asynchronously (i.e., randomly and one at a time) according to the rule y(n+)=sgn[w.y(n)] Repeat the iteration until the state vector s remains unchanged. 4. Outputting. Let y fixed denote the fixed point (stable state) computed at the end of step 3. he resulting output vector y of the network is Y =y fixed 5 5 Summary of Hopfield Model Example: Consider a Hopfield N.N. with 3 neurons, which we want store two vectors (,-,) and (-,,-): 0 0 0 2 2 W 2 0 0 2 0 2 3 3 0 0 2 2 0 he threshold applied to each neuron is assumed to be zero and the corresponding HNN has no external input. 6 6 8

Summary of Hopfield Model 0 2 2 W 2 0 2 3 2 2 0 0 0 y(0) 0 0 y(0) 0 0 y(0) 0 0 y(0) 0 0 y(0) 0 0 y(0) herefore, the network has two fundamental memories. 7 7 Summary of Hopfield Model y(0) y(0) y(0) y( 0) y(0) y(0) 8 8 9

Literature Cited he material of this lecture is based on: [] Simon Haykin, Neural Networks: A Comprehensive Foundation., Prentice Hall, 998. [2] http://homepages.gold.ac.uk/nikolaev/cis3.htm 9 0