Week 4: Hopfield Network

Similar documents
Artificial Intelligence Hopfield Networks

2- AUTOASSOCIATIVE NET - The feedforward autoassociative net considered in this section is a special case of the heteroassociative net.

In biological terms, memory refers to the ability of neural systems to store activity patterns and later recall them when required.

Hopfield Neural Network

3.3 Discrete Hopfield Net An iterative autoassociative net similar to the nets described in the previous sections has been developed by Hopfield

COMP9444 Neural Networks and Deep Learning 11. Boltzmann Machines. COMP9444 c Alan Blair, 2017

CHAPTER 3. Pattern Association. Neural Networks

Using a Hopfield Network: A Nuts and Bolts Approach

Lecture 7 Artificial neural networks: Supervised learning

Introduction to Neural Networks

Iterative Autoassociative Net: Bidirectional Associative Memory

7 Rate-Based Recurrent Networks of Threshold Neurons: Basis for Associative Memory

Artificial Neural Networks Examination, March 2004

Neural Nets and Symbolic Reasoning Hopfield Networks

Neural Networks. Hopfield Nets and Auto Associators Fall 2017

A. The Hopfield Network. III. Recurrent Neural Networks. Typical Artificial Neuron. Typical Artificial Neuron. Hopfield Network.

7 Recurrent Networks of Threshold (Binary) Neurons: Basis for Associative Memory

A. The Hopfield Network. III. Recurrent Neural Networks. Typical Artificial Neuron. Typical Artificial Neuron. Hopfield Network.

Machine Learning. Neural Networks

COGS Q250 Fall Homework 7: Learning in Neural Networks Due: 9:00am, Friday 2nd November.

Hopfield Networks. (Excerpt from a Basic Course at IK 2008) Herbert Jaeger. Jacobs University Bremen

Introduction to Artificial Neural Networks

Computational Intelligence Lecture 6: Associative Memory

ARTIFICIAL NEURAL NETWORK PART I HANIEH BORHANAZAD

Neural Networks. Prof. Dr. Rudolf Kruse. Computational Intelligence Group Faculty for Computer Science

Hopfield Neural Network and Associative Memory. Typical Myelinated Vertebrate Motoneuron (Wikipedia) Topic 3 Polymers and Neurons Lecture 5

AI Programming CS F-20 Neural Networks

Learning and Memory in Neural Networks

On the Hopfield algorithm. Foundations and examples

Neural Nets in PR. Pattern Recognition XII. Michal Haindl. Outline. Neural Nets in PR 2

Lecture 5: Logistic Regression. Neural Networks

Hopfield Networks and Boltzmann Machines. Christian Borgelt Artificial Neural Networks and Deep Learning 296

Hopfield networks. Lluís A. Belanche Soft Computing Research Group

Neural networks. Chapter 19, Sections 1 5 1

Artificial Neural Networks Examination, June 2005

Neural networks. Chapter 20. Chapter 20 1

EEE 241: Linear Systems

Neural Networks Lecture 6: Associative Memory II

Neural Networks for Machine Learning. Lecture 2a An overview of the main types of neural network architecture

Data Mining Part 5. Prediction

CMSC 421: Neural Computation. Applications of Neural Networks

Stochastic Networks Variations of the Hopfield model

Neural Networks for Machine Learning. Lecture 11a Hopfield Nets

18.6 Regression and Classification with Linear Models

Q520: Answers to the Homework on Hopfield Networks. 1. For each of the following, answer true or false with an explanation:

COMP 551 Applied Machine Learning Lecture 14: Neural Networks

Input layer. Weight matrix [ ] Output layer

Neural Networks. Associative memory 12/30/2015. Associative memories. Associative memories

Plan. Perceptron Linear discriminant. Associative memories Hopfield networks Chaotic networks. Multilayer perceptron Backpropagation

Neural networks. Chapter 20, Section 5 1

Experiment 1: Linear Regression

Feedforward Neural Nets and Backpropagation

Unit 8: Introduction to neural networks. Perceptrons

Associative Memory : Soft Computing Course Lecture 21 24, notes, slides RC Chakraborty, Aug.

Associative Memories (I) Hopfield Networks

Storage Capacity of Letter Recognition in Hopfield Networks

Simple Neural Nets For Pattern Classification

Homework 6: Image Completion using Mixture of Bernoullis

INTRODUCTION TO ARTIFICIAL INTELLIGENCE

How can ideas from quantum computing improve or speed up neuromorphic models of computation?

CSE 5526: Introduction to Neural Networks Hopfield Network for Associative Memory

Chapter 9: The Perceptron

HOMEWORK #4: LOGISTIC REGRESSION

Pattern Association or Associative Networks. Jugal Kalita University of Colorado at Colorado Springs

CHALMERS, GÖTEBORGS UNIVERSITET. EXAM for ARTIFICIAL NEURAL NETWORKS. COURSE CODES: FFR 135, FIM 720 GU, PhD

Gradient Descent Training Rule: The Details

Neural Networks. Fundamentals Framework for distributed processing Network topologies Training of ANN s Notation Perceptron Back Propagation

Artificial Neural Networks Examination, June 2004

Pattern Classification

1 R.V k V k 1 / I.k/ here; we ll stimulate the action potential another way.) Note that this further simplifies to. m 3 k h k.

Neural Networks. Chapter 18, Section 7. TB Artificial Intelligence. Slides from AIMA 1/ 21

Using Variable Threshold to Increase Capacity in a Feedback Neural Network

Classification with Perceptrons. Reading:

CSC321 Lecture 4 The Perceptron Algorithm

CE213 Artificial Intelligence Lecture 14

CS:4420 Artificial Intelligence

CS 730/730W/830: Intro AI

Neural Networks. CSE 6363 Machine Learning Vassilis Athitsos Computer Science and Engineering Department University of Texas at Arlington

Application of hopfield network in improvement of fingerprint recognition process Mahmoud Alborzi 1, Abbas Toloie- Eshlaghy 1 and Dena Bazazian 2

Stanford Machine Learning - Week V

Revision: Neural Network

Neural Networks. Mark van Rossum. January 15, School of Informatics, University of Edinburgh 1 / 28

The error-backpropagation algorithm is one of the most important and widely used (and some would say wildly used) learning techniques for neural

Lecture 15: Exploding and Vanishing Gradients

Christian Mohr

Neural Networks. Nicholas Ruozzi University of Texas at Dallas

Memories Associated with Single Neurons and Proximity Matrices

Practicals 5 : Perceptron

Content-Addressable Memory Associative Memory Lernmatrix Association Heteroassociation Learning Retrieval Reliability of the answer

Math 016 Lessons Wimayra LUY

Artificial Neural Networks Examination, March 2002

Neural Networks and Ensemble Methods for Classification

2015 Todd Neller. A.I.M.A. text figures 1995 Prentice Hall. Used by permission. Neural Networks. Todd W. Neller

Sample Exam COMP 9444 NEURAL NETWORKS Solutions

HOPFIELD neural networks (HNNs) are a class of nonlinear

The perceptron learning algorithm is one of the first procedures proposed for learning in neural network models and is mostly credited to Rosenblatt.

Week 5: The Backpropagation Algorithm

Artificial neural networks

The Hypercube Graph and the Inhibitory Hypercube Network

CSC321 Lecture 5: Multilayer Perceptrons

Transcription:

Week 4: Hopfield Network Phong Le, Willem Zuidema November 20, 2013 Last week we studied multi-layer perceptron, a neural network in which information is only allowed to transmit in one direction (from input units to output units), for supervised learning. This week, we will study another famous net, namely Hopfield net where information is allowed to transmit backward. Required Library We use the library png for reading and writing png files. Install it by typing install.packages("png"). Required R Code At http://www.illc.uva.nl/laco/clas/fncm13/assignments/computerlab-week4/ you can find the R-files you need for this exercise. 1 Definition Hopfield network is a recurrent neural network in which any neuron is an input as well as output unit, and each neuron i is a perceptron with the binary threshold activation function, any pair of neurons (i, j) are connected by two weighted links w ij and w ji. Formally speaking, a neuron is characterized by { 1 if ui (t) 0 v i (t) = sign(u i (t)) = 1 otherwise (1) where u i (t), the total input at time t, is computed by u i (t) = j i v j (t 1)w ji + θ i A neuron i is called stable at time t if v i (t) = v i (t 1) = sign(u i (t 1)). A Hopfield net is called stable if all of its neurons are stable. Here after, we drop the time t, and assume that all the biases are 0; i.e., we use u i instead of u i (t), and θ i = 0 for all i. 2 Activation There are two ways to update a Hopfield net, asynchronous where only one neuron, which is randomly picked up, is updated at a time and synchronous where all neurons are updated at a time. It is proved that, if the net is symmetric, i.e. w ij = w ji for all i, j, then it will definitely converge to a stable point (see Theorem 2, page 51, Kröse and van der Smagt (1996)), which is a local minimum of the following energy function E = 1 w ij v i v j = 1 2 2 vt Wv i,j 1

Exercise 1: In this exercise, you will study how a Hopfield net updates its state. Let consider the network given in Figure 1. 1. First of all, you need to load the file hopfield.r (typing source("hopfield.r")). 2. Then, create a Hopfield net, by create a weight matrix and then put it in a list as follows weights = matrix(c( 0, 1, -2, 1, 0, 1, -2, 1, 0), 3, 3) hopnet = list(weights = weights) 3. Set the initial state init.y = c(1,-1,1). 4. Next, activate the net by typing run.hopfield(hopnet, init.y, stepbystep=t, topo=c(3,1)). You should see the below weights = [, 1 ] [, 2 ] [, 3 ] [ 1, ] 0 1 2 [ 2, ] 1 0 1 [ 3, ] 2 1 0 input p a t t e r n = 1 1 1 and a black-white plot illustrating the current state of the network (black represents -1, white 1). 5. Press Enter, you should see something like this i t e r 1 p i c k neuron 2 input to t h i s neuron 2 ouput o f t h i s neuron 1 new s t a t e 1 1 1 This means that, in the first iteration, we pick the neuron number 2. neuron is 2, and hence its output is 1. The total input to this 6. Press Enter three or four times to see what happens next. 7. Now, do all the steps above again with your own weight matrix and initial state. Report what you get (just copy what you see in your R console and paste it into your report). Exercise 2: To see why symmetry is important for the convergence, let s consider a 2-neuron Hopfield net with the weight matrix ( ) 0 1 W = 1 0 Your task is to examine whether the net converges or not if the initial state is v = (1, 1) T. To do so, load the R file hopfield.r (source("hopfield.r")), create the corresponding Hopfield net by typing hopnet = list(weights = matrix(c(0, 1, -1, 0), 2, 2)) 2

Figure 1: A Hopfield net. set the initial state init.y = c(1,-1), activate the net by typing run.hopfield(hopnet, init.y, maxit = 10, stepbystep=t, topo=c(2,1)) (note that maxit is the number of times we pick a neuron to activate it.) Now, could you conclude anything ( ) about the convergence? 0 1 Set the weight matrix W = and do all the steps above again. Does the net reach a stable 1 0 state? 3 Learning with Hebb s rule A Hopfield net is an associative memory in the sense that it could be used to store patterns and we can retrieve a stored pattern with an incomplete input. The principle behind it is that, the weight matrix is set such that the stored patterns correspond to stable points; and therefore, given an incomplete input pattern, the net will iterate to converge to a stable point which corresponds to a stored pattern most similar to the input. Given m patterns p k = (p k,1,..., p k,n ) T, k = 1..m, the Hebbian learning rule can be used to set up the weights as follows { m w ij = k=1 p k,ip k,j if i j 0 otherwise which could be written compactly: W = m p k p T k mi k=1 Unfortunately, the learning rule above is not perfect. In order to decide when to apply that rule and which net structure to use, let s claim a store pattern, says p 1, and see when the net converges to that pattern. Recall that the net is stable at p 1 when all neurons are stable, i.e. v = sign(u) = sign ( Wp 1 ) = p1. Analysing u we have u = Wp 1 = (p 1 p T 1 +... + p m p T m mi)p 1 = p 1 p T 1 p 1 +... + p m p T mp 1 mip 1 = (n m)p 1 + j>1 α j p j 3

where n is the number of neurons, α j could be considered as the similarity between p 1 and p j. From this, we can see that, in order to have sign(u) = p 1, n > m and α j are small. Intuitively, the number of neurons must be greater than the number of patterns we want to store, and those patterns must be distinguishable. Exercise 3: In this exercise, you will study how to use the Hebbian learning rule to update a Hopfield net s weights, and its effect. 1. Let s create two 10 10 images, namely pattern1, pattern2, and the matrix weight as follows pattern1 = rep(1, 10*10) pattern2 = rep(-1, 10*10) weights = matrix(rep(0,10*10*10*10),10*10, 10*10) 2. Then, use the Hebbian learning rule to update the weight matrix weights = weights + pattern1 %o% pattern1 + pattern2 %o% pattern2-2*diag(10*10) 3. Examine whether the net with that weight matrix stores those two patterns by executing run.hopfield(list(weights=weights), pattern1, stepbystep=t, topo=c(10,10), replace=f) and run.hopfield(list(weights=weights), pattern2, stepbystep=t, topo=c(10,10), replace=f) 4. Create other two patterns, which are pattern1 but some lines are filled with -1 pattern3 = matrix(pattern1, 10, 10) for (i in 1:3) pattern3[i,] = -1 dim(pattern3) = NULL pattern4 = matrix(pattern1, 10, 10) for (i in 1:5) pattern4[i,] = -1 dim(pattern4) = NULL If these patterns are set as the initial state for the net, which patterns will the net retrieve? 5. Now, use the Hebbian learning rule to store pattern3, by updating the weight matrix as follows weights = weights + pattern3 %o% pattern3 - diag(10*10) Does the net now remember this pattern? 6. If pattern4 is set as the initial state for the net, which patterns will the net retrieve? 7. One interesting property of the Hebbian learning rule is that we can use its reverse (i.e,. addition becomes subtraction and vice versa) to erase a pattern out of the memory. To make the net forget pattern3, execute weights = weights - pattern3 %o% pattern3 + diag(10*10) Now, check if the net still remembers pattern3. Check if pattern4 is set as the initial state for the net, which patterns will the net retrieve? 4

Exercise 4: In this exercise, you will train a Hopfield net to store some digits. The files you need are hopfield.r for training a Hopfield net, hopfield digit.r for the whole experiment, and 0.png, 1.png,..., 9.png which are images of the ten digits. 1. First of all, execute the file analyse digit.r to examine the similarities between the digits. Which pair is the most similar? Which is the least? Which digit is the most distinguishable from the others? (Note the digit 0 is indexed 10.) 2. Open hopfield digit.r and set digits to a set of digits you want to store in a Hopfield net. First try with all ten digits. Then with all odd digits, and finally with all even digits. After executing the hopfield digit.r file, you should see a plot, in which the first row contains input images, the second row output images, and the third row expected output images. What do you see? Can you explain? 3. Find the largest set of digits that the net can store. 4. Now, we consider only even digits. The parameter noise rate decides the probability that a pixel is flipped, e.g. noise rate = 0.1 means any pixel is flipped with the probability 0.1. The higher noise rate is, the more noisy the input is. Gradually increase the value of noise rate from 0 to 1, report the maximum value that the network correctly retrieves all inputs. 5. In some cases, the retrieved digit looks quite good, but not perfect. Can you explain why? 4 Submission You have to submit a file named your name.pdf. The deadline is 15:00 Monday 25 Nov. If you have any questions, contact Phong Le (p.le@uva.nl). References Kröse, B. and van der Smagt, P. (1996). An introduction to neural networks. 5