Hopfield Neural Network and Associative Memory. Typical Myelinated Vertebrate Motoneuron (Wikipedia) Topic 3 Polymers and Neurons Lecture 5

Similar documents
Artificial Intelligence Hopfield Networks

Memories Associated with Single Neurons and Proximity Matrices

Using a Hopfield Network: A Nuts and Bolts Approach

A. The Hopfield Network. III. Recurrent Neural Networks. Typical Artificial Neuron. Typical Artificial Neuron. Hopfield Network.

Lecture 7 Artificial neural networks: Supervised learning

A. The Hopfield Network. III. Recurrent Neural Networks. Typical Artificial Neuron. Typical Artificial Neuron. Hopfield Network.

Using Variable Threshold to Increase Capacity in a Feedback Neural Network

7 Rate-Based Recurrent Networks of Threshold Neurons: Basis for Associative Memory

7 Recurrent Networks of Threshold (Binary) Neurons: Basis for Associative Memory

Introduction to Neural Networks

Neural networks. Chapter 19, Sections 1 5 1

How can ideas from quantum computing improve or speed up neuromorphic models of computation?

Hopfield Neural Network

Hopfield Networks. (Excerpt from a Basic Course at IK 2008) Herbert Jaeger. Jacobs University Bremen

Artificial Neural Network and Fuzzy Logic

arxiv:quant-ph/ v1 17 Oct 1995

Storage Capacity of Letter Recognition in Hopfield Networks

Linear Regression, Neural Networks, etc.

Introduction Biologically Motivated Crude Model Backpropagation

Neural Nets in PR. Pattern Recognition XII. Michal Haindl. Outline. Neural Nets in PR 2

In biological terms, memory refers to the ability of neural systems to store activity patterns and later recall them when required.

Neural Network Essentials 1

Learning and Memory in Neural Networks

Neural Networks. Prof. Dr. Rudolf Kruse. Computational Intelligence Group Faculty for Computer Science

Introduction to Artificial Neural Networks

Neural networks. Chapter 20. Chapter 20 1

Artificial Neural Networks Examination, March 2004

Consider the way we are able to retrieve a pattern from a partial key as in Figure 10 1.

Lecture 4: Feed Forward Neural Networks

CSE 5526: Introduction to Neural Networks Hopfield Network for Associative Memory

Neurochemistry 1. Nervous system is made of neurons & glia, as well as other cells. Santiago Ramon y Cajal Nobel Prize 1906

Sections 18.6 and 18.7 Artificial Neural Networks

Neural networks. Chapter 20, Section 5 1

Hertz, Krogh, Palmer: Introduction to the Theory of Neural Computation. Addison-Wesley Publishing Company (1991). (v ji (1 x i ) + (1 v ji )x i )

Neural Networks. Chapter 18, Section 7. TB Artificial Intelligence. Slides from AIMA 1/ 21

Machine Learning. Neural Networks. (slides from Domingos, Pardo, others)

Logic Learning in Hopfield Networks

CHALMERS, GÖTEBORGS UNIVERSITET. EXAM for ARTIFICIAL NEURAL NETWORKS. COURSE CODES: FFR 135, FIM 720 GU, PhD

Feedforward Neural Nets and Backpropagation

Hopfield and Potts-Hopfield Networks

Artificial Neural Network

Chapter 9: The Perceptron

Instituto Tecnológico y de Estudios Superiores de Occidente Departamento de Electrónica, Sistemas e Informática. Introductory Notes on Neural Networks

Neural Networks. Fundamentals of Neural Networks : Architectures, Algorithms and Applications. L, Fausett, 1994

Effects of Interactive Function Forms in a Self-Organized Critical Model Based on Neural Networks

Data Mining Part 5. Prediction

Neural Networks and Fuzzy Logic Rajendra Dept.of CSE ASCET

Chapter 37 Active Reading Guide Neurons, Synapses, and Signaling

Machine Learning. Neural Networks. (slides from Domingos, Pardo, others)

Machine Learning. Neural Networks. (slides from Domingos, Pardo, others)

Associative Memories (I) Hopfield Networks

Simple neuron model Components of simple neuron

Nervous System Organization

Compartmental Modelling

Neural Networks. Hopfield Nets and Auto Associators Fall 2017

Universality of sensory-response systems

Sections 18.6 and 18.7 Artificial Neural Networks

Physiology Unit 2. MEMBRANE POTENTIALS and SYNAPSES

Terminal attractor optical associative memory with adaptive control parameter

The Neuron - F. Fig. 45.3

Machine Learning. Neural Networks

EEE 241: Linear Systems

Hopfield Network for Associative Memory

Last update: October 26, Neural networks. CMSC 421: Section Dana Nau

Control and Integration. Nervous System Organization: Bilateral Symmetric Animals. Nervous System Organization: Radial Symmetric Animals

Modelling stochastic neural learning

Artificial Neural Networks. Historical description

Neural Networks: Introduction

2- AUTOASSOCIATIVE NET - The feedforward autoassociative net considered in this section is a special case of the heteroassociative net.

Neural Nets and Symbolic Reasoning Hopfield Networks

Artifical Neural Networks

Introduction To Artificial Neural Networks

J. Marro Institute Carlos I, University of Granada. Net-Works 08, Pamplona, 9-11 June 2008

INTRODUCTION TO NEURAL NETWORKS

Hopfield Networks and Boltzmann Machines. Christian Borgelt Artificial Neural Networks and Deep Learning 296

3.3 Discrete Hopfield Net An iterative autoassociative net similar to the nets described in the previous sections has been developed by Hopfield

CS:4420 Artificial Intelligence

Neural Networks Introduction

COMP9444 Neural Networks and Deep Learning 2. Perceptrons. COMP9444 c Alan Blair, 2017

Ch. 5. Membrane Potentials and Action Potentials

Neural networks: Unsupervised learning

Artificial Neural Networks Examination, June 2005

Neural Networks for Machine Learning. Lecture 11a Hopfield Nets

EE04 804(B) Soft Computing Ver. 1.2 Class 2. Neural Networks - I Feb 23, Sasidharan Sreedharan

Marr's Theory of the Hippocampus: Part I

Neuron. Detector Model. Understanding Neural Components in Detector Model. Detector vs. Computer. Detector. Neuron. output. axon

Systems Biology: A Personal View IX. Landscapes. Sitabhra Sinha IMSc Chennai

2401 : Anatomy/Physiology

Chapter ML:VI. VI. Neural Networks. Perceptron Learning Gradient Descent Multilayer Perceptron Radial Basis Functions

Math in systems neuroscience. Quan Wen

Week 4: Hopfield Network

Artificial neural networks

CN2 1: Introduction. Paul Gribble. Sep 10,

COMP304 Introduction to Neural Networks based on slides by:

CHAPTER 3. Pattern Association. Neural Networks

Iterative Autoassociative Net: Bidirectional Associative Memory

Nervous System Organization

Neurons, Synapses, and Signaling

NEURONS, SENSE ORGANS, AND NERVOUS SYSTEMS CHAPTER 34

BIOLOGY 11/10/2016. Neurons, Synapses, and Signaling. Concept 48.1: Neuron organization and structure reflect function in information transfer

Transcription:

Hopfield Neural Network and Associative Memory Typical Myelinated Vertebrate Motoneuron (Wikipedia) PHY 411-506 Computational Physics 2 1 Wednesday, March 5

1906 Nobel Prize in Physiology or Medicine. Drawings by Ramon y Cajal. PHY 411-506 Computational Physics 2 2 Wednesday, March 5

Action Potential of an Excited Neuron Neuron Networks in the Brain The human brain contains about 10 11 neurons connected in tree-like networks. The cell body of a neuron has as many as 10 5 dedrites or tree-like fibers on its surface The neuron integrates electrochemical signals received by its dendrites from other neurons If the integrated signal exceeds a threshold the neurons generates a spike-train of soliton-like signals that propagate along its axon PHY 411-506 Computational Physics 2 3 Wednesday, March 5

The axon branches into as many as 10 4 strands each ending in an synapse with a denrite on another neuron The axonic spike-train triggers a flow of chemicals (neurotransmitters) across the synaptic junction Because spikes have characteristic fixed magnitude, profile and speed, information is likely encoded in the length of a spike train and the pattern of spikes as a function of time Physiology of Memory The brain can store information and recall it at a later time A sensory stimulus, for example seeing a face or hearing a voice, triggers a cascade of spike-trains in a large network of neurons, see Wikipedia: Neuroanatomy of memory Information contained in the spike-train cascade modifies the chemical structure of a network of synaptic connections, thus encoding the information in short- or long-term memory, see Wikipedia: Physiology of memory A different stimulus at a later time can trigger a spike-train cascade similar to the original stimulus, thus recalling the stored memory PHY 411-506 Computational Physics 2 4 Wednesday, March 5

McCulloch-Pitts Neuron Model An Artificial neuron model was introduced by W. McCulloch and W. Pitts in 1943 to model the recall of information in a network of neural synapses. The action potential of the i-th neuron in a network is modeled by a simple binary voltage or state { 1 firing (active) V i = 0 not firing (resting) Synaptic information is represented by a matrix of weights w. The action potential in the i-th neuron is determined by the potentials is all of the other neurons which make synaptic connections to it by V i = Θ µ i + j w ij V j where w ij is the synaptic weight or strength of the j i synaptic connection, µ i is a threshold bias factor for neuron i, and Θ is a binary transfer function taking values 0, 1, for example a simple Heaviside step function { 1 if x > 0 Θ(x) = θ(x) = 0 otherwise The neuron is binary threshold device that switches on and off in time. PHY 411-506 Computational Physics 2 5 Wednesday, March 5

Hebb s Rule and Hebbian Learning How does the weight matrix w get modified to store memories? In 1949 D. Hebb proposed a mechanism called Hebb s rule w ij = 1 p V (k) i V (k) j p for a network excited by p different memories with excitation patterns V (k) i. k=1 PHY 411-506 Computational Physics 2 6 Wednesday, March 5

The Hopfield Neural Network Model J.J. Hopfield, Neural networks and physical systems with emergent collective computational abilities, Proc. Natl. Acad. Sci. USA 79, 2554-2558 (1982) introduced a neural network model for storage and retrieval of memories in a binary network. He showed that the model can store multiple memories, and that one or more memories can be simultaneously retrieved by stimulating the network with partial or damaged versions of the memories at a later time. This classic article has been cited more than 12,000 times, and is clearly written and relatively easy to understand. The Model System Consider N binary McCulloch-Pitts neurons with action potentials { 1 firing V i (t) = 0 not firing Each neuron changes its state at random times with a mean attempt rate W by instantaneously sampling PHY 411-506 Computational Physics 2 7 Wednesday, March 5

the potentials of all the other neurons V i (t) = θ j T ij V j (t) The Information Storage Algorithm The network stores memories in a learning phase. A memory is represented by a pattern of action potentials, i.e., by a binary vector V with particular values for its N neuronal components V i, i = 0,..., N 1. To store P memories V k, k = 0,..., P 1 the synaptic matrix is set to T ij = P 1 k=0 (2V (k) i 1)(2V (k) j 1), i j, T ii = 0. Information Retrieval To retrieve memories the model is evolved using the McCulloch-Pitts update algorithm starting from a given initial state. The network is operated by updating the neurons according to some protocol, for example by choosing neurons at random or sequentially (which is usually what is done in software networks), or by updating the whole network synchronously (which is more natural for a hardwired network controlled by a clock). PHY 411-506 Computational Physics 2 8 Wednesday, March 5

In general the state of the system tends to one of several equilibrium configurations, which represents the recalled memory. Energy Function and Ising Model Equivalence The behavior of the model can be analyzed by defining an energy function E = 1 T ij V i V j 2 which represents the energy of a random spin glass with spin variables s i = 1 2V i = ±1. If V i changes by V i in an evolution step, the energy of the system changes by E = V i T ij V j Hopfield showed that i,j i j i The network dynamics decreases the energy of the network This implies that if the network is started in an arbitrary state, then it will evolve to the nearest local energy minimum. The stored states are local minima of the energy function. So if the initial state happens to be in the basin of attraction of one of the stored minima, the that pattern will be recalled! PHY 411-506 Computational Physics 2 9 Wednesday, March 5

A network with N neurons has a huge number 2 N states. The network works best if the stored memories partition the space of network states into well defined basins. The storage capacity of the network is found to be 0.13N. If too many memories are stored, then the minima are not well defined and memories may not be perfectly recalled. The Hamming Distance There are various possible criteria for deciding how closely a given pattern, which is a N-component binary vector, resembles another pattern. A commonly used measure of the distance between two binary vectors is the Hamming distance D H = N 1 i=0 [ξ i (1 ζ i ) + (1 ξ i )ζ i ]. PHY 411-506 Computational Physics 2 10 Wednesday, March 5

This distance varies between 0 if all N bits are identical, and N if none of the bit components are the same. A simple application of neural networks is to storage and reconstruction of associative memories, illustrated in the Hopfield Java Applet PHY 411-506 Computational Physics 2 11 Wednesday, March 5