//5 Neural Netors Associative memory Lecture Associative memories Associative memories The massively parallel models of associative or content associative memory have been developed. Some of these models are: Kohonen, Grossberg, Hamming and idely non Hopfield model. The most interesting aspect of the most of these models is that they specify a learning rule hich can be used to train netor to associate input and output patterns. The associative netor is a computational model emphasizing local and synchronous or asynchronous control, high parallelism, and redundancy. Such a netor is a connectionist architecture and shares some common features ith the Rosenblatt's Perceptron. Hoever, that is much more poerful and flexible than the Perceptron. 4 The model has its origin both in the Hamming and Grossberg models. The netor model is composed of layers or slabs: and input layer, an intermediate layer, and an output layer. The intermediate layer is a modified totally interconnected memoryless Grossberg slab ith recurrent shunting oncenter off-surround subnets, hose purpose is to achieve a maority vote so that only one neuron from this level, the one ith the highest input value, ill send its output to the next layer. The similarities to Grossbergs model: interconnections beteen input layer and intermediate layer The similarities to Hammings model interconnections (feedbac) in the intermediate layer. The connections beteen the input layer and intermediate layer contain all the information about one stored vector. The netor is implementing the nearest-neighbor algorithm. 5 6
//5 The number of elements in the intermediate layer defines the number of stored patterns.. All feedbac connections ithin the intermediate layer are based on the rule of lateral inhibition. The netor is performing a inner-taes-all operation. The elements of input signals (and stored vectors) are the binary values and. X [x, x, x,, x n ] x i {,} The input and output elements (neurons) are only nodes hose purpose is to connect the inputs and outputs respectively to the intermediate slab. The netor can be programmed to function as an autoassociative content-addressable memory or as symbolic substitution system hich yields an arbitrary defined output for any input it depends from the connections beteen the intermediate slab and the output layer. 7 8...... Input layer Intermediate layer eights + negative eights threshold. Output layer 9 Programming the netor The interconnections (eights) beteen the input elements and each intermediate neuron are independent to each other. Each intermediate element has its eights programmed to one input signal and these connections are left unchanged hile the other neurons are programmed. Adding or removing a ne pattern does not influence to the existing netor structure and eights. The connection eights beteen the elements of the input layer and th element of intermediate slab are: if the i th element of the input vector is equal to zero i if the i th element of the input vector is equal to one i b here b is the number of non-zero elements in the th input vector to be stored. This procedure normalizes the total input to each element of the intermediate slab to the interval <;>, and taes not account the relative number of stored elements equal to the input elements, instead of the absolute number. It allos to distinguish beteen signals if one is included in another one.
//5 Example: pattern pattern eights of intermediate slab element here the pattern is recorded i i 8. In the input signal is, the output from both elements is equal to one.. If the input signal is, the output signal from element is equal to.8 hence from element is equal to. The ambiguous output signal in the first case can be solved by the proper netor structure. 4 This learning procedure is repeated for each input vector, each time ith a ne intermediate neuron. The total number of different vectors that can be stored ith this prescription in the net ith n elements in the input layer is n n n 5 Each neuron in the intermediate slab is connected to all other neurons of this slab. The eight on the self feedbac loop is equal to one, and all the other values depend on the correlation beteen stored vectors. The eight beteen the output of th neuron and input of the th neuron is given by + cor(, (, ( M ) here cor(, is correlation (inner product) beteen th and th stored vectors. is one of the identical positive eight from the input slab to the th neuron, M is equal to the number of neurons in the intermediate slab ith non-zero inputs. 6 The denominator ensures that the total lateral inhibition for the element ith the greatest value is smaller that its input. This procedure realizes the rule inner-taes-all. The intermediate slab selects the maximum input, and drives all the other intermediate neurons to zero. If more then one intermediate neuron has the same maximum value, the slab ill select the one that is less correlated to the remaining stored vectors. The structure of connections in the intermediate slab is not symmetrical (, ) (, ), hence If to or more neurons ill have the same input signal, and the outputs may not be discriminated by the criterion, then the slab ill be unable to distinguish beteen them and the outputs ill be driven to zero or ill be a superposition of the tp or more outputs. 7 8
//5 Retrieval of stored vectors At the input layer the unnon signal is applied and the netor has to recognize it. Of the stored vectors are orthogonal, any full of or partial input corresponding to one stored vector ould cause only one neuron in the intermediate slab to have a non-zero output in the first iteration. hen the stored vectors are not orthogonal, a certain number of neurons ill be excited. Let f is the unnon input signal The elements of the vector X define the total input to the elements of the intermediate layer X f * is the matrix of connections beteen the input layer and intermediate layer (the columns are equal to the input eights of each stored vector. 9 The output to of the first iteration is equal to G * X here is square matrix of connections beteen elements of the intermediate slab T the iterative formula G( t + ) * G( t) ( ) t ( f * ) T the output values are calculated by formula (,) ( n,) (,) ( n,) (,) (,) ( n,) (, n) (, n) Y * G matrix of connections beteen the intermediate slab and the output layer; for the associative memory V [ ] V [ ] for i,4,5 i.5 for i, for i,5 i / for i,,4 4 4
//5 V [ ] for i, i / for i,4,5 / / / / / / / / 5 6 (,) (,) (,) (,) (,) (,) (,) (,) (, / / (,) / 4 / / / + cor(, (, ( n ) > 7 8 D patterns stored in the netor (9x6) Input signal after iterations Output signal after 4 iterations 9 5