Assocatve Memores We consder now modes for unsupervsed earnng probems, caed auto-assocaton probems. Assocaton s the task of mappng patterns to patterns. In an assocatve memory the stmuus of an ncompete or corrupted pattern eads to the response of a stored pattern that corresponds n some manner to the nput pattern. A neura network mode most commony used for (auto-) assocaton probems s the Hopfed network. 5
Eampe Corrupted mage Intermedate state of Hopfed net Output States Bt maps. Attractors Prototype patterns. Input An arbtrary pattern (e.g. pcture wth nose). Output The best prototype for that pattern. 5 2
FIELD ETWORKS The Hopfed network mpements a so-caed content addressabe memory. A coecton of patterns caed fundamenta memores s stored n the by means of weghts. Each neuron represents a component of the nput. The weght of the nk between two neurons measures the correaton between the two correspondng components over the fundamenta memores. If the weght s hgh then the correspondng components are often equa n the fundamenta memores. 5 3
ARCHITECTURE: recurrent z - Mutpe-oop feedback system wth no sef-feedback z - z - unt-deay operator 5 4
Hopfed dscrete Input vectors vaues are n {-,} (or {0,}). The number of neurons s equa to the nput dmenson. Every neuron has a nk from every other neuron (recurrent archtecture) ecept tsef (no sef-feedback). The neuron state at tme n s ts output vaue. The network state at tme n s the vector of neurons states. The actvaton functon used to update a neuron state s the sgn functon but f the nput of the actvaton functon s 0 then the new output (state) of the neuron s equa to the od one. Weghts are symmetrc: w = w 5 5
otaton : nput dmenson. M: number of fundamenta memores. f -th component of the µ µ fundamenta memory. (n) State of neuron at tme n. 5 6
Weghts computaton. Storage. Let f, f 2,, f M denote a known set of - dmensona fundamenta memores. The synaptc weghts of the network are: w M M = µ = 0 f µ, f µ, =, =... where w s the weght from neuron to neuron. The eements of the vectors f µ are n {-,+}. Once computed, the synaptc weghts are fed. 5 7
Eecuton 2. Intasaton. Let probe denote an nput vector (probe) presented to the network. The agorthm s ntased by settng: where (0) s the state of neuron at tme n = 0, and probe, s the -th eement of the probe vector probe. 3. Iteraton Unt Convergence. Update the eements of network state vector (n) asynchronousy (.e. randomy and one at the tme) accordng to the rue ( n + ) = sgn w = ( n) Repeat the teraton unt the state vector remans unchanged. fed 4. Outputtng. Let denote the fed pont (or stabe state, that s such that (n+)=(n)) computed at the end of step 3. The resutng output y of the network s: y = ( 0) = probe, =,..., fed =, 2,..., 5 8
Eampe weght - - + + (-,, -) (-,, ) (,, -) (,, ) - 2 3 - neuron (-, -, ) (, -, ) stabe states - - (-, -, -) (, -, -) - - - - - - - - - attracton basn attracton basn 2-5 9
Eampe 2 Separaton of patterns usng the two fundamenta memores (- - -) ( ): Fnd weghts to obtan the foowng behavor: - - - - - - - - - - - - 2 3 w =? 5 0
COVERGECE Every stabe state s at an energy mnmum. (A state s stabe f (n+)=(n)) What s energy? Energy s a functon (Lyapunov functon): E: States such that every frng (change of ts output vaue) of a neuron decreases the vaue of E: mpes E( ) < E() E( ) = = 2 2 T, W w 5
COVERGECE A stored pattern estabshes a correaton between pars of neurons: neurons tend to have the same state or opposte state accordng to ther vaues n the pattern. If w s arge, ths epresses an epectaton that neurons and are postvey correated smary. If t s sma (very negatve) ths ndcates a negatve correaton. w w thus be arge for a state, whch s a stored, pattern (snce w w be postve f > 0 and negatve f < 0). The negatve of the sum w thus be sma. 5 2
Energy Decreases Cam: Frng a neuron decreases the energy E. Proof: Let be the neuron that fres: Ether:. goes from - to whch mpes = w > 0 (above threshod) 2. goes from to - whch mpes = w < 0 (beow threshod) Thus n both cases: ( ' ) w > 0 vaue after = 5 3
5 4 Proof ow we try to factor from the energy functon E: + = = = = = = = = = = = = = = = = = w w w w w w w 2 2 2 2 2 2 2 w E pued out the = case can be added snce w = 0 Term ndependent of
5 5 Proof How does E change wth changng? = + = = = = = = E E E E w ) ' ( w w ' ' E after change vaue of ' E before change vaue of we showed n prevous sde that ths quantty s > 0 (wthout - sgn) So E -E < 0 aways.
Convergence resut We have shown that the energy E decreases wth each neuron frng. The overa number of states s fnte {+, -}. Then The energy cannot decrease forever. Frng cannot contnue forever. 5 6
Computer eperment We ustrate the behavor of the dscrete Hopfed network as a content addressabe memory. n = 20 neurons ( n 2 - n = 4280 weghts). The network s traned to retreve 8 back and whte patterns. Each pattern contans 20 pes. The nputs of the net assume vaue + for back pes and - for whte pes. Retreva : the fundamenta memores are presented to the network to test ts abty to recover them correcty from the nformaton stored n the weghts. In each case, the desred pattern was produced by the network after teraton. 5 7
Computer eperment These patterns are used as fundamenta memores to create the weght matr. 5 8
Computer eperment Retreva 2: to test error-correctng capabty of the network. A pattern of nterest s dstorted by randomy and ndependenty reversng each pe of the pattern wth a probabty of 0.25. The dstorted pattern s used as a probe for the network. The average number of teratons needed to reca, averaged over the eght patterns, s about 3. The net behaves as epected. 5 9
Computer eperment 5 20
Computer eperment 5 2
Spurous attractor states Probem of ncorrect retreva. For eampe the network wth nput corrupted pattern 2 converges to 6. Spurous attractor states: the net sde shows 08 spurous attractors found n 43097 tests of randomy seected dgts corrupted wth the probabty of fppng a bt at 0.25. 5 22
Spurous attractor states Combnaton of dgt, dgt 4 and dgt 9 Loca mnma of E not correated wth any of the fundamenta memores 5 23
Storage Capacty Storage capacty: the quantty of nformaton that can be stored n a network n such a way that t can be retreved correcty. Defnton of storage capacty: C = number of fundamenta patterns number of neurons n the network C = number of fundamenta patterns number of weghtsn the network 5 24
Storage Capacty: bounds Theorem: The mamum storage capacty of a dscrete Hopfed s bounded above by C M = 4 n That s, f β s the probabty that the -th bt of the m-th fundamenta memory s correcty retreved for a =,.., and m =,,M, then m β = whenever M < / (4 n ) 5 25
Hopfed for Optmzaton Optmzaton probems (ke Traveng Saesman) can be encoded nto Hopfed etworks Obectve functon corresponds to the energy of the network Good soutons are stabe states of the network 5 26
TSP Traveng Saesman Probem (TSP): Gven ctes wth dstances d. What s the shortest tour? 5 27
Encodng Construct a Hopfed network wth 2 nodes. Semantcs: na = ff town s vsted at step a Constrants: na =, a na =, a The town dstances are encoded by weghts,.e. w = ab d 5 28
Hopfed etwork for TSP pace cty 2 3 4 A 0 0 0 B 0 0 0 C 0 0 0 D 0 0 0 Confguraton not aowed Confguraton aowed 5 29
Energy and Weghts =,, a, b odes wthn each row ( = ) connected wth weghts w ab E = γ ab odes wthn each coumn ( a=b ) connected wth weghts w ab 2 = γ Each node s connected to nodes n coumns eft and rght wth weght w n a n b w = d ab 5 30