/4/ Learnng Objecves Self Organzaon Map Learnng whou Exaples. Inroducon. MAXNET 3. Cluserng 4. Feaure Map. Self-organzng Feaure Map 6. Concluson 38 Inroducon. Learnng whou exaples. Daa are npu o he syse one by one 3. Mappng he daa no one or ore denson space 4. Copeve learnng Hang Dsance(/). Defne HD(Hang Dsance) beween wo bnary codes A and B of he sae lengh as he nuber of place n whch a, and b j dffer.. Ex: A: [ - ] B: [ - - ] => HD(A, B) = 3 3 4 Hang Dsance(/) Hang Dsance(3/) 6
/4/ Hang Dsance(4/) 3. Or, HD can be defned as he lowes nuber of edges ha us be raversed beween he wo relevan codes. 4. If A and B are bpolar bnary coponens, hen he scalar produc of A, B : Hang Dsance(/) A B n HD ( A, B ) HD( A, B) bs are agreed n HD( A, B) bs are dfferen 7 8 MAXNET(/3). For a wo layer classfer of bnary bpolar vecors, and p classes, p oupu neurons, he sronges response of a neuron ndcaes has he nu HD value and he caegory hs neuron represens. MAXNET(/3) 9 MAXNET(3/3). The MAXNET operaes o suppress values a oupu nsead of oupu values of Hang newor. 3. For he Hang ne n nex fgure, we have npu vecor X p classes => p neurons for oupu oupu vecor Y = [y, y p ] MAXNET(4/3)
/4/ 3 MAXNET(/3) 4. for any oupu neuron,, =, p, we have = [w, w, w n ] and =,, p o be he weghs beween npu X and each oupu neuron.. Also, assung ha for each class, one has he prooype vecor S () as he sandard o be ached. 4 MAXNET(6/3) 6. For classfyng p classes, one can say he h oupu s f and only f X= S () => happens only () = S () oupu for he classfer are X S (), X S (), X S (), X S (p) So when X= S (), he h oupu s n and oher oupus are saller han n. MAXNET(7/3) 7. X S () = (n - HD(X, S () ) ) - HD(X, S () ) ½ X S () = n/ HD(X, S () ) So he wegh arx: H =½S H S S ( S () () p) S S S () () ( p) S S S () n () n ( p) n 6 MAXNET(8/3) 8. By gvng a fxed bas n/ o he npu hen ne = ½X S () + n/ for =,, p or ne = n - HD(X, S () ) MAXNET(9/3) 9. To scale he npu ~n o ~ down, one can apply ransfer funcon as f(ne ) = ne for =,,..p MAXNET(/3) 7 8 3
/4/ MAXNET(/3). So for he node wh he he hghes oupu eans ha he node has salles HD beween npu and prooype vecors S () S ().e. f(ne ) = for oher nodes f(ne ) < MAXNET(/3). MAXNET s eployed as a second layer only for he cases where an enhanceen of he nal donan response of h node s requred..e., he purpose of MAXNET s o le ax{ y, y p } equal o and le ohers equal o. 9 MAXNET(3/3) MAXNET(4/3). To acheve hs, one can le y f y ( y j ) j where =, p ; j=, p ; j y s bounded by y for =, p oupu of Hang Ne. and y can only be or. 3 MAXNET(/3) 3. So ε s bounded by <ε</p and M ε: laeral neracon coeffcen ( p p) And 4 MAXNET(6/3) ne M y 4
/4/ MAXNET(7/3) 4. So he ransfer funcon MAXNET(8/3) f ( ne) ne ne ne 6 MAXNET(9/3) Ex: To have a Hang Ne for classfyng C, I, T hen S () = [ - - ] S () = [ - - - - - - ] S (3) = [ - - - - ] MAXNET(/3) 7 8 So H MAXNET(/3) ne X n H For X MAXNET(/3) ne 7 3 And 7 f ne 9 Y 3 9 9 9 3
/4/ 6 3 MAXNET(3/3) Inpu o MAXNET and selec =. < /3(=/p) So M 3 MAXNET(4/3) And M ne f Y Y ne 33 MAXNET(/3) =.333.67.99.333.67.99..333.777...... ne f Y o ne 34 MAXNET(7/3) = Y ne..... 3 MAXNET(8/3) = Y ne 3.96.48.96.4.48 36 MAXNET(9/3) =3 Y ne 4 7 3.46..46
/4/ MAXNET(3/3) Suary: Hang newor only ells whch class wll osly le, no o resore dsored paern. Cluserng Unsupervsed Learnng(/). Inroducon a. o caegorze or cluser daa b. groupng slar objecs and separang of dsslar ones. Assung paern se {X, X, X N } s subed o deerne he decson funcon requred o denfy possble clusers, => by slary rules 37 38 Cluserng Unsupervsed Learnng(/) Cluserng Unsupervsed Learnng(3/) Eucldean Dsance : X X cosψ ( X X ) ( X X ) X X X X 39 4 Cluserng Unsupervsed Learnng(4/) 3. nner-tae-all learnng Assung npu vecors are o be classfed no one of specfc nuber of p caegores accordng o he clusers deeced n he ranng se {X, X, X N }. Cluserng Unsupervsed Learnng(/) 4 4 7
/4/ Cluserng Unsupervsed Learnng(6/) Kohonen Newor Y=f( X) for vecor sze = n p and 43 w w for,,p n Cluserng Unsupervsed Learnng(7/) pror o he learnng, noralzaon of all wegh vecors s requred So 44 Cluserng Unsupervsed Learnng(8/) The eanng of ranng s o fnd all such ha X n X,, p fro.e., he h neuron wh vecor s he closes approxaon of he curren X. Fro Cluserng Unsupervsed Learnng(9/) X X and n X,, p n X,, p X X X ax,, p X 4 46 So Cluserng Unsupervsed Learnng(/) X ax X,, p The h neuron s he wnnng neuron and has he larges value of ne, =, p. Cluserng Unsupervsed Learnng(/) Thus should be adjused such ha X - s reduced. Then, X - can be reduced by he drecon of graden. X - = -(X - ) Or, ncrease n (X - ) drecon. 47 48 8
/4/ Cluserng Unsupervsed Learnng(/) For any X s jus a sngle sep, we wan only fracon of (X - ), hus for neuron : = α(x - ).< α <.7 for oher neurons: = Cluserng Unsupervsed Learnng(3/) In general: ( X ) where : wnnng neuron α : learnng consan for 49 Cluserng Unsupervsed Learnng(/) 4. Geoercal nerpreaon npu vecor X (noralzed) and wegh s wnner => X X for,,p ax 旐 Cluserng Unsupervsed Learnng(6/) So X And Cluserng Unsupervsed Learnng(7/) 3 ' s creaed. X X Cluserng Unsupervsed Learnng(8/) Thus he wegh adjusen s anly he roaon of he wegh vecor oward npu vecor whou a sgnfcan lengh change. And s no a noral, so n he new ranng sage, us be noralzed agan. s a vecor pon o he gravy of each cluser on a uny sphere. 4 X X X X X X X 9
/4/ Cluserng Unsupervsed Learnng(9/). In he case of soe paerns are nown class, hen X X α > for correc node α < oherwse Ths wll accelerae he learnng process sgnfcanly. Cluserng Unsupervsed Learnng(/) 6. Anoher odfcaon s o adjus he wegh for boh wnner and losers. leay copeve learnng 7. Recall Y= f( X) X 8. Inalzaon of weghs Randoly choose wegh fro U(,) or convex cobnaon for,, p n 6 Feaure Map(/6). Transfor fro hgh densonal paern space o low-denson feaure space.. Feaure exracon: wo caegory: naural srucure X no naural srucure -- depend on paern s slar o huan percepon or no Feaure Map(/6) X X X X X X X 7 8 Feaure Map(3/6) 3. anoher poran aspec s o represen he feaure as naural as possble. 4. Self organzng neural array ha can ap feaures X fro paern space. Feaure Map(4/6). Use one densonal array or Two densonal array 6. Exaple: X: npu vecor X X X X X X X : neuron w j :wegh fro npu x j o neuron y : oupu of neuron 9 6
/4/ Feaure Map(/6) X X X X X X X 7. So defne y f ( S( X, )) where S( X, ) eans Feaure Map(6/6) he slary beween X 8. Thus, (b) s he resul of (a) X and 6 6 Self-organzng Feaure Map(/6). One densonal appng se of paern X, =,,. use a lnear array > > 3 >.. f X y ax y( X ),,, 63 y ax y( X ),,, Self-organzng Feaure Map(/6). Thus, hs learnng s o fnd he bes achng neuron cells whch can acvae her spaal neghbors o reac o he sae X npu. 3. Or o fnd c such ha 64 X c n X Self-organzng Feaure Map(3/6) 4. In case of wo wnners, choose lower.. If c s he wnnng neuron hen defne N c as he neghborhood around c and X N c s always changng as learnng gong. 6. Then x w ) f s n he neghborho od w j 6 ( j j Nc oherwse Self-organzng Feaure Map(4/6) 7. : T where : curren ranng eraon T: oal # of ranng seps o be done. X 8. sars fro and s decreased unl reaches value of 66
/4/ Self-organzng Feaure Map(/6) 9. If recangular neghborhood s used hen c-d < x < c+d c-d < y < c+d and as learnng connues X d d T d s decreased fro d o 67 Self-organzng Feaure Mapexaple. Exaple: a. Inpu paerns are chosen randoly fro U[,]. b. The daa eployed n he experen coprsed pons dsrbued unforly over he bpolar square [,] [, ] c. The pons hus descrbe a geoercally X square opology. d. Thus, nal weghs can be ploed as n nex fgure. d. - connecon lne connecs wo adjacen neuron n copeve layer. 68 Self-organzng Feaure Mapexaple Self-organzng Feaure Map-- Exaple X X X X X X X X X X X X X X 69 7 Self-organzng Feaure Map Exaple Self-organzng Feaure Map Exaple X X X X X X X X X X X X X X 7 7