HOPFIELD NETWORKS 9.1 INTRODUCTION

Size: px
Start display at page:

Download "HOPFIELD NETWORKS 9.1 INTRODUCTION"

Transcription

1 UNI 9 HOPFIELD NEWORKS Structure Page No. 9. Introducton 9 Objectves 9. Related Defntons 0 9. Hopfeld Networs 9.4 Structure of Hopfeld Networs he Functonalty of Hopfeld Networs Storage Capacty of Hopfeld Networs Summary Solutons/Answers Practcal Assgnments INRODUCION One of the most mportant functons of human bran s the layng down and recall of memores. More over the studes n neurodynamcs ndcate the presence of shortterm and long-term memory. he short-term memory contrbutes n processng routne tass whle long-term memory helps n storng the lesson from the past experence. Our memores mostly functon n a manner better termed as assocatve or content-addressable. hat s, a memory does not exst n some solated fashon, located n a partcular set of neurons. All memores are n some sense strngs of memores. For example, we remember a place or a person by some event. Sometmes more than the facal features of a person t s the voce or the pecularty of the name that mpresses our memory. hus, memores are stored n assocaton wth one another. he neural networ researchers contend that there s a basc msmatch between the standard technology used by computers and the technology of the human bran for processng nformaton. Feed-forward neural networs are well studed for obvous reasons of smplcty n understandng and modellng the computaton. Feed forward networ s acyclc. he artfcal neurons or the perceptrons n a feed-forward neural networ model have no cyclc connectons. he nput data s processed and passed to the outputs but not vve-versa. Recurrent networs can have connectons that go bac from the output nodes and can also have arbtrary connectons between any nodes. hat s a recurrent neural networ has at least one feedbac loop. For example, n a sngle layer recurrent networ each neuron may have a feedbac connecton to each of the other neuron or t may even have self-feedbac loops. As learnng usng perceptrons s a process of modfyng the values of the synaptc weghts and the threshold or the actvaton functon usng samples of tranng data. A group of (connected) perceptrons are traned on sample nput-output pars untl t learns to compute the correct functon. Once a feed-forward networ s traned, ts state s fxed whch s not altered by any subsequent nput data untl the networ s retraned. hs ndcates that a feed-forward neural networ does not have memory. In a neurobologcal context memory refers to the relatvely endurng neural alteratons nduced by the nteracton of an organsm wth ts envronment. Learnng tass lead us to assume exstence of memory. herefore, the alteratons ndcate presence of memory and for ts usefulness memory needs to be accessble. Assocatve memores have a very fant smlarty to that of the human bran s ablty to assocate patterns. he archtecture of a neural networ wth assocatve memory

2 Applcatons of Neural Networs depends on ts capablty of assocaton. An assocate memory s a storehouse of assocated patterns encoded n some form. When ths storehouse s trggered or ncted wth a pattern the assocated pattern par s recalled or output. An assocatve memory networ s desgned by storng/recordng several deal patterns nto the networ s stable state. When a pattern (even wth nose or dstorted representaton of stored patterns) s nput to the storehouse, the networ s expected to reach one of the stored patterns and retreve t as an output. hese neural networ models are categorzed as assocaton models and are also nown as content addressable or autoassocatve neural networs. All the defntons wll be dscussed n Sec. 9.. In Sec. 9. we shall dscuss Hopfeld networs. In Sec. 9.4, we shall dscuss the structure of the Hopfeld networs, n detal and we shall extend our dscusson on the functonalty of the Hopfeld networs n Sec he storage capacty of ths networ s dscussed n Sec Objectves After readng ths unt you should be able to: dentfy and desgn general assocaton and recurrent models of neural networ; model Hopfeld networs; tran Hopfeld networs for a gven set of patterns; gve a traned networ to run the Hopfeld for a test pattern wth nose may requre rgorous computaton specally f dmenson of the vectors s hgh. 9. RELAED DEFINIIONS Let us recall varous defntons as gven below: Recurrent Networs: he nterconnecton scheme of the recurrent networs s feedbac connecton or loops as shown below n Fg.. Output Input Fg. : Recurrent networ Assocatve Memory: An assocatve memory s a bran-le dstrbuted memory that learns by assocaton. Assocaton s one of the basc features of the human memory and s prevalent n most models of cognton. Assocatve memory can be categorzed as auto assocatve memory and Hetero assocatve memory. hese are defned n the followng: 0

3 ) Autoassocaton: In autoassocaton a neural networ s requred to store a set of patterns (vectors) by repeatedly presentng them to the networ. he networ s subsequently presented a partal descrpton or nosy verson of an orgnal pattern stored n t, and the tas s to recall or more specfcally retreve that partcular pattern. ) Heteroassocaton: In heteroassocaton an arbtrary set of nput patterns s assocated (pared) wth another set of arbtrary set of output patterns. he tas of retreval of patterns however s smlar.e. on nput of a stored pattern or a dstorted verson of the already stored pattern the orgnal pattern coupled wth the gven nput s recalled. Hopfeld Networs Let x be the ey pattern (vector) appled to an assocatve memory and y be the memorzed pattern (vector). he pattern assocaton performed by the networ s descrbed by, x y, =,,, q where q s the number of patterns stored n the networ. he ey pattern x acts as the stmulus that not only determnes the storage locaton of the memorzed pattern y, but also holds the ey for ts retreval. Input Vector x Pattern Assocator Output vector y Fg. : Input-output relaton of the pattern assocator Auto-assocatve networs consst of one layer of neural elements (unts) that are all nterconnected, although self-connectons are usually dsallowed. he neural elements are nonlnear, wth states bounded from zero to one so called two-state-neurons. Any state, whether ntal or desred, wll be represented by some pattern of zeros and ones over the unts. Recurrence and nonlnearty le at the heart of auto-assocatve networ behavour. Recurrence allows desred states to emerge va nteractons among the unts, and the nonlnearty prevents the whole networ from runnng away, and nstead encourages the networ to settle nto a stable state. he auto-assocatve networ wll, n most cases, relax from any ntal state nto the desred state closest to t. he nterconnectons between unts can be traned accordng to Hebban rules (proposed n 949) - when one cell repeatedly asssts n frng another, the axon of the frst cell develops synaptc nobs (or enlarges them f they already exst) n contact wth the soma of the second cell. Hebban rules are local, n the sense that they depend only upon the actvty of neurons pre- and post-synaptc to any partcular connecton. he orgnal rule proposed by Hebb specfed an ncrease n synaptc strength whenever pre- and post-synaptc neurons were actve together. herefore n an assocaton net, f we compare two patterns components (vectors or pxels) wthn many patterns and fnd that they are frequently n the same state then the arc weght between the two perceptrons must be ncreased (should be postve) and f they are frequently n dfferent states, then the arc weght between the two perceptrons must be decreased (should be negatve). here are two phases nvolved n the operaton of an assocatve memory: Storage phase: hs refers to the tranng of the networ.

4 Applcatons of Neural Networs Recall/Retreval phase: hs nvolves the retreval of a memorzed pattern n response to the presentaton of a nosy or dstorted verson of a ey pattern to the networ. Matrx Representaton: For the computaton a matrx representaton s a practcal tool. Let X = matrx of nput patterns, where each ROW s a pattern. So x, = the -th bt of the -th pattern. Let Y = matrx of output patterns, where each ROW s a pattern. So y, j = the j-th bt of the -th pattern. hen, average correlaton between two nput patterns and j across all patterns s: w = / P(x y + x y + + x y ) (), j,, j,, j p, p, j All weghts for an assocatve memory networ model therefore can be calculated by the followng: W = X Y () herefore, for a set of nput patterns: IP, IP,, IPp, and output patters OP, OP,, OP, p IP x... x IP x... x X = : : : : IP x... x,,n,,n p p, p,n X IP IP... IP p x x... x : :... : x x... x,, p, =,, p, : :... : x x... x,n,n p,n OP y... y... y OP y... y y Y = : : : : OP y... y y,, j,n,,j...,n p p, p, j... p,n From the above sets of vectors (matrces) X and Y, t s obvous that Eqn. () s the dot product of -th row of X and the j-th column of Y. But snce the output vector n case of autoassocaton s one of the earler memorzed vectors t s one of vectors n X tself and therefore the weght calculaton autoassocaton networs s, W = X X ()

5 Consder the followng procedure to explan the assocatve memory networ model. Hopfeld Networs Input: Pattern (often nosy/corrupt) Output: Correspondng pattern (complete/relatvely nose-free) Process:. Load nput pattern onto hghly-nterconnected neurons (a traned networ on gven set of lbrary patterns).. Run the neurons untl they reach a steady state.. Read output off the states of the neurons. Subsequently Fg. and fg. 4 gve a graphcal llustraton of the procedure. Input Output Fg. : raned assocatve networ Input: ( 0 - -) Fg. 4: Runnng an nput pattern Output: ( - - -) Now, let us dscuss Hopfeld networs n the followng secton. 9. HOPFIELD NEWORKS he Hopfeld Net proposed by J.J. Hopfeld n 98 s a neural networ that s a lot smpler to understand than the Mult Layer Perceptron. For a start, t only has one layer of nodes. Unle the MLP, t doesn't mae one of ts outputs go hgh to show whch of the patterns most closely matches the nput. Instead t taes the nput pattern, whch s assumed to be one of the lbrary patterns whch has been changed or corrupted somehow, and tres to reconstruct the correct pattern from the nput that you gve t.

6 Applcatons of Neural Networs Hopfeld networs are Recurrent neural networ model that possesses auto-assocatve property. herefore the networ s fully nterconnected wth the excepton that no neuron has any connecton to tself. hus, the Hopfeld networ s a form of artfcal neural networ that serve as content-addressable memory systems wth bnary threshold unts. hey are guaranteed to converge to a stable state. he am of the Hopfeld networ s to recognze a partal or dstorted nput pattern as one of ts prevously memorzed vectors and to output the perfect and complete memorzed verson. hus the Hopfeld networs proposed as a theory of memory has the followng features:. Dstrbuted Representaton: A memory s stored as a pattern of actvaton across a set of processng elements. Further, the memores can be supermposed on one another; dfferent memores are represented by dfferent patterns over the same set of processng elements.. Dstrbuted Asynchronous Controls: Each processng element maes decsons based only on ts own local stuaton. All these local actons add up to a global soluton.. Content-addressable Memory: A number of patterns can be stored n a networ. o retreve a pattern, we need to only specfy a porton of t. he networ automatcally fnds the closest match. 4. Fault olerance: If a few processng elements msbehave or fals completely, the networ wll stll functon properly. he man concept underlyng the Hopfeld networ s that a sngle networ of nterconnected, bnary-valued neurons can store multple stable states, the lbrary patterns also called attractors, smlar to the bran the full pattern can be recovered f the networ s presented wth only partal nformaton just as descrbed n assocatve memores. Furthermore, there s a degree of stablty n the system, f just a few of the connectons between neurons are severed, the recalled memory s not too badly corrupted, the networ can respond wth a "best guess". he nodes n the Hopfeld networ are vast smplfcatons of real neurons, they can only exst n one of two possble "states" - {frng, not frng} or { 0, } or {, }. Every node s connected to every other node wth some strength (synaptc weghts). At any nstant of tme a node wll change ts state (.e. start or stop frng) dependng on the nputs t receves from the other nodes. If we start the system off wth any general pattern of frng and non-frng nodes then ths pattern wll n general change wth tme. o see ths thn of startng the networ wth just one frng node. hs wll send a sgnal to all the other nodes va ts connectons so that a short tme later some of these other nodes wll fre. hese new frng nodes wll then excte others after a further short tme nterval and a whole cascade of dfferent frng patterns wll occur. One mght magne that the frng pattern of the networ would change n a complcated perhaps random way wth tme. he crucal property of the Hopfeld networ whch renders t useful for smulatng memory recall s the followng: t guarantees that the pattern wll settle down after a long enough tme to some fxed pattern. Certan nodes wll be always "on" and others "off". Furthermore, t s possble to arrange that these stable frng patterns of the networ correspond to the desred memores we wsh to store. Example : Suppose we have traned our Hopfeld Net on the three patterns gven n fg. 5: 4

7 Hopfeld Networs Fg. 5 Next we nput a pattern whch s a bt le one of these, say the left most one gven n the Fg. 6. Leave the networ to run. It gradually alters the pattern we gve t untl t has reconstructed one of the orgnals ones, the rght most one. Fg. 6 he Hopfeld Net s left to terate (loop round and round) untl t doesn't change any more. hen t should match one of the nput patterns. he program should then compare the reconstructed pattern wth the lbrary of orgnals, and see whch one matches. 9.4 SRUCURE OF HOPFIELD NEWORKS he structure of the Hopfeld networs model s shown n Fg. 7. OUPUS (Vald after convergence) INPUS (Appled at tme zero) Fg. 7: Hopfeld Networ Model he nputs to the Hopfeld Net are at the bottom. he nodes produce an output whch comes out at bottom and s fed bac nto all the nodes except the one that produced t (.e. all the nodes refer to each other, but not to themselves) and uses ths to produce the next output. he fnal output of the nodes s extracted at the top. For a networ of N neurons, the state of the networ s denoted by the vector, s = (s, s,, s ) N Each j s = ±, the state of neuron j represents the one bt of nformaton, and the N state vector s represents a bnary word of N bts of nformaton. As the unts/neurons 5

8 Applcatons of Neural Networs have a bnary threshold unts, the dynamcal rule has the possblty of two defntons for a the actvaton of neuron : f w j js j θ, a = (4) otherwse. or, f w j js j θ, a = (5) 0 otherwse. where, w j s the connecton (synaptc) weght from neuron to neuron j as n Eqn. (), s j s the state of neuron (unt) j and θ s the threshold of the neuron. he above relaton n Eqns. (4) and (5) may be wrtten n the compact form, a = sgn[ Σ wj s j] for all j =,,,N where, sgn s the sgnum functon. If a s zero then the acton can be arbtrary. However, the neuron I may reman n the prevous state regardless of whether t s on or off. he feedbac networ wth no self-loopng, Hopfeld networs also have the followng two restrctons on due to the nds of nterconnecton that prevals: ) w = 0 for all neurons..e. there s no loop/connecton to tself, and ) wj w j = for all neurons..e. the connectons are symmetrc. he Hopfeld nets also have a scalar value assocated wth each state of the networ referred to as the energy, E of the networ. Energy s defned as below, E = w s s + θ s (6) < j j j he reason for ths s somewhat techncal but we can proceed by analogy. Imagne a ball rollng on some bumpy surface. We magne the poston of the ball at any nstant to represent the actvty of the nodes n the networ. Memores wll be represented by specal patterns of node actvty correspondng to wells n the surface. hus, f the ball s let go, t wll execute some complcated moton but we are certan that eventually t wll end up n one of the wells of the surface. We can thn of the heght of the surface as representng the energy of the ball. We now that the ball wll see to mnmze ts energy by seeng out the lowest spots on the surface-the wells. In the language of memory recall, f we start/ ntate the networ wth a pattern for frng, the networ must approxmate one of the "stable frng patterns" n the memores. However, t wll "under ts own steam" end up n a nearby (wth respect to the threshold) well n the energy surface thereby recallng the orgnal perfect memory. In the followng secton, we shall focus on the functonalty of Hopfeld networs. 9.5 HE FUNCIONALIY OF HOPFIELD NEWORKS As mentoned earler that Hopfeld nets are a specal case of assocatve networs or content-addressable memory nets. herefore, Hopfeld nets too have the correspondng two phases namely, Storage phase and the Retreval phase. 6

9 Storage Phase: Let us have M, N-dmensonal vectors to be stored n the Hopfeld nets, say P = {p =,,, M}. he m vectors compose the lbrary or the fundamental memores, representng the patterns to be memorzed by the networ. Let p, I denote the -th element of the fundamental memory p. Accordng to the Hebb s rule the synaptc weghts from neuron to j can be computed usng Eqn. ().e. Hopfeld Networs w = / N(p p + p p + + p p ) (7) j,, j,, j M, M, j and w = 0 for all neurons =,,, M. Let W denotes the N N synaptc weght matrx or connectvty matrx of the networ. M (8) W = p p MI N where, pp denoted the dot product of the -th vector and ts transpose. I s the dentty matrx. From the above computaton the followng three ponts are reconfrmed: he output of each neuron n the networ s fed bac to all other neurons. here s no self-loopng n the networ. he weght matrx of networ s symmetrc. Retreval Phase: hs phase s ntated on nput of an N-dmensonal vector as nput say p t, to the traned networ. hs s also called a probe. ypcally the probe has elements ± and s thought to be a nosy or ncomplete verson of any of the attractors already n the lbrary/ fundamental memory. he nformaton retreval starts wth a dynamcal rule n whch each neuron of the networ randomly but at some fxed rate examnes ts actvaton functon a as a result of the connectvty ts neghbourng neurons. Whle examnng f a s greater than zero then the state s changed to else reman n the prevous state. But f the value of a s less than zero then swtch state to. However, f the value of a s equal to zero then the state s left n ts prevous state regardless of whether t was n on or off state. he updatng of the states of the neurons s determnstc but selecton of the neurons for updatng states s done randomly n seral (asynchronous) or synchronous. Startng wth the test nput or the probe vector the networ must fnally reach a state of stablty, or produce tme nvarant state vector as the output pattern.e. y, whose ndvdual elements satsfy the followng stablty condton, N y = sgn w j,p t, j =,,, N. (9) = or the matrx form, y = sgn( Wp t ) (0) Example : Consder the followng example: ) Input Patterns M =, N = 4, 4-dmensonal vector. IP IP IP 7

10 Applcatons of Neural Networs ) he average correlaton across the three patterns to desgn and tran the Hopfeld networ. w IP IP IP Avg, j w /, w /, w, 4 / w, / w, 4 / w, 4 X = X = he weght for the auto assocaton networ s W = X X = Here t s clear that n the connectvty matrx w, 4 = w4, or n general the lower and the upper trangles of the product matrx represents the 6 weghts w = w. We can scale the weghts by dvdng them by p =. For the, j j, teratve purposes t s easer to scale by p at the end nstead of scalng the entre weght matrx W pror to testng. Now let us construct the Hopfeld net and tranng t wth the nputs IP, IP, IP. [ /] 4 [+/] s = 0 for =,,, 4 Fg. 8: Hopfeld networ 8

11 ) Now consder the testng nput = ( 0 0 ). Hopfeld Networs [- /] [+/] Fg. 9: Wth nput ( 0 0 ) 4 s = ; s = 0; s = 0; s 4 = - v) Synchronous teraton to update all the nodes. ( 0 0 ) = ( ) able Inputs Node 4 Output 0 0 -/ / 0 0 / -/ / he dagonal represents the values from the nput layer n the above table depctng the result of the synchronous teraton modeled by the precedng computaton. Now scalng the output by dvdng by p = and usng the threshold θ = 0, we obtan the vector ( / / / / ) ( ) the same as IP an attractor n the lbrary. Consder the followng fgure to dsplay the processng n the Hopfeld net to reach a stable state agan wth the output vector as computed earler. he graphcal representaton of the output s gven n Fg Fg. 0: ( 0 0 ) ( ) 9

12 Applcatons of Neural Networs Now, try some exercses. E) Run the above Hopfeld networ for the test nput vector ( 0 0 0). E) Draw the graph of the nput to output vector obtaned n E). he smart thng about the Hopfeld networ s that there exsts a rather smple way of settng up the connectons between nodes n such a way that any desred set of patterns can be made "stable frng patterns". hus any set of memores can be burned nto the networ at the begnnng. hen f we c the networ off wth any old set of node actvty we are guaranteed that a "memory" wll be recalled. Not too surprsngly, the memory that s recalled s the one whch s "closest" to the startng pattern. In other words, we can gve the networ a corrupted mage or memory and the networ wll "all by tself" try to reconstruct the perfect mage. Of course, f the nput mage s suffcently poor, t may recall the ncorrect memory the networ can become "confused" just le the human bran. We now that when we try to remember someone's telephone number we wll sometmes produce the wrong one! Notce also that the networ s reasonably robust f we change a few connecton strengths just a lttle the recalled mages are "roughly rght". We don't lose any of the mages completely. Example : Consder a Hopfeld networ whose weght matrx s gven by, 0 w = 0 0 Consder a test nput vectors pt = ( ) and pt = ( ). Usng the Eqn. (9) and then ts transpose results n the desred vector. herefore, Wpt 0 4 = 0 4 = 0 4 y = Sgn ( ) = Wpt Snce the CM or the weght matrx s symmetrc we can also drectly get the vector as by the product of vector and the matrx,.e. ptw 0 y sgn ( ) ( ) 0 = ptw = = ( ) 0 Eqn. (0) s also nown as algnment condton. he state vectors that satsfy the algnment condton are stable states. herefore, the gven patterns are stables state vector too. 40 If we consder the followng vectors as probes for ths example, ( ), (,, ) or ( ). he resultng output s ( ). Note that each of the probes had sngle error compared to the stored memory.

13 Smlarly, consder another set of test patterns to nput to the Hopfeld networ ( ), ( ) or ( ). he resultng output vector n ths case s computed to be ( ). In ths case too the test patterns are noted to have sngle error. Hopfeld Networs Now, try the followng exercses. E) Run the Hopfeld networ of Example for the test nput vector ( 0 0). Also, draw the graphcal representaton of the output. E4) Consder the set of pattern vectors P. Obtan the connectvty matrx (CM) for the patterns n P (four patterns) P = In the followng Secton, we shall dscuss the storage capacty Hopfeld Networs. 9.6 HOPFIELD NEWORKS: SORAGE CAPACIY here are two major lmtatons of Hopfeld networs. One s the tendency to local mnma, whch s good for assocaton but bad for optmsaton. he other lmtaton s on the networ capacty. For a networ of N bnary nodes, the capacty lmt s of N the order of N rather than. If the patterns or the vectors are N-dmensonal then there must be networ of N bnary nodes. However, the capacty s equal to the relatonshp between the number of patterns that can be stored & retreved wthout error to the sze of the networ. Let M be the number of patterns to be stored or fundamental memores. So long as the storage capacty s of the networ s not overloaded.e., M s small compared to N. M Capacty = N M or, Number of weghts If we use the followng defnton of 00% correct retreval, when any of the stored patterns s entered completely (no nose), then that same pattern s returned by the networ;.e. he pattern s a stable attractor. A detaled proof shows that a Hopfeld networ of N nodes can acheve 00% correct retreval on P patterns f: P < N /(4 *ln(n)). N Max P In general, as more patterns are added to a networ, the average correlatons wll be less lely to match the correlatons n any partcular pattern. Hence, the lelhood of retreval error wll ncrease. Just as t happens n case of human memory. When the nformaton s loaded/crammed nto a memory, or patterns to store recall s slower 4

14 Applcatons of Neural Networs than when there are less patterns to remember. he ey to perfect recall s selectve gnorance! Example 4: A three neuron networ traned wth three patterns. W, W, W, W, W, O A W, O B W, W, W, O C Let s say we d le to tran three patterns: Pattern number one: O ( ) = O A B( ) = OC ( ) = Pattern number two: O ( ) = O A B( ) = OC ( ) = Pattern number three: O ( ) = O A B( ) = OC( ) = W = 0, W = O O + O O + O O = + + = ( ) ( ) ( ) ( ) ( ) ( ) ( ) ( ) ( ) ( ) ( ) ( ) ( ) ( ) ( ) ( ) ( ) ( ) ( ), A B A B A B W = O O + O O + O O = + + =, A C A C A C W = 0, W = O O + O O + O O = + + = ( ) ( ) ( ) ( ) ( ) ( ) ( ) ( ) ( ) ( ) ( ) ( ) ( ) ( ) ( ) ( ) ( ) ( ) ( ), B A B A B A W = O O + O O + O O = + + =, B C B C B C W = 0, W = O O + O O + O O = + + = ( ) ( ) ( ) ( ) ( ) ( ) ( ) ( ) ( ) ( ) ( ) ( ) ( ) ( ) ( ) ( ) ( ) ( ), C A C A C A W = O O + O O + O O = + + =, C B C B C B 4

15 Now try the followng exercse. Hopfeld Networs E5) ran ths networ wth the three patterns shown. W, W, W, W, W, O A W, O B W, W, W, O C Patterns: Applcatons of the Hopfeld Networ. he Hopfeld networ can be used as an effectve nterface between analog and dgtal devces, where the nput sgnals to the networ are analog and the output sgnals are dscrete values.. Assocatve memory s a major applcaton of the Hopfeld networ.. he energy mnmzaton ablty of the Hopfeld networ s used to solve optmzaton problems. 4. he varous applcatons of Hopfeld networs are as gven below. Hopfeld networ remembers cues from the past and does not complcate the tranng procedure. 5. he Hopfeld networ can be used for convertng analog sgnals nto the dgtal format, for assocatve memory. Now, let us summarze ths unt. 9.7 SUMMARY he summary of ths unt ncludes the four basc operatons to desgn and run a Hopfeld networ. hese four operatons are learnng, ntalzaton (of the networ), teraton untl convergence, and fnally outputtng the output vector. 4

16 Applcatons of Neural Networs ) Learnng/ranng : Let p, p,, pm be the N-dmensonal patterns to be stored. Usng the dot product rule, the Hebban postulate of learnng, the synaptc weghts of the entre recurrent networ prohbtng self-loopng s computed by, ) ) v) w M = p,p,j j N 0 = j, j = Intalzaton: Let p t be the n-dmensonal vector probe or the test nput to the Hopfeld networ. he ntalzaton s carred out by, s (0) = p, =,,, N, t where s (0) s the state of the -th neuron at tme n = 0, and p,t the -th element of the test nput p t. Iteraton untl convergence: Update the elements of the state vector s(n) synchronously or asynchronously usng the rule, N s (n + ) = sgn wjs (n), =,,, N j= Repeat the teraton untl the state of the neurons do not change.e. the state vector s remans unchanged. Outputtng: Let s fxed denote the stable states computed at the end of above teratve step. he resultng pattern OP of the networ s, OP = s fxed he storage phase uses the learnng/tranng operaton. he subsequent operatons ntalzaton, teraton untl convergence and outputtng are appled n the retreval phase. 9.8 SOLUIONS/ANSWERS E) hs test pattern too has elements other than ±. Usng the Eqn. (4) or ptw and dvdng the resultant by the vector obtaned s ( ). he calculaton s gven as below. Input = ( 0 0 0) he correspondng graphcal representaton of the Hopfeld networ s gven as 4 44 he synchronous teraton to update the nodes s gven by

17 ( 0 0 0) Hopfeld Networs = ( ) Nodes 4 Output / / / Whch s a stable state and one of the nput patterns stored n the lbrary. hough the asynchronous update results n spurous output, the synchronous update gves a pattern from the fundamental memory. E) Input 4 Output 4 E) Frst of all the nput defntely has some serous nose therefore the value 0 whch s not there n the nput patterns s present n the probe. However, use the Eqn. (4) or ptw = ( 0 0), whch s a stable state, but not one of the nput patterns stored n the lbrary. E4) Here N = 0 and M = 4. Use ether the Eqn. (6) or Eqn. (7) to obtan the weghts for each w for all and j =,,, 0 but j. Substtute the values of, j P and P n Eqn. (7) we get 9.9 PRACICAL ASSIGNMEN Sesson 8 Wrte a program n C language to fnd CM = / 4P P 0I 0. ) he average correlaton matrx for gven nput patterns M and N to desgn and tran the Hopfeld Networ. ) he weght matrx. ) Output of the Hopfeld networ. Also test your program on the nput patterns gven n Example. 45

CHAPTER III Neural Networks as Associative Memory

CHAPTER III Neural Networks as Associative Memory CHAPTER III Neural Networs as Assocatve Memory Introducton One of the prmary functons of the bran s assocatve memory. We assocate the faces wth names, letters wth sounds, or we can recognze the people

More information

Internet Engineering. Jacek Mazurkiewicz, PhD Softcomputing. Part 3: Recurrent Artificial Neural Networks Self-Organising Artificial Neural Networks

Internet Engineering. Jacek Mazurkiewicz, PhD Softcomputing. Part 3: Recurrent Artificial Neural Networks Self-Organising Artificial Neural Networks Internet Engneerng Jacek Mazurkewcz, PhD Softcomputng Part 3: Recurrent Artfcal Neural Networks Self-Organsng Artfcal Neural Networks Recurrent Artfcal Neural Networks Feedback sgnals between neurons Dynamc

More information

EEE 241: Linear Systems

EEE 241: Linear Systems EEE : Lnear Systems Summary #: Backpropagaton BACKPROPAGATION The perceptron rule as well as the Wdrow Hoff learnng were desgned to tran sngle layer networks. They suffer from the same dsadvantage: they

More information

Unsupervised Learning

Unsupervised Learning Unsupervsed Learnng Kevn Swngler What s Unsupervsed Learnng? Most smply, t can be thought of as learnng to recognse and recall thngs Recognton I ve seen that before Recall I ve seen that before and I can

More information

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur Module 3 LOSSY IMAGE COMPRESSION SYSTEMS Verson ECE IIT, Kharagpur Lesson 6 Theory of Quantzaton Verson ECE IIT, Kharagpur Instructonal Objectves At the end of ths lesson, the students should be able to:

More information

Structure and Drive Paul A. Jensen Copyright July 20, 2003

Structure and Drive Paul A. Jensen Copyright July 20, 2003 Structure and Drve Paul A. Jensen Copyrght July 20, 2003 A system s made up of several operatons wth flow passng between them. The structure of the system descrbes the flow paths from nputs to outputs.

More information

Associative Memories

Associative Memories Assocatve Memores We consder now modes for unsupervsed earnng probems, caed auto-assocaton probems. Assocaton s the task of mappng patterns to patterns. In an assocatve memory the stmuus of an ncompete

More information

Hopfield Training Rules 1 N

Hopfield Training Rules 1 N Hopfeld Tranng Rules To memorse a sngle pattern Suppose e set the eghts thus - = p p here, s the eght beteen nodes & s the number of nodes n the netor p s the value requred for the -th node What ll the

More information

Problem Set 9 Solutions

Problem Set 9 Solutions Desgn and Analyss of Algorthms May 4, 2015 Massachusetts Insttute of Technology 6.046J/18.410J Profs. Erk Demane, Srn Devadas, and Nancy Lynch Problem Set 9 Solutons Problem Set 9 Solutons Ths problem

More information

VQ widely used in coding speech, image, and video

VQ widely used in coding speech, image, and video at Scalar quantzers are specal cases of vector quantzers (VQ): they are constraned to look at one sample at a tme (memoryless) VQ does not have such constrant better RD perfomance expected Source codng

More information

Grover s Algorithm + Quantum Zeno Effect + Vaidman

Grover s Algorithm + Quantum Zeno Effect + Vaidman Grover s Algorthm + Quantum Zeno Effect + Vadman CS 294-2 Bomb 10/12/04 Fall 2004 Lecture 11 Grover s algorthm Recall that Grover s algorthm for searchng over a space of sze wors as follows: consder the

More information

Hopfield networks and Boltzmann machines. Geoffrey Hinton et al. Presented by Tambet Matiisen

Hopfield networks and Boltzmann machines. Geoffrey Hinton et al. Presented by Tambet Matiisen Hopfeld networks and Boltzmann machnes Geoffrey Hnton et al. Presented by Tambet Matsen 18.11.2014 Hopfeld network Bnary unts Symmetrcal connectons http://www.nnwj.de/hopfeld-net.html Energy functon The

More information

Feature Selection: Part 1

Feature Selection: Part 1 CSE 546: Machne Learnng Lecture 5 Feature Selecton: Part 1 Instructor: Sham Kakade 1 Regresson n the hgh dmensonal settng How do we learn when the number of features d s greater than the sample sze n?

More information

Singular Value Decomposition: Theory and Applications

Singular Value Decomposition: Theory and Applications Sngular Value Decomposton: Theory and Applcatons Danel Khashab Sprng 2015 Last Update: March 2, 2015 1 Introducton A = UDV where columns of U and V are orthonormal and matrx D s dagonal wth postve real

More information

8 Derivation of Network Rate Equations from Single- Cell Conductance Equations

8 Derivation of Network Rate Equations from Single- Cell Conductance Equations Physcs 178/278 - Davd Klenfeld - Wnter 2015 8 Dervaton of Network Rate Equatons from Sngle- Cell Conductance Equatons We consder a network of many neurons, each of whch obeys a set of conductancebased,

More information

Week 5: Neural Networks

Week 5: Neural Networks Week 5: Neural Networks Instructor: Sergey Levne Neural Networks Summary In the prevous lecture, we saw how we can construct neural networks by extendng logstc regresson. Neural networks consst of multple

More information

Module 9. Lecture 6. Duality in Assignment Problems

Module 9. Lecture 6. Duality in Assignment Problems Module 9 1 Lecture 6 Dualty n Assgnment Problems In ths lecture we attempt to answer few other mportant questons posed n earler lecture for (AP) and see how some of them can be explaned through the concept

More information

1 Derivation of Rate Equations from Single-Cell Conductance (Hodgkin-Huxley-like) Equations

1 Derivation of Rate Equations from Single-Cell Conductance (Hodgkin-Huxley-like) Equations Physcs 171/271 -Davd Klenfeld - Fall 2005 (revsed Wnter 2011) 1 Dervaton of Rate Equatons from Sngle-Cell Conductance (Hodgkn-Huxley-lke) Equatons We consder a network of many neurons, each of whch obeys

More information

Lecture 12: Discrete Laplacian

Lecture 12: Discrete Laplacian Lecture 12: Dscrete Laplacan Scrbe: Tanye Lu Our goal s to come up wth a dscrete verson of Laplacan operator for trangulated surfaces, so that we can use t n practce to solve related problems We are mostly

More information

9 Derivation of Rate Equations from Single-Cell Conductance (Hodgkin-Huxley-like) Equations

9 Derivation of Rate Equations from Single-Cell Conductance (Hodgkin-Huxley-like) Equations Physcs 171/271 - Chapter 9R -Davd Klenfeld - Fall 2005 9 Dervaton of Rate Equatons from Sngle-Cell Conductance (Hodgkn-Huxley-lke) Equatons We consder a network of many neurons, each of whch obeys a set

More information

More metrics on cartesian products

More metrics on cartesian products More metrcs on cartesan products If (X, d ) are metrc spaces for 1 n, then n Secton II4 of the lecture notes we defned three metrcs on X whose underlyng topologes are the product topology The purpose of

More information

Section 8.3 Polar Form of Complex Numbers

Section 8.3 Polar Form of Complex Numbers 80 Chapter 8 Secton 8 Polar Form of Complex Numbers From prevous classes, you may have encountered magnary numbers the square roots of negatve numbers and, more generally, complex numbers whch are the

More information

1 Convex Optimization

1 Convex Optimization Convex Optmzaton We wll consder convex optmzaton problems. Namely, mnmzaton problems where the objectve s convex (we assume no constrants for now). Such problems often arse n machne learnng. For example,

More information

CSci 6974 and ECSE 6966 Math. Tech. for Vision, Graphics and Robotics Lecture 21, April 17, 2006 Estimating A Plane Homography

CSci 6974 and ECSE 6966 Math. Tech. for Vision, Graphics and Robotics Lecture 21, April 17, 2006 Estimating A Plane Homography CSc 6974 and ECSE 6966 Math. Tech. for Vson, Graphcs and Robotcs Lecture 21, Aprl 17, 2006 Estmatng A Plane Homography Overvew We contnue wth a dscusson of the major ssues, usng estmaton of plane projectve

More information

Chapter Newton s Method

Chapter Newton s Method Chapter 9. Newton s Method After readng ths chapter, you should be able to:. Understand how Newton s method s dfferent from the Golden Secton Search method. Understand how Newton s method works 3. Solve

More information

8 Derivation of Network Rate Equations from Single- Cell Conductance Equations

8 Derivation of Network Rate Equations from Single- Cell Conductance Equations Physcs 178/278 - Davd Klenfeld - Wnter 2019 8 Dervaton of Network Rate Equatons from Sngle- Cell Conductance Equatons Our goal to derve the form of the abstract quanttes n rate equatons, such as synaptc

More information

1 GSW Iterative Techniques for y = Ax

1 GSW Iterative Techniques for y = Ax 1 for y = A I m gong to cheat here. here are a lot of teratve technques that can be used to solve the general case of a set of smultaneous equatons (wrtten n the matr form as y = A), but ths chapter sn

More information

Admin NEURAL NETWORKS. Perceptron learning algorithm. Our Nervous System 10/25/16. Assignment 7. Class 11/22. Schedule for the rest of the semester

Admin NEURAL NETWORKS. Perceptron learning algorithm. Our Nervous System 10/25/16. Assignment 7. Class 11/22. Schedule for the rest of the semester 0/25/6 Admn Assgnment 7 Class /22 Schedule for the rest of the semester NEURAL NETWORKS Davd Kauchak CS58 Fall 206 Perceptron learnng algorthm Our Nervous System repeat untl convergence (or for some #

More information

Kernel Methods and SVMs Extension

Kernel Methods and SVMs Extension Kernel Methods and SVMs Extenson The purpose of ths document s to revew materal covered n Machne Learnng 1 Supervsed Learnng regardng support vector machnes (SVMs). Ths document also provdes a general

More information

2.3 Nilpotent endomorphisms

2.3 Nilpotent endomorphisms s a block dagonal matrx, wth A Mat dm U (C) In fact, we can assume that B = B 1 B k, wth B an ordered bass of U, and that A = [f U ] B, where f U : U U s the restrcton of f to U 40 23 Nlpotent endomorphsms

More information

Introduction to Vapor/Liquid Equilibrium, part 2. Raoult s Law:

Introduction to Vapor/Liquid Equilibrium, part 2. Raoult s Law: CE304, Sprng 2004 Lecture 4 Introducton to Vapor/Lqud Equlbrum, part 2 Raoult s Law: The smplest model that allows us do VLE calculatons s obtaned when we assume that the vapor phase s an deal gas, and

More information

Model of Neurons. CS 416 Artificial Intelligence. Early History of Neural Nets. Cybernetics. McCulloch-Pitts Neurons. Hebbian Modification.

Model of Neurons. CS 416 Artificial Intelligence. Early History of Neural Nets. Cybernetics. McCulloch-Pitts Neurons. Hebbian Modification. Page 1 Model of Neurons CS 416 Artfcal Intellgence Lecture 18 Neural Nets Chapter 20 Multple nputs/dendrtes (~10,000!!!) Cell body/soma performs computaton Sngle output/axon Computaton s typcally modeled

More information

For now, let us focus on a specific model of neurons. These are simplified from reality but can achieve remarkable results.

For now, let us focus on a specific model of neurons. These are simplified from reality but can achieve remarkable results. Neural Networks : Dervaton compled by Alvn Wan from Professor Jtendra Malk s lecture Ths type of computaton s called deep learnng and s the most popular method for many problems, such as computer vson

More information

Foundations of Arithmetic

Foundations of Arithmetic Foundatons of Arthmetc Notaton We shall denote the sum and product of numbers n the usual notaton as a 2 + a 2 + a 3 + + a = a, a 1 a 2 a 3 a = a The notaton a b means a dvdes b,.e. ac = b where c s an

More information

Multilayer Perceptrons and Backpropagation. Perceptrons. Recap: Perceptrons. Informatics 1 CG: Lecture 6. Mirella Lapata

Multilayer Perceptrons and Backpropagation. Perceptrons. Recap: Perceptrons. Informatics 1 CG: Lecture 6. Mirella Lapata Multlayer Perceptrons and Informatcs CG: Lecture 6 Mrella Lapata School of Informatcs Unversty of Ednburgh mlap@nf.ed.ac.uk Readng: Kevn Gurney s Introducton to Neural Networks, Chapters 5 6.5 January,

More information

CHALMERS, GÖTEBORGS UNIVERSITET. SOLUTIONS to RE-EXAM for ARTIFICIAL NEURAL NETWORKS. COURSE CODES: FFR 135, FIM 720 GU, PhD

CHALMERS, GÖTEBORGS UNIVERSITET. SOLUTIONS to RE-EXAM for ARTIFICIAL NEURAL NETWORKS. COURSE CODES: FFR 135, FIM 720 GU, PhD CHALMERS, GÖTEBORGS UNIVERSITET SOLUTIONS to RE-EXAM for ARTIFICIAL NEURAL NETWORKS COURSE CODES: FFR 35, FIM 72 GU, PhD Tme: Place: Teachers: Allowed materal: Not allowed: January 2, 28, at 8 3 2 3 SB

More information

The Order Relation and Trace Inequalities for. Hermitian Operators

The Order Relation and Trace Inequalities for. Hermitian Operators Internatonal Mathematcal Forum, Vol 3, 08, no, 507-57 HIKARI Ltd, wwwm-hkarcom https://doorg/0988/mf088055 The Order Relaton and Trace Inequaltes for Hermtan Operators Y Huang School of Informaton Scence

More information

Lectures - Week 4 Matrix norms, Conditioning, Vector Spaces, Linear Independence, Spanning sets and Basis, Null space and Range of a Matrix

Lectures - Week 4 Matrix norms, Conditioning, Vector Spaces, Linear Independence, Spanning sets and Basis, Null space and Range of a Matrix Lectures - Week 4 Matrx norms, Condtonng, Vector Spaces, Lnear Independence, Spannng sets and Bass, Null space and Range of a Matrx Matrx Norms Now we turn to assocatng a number to each matrx. We could

More information

Difference Equations

Difference Equations Dfference Equatons c Jan Vrbk 1 Bascs Suppose a sequence of numbers, say a 0,a 1,a,a 3,... s defned by a certan general relatonshp between, say, three consecutve values of the sequence, e.g. a + +3a +1

More information

LECTURE 9 CANONICAL CORRELATION ANALYSIS

LECTURE 9 CANONICAL CORRELATION ANALYSIS LECURE 9 CANONICAL CORRELAION ANALYSIS Introducton he concept of canoncal correlaton arses when we want to quantfy the assocatons between two sets of varables. For example, suppose that the frst set of

More information

Neural Networks & Learning

Neural Networks & Learning Neural Netorks & Learnng. Introducton The basc prelmnares nvolved n the Artfcal Neural Netorks (ANN) are descrbed n secton. An Artfcal Neural Netorks (ANN) s an nformaton-processng paradgm that nspred

More information

Lecture Notes on Linear Regression

Lecture Notes on Linear Regression Lecture Notes on Lnear Regresson Feng L fl@sdueducn Shandong Unversty, Chna Lnear Regresson Problem In regresson problem, we am at predct a contnuous target value gven an nput feature vector We assume

More information

THE SUMMATION NOTATION Ʃ

THE SUMMATION NOTATION Ʃ Sngle Subscrpt otaton THE SUMMATIO OTATIO Ʃ Most of the calculatons we perform n statstcs are repettve operatons on lsts of numbers. For example, we compute the sum of a set of numbers, or the sum of the

More information

CHAPTER 5 NUMERICAL EVALUATION OF DYNAMIC RESPONSE

CHAPTER 5 NUMERICAL EVALUATION OF DYNAMIC RESPONSE CHAPTER 5 NUMERICAL EVALUATION OF DYNAMIC RESPONSE Analytcal soluton s usually not possble when exctaton vares arbtrarly wth tme or f the system s nonlnear. Such problems can be solved by numercal tmesteppng

More information

Errors for Linear Systems

Errors for Linear Systems Errors for Lnear Systems When we solve a lnear system Ax b we often do not know A and b exactly, but have only approxmatons  and ˆb avalable. Then the best thng we can do s to solve ˆx ˆb exactly whch

More information

The Hopfield model. 1 The Hebbian paradigm. Sebastian Seung Lecture 15: November 7, 2002

The Hopfield model. 1 The Hebbian paradigm. Sebastian Seung Lecture 15: November 7, 2002 MIT Department of Bran and Cogntve Scences 9.29J, Sprng 2004 - Introducton to Computatonal euroscence Instructor: Professor Sebastan Seung The Hopfeld model Sebastan Seung 9.64 Lecture 5: ovember 7, 2002

More information

NP-Completeness : Proofs

NP-Completeness : Proofs NP-Completeness : Proofs Proof Methods A method to show a decson problem Π NP-complete s as follows. (1) Show Π NP. (2) Choose an NP-complete problem Π. (3) Show Π Π. A method to show an optmzaton problem

More information

3.1 Expectation of Functions of Several Random Variables. )' be a k-dimensional discrete or continuous random vector, with joint PMF p (, E X E X1 E X

3.1 Expectation of Functions of Several Random Variables. )' be a k-dimensional discrete or continuous random vector, with joint PMF p (, E X E X1 E X Statstcs 1: Probablty Theory II 37 3 EPECTATION OF SEVERAL RANDOM VARIABLES As n Probablty Theory I, the nterest n most stuatons les not on the actual dstrbuton of a random vector, but rather on a number

More information

College of Computer & Information Science Fall 2009 Northeastern University 20 October 2009

College of Computer & Information Science Fall 2009 Northeastern University 20 October 2009 College of Computer & Informaton Scence Fall 2009 Northeastern Unversty 20 October 2009 CS7880: Algorthmc Power Tools Scrbe: Jan Wen and Laura Poplawsk Lecture Outlne: Prmal-dual schema Network Desgn:

More information

ELASTIC WAVE PROPAGATION IN A CONTINUOUS MEDIUM

ELASTIC WAVE PROPAGATION IN A CONTINUOUS MEDIUM ELASTIC WAVE PROPAGATION IN A CONTINUOUS MEDIUM An elastc wave s a deformaton of the body that travels throughout the body n all drectons. We can examne the deformaton over a perod of tme by fxng our look

More information

A new construction of 3-separable matrices via an improved decoding of Macula s construction

A new construction of 3-separable matrices via an improved decoding of Macula s construction Dscrete Optmzaton 5 008 700 704 Contents lsts avalable at ScenceDrect Dscrete Optmzaton journal homepage: wwwelsevercom/locate/dsopt A new constructon of 3-separable matrces va an mproved decodng of Macula

More information

Transfer Functions. Convenient representation of a linear, dynamic model. A transfer function (TF) relates one input and one output: ( ) system

Transfer Functions. Convenient representation of a linear, dynamic model. A transfer function (TF) relates one input and one output: ( ) system Transfer Functons Convenent representaton of a lnear, dynamc model. A transfer functon (TF) relates one nput and one output: x t X s y t system Y s The followng termnology s used: x y nput output forcng

More information

2E Pattern Recognition Solutions to Introduction to Pattern Recognition, Chapter 2: Bayesian pattern classification

2E Pattern Recognition Solutions to Introduction to Pattern Recognition, Chapter 2: Bayesian pattern classification E395 - Pattern Recognton Solutons to Introducton to Pattern Recognton, Chapter : Bayesan pattern classfcaton Preface Ths document s a soluton manual for selected exercses from Introducton to Pattern Recognton

More information

Lecture 10 Support Vector Machines II

Lecture 10 Support Vector Machines II Lecture 10 Support Vector Machnes II 22 February 2016 Taylor B. Arnold Yale Statstcs STAT 365/665 1/28 Notes: Problem 3 s posted and due ths upcomng Frday There was an early bug n the fake-test data; fxed

More information

1 Matrix representations of canonical matrices

1 Matrix representations of canonical matrices 1 Matrx representatons of canoncal matrces 2-d rotaton around the orgn: ( ) cos θ sn θ R 0 = sn θ cos θ 3-d rotaton around the x-axs: R x = 1 0 0 0 cos θ sn θ 0 sn θ cos θ 3-d rotaton around the y-axs:

More information

Generalized Linear Methods

Generalized Linear Methods Generalzed Lnear Methods 1 Introducton In the Ensemble Methods the general dea s that usng a combnaton of several weak learner one could make a better learner. More formally, assume that we have a set

More information

Homework Assignment 3 Due in class, Thursday October 15

Homework Assignment 3 Due in class, Thursday October 15 Homework Assgnment 3 Due n class, Thursday October 15 SDS 383C Statstcal Modelng I 1 Rdge regresson and Lasso 1. Get the Prostrate cancer data from http://statweb.stanford.edu/~tbs/elemstatlearn/ datasets/prostate.data.

More information

Assortment Optimization under MNL

Assortment Optimization under MNL Assortment Optmzaton under MNL Haotan Song Aprl 30, 2017 1 Introducton The assortment optmzaton problem ams to fnd the revenue-maxmzng assortment of products to offer when the prces of products are fxed.

More information

Chapter 5. Solution of System of Linear Equations. Module No. 6. Solution of Inconsistent and Ill Conditioned Systems

Chapter 5. Solution of System of Linear Equations. Module No. 6. Solution of Inconsistent and Ill Conditioned Systems Numercal Analyss by Dr. Anta Pal Assstant Professor Department of Mathematcs Natonal Insttute of Technology Durgapur Durgapur-713209 emal: anta.bue@gmal.com 1 . Chapter 5 Soluton of System of Lnear Equatons

More information

12. The Hamilton-Jacobi Equation Michael Fowler

12. The Hamilton-Jacobi Equation Michael Fowler 1. The Hamlton-Jacob Equaton Mchael Fowler Back to Confguraton Space We ve establshed that the acton, regarded as a functon of ts coordnate endponts and tme, satsfes ( ) ( ) S q, t / t+ H qpt,, = 0, and

More information

Calculation of time complexity (3%)

Calculation of time complexity (3%) Problem 1. (30%) Calculaton of tme complexty (3%) Gven n ctes, usng exhaust search to see every result takes O(n!). Calculaton of tme needed to solve the problem (2%) 40 ctes:40! dfferent tours 40 add

More information

Multilayer Perceptron (MLP)

Multilayer Perceptron (MLP) Multlayer Perceptron (MLP) Seungjn Cho Department of Computer Scence and Engneerng Pohang Unversty of Scence and Technology 77 Cheongam-ro, Nam-gu, Pohang 37673, Korea seungjn@postech.ac.kr 1 / 20 Outlne

More information

Lecture 3: Shannon s Theorem

Lecture 3: Shannon s Theorem CSE 533: Error-Correctng Codes (Autumn 006 Lecture 3: Shannon s Theorem October 9, 006 Lecturer: Venkatesan Guruswam Scrbe: Wdad Machmouch 1 Communcaton Model The communcaton model we are usng conssts

More information

Boostrapaggregating (Bagging)

Boostrapaggregating (Bagging) Boostrapaggregatng (Baggng) An ensemble meta-algorthm desgned to mprove the stablty and accuracy of machne learnng algorthms Can be used n both regresson and classfcaton Reduces varance and helps to avod

More information

Linear Feature Engineering 11

Linear Feature Engineering 11 Lnear Feature Engneerng 11 2 Least-Squares 2.1 Smple least-squares Consder the followng dataset. We have a bunch of nputs x and correspondng outputs y. The partcular values n ths dataset are x y 0.23 0.19

More information

CSC321 Tutorial 9: Review of Boltzmann machines and simulated annealing

CSC321 Tutorial 9: Review of Boltzmann machines and simulated annealing CSC321 Tutoral 9: Revew of Boltzmann machnes and smulated annealng (Sldes based on Lecture 16-18 and selected readngs) Yue L Emal: yuel@cs.toronto.edu Wed 11-12 March 19 Fr 10-11 March 21 Outlne Boltzmann

More information

Formulas for the Determinant

Formulas for the Determinant page 224 224 CHAPTER 3 Determnants e t te t e 2t 38 A = e t 2te t e 2t e t te t 2e 2t 39 If 123 A = 345, 456 compute the matrx product A adj(a) What can you conclude about det(a)? For Problems 40 43, use

More information

Outline and Reading. Dynamic Programming. Dynamic Programming revealed. Computing Fibonacci. The General Dynamic Programming Technique

Outline and Reading. Dynamic Programming. Dynamic Programming revealed. Computing Fibonacci. The General Dynamic Programming Technique Outlne and Readng Dynamc Programmng The General Technque ( 5.3.2) -1 Knapsac Problem ( 5.3.3) Matrx Chan-Product ( 5.3.1) Dynamc Programmng verson 1.4 1 Dynamc Programmng verson 1.4 2 Dynamc Programmng

More information

Canonical transformations

Canonical transformations Canoncal transformatons November 23, 2014 Recall that we have defned a symplectc transformaton to be any lnear transformaton M A B leavng the symplectc form nvarant, Ω AB M A CM B DΩ CD Coordnate transformatons,

More information

APPENDIX A Some Linear Algebra

APPENDIX A Some Linear Algebra APPENDIX A Some Lnear Algebra The collecton of m, n matrces A.1 Matrces a 1,1,..., a 1,n A = a m,1,..., a m,n wth real elements a,j s denoted by R m,n. If n = 1 then A s called a column vector. Smlarly,

More information

The Multiple Classical Linear Regression Model (CLRM): Specification and Assumptions. 1. Introduction

The Multiple Classical Linear Regression Model (CLRM): Specification and Assumptions. 1. Introduction ECONOMICS 5* -- NOTE (Summary) ECON 5* -- NOTE The Multple Classcal Lnear Regresson Model (CLRM): Specfcaton and Assumptons. Introducton CLRM stands for the Classcal Lnear Regresson Model. The CLRM s also

More information

C/CS/Phy191 Problem Set 3 Solutions Out: Oct 1, 2008., where ( 00. ), so the overall state of the system is ) ( ( ( ( 00 ± 11 ), Φ ± = 1

C/CS/Phy191 Problem Set 3 Solutions Out: Oct 1, 2008., where ( 00. ), so the overall state of the system is ) ( ( ( ( 00 ± 11 ), Φ ± = 1 C/CS/Phy9 Problem Set 3 Solutons Out: Oct, 8 Suppose you have two qubts n some arbtrary entangled state ψ You apply the teleportaton protocol to each of the qubts separately What s the resultng state obtaned

More information

MMA and GCMMA two methods for nonlinear optimization

MMA and GCMMA two methods for nonlinear optimization MMA and GCMMA two methods for nonlnear optmzaton Krster Svanberg Optmzaton and Systems Theory, KTH, Stockholm, Sweden. krlle@math.kth.se Ths note descrbes the algorthms used n the author s 2007 mplementatons

More information

Random Walks on Digraphs

Random Walks on Digraphs Random Walks on Dgraphs J. J. P. Veerman October 23, 27 Introducton Let V = {, n} be a vertex set and S a non-negatve row-stochastc matrx (.e. rows sum to ). V and S defne a dgraph G = G(V, S) and a drected

More information

Statistical Foundations of Pattern Recognition

Statistical Foundations of Pattern Recognition Statstcal Foundatons of Pattern Recognton Learnng Objectves Bayes Theorem Decson-mang Confdence factors Dscrmnants The connecton to neural nets Statstcal Foundatons of Pattern Recognton NDE measurement

More information

Psychology 282 Lecture #24 Outline Regression Diagnostics: Outliers

Psychology 282 Lecture #24 Outline Regression Diagnostics: Outliers Psychology 282 Lecture #24 Outlne Regresson Dagnostcs: Outlers In an earler lecture we studed the statstcal assumptons underlyng the regresson model, ncludng the followng ponts: Formal statement of assumptons.

More information

Supplement: Proofs and Technical Details for The Solution Path of the Generalized Lasso

Supplement: Proofs and Technical Details for The Solution Path of the Generalized Lasso Supplement: Proofs and Techncal Detals for The Soluton Path of the Generalzed Lasso Ryan J. Tbshran Jonathan Taylor In ths document we gve supplementary detals to the paper The Soluton Path of the Generalzed

More information

10-701/ Machine Learning, Fall 2005 Homework 3

10-701/ Machine Learning, Fall 2005 Homework 3 10-701/15-781 Machne Learnng, Fall 2005 Homework 3 Out: 10/20/05 Due: begnnng of the class 11/01/05 Instructons Contact questons-10701@autonlaborg for queston Problem 1 Regresson and Cross-valdaton [40

More information

Solving Nonlinear Differential Equations by a Neural Network Method

Solving Nonlinear Differential Equations by a Neural Network Method Solvng Nonlnear Dfferental Equatons by a Neural Network Method Luce P. Aarts and Peter Van der Veer Delft Unversty of Technology, Faculty of Cvlengneerng and Geoscences, Secton of Cvlengneerng Informatcs,

More information

SCALARS AND VECTORS All physical quantities in engineering mechanics are measured using either scalars or vectors.

SCALARS AND VECTORS All physical quantities in engineering mechanics are measured using either scalars or vectors. SCALARS AND ECTORS All phscal uanttes n engneerng mechancs are measured usng ether scalars or vectors. Scalar. A scalar s an postve or negatve phscal uantt that can be completel specfed b ts magntude.

More information

MA 323 Geometric Modelling Course Notes: Day 13 Bezier Curves & Bernstein Polynomials

MA 323 Geometric Modelling Course Notes: Day 13 Bezier Curves & Bernstein Polynomials MA 323 Geometrc Modellng Course Notes: Day 13 Bezer Curves & Bernsten Polynomals Davd L. Fnn Over the past few days, we have looked at de Casteljau s algorthm for generatng a polynomal curve, and we have

More information

Chapter 8 Indicator Variables

Chapter 8 Indicator Variables Chapter 8 Indcator Varables In general, e explanatory varables n any regresson analyss are assumed to be quanttatve n nature. For example, e varables lke temperature, dstance, age etc. are quanttatve n

More information

princeton univ. F 17 cos 521: Advanced Algorithm Design Lecture 7: LP Duality Lecturer: Matt Weinberg

princeton univ. F 17 cos 521: Advanced Algorithm Design Lecture 7: LP Duality Lecturer: Matt Weinberg prnceton unv. F 17 cos 521: Advanced Algorthm Desgn Lecture 7: LP Dualty Lecturer: Matt Wenberg Scrbe: LP Dualty s an extremely useful tool for analyzng structural propertes of lnear programs. Whle there

More information

find (x): given element x, return the canonical element of the set containing x;

find (x): given element x, return the canonical element of the set containing x; COS 43 Sprng, 009 Dsjont Set Unon Problem: Mantan a collecton of dsjont sets. Two operatons: fnd the set contanng a gven element; unte two sets nto one (destructvely). Approach: Canoncal element method:

More information

Resource Allocation with a Budget Constraint for Computing Independent Tasks in the Cloud

Resource Allocation with a Budget Constraint for Computing Independent Tasks in the Cloud Resource Allocaton wth a Budget Constrant for Computng Independent Tasks n the Cloud Wemng Sh and Bo Hong School of Electrcal and Computer Engneerng Georga Insttute of Technology, USA 2nd IEEE Internatonal

More information

COMPLEX NUMBERS AND QUADRATIC EQUATIONS

COMPLEX NUMBERS AND QUADRATIC EQUATIONS COMPLEX NUMBERS AND QUADRATIC EQUATIONS INTRODUCTION We know that x 0 for all x R e the square of a real number (whether postve, negatve or ero) s non-negatve Hence the equatons x, x, x + 7 0 etc are not

More information

Design and Optimization of Fuzzy Controller for Inverse Pendulum System Using Genetic Algorithm

Design and Optimization of Fuzzy Controller for Inverse Pendulum System Using Genetic Algorithm Desgn and Optmzaton of Fuzzy Controller for Inverse Pendulum System Usng Genetc Algorthm H. Mehraban A. Ashoor Unversty of Tehran Unversty of Tehran h.mehraban@ece.ut.ac.r a.ashoor@ece.ut.ac.r Abstract:

More information

AGC Introduction

AGC Introduction . Introducton AGC 3 The prmary controller response to a load/generaton mbalance results n generaton adjustment so as to mantan load/generaton balance. However, due to droop, t also results n a non-zero

More information

Week3, Chapter 4. Position and Displacement. Motion in Two Dimensions. Instantaneous Velocity. Average Velocity

Week3, Chapter 4. Position and Displacement. Motion in Two Dimensions. Instantaneous Velocity. Average Velocity Week3, Chapter 4 Moton n Two Dmensons Lecture Quz A partcle confned to moton along the x axs moves wth constant acceleraton from x =.0 m to x = 8.0 m durng a 1-s tme nterval. The velocty of the partcle

More information

Tutorial 2. COMP4134 Biometrics Authentication. February 9, Jun Xu, Teaching Asistant

Tutorial 2. COMP4134 Biometrics Authentication. February 9, Jun Xu, Teaching Asistant Tutoral 2 COMP434 ometrcs uthentcaton Jun Xu, Teachng sstant csjunxu@comp.polyu.edu.hk February 9, 207 Table of Contents Problems Problem : nswer the questons Problem 2: Power law functon Problem 3: Convoluton

More information

Introduction to the Introduction to Artificial Neural Network

Introduction to the Introduction to Artificial Neural Network Introducton to the Introducton to Artfcal Neural Netork Vuong Le th Hao Tang s sldes Part of the content of the sldes are from the Internet (possbly th modfcatons). The lecturer does not clam any onershp

More information

Lecture 5 Decoding Binary BCH Codes

Lecture 5 Decoding Binary BCH Codes Lecture 5 Decodng Bnary BCH Codes In ths class, we wll ntroduce dfferent methods for decodng BCH codes 51 Decodng the [15, 7, 5] 2 -BCH Code Consder the [15, 7, 5] 2 -code C we ntroduced n the last lecture

More information

Natural Language Processing and Information Retrieval

Natural Language Processing and Information Retrieval Natural Language Processng and Informaton Retreval Support Vector Machnes Alessandro Moschtt Department of nformaton and communcaton technology Unversty of Trento Emal: moschtt@ds.untn.t Summary Support

More information

arxiv: v1 [cs.lg] 17 Jan 2019

arxiv: v1 [cs.lg] 17 Jan 2019 LECTURE NOTES arxv:90.05639v [cs.lg] 7 Jan 209 Artfcal Neural Networks B. MEHLIG Department of Physcs Unversty of Gothenburg Göteborg, Sweden 209 PREFACE These are lecture notes for my course on Artfcal

More information

AP Physics 1 & 2 Summer Assignment

AP Physics 1 & 2 Summer Assignment AP Physcs 1 & 2 Summer Assgnment AP Physcs 1 requres an exceptonal profcency n algebra, trgonometry, and geometry. It was desgned by a select group of college professors and hgh school scence teachers

More information

5 The Rational Canonical Form

5 The Rational Canonical Form 5 The Ratonal Canoncal Form Here p s a monc rreducble factor of the mnmum polynomal m T and s not necessarly of degree one Let F p denote the feld constructed earler n the course, consstng of all matrces

More information

LECTURE NOTES. Artifical Neural Networks. B. MEHLIG (course home page)

LECTURE NOTES. Artifical Neural Networks. B. MEHLIG (course home page) LECTURE NOTES Artfcal Neural Networks B. MEHLIG (course home page) Department of Physcs Unversty of Gothenburg Göteborg, Sweden 208 PREFACE These are lecture notes for my course on Artfcal Neural Networks

More information

A particle in a state of uniform motion remain in that state of motion unless acted upon by external force.

A particle in a state of uniform motion remain in that state of motion unless acted upon by external force. The fundamental prncples of classcal mechancs were lad down by Galleo and Newton n the 16th and 17th centures. In 1686, Newton wrote the Prncpa where he gave us three laws of moton, one law of gravty,

More information

Supporting Information

Supporting Information Supportng Informaton The neural network f n Eq. 1 s gven by: f x l = ReLU W atom x l + b atom, 2 where ReLU s the element-wse rectfed lnear unt, 21.e., ReLUx = max0, x, W atom R d d s the weght matrx to

More information

Compilers. Spring term. Alfonso Ortega: Enrique Alfonseca: Chapter 4: Syntactic analysis

Compilers. Spring term. Alfonso Ortega: Enrique Alfonseca: Chapter 4: Syntactic analysis Complers Sprng term Alfonso Ortega: alfonso.ortega@uam.es nrque Alfonseca: enrque.alfonseca@uam.es Chapter : Syntactc analyss. Introducton. Bottom-up Analyss Syntax Analyser Concepts It analyses the context-ndependent

More information