CS407 Neural Computation

Size: px
Start display at page:

Download "CS407 Neural Computation"

Transcription

1 CS407 Neural Computaton Lecture 8: Neural Netorks for Constraned Optmzaton. Lecturer: A/Prof. M. Bennamoun

2 Neural Nets for Constraned Optmzaton. Introducton Boltzmann machne Introducton Archtecture and Algorthm Boltzmann machne: applcaton to to the TSP Contnuous Hopfeld nets Contnuous Hopfeld nets: applcaton to to the TSP References and suggested readng

3 Introducton There are nets that are desgned for constraned optmzaton problems (such as the Travelng Salesman Problem, TSP). These nets have fed eghts that ncorporate nformaton concernng the constrants and the quantty to be optmzed. The nets terate to fnd a pattern of o/p sgnals that represents a soluton to the problem. E.g of such nets are the Boltzmann machne (thout learnng), the contnuous Hopfeld net, and several varatons (Gaussan and Cauchy nets). Other optmzaton problems to hch ths type of NNs can be appled to are: ob shop schedulng, space allocaton, 3

4 Travelng Salesman Problem (TSP) The am of the TSP s to fnd a tour of a gven set of ctes that s of mnmum length. A tour conssts of vstng each cty eactly once and returnng to the startng cty. The tour of mnmum dstance s desred. The dffculty of fndng a soluton ncreases rapdly as the number of ctes ncreases. Many approaches other than NNs to solve ths problem are etensvely reported n the lterature. 4

5 Introducton NN approach to constraned optmzaton Each unt represents a hypothess, th the unt on f the hypothess s true, off f the hypothess s false. The eghts are fed to represent both the constrants of the problem and the functon to be optmzed. The soluton of the problem corresponds to the mnmum of an energy functon or the mamum of a consensus functon for the net. NNs have several potental advantages over tradtonal technques for certan types of optmzaton problems. They can fnd near optmal solutons quckly for large problems. 5

6 Introducton NN approach to constraned optmzaton They can also handle stuatons n hch some constrants are eak (desrable but not absolutely requred). For e.g. n the TSP, t s physcally mpossble to vst ctes smultaneously, but t may be desrable to vst each cty only once. The dfference n these types of constrants could be reflected by makng the penalty for havng unts n the same column on smultaneously larger than the penalty for havng unts n the same ro on smultaneously. If t s more mportant to vst some ctes than others, these ctes can be gven larger self-connecton eghts. 6

7 Introducton NN archtecture for the TSP For n ctes, e use n unts, arranged n a square array. A vald tour s represented by eactly one unt beng on n each ro and n each column. To unts beng on n a ro ndcates that the correspondng cty as vsted tce; To unts beng on n a column shos that the salesman as n to ctes at the same tme. The unts n each ro are fully nterconnected; smlarly The unts n each column are fully nterconnected. The eghts are set so that unts thn the same ro (or the same column) ll tend not to be on at the same tme. In addton, there are connectons (see later). beteen unts n adacent columns and beteen unts n the frst and last columns, correspondng to the dstances beteen ctes 7

8 8 J,0 J,9 J,8 J,7 J,6 J,5 J,4 J,3 J, J, I,0 I,9 I,8 I,7 I,6 I,5 I,4 I,3 I, I, H,0 H,9 H,8 H,7 H,6 H,5 H,4 H,3 H, H, G,0 G,9 G,8 G,7 G,6 G,5 G,4 G,3 G, G, F,0 F,9 F,8 F,7 F,6 F,5 F,4 F,3 F, F, E,0 E,9 E,8 E,7 E,6 E,5 E,4 E,3 E, E, D,0 D,9 D,8 D,7 D,6 D,5 D,4 D,3 D, D, C,0 C,9 C,8 C,7 C,6 C,5 C,4 C,3 C, C, B,0 B,9 B,8 B,7 B,6 B,5 B,4 B,3 B, B, A,0 A,9 A,8 A,7 A,6 A,5 A,4 A,3 A, A, Cty J I H G F E D C B A Introducton NN archtecture for the TSP b b,,,n,,,n n, n, n,n b b b b b b b

9 Neural Nets for Constraned Optmzaton. Introducton Boltzmann machne Introducton Archtecture and Algorthm Boltzmann machne: applcaton to to the TSP Contnuous Hopfeld nets Contnuous Hopfeld nets: applcaton to to the TSP References and suggested readng 9

10 Boltzmann machne The states of the unts of a Boltzmann machne NNs are bnary valued, th probablstc state transtons. The confguraton of the net s the vector of the states of the unts. The Boltzmann machne descrbed n ths lecture has fed eght, hch epress the degree of desrablty that unts X and X both be on. In applyng Boltzmann machne to constraned optmzaton problems, the eghts represent the constrants of the problem and the quantty to be optmzed. Note that the descrpton presented here s based on the mamzaton of a consensus functon (rather than the mnmzaton of a cost functon). The archtecture of a Boltzmann machne s qute general, consstng of a set of unts (X and X are representatve unts) a set of b-drectonal connectons beteen pars of unts. If unts X and X are connected,! 0. The b-drectonal nature of the connecton s often represented as 0

11 Boltzmann machne A unt may also have a self-connecton (or equvalently, there may be a bas unt, hch s alays on and connected to every other unt; n ths nterpretaton, the self-connecton eght ould be replaced by the bas eght). The state of unt X s ether ( on ) or 0 ( off ). The obectve of the NN s to mamze the consensus functon: C The sum runs over all unts of the net. The net fnds ths mamum (or at least a local mamum) by lettng each unt attempt to change ts state (from on to off or vce versa). The attempts may be made ether sequentally (one unt at a tme) or n parallel (several unts smultaneously). Only the sequental Boltzmann machne ll be dscussed here.

12 Boltzmann machne C If unt s on, If unt s off, 0

13 Boltzmann machne The change n consensus f unt X ere to change ts state (from to 0 or from 0 to ) s Contrbuton from all nodes hch are on and connected to thru C( ) [ ] + here s the current state of unt X. The coeffcent [ ] ll be + f unt X s currently off and f unt X s currently on X 0 X NOTE: that f unt X ere to change ts actvaton, the resultng change n consensus can be computed from nformaton that s local to unt X,.e. from eghts on connectons and actvatons of unts to hch unt X s connected (th 0 If unt X s not connected to unt X ). 3

14 Boltzmann machne Hoever, unt X does not necessarly change ts state, even f dong so ould ncrease the consensus of the net. The probablty of the net acceptng a change n state for unt X s A(, T) + ep C( ) T The control parameter T (called temperature) s gradually reduced as the net searches for a mamal consensus Loer values of T make t more lkely that the net ll accept a change of state that ncreases ts consensus and less lkely that t ll accept a change that reduces ts consensus. C T 0 ep 0 A(, T) (assumng C > 0) T 4

15 Boltzmann machne The use of a probablstc update procedure for the actvatons, th the control parameter decreasng as the net searches for the optmal soluton to the problem represented by ts eghts, reduces the chances of the net gettng stuck n a local mamum. Ths process of gradually reducng T s called smulated annealng. It s analogous to the physcal annealng process used to produce a strong metal (th a regular crystallne structure). Durng annealng a molten metal s cooled gradually n order to avod mperfectons n the crystallne structure of the metal due to freezng. 5

16 Neural Nets for Constraned Optmzaton. Introducton Boltzmann machne Introducton Archtecture and Algorthm Boltzmann machne: applcaton to to the TSP Contnuous Hopfeld nets Contnuous Hopfeld nets: applcaton to to the TSP References and suggested readng 6

17 Boltzmann machne Archtecture Here s the archtecture of a Boltzmann machne for unts n a D array. The unts thn each ro are fully nterconnected. Smlarly, the unts thn each column are also fully nterconnected. The eghts on each of the connectons s p (here p>0). Each unt has a self-connecton, th eght b>0 A typcal unt s labelled, b b b,,,n b b,,,n b n, n, n,n b b b 7

18 Boltzmann machne Settng the eghts: Algorthm The eghts for a Boltzmann machne are fed so that the net ll tend to make state transtons toard a mamum of the consensus functon defned above. If e sh the net (shon n prevous slde) to have eactly one unt on n each ro and n each column, e must choose the values of the eghts p and b so that mprovng the confguraton corresponds to ncreasng the consensus. Each unt s connected to every other unt n the same ro th eght p (p > 0) Smlarly, each unt s connected to every other unt n the same column th eght p. The eghts are penaltes for volatng the condton at most one unt be on n each ro and each column. In addton, each unt has a self-connecton, of eght b>0. 8

19 Boltzmann machne Algorthm The self-connecton eght s an ncentve (bonus) to encourage a unt to turn on f t can do so thout causng more than one unt to be on n a ro or column. If p > b, the net ll functon as desred: If unt s off (u 0) and none of the unts connected to s on, changng the status of to on ll ncrease the consensus of the net by the amount b (hch s a desrable change). On the other hand, f one of the unts n ro or n column (say,, + s already on ), attemptng to turn unt on ould result n a change of consensus by the amount b. Thus, for b < 0 (.e. p>b), the effect ould be to decrease the consensus (the net ll tend to reect ths unfavorable change). Bonus and penalty connectons, th p>b, ll be used n the net for the TSP to represent the constrants for a vald tour. 9

20 Boltzmann machne Algorthm Applcaton procedure: The eght beteen unt and IJ s denoted (, ; I,J) (, (, ; I, J) p ;, ) b f I The applcaton procedure s as follos: or J (but not both); Step 0 Intalze eghts to represent the constrants of the problem Intalze the control parameter (temperature) T Intalze actvatons of unts (random bnary values). Step Whle stoppng condton s false, do steps -8 Step Step 3 Step 4 Do steps 3-6 n tmes (ths consttutes an epoch) Choose ntegers I and J at random beteen and n (unt IJ s the current canddate to change ts state) Compute the change n consensus that ould result: C( ) [ u ] ( I, J; I, J) + (, ; I, J) IJ u, I, J 0

21 Boltzmann machne Step 5 Step 6 Algorthm Compute the probablty of acceptance of the change: A( T) C( ) + ep T Determne hether or not to accept the change Let R be a random number beteen 0 and. If R<A, accept the change: u u (Ths changesthe state of unt ) I, J I, J I, J Step 7 If R> A, reect the proposed change. Reduce the control parameter T ( ne) 0.95T( old) Step 8 Test stoppng condton: If there has been no change of state for a specfed number of epochs, or f the temperature has reached a specfed value, stop; otherse contnue.

22 Boltzmann machne Intal Temperature: Algorthm The ntal temperature should be taken large enough so that the probablty of acceptng a change of state s appromately 0.5, regardless of hether the change s benefcal or detrmental. T ep C T A(, T) 0.5 Hoever, snce a hgh startng temperature ncreases the requred computaton tme sgnfcantly, a loer ntal temperature may be more practcal n some applcatons.

23 Boltzmann machne Coolng schedule Algorthm Theoretcal results sho that the temperature should be cooled sloly accordng to the logarthmc formula: here k s an epoch. T0 T B ( k) log( + k) Eponental coolng schedule can be used: T( ne) α T( old) here the temperature s reduced after each epoch. A larger α (such as α 0.98) allos for feer epochs at each temperature A smaller α (such as α 0.9) may requre more epochs at each temperature. 3

24 Neural Nets for Constraned Optmzaton. Introducton Boltzmann machne Introducton Archtecture and Algorthm Boltzmann machne: applcaton to to the TSP Contnuous Hopfeld nets Contnuous Hopfeld nets: applcaton to to the TSP References and suggested readng 4

25 Boltzmann machne Applcaton TSP Nomenclature: n number of ctes n the tour (there are n unt n the net) nde desgnatng a cty; n nde desgnatng poston n tour, mod n;.e. n +, 0 n, unt representng the hypothess that the th cty s vsted at the th step of the tour u, actvaton of unt, ; u, f the hypothess s true, u, 0 f the hypothess s false d,k dstance beteen cty and cty k, k d mamum dstance beteen ctes. 5

26 Boltzmann machne Applcaton TSP Archtecture: For ths applcaton t s convenent to arrange the unts of the NN n a grd (Fg. belo). The ros of the grd represent ctes to be vsted The columns the poston of a cty n the tour. Cty A B C D E F G H I J Poston A, B, C, D, E, F, G, H, I, J, A, B, C, D, E, F, G, H, I, J, 3 A,3 B,3 C,3 D,3 E,3 F,3 G,3 H,3 I,3 J,3 4 A,4 B,4 C,4 D,4 E,4 F,4 G,4 H,4 I,4 J,4 5 A,5 B,5 C,5 D,5 E,5 F,5 G,5 H,5 I,5 J,5 6 A,6 B,6 C,6 D,6 E,6 F,6 G,6 H,6 I,6 J,6 7 A,7 B,7 C,7 D,7 E,7 F,7 G,7 H,7 I,7 J,7 8 A,8 B,8 C,8 D,8 E,8 F,8 G,8 H,8 I,8 J,8 9 A,9 B,9 C,9 D,9 E,9 F,9 G,9 H,9 I,9 J,9 0 A,0 B,0 C,0 D,0 E,0 F,0 G,0 H,0 I,0 J,0 6

27 Boltzmann machne Applcaton TSP, has a self-connecton of eght b; ths represents the desrablty of vstng cty, at stage., s connected to all other unts n ro th penalty eght p; ths represents the constrants that the same cty s not to be vsted tce., s connected toall other unts n column th penalty eght p; ths represents the constrant that ctes cannot be vsted smultaneously., s connected to k,+ for k n, k, th eght d, Ths represents the dstance traveled n makng the transton from cty at stage to cty k at stage +, s connected to k,- for k n, k, th eght d, Ths represents the dstance traveled n makng the transton from cty k at stage - to cty at stage k k 7

28 Boltzmann machne Applcaton TSP Settng the eghts: The desred net ll be constructed n steps Frst, a NN ll be formed for hch the mamum consensus occurs henever the constrants of the problem are satsfed..e. hen eactly one unt s on n each ro and n each column. Second, e ll add eghted connectons to represent the dstances beteen the ctes. In order to treat the problem as a mamum consensus problem, the eghts representng dstances ll be negatve. A Boltzmann machne th eghts representng the constrants (but not the dstances) for the TSP s shon belo. If p>b, the net ll functon as desred (as eplaned earler). To complete the formulaton of a Boltzmann NN for the TSP, eghted connectons representng dstances must be ncluded. For ths purpose, a typcal unt, s connected to the unts k,- and k,+ (for all k ) by eghts that represent the dstances beteen cty and cty k 8

29 Boltzmann machne Applcaton TSP b b b,,,n b b,,,n b n, n, n,n b b b 9

30 Boltzmann machne Applcaton TSP The dstance eghts are shon on the fgure belo for the typcal unt,,- -d,, -d,,+ k,- k,+ -d k, -d k,,-,,+ -d n, -d n, n,- n, n,+ Boltzmann NN for the TSP; eghts represent the dstances for unt, 30

31 Boltzmann machne Applcaton TSP NOTE: nts n the last column are connected to unts n the frst column by connectons representng the approprate dstances. Hoever, unts n a partcular column are not connected to unts n columns other than those mmedately adacent to the sad column. We no consder the relaton beteen the constrant eght b and the dstance eghts. Let d denote the mamum dstance beteen any ctes n the tour. Assume that no cty s vsted n the th poston of the tour and that no cty s vsted tce. In ths case, some cty, say s not vsted at all;.e. no unt s on n column or n ro. Snce allong, to turn on should be encouraged, the eghts should be set so that the consensus ll be ncreased f t turns on. The change n consensus ll be b d, k d, k here k ndcates the cty vsted at stage - of the tour k denotes the cty vsted at stage + (and cty s vsted at stage ). Ths change > b-d (ths change should be > even for mamum dstance beteen ctes, d) 3

32 Boltzmann machne Applcaton TSP Hoever equalty ll occur only f the ctes vsted n postons - and + are both the mamum dstance d, aay from cty In general, requrng the change n consensus to be postve ll suffce, so e take b>d. Thus, e see that f p>b, the consensus functon has a hgher value for a feasble soluton (one that satsfes the constrants) than for a non-feasble soluton If b>d the consensus ll be hgher for a short feasble soluton than for a longer tour. In summary: p>b >d 3

33 Boltzmann machne Analyss The TSP s a nce model for a varety of constraned optmzaton problems. It s hoever a dffcult problem for the Boltzmann machne, because n order to go from one vald tour to another, several nvald tours must be accepted. By contrast, the transton from vald solutons to vald soluton may not be as dffcult n other constraned optmzaton problems. Equlbrum: The net s n thermal equlbrum (at a partcular temperature) hen the probs P α and P β of confguratons of the net, α and β, obey the Boltzmann dstrbuton P P α β E ep β E T α E E α β energy of energy of confgα confgβ 33

34 Boltzmann machne Analyss At hgher temperatures, the probs of dfferent confgs are more nearly equal T ep(.) P ~ α P β At loer temperatures, there s a stronger bas toard confs th loer energy. Startng at a suffcently hgh temperature ensures that the net ll have appromately equal probs of acceptng or reectng any proposed state transton. If the temp s reduced sloly, the net ll reman n equlbrum at loer temps. It s not practcal to verfy drectly the equlbrum condton at each temp, as there are too many possble confguratons. 34

35 Boltzmann machne Energy functon: The energy of a confguraton can be defned as: E < + Analyss Where θ s a threshold and self-connectons (or bases) are not used. The dfference n energy beteen a confg th unt X k off and one th X k on (and the state of all other unts remanng unchanged) s E( k) θ k If the unts change ther actvatons randomly and asynchronously and the net alays moves to a loer energy (rather than movng to a loer energy th a probablty that s less than ), the dscrete Hopfeld net results. + θ k 35

36 Boltzmann machne Analyss To smplfy notaton, one may nclude a unt n the net that s connected to every other unt and s alays on. Ths allos the threshold to be treated as any other eght, so that E < The energy gap beteen the conf th unt X k off and that th unt X k on s E ( k) k 36

37 Neural Nets for Constraned Optmzaton. Introducton Boltzmann machne Introducton Archtecture and Algorthm Boltzmann machne: applcaton to to the TSP Contnuous Hopfeld nets Contnuous Hopfeld nets: applcaton to to the TSP References and suggested readng 37

38 Contnuous Hopfeld net A modfcaton of the dscrete Hopfeld net, th contnuousvalued output functons, can be used ether for assocatve memory problems (as th the dscrete form) or constraned optmzaton problems such as the TSP. As th the dscrete Hopfeld net, the connectons beteen unts are bdrectonal. So that the eght matr s symmetrc;.e. the connecton from unt to unt (th eght ) s the same as the connecton from to (th eght ). For the contnuous Hopfeld net, e denote the nternal actvty of a neuron as u ; Its output sgnal s v g u ) If e defne an energy functon ( E n n 0.5 v v + θ v n 38

39 Contnuous Hopfeld net E 0.5 n n n v v + n θ v for E E ( θ v + θ v ) ( v v + v v + ) + ( θ v + θ v ) v v + v v + for θ θ 39

40 Contnuous Hopfeld net Then the net ll converge to a stable confguraton that s a mnmum of the energy functon as long as n n n de 0 But E 0.5 dt vv + θv de dt E v dv du du dt E v g '( u net v ) > dv du + θ 0, E v du dt net (chan rule) E s a Lyapounov Energy functon Hence de dt g '( u )( net ) 0 40

41 Contnuous Hopfeld net For ths form of the energy functon, the net ll converge f the actvty of each neuron changes th tme accordng to the dfferental equaton du dt E v n v θ In the orgnal presentaton of the contnuous Hopfeld net, the energy functon as: E 0.5 n n v v n θ v + τ n v 0 g ( v) dv Tme constant 4

42 4 Contnuous Hopfeld net If the actvty of each neuron changes th tme accordng to the dfferental equaton the net ll converge. In the Hopfeld-Tank soluton of the TSP, each unt has ndces. The frst nde, y, etc. denotes the cty, The second,, etc, denotes the poston n the tour. The Hopfeld-Tank energy functon for the TSP s: + + n v u dt du θ τ y y y y y y v v v d D v N C v v B v v A E ) (,,,,,,,,,

43 43 Contnuous Hopfeld net The dfferental equaton for the actvty of unt X,I s The o/p sgnal s gven by applyng the sgmod functon (th range beteen 0 and ), hch Hopfeld and Tank epressed as X y I y I y y X I X y I y XJ I X I X v v d D v N C v B v A u dt du ( ),,,,,,, τ [ ] u u g v α tanh( 0.5 ) ( +

44 Neural Nets for Constraned Optmzaton. Introducton Boltzmann machne Introducton Archtecture and Algorthm Boltzmann machne: applcaton to to the TSP Contnuous Hopfeld nets Contnuous Hopfeld nets: applcaton to to the TSP References and suggested readng 44

45 Contnuous Hopfeld net Approach Formulate the problem n terms of a Hopfeld energy of the form: n n E 0.5 v v + θ v n Problem Formulaton by Hopfeld Energy Energy Mnmzaton By Hopfeld Soluton State 45

46 46 Contnuous Hopfeld net Archtecture for TSP The unts used to solve the 0-cty TSP are arranged as shon J,0 J,9 J,8 J,7 J,6 J,5 J,4 J,3 J, J, I,0 I,9 I,8 I,7 I,6 I,5 I,4 I,3 I, I, H,0 H,9 H,8 H,7 H,6 H,5 H,4 H,3 H, H, G,0 G,9 G,8 G,7 G,6 G,5 G,4 G,3 G, G, F,0 F,9 F,8 F,7 F,6 F,5 F,4 F,3 F, F, E,0 E,9 E,8 E,7 E,6 E,5 E,4 E,3 E, E, D,0 D,9 D,8 D,7 D,6 D,5 D,4 D,3 D, D, C,0 C,9 C,8 C,7 C,6 C,5 C,4 C,3 C, C, B,0 B,9 B,8 B,7 B,6 B,5 B,4 B,3 B, B, A,0 A,9 A,8 A,7 A,6 A,5 A,4 A,3 A, A, Cty J I H G F E D C B A Poston

47 Contnuous Hopfeld net Archtecture for TSP The connecton eghts are fed and are usually not shon or even eplctly stated. The eghts for nter-ro connectons correspond to the parameter A n the energy equaton; There s a contrbuton to the energy f unts n the same ro are on. Smlarly, the nter-columnar connectons have eghts B; The dstance connectons appear n the fourth term of the energy equaton. More eplctly, the eghts beteen unts and y are (, : y, ) Aδ ( ) ( ) + ( + y δ Bδ δ y C Ddy δ, + δ, f δ s the DracDelta 0 otherse In addton each unt receves an eternal nput sgnal I +CN The parameter N s usually taken to be somehat larger than the number of ctes n 47

48 Contnuous Hopfeld net Algorthm for TSP Step 0 Intalze actvatons of all unts Intalze t to a small value Step Whle stoppng condton s false, do steps -6 Step Perform steps 3-5 n tmes (n s the number of ctes) Step 3 choose a unt at random Step 4 change actvty on selected unt: ( ne) u, ( old) + t u, ( old) A v B vy, + C N v D d y y u,, y( vy, + y, + v Step 5 apply output functon [ + u ] v, 0.5 tanh( α, Step 6 check stoppng condton 48

49 Contnuous Hopfeld net Algorthm for TSP Hopfeld and Tank used the follong parameter values n ther soluton of the problem: A B 500, C 00, D 500, N 5, α 50 The large value of α gves a very steep sgmod functon, hch appromates a step functon. The large coeffcents and a correspondngly small t result n very lttle contrbuton from the decay term ( ( old) t ) u, The ntal actvty levels (u, ) ere chosen so that v, 0 (the desred total actvaton for a vald tour). Hoever, some nose as ncluded so that not all unts started th the same actvty (or o/p sgnal). 49

50 Neural Nets for Constraned Optmzaton. Introducton Boltzmann machne Introducton Archtecture and Algorthm Boltzmann machne: applcaton to to the TSP Contnuous Hopfeld nets Contnuous Hopfeld nets: applcaton to to the TSP References and suggested readng 50

51 Suggested Readng. L., Fundamentals of Neural Netorks, Prentce-Hall, 994, Chapter 7. 5

Hopfield Training Rules 1 N

Hopfield Training Rules 1 N Hopfeld Tranng Rules To memorse a sngle pattern Suppose e set the eghts thus - = p p here, s the eght beteen nodes & s the number of nodes n the netor p s the value requred for the -th node What ll the

More information

Hopfield networks and Boltzmann machines. Geoffrey Hinton et al. Presented by Tambet Matiisen

Hopfield networks and Boltzmann machines. Geoffrey Hinton et al. Presented by Tambet Matiisen Hopfeld networks and Boltzmann machnes Geoffrey Hnton et al. Presented by Tambet Matsen 18.11.2014 Hopfeld network Bnary unts Symmetrcal connectons http://www.nnwj.de/hopfeld-net.html Energy functon The

More information

Neural Networks & Learning

Neural Networks & Learning Neural Netorks & Learnng. Introducton The basc prelmnares nvolved n the Artfcal Neural Netorks (ANN) are descrbed n secton. An Artfcal Neural Netorks (ANN) s an nformaton-processng paradgm that nspred

More information

Multigradient for Neural Networks for Equalizers 1

Multigradient for Neural Networks for Equalizers 1 Multgradent for Neural Netorks for Equalzers 1 Chulhee ee, Jnook Go and Heeyoung Km Department of Electrcal and Electronc Engneerng Yonse Unversty 134 Shnchon-Dong, Seodaemun-Ku, Seoul 1-749, Korea ABSTRACT

More information

Markov Chain Monte Carlo (MCMC), Gibbs Sampling, Metropolis Algorithms, and Simulated Annealing Bioinformatics Course Supplement

Markov Chain Monte Carlo (MCMC), Gibbs Sampling, Metropolis Algorithms, and Simulated Annealing Bioinformatics Course Supplement Markov Chan Monte Carlo MCMC, Gbbs Samplng, Metropols Algorthms, and Smulated Annealng 2001 Bonformatcs Course Supplement SNU Bontellgence Lab http://bsnuackr/ Outlne! Markov Chan Monte Carlo MCMC! Metropols-Hastngs

More information

Lecture Notes on Linear Regression

Lecture Notes on Linear Regression Lecture Notes on Lnear Regresson Feng L fl@sdueducn Shandong Unversty, Chna Lnear Regresson Problem In regresson problem, we am at predct a contnuous target value gven an nput feature vector We assume

More information

Chapter 8. Potential Energy and Conservation of Energy

Chapter 8. Potential Energy and Conservation of Energy Chapter 8 Potental Energy and Conservaton of Energy In ths chapter we wll ntroduce the followng concepts: Potental Energy Conservatve and non-conservatve forces Mechancal Energy Conservaton of Mechancal

More information

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur Module 3 LOSSY IMAGE COMPRESSION SYSTEMS Verson ECE IIT, Kharagpur Lesson 6 Theory of Quantzaton Verson ECE IIT, Kharagpur Instructonal Objectves At the end of ths lesson, the students should be able to:

More information

Pattern Classification

Pattern Classification Pattern Classfcaton All materals n these sldes ere taken from Pattern Classfcaton (nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wley & Sons, 000 th the permsson of the authors and the publsher

More information

Self-organising Systems 2 Simulated Annealing and Boltzmann Machines

Self-organising Systems 2 Simulated Annealing and Boltzmann Machines Ams Reference Keywords Plan Self-organsng Systems Smulated Annealng and Boltzmann Machnes to obtan a mathematcal framework for stochastc machnes to study smulated annealng to study the Boltzmann machne

More information

Associative Memories

Associative Memories Assocatve Memores We consder now modes for unsupervsed earnng probems, caed auto-assocaton probems. Assocaton s the task of mappng patterns to patterns. In an assocatve memory the stmuus of an ncompete

More information

CSC321 Tutorial 9: Review of Boltzmann machines and simulated annealing

CSC321 Tutorial 9: Review of Boltzmann machines and simulated annealing CSC321 Tutoral 9: Revew of Boltzmann machnes and smulated annealng (Sldes based on Lecture 16-18 and selected readngs) Yue L Emal: yuel@cs.toronto.edu Wed 11-12 March 19 Fr 10-11 March 21 Outlne Boltzmann

More information

Module 9. Lecture 6. Duality in Assignment Problems

Module 9. Lecture 6. Duality in Assignment Problems Module 9 1 Lecture 6 Dualty n Assgnment Problems In ths lecture we attempt to answer few other mportant questons posed n earler lecture for (AP) and see how some of them can be explaned through the concept

More information

Physics 5153 Classical Mechanics. Principle of Virtual Work-1

Physics 5153 Classical Mechanics. Principle of Virtual Work-1 P. Guterrez 1 Introducton Physcs 5153 Classcal Mechancs Prncple of Vrtual Work The frst varatonal prncple we encounter n mechancs s the prncple of vrtual work. It establshes the equlbrum condton of a mechancal

More information

Assortment Optimization under MNL

Assortment Optimization under MNL Assortment Optmzaton under MNL Haotan Song Aprl 30, 2017 1 Introducton The assortment optmzaton problem ams to fnd the revenue-maxmzng assortment of products to offer when the prces of products are fxed.

More information

Week 5: Neural Networks

Week 5: Neural Networks Week 5: Neural Networks Instructor: Sergey Levne Neural Networks Summary In the prevous lecture, we saw how we can construct neural networks by extendng logstc regresson. Neural networks consst of multple

More information

Other NN Models. Reinforcement learning (RL) Probabilistic neural networks

Other NN Models. Reinforcement learning (RL) Probabilistic neural networks Other NN Models Renforcement learnng (RL) Probablstc neural networks Support vector machne (SVM) Renforcement learnng g( (RL) Basc deas: Supervsed dlearnng: (delta rule, BP) Samples (x, f(x)) to learn

More information

Structure and Drive Paul A. Jensen Copyright July 20, 2003

Structure and Drive Paul A. Jensen Copyright July 20, 2003 Structure and Drve Paul A. Jensen Copyrght July 20, 2003 A system s made up of several operatons wth flow passng between them. The structure of the system descrbes the flow paths from nputs to outputs.

More information

College of Computer & Information Science Fall 2009 Northeastern University 20 October 2009

College of Computer & Information Science Fall 2009 Northeastern University 20 October 2009 College of Computer & Informaton Scence Fall 2009 Northeastern Unversty 20 October 2009 CS7880: Algorthmc Power Tools Scrbe: Jan Wen and Laura Poplawsk Lecture Outlne: Prmal-dual schema Network Desgn:

More information

Open Systems: Chemical Potential and Partial Molar Quantities Chemical Potential

Open Systems: Chemical Potential and Partial Molar Quantities Chemical Potential Open Systems: Chemcal Potental and Partal Molar Quanttes Chemcal Potental For closed systems, we have derved the followng relatonshps: du = TdS pdv dh = TdS + Vdp da = SdT pdv dg = VdP SdT For open systems,

More information

Lecture 14: Bandits with Budget Constraints

Lecture 14: Bandits with Budget Constraints IEOR 8100-001: Learnng and Optmzaton for Sequental Decson Makng 03/07/16 Lecture 14: andts wth udget Constrants Instructor: Shpra Agrawal Scrbed by: Zhpeng Lu 1 Problem defnton In the regular Mult-armed

More information

12. The Hamilton-Jacobi Equation Michael Fowler

12. The Hamilton-Jacobi Equation Michael Fowler 1. The Hamlton-Jacob Equaton Mchael Fowler Back to Confguraton Space We ve establshed that the acton, regarded as a functon of ts coordnate endponts and tme, satsfes ( ) ( ) S q, t / t+ H qpt,, = 0, and

More information

Week3, Chapter 4. Position and Displacement. Motion in Two Dimensions. Instantaneous Velocity. Average Velocity

Week3, Chapter 4. Position and Displacement. Motion in Two Dimensions. Instantaneous Velocity. Average Velocity Week3, Chapter 4 Moton n Two Dmensons Lecture Quz A partcle confned to moton along the x axs moves wth constant acceleraton from x =.0 m to x = 8.0 m durng a 1-s tme nterval. The velocty of the partcle

More information

Multilayer Perceptrons and Backpropagation. Perceptrons. Recap: Perceptrons. Informatics 1 CG: Lecture 6. Mirella Lapata

Multilayer Perceptrons and Backpropagation. Perceptrons. Recap: Perceptrons. Informatics 1 CG: Lecture 6. Mirella Lapata Multlayer Perceptrons and Informatcs CG: Lecture 6 Mrella Lapata School of Informatcs Unversty of Ednburgh mlap@nf.ed.ac.uk Readng: Kevn Gurney s Introducton to Neural Networks, Chapters 5 6.5 January,

More information

CIS526: Machine Learning Lecture 3 (Sept 16, 2003) Linear Regression. Preparation help: Xiaoying Huang. x 1 θ 1 output... θ M x M

CIS526: Machine Learning Lecture 3 (Sept 16, 2003) Linear Regression. Preparation help: Xiaoying Huang. x 1 θ 1 output... θ M x M CIS56: achne Learnng Lecture 3 (Sept 6, 003) Preparaton help: Xaoyng Huang Lnear Regresson Lnear regresson can be represented by a functonal form: f(; θ) = θ 0 0 +θ + + θ = θ = 0 ote: 0 s a dummy attrbute

More information

1 GSW Iterative Techniques for y = Ax

1 GSW Iterative Techniques for y = Ax 1 for y = A I m gong to cheat here. here are a lot of teratve technques that can be used to solve the general case of a set of smultaneous equatons (wrtten n the matr form as y = A), but ths chapter sn

More information

Duality in linear programming

Duality in linear programming MPRA Munch Personal RePEc Archve Dualty n lnear programmng Mhaela Albc and Dela Teselos and Raluca Prundeanu and Ionela Popa Unversty Constantn Brancoveanu Ramncu Valcea 7 January 00 Onlne at http://mpraubun-muenchende/986/

More information

Markov Chain Monte Carlo Lecture 6

Markov Chain Monte Carlo Lecture 6 where (x 1,..., x N ) X N, N s called the populaton sze, f(x) f (x) for at least one {1, 2,..., N}, and those dfferent from f(x) are called the tral dstrbutons n terms of mportance samplng. Dfferent ways

More information

Robust Design Optimization of Building Frames using Order Statistics and Local Search

Robust Design Optimization of Building Frames using Order Statistics and Local Search The Eghth Chna-Japan-Korea Jont Symposum on Optmzaton of Structural and Mechancal Systems May 5-9, 04, Gyeongju, Korea aper No 0059 Robust Desgn Optmzaton of Buldng Frames usng Order Statstcs and Local

More information

8 Derivation of Network Rate Equations from Single- Cell Conductance Equations

8 Derivation of Network Rate Equations from Single- Cell Conductance Equations Physcs 178/278 - Davd Klenfeld - Wnter 2015 8 Dervaton of Network Rate Equatons from Sngle- Cell Conductance Equatons We consder a network of many neurons, each of whch obeys a set of conductancebased,

More information

1 Derivation of Rate Equations from Single-Cell Conductance (Hodgkin-Huxley-like) Equations

1 Derivation of Rate Equations from Single-Cell Conductance (Hodgkin-Huxley-like) Equations Physcs 171/271 -Davd Klenfeld - Fall 2005 (revsed Wnter 2011) 1 Dervaton of Rate Equatons from Sngle-Cell Conductance (Hodgkn-Huxley-lke) Equatons We consder a network of many neurons, each of whch obeys

More information

Chapter - 2. Distribution System Power Flow Analysis

Chapter - 2. Distribution System Power Flow Analysis Chapter - 2 Dstrbuton System Power Flow Analyss CHAPTER - 2 Radal Dstrbuton System Load Flow 2.1 Introducton Load flow s an mportant tool [66] for analyzng electrcal power system network performance. Load

More information

2E Pattern Recognition Solutions to Introduction to Pattern Recognition, Chapter 2: Bayesian pattern classification

2E Pattern Recognition Solutions to Introduction to Pattern Recognition, Chapter 2: Bayesian pattern classification E395 - Pattern Recognton Solutons to Introducton to Pattern Recognton, Chapter : Bayesan pattern classfcaton Preface Ths document s a soluton manual for selected exercses from Introducton to Pattern Recognton

More information

1 Convex Optimization

1 Convex Optimization Convex Optmzaton We wll consder convex optmzaton problems. Namely, mnmzaton problems where the objectve s convex (we assume no constrants for now). Such problems often arse n machne learnng. For example,

More information

8 Derivation of Network Rate Equations from Single- Cell Conductance Equations

8 Derivation of Network Rate Equations from Single- Cell Conductance Equations Physcs 178/278 - Davd Klenfeld - Wnter 2019 8 Dervaton of Network Rate Equatons from Sngle- Cell Conductance Equatons Our goal to derve the form of the abstract quanttes n rate equatons, such as synaptc

More information

Wavelet chaotic neural networks and their application to continuous function optimization

Wavelet chaotic neural networks and their application to continuous function optimization Vol., No.3, 04-09 (009) do:0.436/ns.009.307 Natural Scence Wavelet chaotc neural networks and ther applcaton to contnuous functon optmzaton Ja-Ha Zhang, Yao-Qun Xu College of Electrcal and Automatc Engneerng,

More information

The Geometry of Logit and Probit

The Geometry of Logit and Probit The Geometry of Logt and Probt Ths short note s meant as a supplement to Chapters and 3 of Spatal Models of Parlamentary Votng and the notaton and reference to fgures n the text below s to those two chapters.

More information

COMPARISON OF SOME RELIABILITY CHARACTERISTICS BETWEEN REDUNDANT SYSTEMS REQUIRING SUPPORTING UNITS FOR THEIR OPERATIONS

COMPARISON OF SOME RELIABILITY CHARACTERISTICS BETWEEN REDUNDANT SYSTEMS REQUIRING SUPPORTING UNITS FOR THEIR OPERATIONS Avalable onlne at http://sck.org J. Math. Comput. Sc. 3 (3), No., 6-3 ISSN: 97-537 COMPARISON OF SOME RELIABILITY CHARACTERISTICS BETWEEN REDUNDANT SYSTEMS REQUIRING SUPPORTING UNITS FOR THEIR OPERATIONS

More information

Feature Selection: Part 1

Feature Selection: Part 1 CSE 546: Machne Learnng Lecture 5 Feature Selecton: Part 1 Instructor: Sham Kakade 1 Regresson n the hgh dmensonal settng How do we learn when the number of features d s greater than the sample sze n?

More information

COS 521: Advanced Algorithms Game Theory and Linear Programming

COS 521: Advanced Algorithms Game Theory and Linear Programming COS 521: Advanced Algorthms Game Theory and Lnear Programmng Moses Charkar February 27, 2013 In these notes, we ntroduce some basc concepts n game theory and lnear programmng (LP). We show a connecton

More information

Chapter 5. Solution of System of Linear Equations. Module No. 6. Solution of Inconsistent and Ill Conditioned Systems

Chapter 5. Solution of System of Linear Equations. Module No. 6. Solution of Inconsistent and Ill Conditioned Systems Numercal Analyss by Dr. Anta Pal Assstant Professor Department of Mathematcs Natonal Insttute of Technology Durgapur Durgapur-713209 emal: anta.bue@gmal.com 1 . Chapter 5 Soluton of System of Lnear Equatons

More information

princeton univ. F 17 cos 521: Advanced Algorithm Design Lecture 7: LP Duality Lecturer: Matt Weinberg

princeton univ. F 17 cos 521: Advanced Algorithm Design Lecture 7: LP Duality Lecturer: Matt Weinberg prnceton unv. F 17 cos 521: Advanced Algorthm Desgn Lecture 7: LP Dualty Lecturer: Matt Wenberg Scrbe: LP Dualty s an extremely useful tool for analyzng structural propertes of lnear programs. Whle there

More information

Resource Allocation with a Budget Constraint for Computing Independent Tasks in the Cloud

Resource Allocation with a Budget Constraint for Computing Independent Tasks in the Cloud Resource Allocaton wth a Budget Constrant for Computng Independent Tasks n the Cloud Wemng Sh and Bo Hong School of Electrcal and Computer Engneerng Georga Insttute of Technology, USA 2nd IEEE Internatonal

More information

INF 5860 Machine learning for image classification. Lecture 3 : Image classification and regression part II Anne Solberg January 31, 2018

INF 5860 Machine learning for image classification. Lecture 3 : Image classification and regression part II Anne Solberg January 31, 2018 INF 5860 Machne learnng for mage classfcaton Lecture 3 : Image classfcaton and regresson part II Anne Solberg January 3, 08 Today s topcs Multclass logstc regresson and softma Regularzaton Image classfcaton

More information

3.1 Expectation of Functions of Several Random Variables. )' be a k-dimensional discrete or continuous random vector, with joint PMF p (, E X E X1 E X

3.1 Expectation of Functions of Several Random Variables. )' be a k-dimensional discrete or continuous random vector, with joint PMF p (, E X E X1 E X Statstcs 1: Probablty Theory II 37 3 EPECTATION OF SEVERAL RANDOM VARIABLES As n Probablty Theory I, the nterest n most stuatons les not on the actual dstrbuton of a random vector, but rather on a number

More information

Difference Equations

Difference Equations Dfference Equatons c Jan Vrbk 1 Bascs Suppose a sequence of numbers, say a 0,a 1,a,a 3,... s defned by a certan general relatonshp between, say, three consecutve values of the sequence, e.g. a + +3a +1

More information

6. Stochastic processes (2)

6. Stochastic processes (2) Contents Markov processes Brth-death processes Lect6.ppt S-38.45 - Introducton to Teletraffc Theory Sprng 5 Markov process Consder a contnuous-tme and dscrete-state stochastc process X(t) wth state space

More information

b ), which stands for uniform distribution on the interval a x< b. = 0 elsewhere

b ), which stands for uniform distribution on the interval a x< b. = 0 elsewhere Fall Analyss of Epermental Measurements B. Esensten/rev. S. Errede Some mportant probablty dstrbutons: Unform Bnomal Posson Gaussan/ormal The Unform dstrbuton s often called U( a, b ), hch stands for unform

More information

6. Stochastic processes (2)

6. Stochastic processes (2) 6. Stochastc processes () Lect6.ppt S-38.45 - Introducton to Teletraffc Theory Sprng 5 6. Stochastc processes () Contents Markov processes Brth-death processes 6. Stochastc processes () Markov process

More information

Portfolios with Trading Constraints and Payout Restrictions

Portfolios with Trading Constraints and Payout Restrictions Portfolos wth Tradng Constrants and Payout Restrctons John R. Brge Northwestern Unversty (ont wor wth Chrs Donohue Xaodong Xu and Gongyun Zhao) 1 General Problem (Very) long-term nvestor (eample: unversty

More information

The Feynman path integral

The Feynman path integral The Feynman path ntegral Aprl 3, 205 Hesenberg and Schrödnger pctures The Schrödnger wave functon places the tme dependence of a physcal system n the state, ψ, t, where the state s a vector n Hlbert space

More information

VQ widely used in coding speech, image, and video

VQ widely used in coding speech, image, and video at Scalar quantzers are specal cases of vector quantzers (VQ): they are constraned to look at one sample at a tme (memoryless) VQ does not have such constrant better RD perfomance expected Source codng

More information

Lecture Space-Bounded Derandomization

Lecture Space-Bounded Derandomization Notes on Complexty Theory Last updated: October, 2008 Jonathan Katz Lecture Space-Bounded Derandomzaton 1 Space-Bounded Derandomzaton We now dscuss derandomzaton of space-bounded algorthms. Here non-trval

More information

9 Derivation of Rate Equations from Single-Cell Conductance (Hodgkin-Huxley-like) Equations

9 Derivation of Rate Equations from Single-Cell Conductance (Hodgkin-Huxley-like) Equations Physcs 171/271 - Chapter 9R -Davd Klenfeld - Fall 2005 9 Dervaton of Rate Equatons from Sngle-Cell Conductance (Hodgkn-Huxley-lke) Equatons We consder a network of many neurons, each of whch obeys a set

More information

Numerical Heat and Mass Transfer

Numerical Heat and Mass Transfer Master degree n Mechancal Engneerng Numercal Heat and Mass Transfer 06-Fnte-Dfference Method (One-dmensonal, steady state heat conducton) Fausto Arpno f.arpno@uncas.t Introducton Why we use models and

More information

The Minimum Universal Cost Flow in an Infeasible Flow Network

The Minimum Universal Cost Flow in an Infeasible Flow Network Journal of Scences, Islamc Republc of Iran 17(2): 175-180 (2006) Unversty of Tehran, ISSN 1016-1104 http://jscencesutacr The Mnmum Unversal Cost Flow n an Infeasble Flow Network H Saleh Fathabad * M Bagheran

More information

Kernels in Support Vector Machines. Based on lectures of Martin Law, University of Michigan

Kernels in Support Vector Machines. Based on lectures of Martin Law, University of Michigan Kernels n Support Vector Machnes Based on lectures of Martn Law, Unversty of Mchgan Non Lnear separable problems AND OR NOT() The XOR problem cannot be solved wth a perceptron. XOR Per Lug Martell - Systems

More information

EEL 6266 Power System Operation and Control. Chapter 3 Economic Dispatch Using Dynamic Programming

EEL 6266 Power System Operation and Control. Chapter 3 Economic Dispatch Using Dynamic Programming EEL 6266 Power System Operaton and Control Chapter 3 Economc Dspatch Usng Dynamc Programmng Pecewse Lnear Cost Functons Common practce many utltes prefer to represent ther generator cost functons as sngle-

More information

A Tutorial on Data Reduction. Linear Discriminant Analysis (LDA) Shireen Elhabian and Aly A. Farag. University of Louisville, CVIP Lab September 2009

A Tutorial on Data Reduction. Linear Discriminant Analysis (LDA) Shireen Elhabian and Aly A. Farag. University of Louisville, CVIP Lab September 2009 A utoral on Data Reducton Lnear Dscrmnant Analss (LDA) hreen Elhaban and Al A Farag Unverst of Lousvlle, CVIP Lab eptember 009 Outlne LDA objectve Recall PCA No LDA LDA o Classes Counter eample LDA C Classes

More information

Time-Varying Systems and Computations Lecture 6

Time-Varying Systems and Computations Lecture 6 Tme-Varyng Systems and Computatons Lecture 6 Klaus Depold 14. Januar 2014 The Kalman Flter The Kalman estmaton flter attempts to estmate the actual state of an unknown dscrete dynamcal system, gven nosy

More information

Introduction to the Introduction to Artificial Neural Network

Introduction to the Introduction to Artificial Neural Network Introducton to the Introducton to Artfcal Neural Netork Vuong Le th Hao Tang s sldes Part of the content of the sldes are from the Internet (possbly th modfcatons). The lecturer does not clam any onershp

More information

ONE DIMENSIONAL TRIANGULAR FIN EXPERIMENT. Technical Advisor: Dr. D.C. Look, Jr. Version: 11/03/00

ONE DIMENSIONAL TRIANGULAR FIN EXPERIMENT. Technical Advisor: Dr. D.C. Look, Jr. Version: 11/03/00 ONE IMENSIONAL TRIANGULAR FIN EXPERIMENT Techncal Advsor: r..c. Look, Jr. Verson: /3/ 7. GENERAL OJECTIVES a) To understand a one-dmensonal epermental appromaton. b) To understand the art of epermental

More information

CHAPTER III Neural Networks as Associative Memory

CHAPTER III Neural Networks as Associative Memory CHAPTER III Neural Networs as Assocatve Memory Introducton One of the prmary functons of the bran s assocatve memory. We assocate the faces wth names, letters wth sounds, or we can recognze the people

More information

CS : Algorithms and Uncertainty Lecture 17 Date: October 26, 2016

CS : Algorithms and Uncertainty Lecture 17 Date: October 26, 2016 CS 29-128: Algorthms and Uncertanty Lecture 17 Date: October 26, 2016 Instructor: Nkhl Bansal Scrbe: Mchael Denns 1 Introducton In ths lecture we wll be lookng nto the secretary problem, and an nterestng

More information

Thermodynamics Second Law Entropy

Thermodynamics Second Law Entropy Thermodynamcs Second Law Entropy Lana Sherdan De Anza College May 8, 2018 Last tme the Boltzmann dstrbuton (dstrbuton of energes) the Maxwell-Boltzmann dstrbuton (dstrbuton of speeds) the Second Law of

More information

Kernel Methods and SVMs Extension

Kernel Methods and SVMs Extension Kernel Methods and SVMs Extenson The purpose of ths document s to revew materal covered n Machne Learnng 1 Supervsed Learnng regardng support vector machnes (SVMs). Ths document also provdes a general

More information

Support Vector Machines CS434

Support Vector Machines CS434 Support Vector Machnes CS434 Lnear Separators Many lnear separators exst that perfectly classfy all tranng examples Whch of the lnear separators s the best? Intuton of Margn Consder ponts A, B, and C We

More information

Some modelling aspects for the Matlab implementation of MMA

Some modelling aspects for the Matlab implementation of MMA Some modellng aspects for the Matlab mplementaton of MMA Krster Svanberg krlle@math.kth.se Optmzaton and Systems Theory Department of Mathematcs KTH, SE 10044 Stockholm September 2004 1. Consdered optmzaton

More information

Advanced Introduction to Machine Learning

Advanced Introduction to Machine Learning Advanced Introducton to Machne Learnng 10715, Fall 2014 The Kernel Trck, Reproducng Kernel Hlbert Space, and the Representer Theorem Erc Xng Lecture 6, September 24, 2014 Readng: Erc Xng @ CMU, 2014 1

More information

Turing Machines (intro)

Turing Machines (intro) CHAPTER 3 The Church-Turng Thess Contents Turng Machnes defntons, examples, Turng-recognzable and Turng-decdable languages Varants of Turng Machne Multtape Turng machnes, non-determnstc Turng Machnes,

More information

Hidden Markov Models

Hidden Markov Models Hdden Markov Models Namrata Vaswan, Iowa State Unversty Aprl 24, 204 Hdden Markov Model Defntons and Examples Defntons:. A hdden Markov model (HMM) refers to a set of hdden states X 0, X,..., X t,...,

More information

Design and Optimization of Fuzzy Controller for Inverse Pendulum System Using Genetic Algorithm

Design and Optimization of Fuzzy Controller for Inverse Pendulum System Using Genetic Algorithm Desgn and Optmzaton of Fuzzy Controller for Inverse Pendulum System Usng Genetc Algorthm H. Mehraban A. Ashoor Unversty of Tehran Unversty of Tehran h.mehraban@ece.ut.ac.r a.ashoor@ece.ut.ac.r Abstract:

More information

Speech and Language Processing

Speech and Language Processing Speech and Language rocessng Lecture 3 ayesan network and ayesan nference Informaton and ommuncatons Engneerng ourse Takahro Shnozak 08//5 Lecture lan (Shnozak s part) I gves the frst 6 lectures about

More information

Discriminative classifier: Logistic Regression. CS534-Machine Learning

Discriminative classifier: Logistic Regression. CS534-Machine Learning Dscrmnatve classfer: Logstc Regresson CS534-Machne Learnng robablstc Classfer Gven an nstance, hat does a probablstc classfer do dfferentl compared to, sa, perceptron? It does not drectl predct Instead,

More information

Lab 2e Thermal System Response and Effective Heat Transfer Coefficient

Lab 2e Thermal System Response and Effective Heat Transfer Coefficient 58:080 Expermental Engneerng 1 OBJECTIVE Lab 2e Thermal System Response and Effectve Heat Transfer Coeffcent Warnng: though the experment has educatonal objectves (to learn about bolng heat transfer, etc.),

More information

Discriminative classifier: Logistic Regression. CS534-Machine Learning

Discriminative classifier: Logistic Regression. CS534-Machine Learning Dscrmnatve classfer: Logstc Regresson CS534-Machne Learnng 2 Logstc Regresson Gven tranng set D stc regresson learns the condtonal dstrbuton We ll assume onl to classes and a parametrc form for here s

More information

Problem Set 9 Solutions

Problem Set 9 Solutions Desgn and Analyss of Algorthms May 4, 2015 Massachusetts Insttute of Technology 6.046J/18.410J Profs. Erk Demane, Srn Devadas, and Nancy Lynch Problem Set 9 Solutons Problem Set 9 Solutons Ths problem

More information

Mean Field / Variational Approximations

Mean Field / Variational Approximations Mean Feld / Varatonal Appromatons resented by Jose Nuñez 0/24/05 Outlne Introducton Mean Feld Appromaton Structured Mean Feld Weghted Mean Feld Varatonal Methods Introducton roblem: We have dstrbuton but

More information

Supporting Information

Supporting Information Supportng Informaton The neural network f n Eq. 1 s gven by: f x l = ReLU W atom x l + b atom, 2 where ReLU s the element-wse rectfed lnear unt, 21.e., ReLUx = max0, x, W atom R d d s the weght matrx to

More information

10.34 Numerical Methods Applied to Chemical Engineering Fall Homework #3: Systems of Nonlinear Equations and Optimization

10.34 Numerical Methods Applied to Chemical Engineering Fall Homework #3: Systems of Nonlinear Equations and Optimization 10.34 Numercal Methods Appled to Chemcal Engneerng Fall 2015 Homework #3: Systems of Nonlnear Equatons and Optmzaton Problem 1 (30 ponts). A (homogeneous) azeotrope s a composton of a multcomponent mxture

More information

A PROBABILITY-DRIVEN SEARCH ALGORITHM FOR SOLVING MULTI-OBJECTIVE OPTIMIZATION PROBLEMS

A PROBABILITY-DRIVEN SEARCH ALGORITHM FOR SOLVING MULTI-OBJECTIVE OPTIMIZATION PROBLEMS HCMC Unversty of Pedagogy Thong Nguyen Huu et al. A PROBABILITY-DRIVEN SEARCH ALGORITHM FOR SOLVING MULTI-OBJECTIVE OPTIMIZATION PROBLEMS Thong Nguyen Huu and Hao Tran Van Department of mathematcs-nformaton,

More information

Why BP Works STAT 232B

Why BP Works STAT 232B Why BP Works STAT 232B Free Energes Helmholz & Gbbs Free Energes 1 Dstance between Probablstc Models - K-L dvergence b{ KL b{ p{ = b{ ln { } p{ Here, p{ s the eact ont prob. b{ s the appromaton, called

More information

On the Multicriteria Integer Network Flow Problem

On the Multicriteria Integer Network Flow Problem BULGARIAN ACADEMY OF SCIENCES CYBERNETICS AND INFORMATION TECHNOLOGIES Volume 5, No 2 Sofa 2005 On the Multcrtera Integer Network Flow Problem Vassl Vasslev, Marana Nkolova, Maryana Vassleva Insttute of

More information

Solution Thermodynamics

Solution Thermodynamics Soluton hermodynamcs usng Wagner Notaton by Stanley. Howard Department of aterals and etallurgcal Engneerng South Dakota School of nes and echnology Rapd Cty, SD 57701 January 7, 001 Soluton hermodynamcs

More information

Physics 5153 Classical Mechanics. D Alembert s Principle and The Lagrangian-1

Physics 5153 Classical Mechanics. D Alembert s Principle and The Lagrangian-1 P. Guterrez Physcs 5153 Classcal Mechancs D Alembert s Prncple and The Lagrangan 1 Introducton The prncple of vrtual work provdes a method of solvng problems of statc equlbrum wthout havng to consder the

More information

Outline. Communication. Bellman Ford Algorithm. Bellman Ford Example. Bellman Ford Shortest Path [1]

Outline. Communication. Bellman Ford Algorithm. Bellman Ford Example. Bellman Ford Shortest Path [1] DYNAMIC SHORTEST PATH SEARCH AND SYNCHRONIZED TASK SWITCHING Jay Wagenpfel, Adran Trachte 2 Outlne Shortest Communcaton Path Searchng Bellmann Ford algorthm Algorthm for dynamc case Modfcatons to our algorthm

More information

DUE: WEDS FEB 21ST 2018

DUE: WEDS FEB 21ST 2018 HOMEWORK # 1: FINITE DIFFERENCES IN ONE DIMENSION DUE: WEDS FEB 21ST 2018 1. Theory Beam bendng s a classcal engneerng analyss. The tradtonal soluton technque makes smplfyng assumptons such as a constant

More information

CHAPTER 14 GENERAL PERTURBATION THEORY

CHAPTER 14 GENERAL PERTURBATION THEORY CHAPTER 4 GENERAL PERTURBATION THEORY 4 Introducton A partcle n orbt around a pont mass or a sphercally symmetrc mass dstrbuton s movng n a gravtatonal potental of the form GM / r In ths potental t moves

More information

Foundations of Arithmetic

Foundations of Arithmetic Foundatons of Arthmetc Notaton We shall denote the sum and product of numbers n the usual notaton as a 2 + a 2 + a 3 + + a = a, a 1 a 2 a 3 a = a The notaton a b means a dvdes b,.e. ac = b where c s an

More information

Answers Problem Set 2 Chem 314A Williamsen Spring 2000

Answers Problem Set 2 Chem 314A Williamsen Spring 2000 Answers Problem Set Chem 314A Wllamsen Sprng 000 1) Gve me the followng crtcal values from the statstcal tables. a) z-statstc,-sded test, 99.7% confdence lmt ±3 b) t-statstc (Case I), 1-sded test, 95%

More information

Simulated Power of the Discrete Cramér-von Mises Goodness-of-Fit Tests

Simulated Power of the Discrete Cramér-von Mises Goodness-of-Fit Tests Smulated of the Cramér-von Mses Goodness-of-Ft Tests Steele, M., Chaselng, J. and 3 Hurst, C. School of Mathematcal and Physcal Scences, James Cook Unversty, Australan School of Envronmental Studes, Grffth

More information

ANSWERS. Problem 1. and the moment generating function (mgf) by. defined for any real t. Use this to show that E( U) var( U)

ANSWERS. Problem 1. and the moment generating function (mgf) by. defined for any real t. Use this to show that E( U) var( U) Econ 413 Exam 13 H ANSWERS Settet er nndelt 9 deloppgaver, A,B,C, som alle anbefales å telle lkt for å gøre det ltt lettere å stå. Svar er gtt . Unfortunately, there s a prntng error n the hnt of

More information

Chapter 7 Channel Capacity and Coding

Chapter 7 Channel Capacity and Coding Wreless Informaton Transmsson System Lab. Chapter 7 Channel Capacty and Codng Insttute of Communcatons Engneerng atonal Sun Yat-sen Unversty Contents 7. Channel models and channel capacty 7.. Channel models

More information

Math Review. CptS 223 Advanced Data Structures. Larry Holder School of Electrical Engineering and Computer Science Washington State University

Math Review. CptS 223 Advanced Data Structures. Larry Holder School of Electrical Engineering and Computer Science Washington State University Math Revew CptS 223 dvanced Data Structures Larry Holder School of Electrcal Engneerng and Computer Scence Washngton State Unversty 1 Why do we need math n a data structures course? nalyzng data structures

More information

U.C. Berkeley CS294: Beyond Worst-Case Analysis Luca Trevisan September 5, 2017

U.C. Berkeley CS294: Beyond Worst-Case Analysis Luca Trevisan September 5, 2017 U.C. Berkeley CS94: Beyond Worst-Case Analyss Handout 4s Luca Trevsan September 5, 07 Summary of Lecture 4 In whch we ntroduce semdefnte programmng and apply t to Max Cut. Semdefnte Programmng Recall that

More information

Expectation Maximization Mixture Models HMMs

Expectation Maximization Mixture Models HMMs -755 Machne Learnng for Sgnal Processng Mture Models HMMs Class 9. 2 Sep 200 Learnng Dstrbutons for Data Problem: Gven a collecton of eamples from some data, estmate ts dstrbuton Basc deas of Mamum Lelhood

More information

A Simple Inventory System

A Simple Inventory System A Smple Inventory System Lawrence M. Leems and Stephen K. Park, Dscrete-Event Smulaton: A Frst Course, Prentce Hall, 2006 Hu Chen Computer Scence Vrgna State Unversty Petersburg, Vrgna February 8, 2017

More information

Which Separator? Spring 1

Which Separator? Spring 1 Whch Separator? 6.034 - Sprng 1 Whch Separator? Mamze the margn to closest ponts 6.034 - Sprng Whch Separator? Mamze the margn to closest ponts 6.034 - Sprng 3 Margn of a pont " # y (w $ + b) proportonal

More information

MMA and GCMMA two methods for nonlinear optimization

MMA and GCMMA two methods for nonlinear optimization MMA and GCMMA two methods for nonlnear optmzaton Krster Svanberg Optmzaton and Systems Theory, KTH, Stockholm, Sweden. krlle@math.kth.se Ths note descrbes the algorthms used n the author s 2007 mplementatons

More information

E O C NO N MIC C D I D SP S A P T A C T H C H A N A D N D UN U I N T T CO C MMITM T EN E T

E O C NO N MIC C D I D SP S A P T A C T H C H A N A D N D UN U I N T T CO C MMITM T EN E T Chapter 4 ECOOMIC DISPATCH AD UIT COMMITMET ITRODUCTIO A power system has several power plants. Each power plant has several generatng unts. At any pont of tme, the total load n the system s met by the

More information