Non-Linear Back-propagation: Doing. Back-Propagation without Derivatives of. the Activation Function. John Hertz.
|
|
- Joan Douglas
- 5 years ago
- Views:
Transcription
1 Non-Lnear Bac-propagaton: Dong Bac-Propagaton wthout Dervatves of the Actvaton Functon. John Hertz Nordta, Begdamsvej 7, 200 Copenhagen, Denmar Ema: Anders Krogh Eectroncs Insttute, Technca Unversty of Denmar, Budng Lyngby, Denmar, Ema: Benny Lautrup The Nes Bohr Insttute, Begdamsvej 7, 200 Copenhagen, Denmar Ema: Torsten Lehmann Eectroncs Insttute, Technca Unversty of Denmar, Budng Lyngby, Denmar, Ema: March 4,994 Abstract The conventona near bac-propagaton agorthm s repaced by a non-near verson, whch avods the necessty for cacuatng the dervatve of the actvaton functon. Ths may be expoted n hardware reazatons of neura processors. In ths paper we derve the non-near bac-propagaton agorthms n the framewor of recurrent bac-propagaton and present somenumerca smuatons of feed-forward networs on the NetTa probem. A dscusson of mpementaton n anaog VLSI eectroncs concudes the paper.
2 Introducton From a smpe rewrtng of the bac-propagaton agorthm [RHW86] a new famy of earnng agorthms emerges whch we ca non-near bac-propagaton. In the norma bac-propagaton agorthm one cacuates the errors on the hdden unts from the errors on the output neurons by means of a near expresson. The non-near agorthms presented here have theadvantage that the bac-propagaton of errors goes through the same non-near unts as the forward propagaton of actvtes. Usng ths method t s no onger necessary to cacuate the dervatve of the actvaton functon for each neuron n the networ as t s n standard bac-propagaton. Whereas the dervatves are trva to cacuate n a smuaton, they appear to be of major concern when mpementng the agorthm n hardware, be t eectronc or optca. For these reasons we beeve that non-near bac-propagaton s very we suted for hardware mpementaton. Ths s the man motvaton for ths wor. In the mt of nntey sma earnng rate the non-near agorthms become dentca to standard bac-propagaton. For sma earnng rates the performance of the new agorthms s therefore comparabe to standard bac-propagaton, whereas for arger earnng rates t performs better. The agorthms generaze easy to recurrent bac-propagaton [Am87, Pn87] of whch the standard bac-propagaton agorthm for feed-forward networs s a speca case. In ths paper we derve the non-near bac-propagaton (NLBP) agorthms n the framewor of recurrent bac-propagaton and present somenu- merca smuatons of feed-forward networs on the NetTa probem [SR87]. A dscusson of mpementaton n anaog VLSI eectroncs concudes the paper. 2 The Agorthms In ths secton we present the agorthms for non-near bac-propagaton. The dervaton can be found n the next secton. We consder a genera networ, recurrent or feed-forward, wth N neurons. The neuron actvtes are denoted V and the weght of the connecton from neuron j to neuron s denoted w j. Threshod vaues are ncuded by means of a xed bas neuron numbered =0. The actvaton (output) of a unt n the networsthen V = g(h ) h = P j w j V j + () 2
3 where g s the actvaton functon and s externa nput the unt. These equatons are apped repeatedy for a neurons unt the state of actvty converges towards a xed pont. For a feed-forward networ ths s guaranteed to happen, and the actvtes shoud be evauated n the forward drecton,.e. from nput towards output. For a recurrent networ there s no guarantee that these equatons w converge towards a xed pont, but we sha assume ths to be the case. The equatons are, however, smpest n the genera form. The error of the networ on nput pattern s dened as = ; V (2) where are the target vaues for the output unts when the nput s pattern and V s the actua output for that same nput pattern. For non-output unts 0. We dene the bacward actvatons as y = g(h + [ P (y ; V )w + ]) (3) where the constants and w be dscussed shorty. These varabes are \eectve" or \movng" targets for hdden unts n the networ. For output unts n a feed-forward networ the sum on s empty, and f the errors are a zero, these equatons have the smpe souton y = V.For non-zero error teraton of these equatons s ewse assumed to ead to a xed pont n the bacwards actvaton state (one can easy show that f the forward equatons () converge, the bacwards equatons w aso converge). Notce that durng teraton of the bacward actvatons we eep the forward actvatons xed. Now, consder a set of nput-output patterns ndexed by = ::: p, and assume that the squared error s used as the cost functon, E sq = 2 px X = ( ) 2 : (4) In terms of these new varabes the non-near bac-propagaton s then e deta-rue earnng, w j = X (y ; V )V j : (5) The constants and are repacements for the usua earnng rate, and t s requred that = s \sma". The reason we spea of a famy of of agorthms s that derent choces of yed derent agorthms. Here the parameters are aowed to der from unt to unt, but usuay they w be 3
4 the same for arge groups of unts (e.g. ones formng a ayer) whch smpes the equatons. We consder two choces of partcuary nterestng: = and =. For the rst of these pays a roe smar to the earnng rate n deta-rue earnng, snce s repaced by n (5). For the entropc error measure [HKP9] the weght update s the same (5), but y s dened as y = g(h + P (y ; V )w )+ : (6) In ths case the weght update for an output unt s exacty e the standard one, w j = P V j.for a networ wth near output unts optmzng the squared error the two equatons for y concde. Obvousy these agorthms can be used onne,.e. changng the weghts after each pattern, just as s common when usng the standard bac-propagaton agorthm. Fnay we woud e to expcty show the mportant cases of = and = for a feed-forward networ. For smpcty the ndex w be dropped. Notaton: abes the ayers from 0 (output) to L (nput). w j s the weght from unt j n ayer + to unt n ayer. Any other varabe (e y and V ) wth superscrpt refers to that varabe n ayer. It w be assumed that s the same for a unts n a ayer, =. The error on the output unts are denoted as before. Here are the two versons: 4
5 =. Output unt: y 0 = g(h0 + 0 ) (squared error). y 0 = V 0 + (entropc error). Hdden unt: y = g(h + ; P(y ; ; V ; )w ; ) Weght updatew j =(y ; V )V + j =. Output unt: y 0 = g(h0 + ) (squared error). y 0 = V 0 + (entropc error). Hdden unt: y = g(h + P (y ; ; V ; )w ; ) Weght updatew j = (y ; V )V + j 3 Dervaton of NLBP In ths secton we derve the non-near bac-propagaton n the framewor of recurrent bac-propagaton. As an ntroducton, we foow the dervaton of recurrent bac-propagaton n [HKP9, p. 72{75]. See aso [Pn87]. 3. Standard recurrent bac-propagaton Assume xed ponts of the networ are gven by (), and that the earnng s governed by an error measure E e (4). Ifwedene (4) s dentca to (2), the gradent descent earnng rue s w pq pq : (7) Derentaton of the xed pont equaton () for pq =(L ; ) p V 0 pv q (8) 5
6 wth V 0 = g 0 (h ) and the matrx L gven by L j = j ; V 0 w j : (9) If ths matrx s postve dente, the dynamcs w be contractve around a xed pont. Accordng to our assumptons ths must therefore be the case. Denng p P = V 0 p (L ; ) p (0) the weght update can be wrtten as and the 's are the soutons to w pq = p V q () = V 0 (P w + ): (2) These are the standard bac-propagaton equatons for a genera networ. In a feed-forward networ they converge to a xed pont when terated n the bacwards drecton from output towards nput. For a genera recurrent net they w converge towards a xed pont when the L-matrx s postve dente, as may easy be demonstrated. 3.2 Non-near bac-propagaton If the error measure s gven by (4) the dervatves of the error measure are = ; V for the output unts, 0 otherwse (3) For an output unt n a feed-forward networ we thus nd = V 0 ( ; V ). One of the deas of the non-near bac-propagaton s to force that nterpretaton on a the unts, denng `eectve targets' y such that / y ; V : (4) For sma eq. expanson: () can then be nterpreted as a rst order Tayor w j = V j = V P (P 0 w + )V j ' [g(h + [ j w j + ]) ; g(h )]V j (5) j where g(h )=V. The rst term n the bracets s just the output of a unt wth the norma nput and the bac-propagated error added { the eectve target: P y = g(h + [ w + ]): (6) 6
7 Note however that the \ntegraton" n (5) s qute arbtrary t s just one possbty out of many gven by w j = V j ' [g(h + [ P w + ]) ; g(h )]V j (7) where the 's are arbtrary parameters smar to the earnng rates.for consstency one now has to repace by = (y ; V ) (8) Then y s nay gven by (3) and the weght update by (5). Formay the \ntegraton" n eq. (5) s ony vad for sma (or sma = n (7)). But for arger there s no guarantee that the cean gradent descent converges anyway, and these nonnear versons mght we turn out to wor better. By mang very arge compared to, one can mae the NLBP ndstngushabe from standard bac-propagaton (the Tayor expansons w be amost exact). That woud be at the expense of hgh numerca nstabty, because y woud be very cose to V and the formua for the weght update, w j = (y ; V ), woud requre very hgh precson. On the other hand, very sma 's are ey to tae the agorthm too far from gradent descent. For these reasons we beeve that the most nterestng range s (assumng that < ). The mt = s the most stabe, numercay, and = s the most gradent-descent-e mt. Notce that f the ratos = = are the same for a neurons n the networ then the equatons tae the smper form and P y = g(h + (y ; V )w + ) (9) w j = (y ; V )V j (20) 3.3 Entropc error measure The entropc error measure s E = X " # ( + V 2 ) og +V ( ; V )og ; V ; (2) f the actvaton functon g s equa to tanh. A smar error measures exsts for other actvaton functons e g(x) =(+e ;x ) ;. It can be shown that 7
8 for ths and smar error E Instead of (3) y shoud then be dened as (6). 3.4 Interna representatons = : (22) For a feedforward archtecture wth a snge hdden ayer, the weght change formuas resembe those obtaned usng the method of nterna representatons [AKH90]. However, they are not qute the same. Usng our present notaton, n the present method we nd a change for the weght from hdden unt j to output unt of [ ; g( P w V )]V j, whe the nterna representaton approach ts[ ; g( P w y )]y j.for the nput-to-hdden ayer the expressons for the weght changes n the two approaches oo the same, but the eectve targets y j n them are derent. They are both cacuated by bac-propagatng errors ; V from P the output unts, but n the present case these V are smpy the resut g( j w j V j ) of the forward P propagaton, whe n the nterna representatons approach, V = g( j w j y j ),.e. they are obtaned by propagatng the eectve targets on the hdden ayer forward through the hdden-to-output weghts. 4 Test of Agorthm The agorthms have been tested on the NetTa probem usng a feed-forward networ wth an nput wndow of 7 etters and one hdden ayer consstng of 80 hdden unts. The agorthms were run n onne mode and a momentum term of 0:8 was used. They were tested for these vaues of :,0:25, 0:5, 0:75, and :0. The earnng rate was scaed accordng to the number n of connectons feedng nto unt, suchthat = = p n a standard method often used n bac-propagaton. Severa vaues were tred for. The nta weghts were chosen at random unformy between ;0:5= p n and 0:5= p n. For each vaue of and, 50 cyces of earnng was done, and the squared error normazed by the number of exampes was recorded at each cyce. For a runs the na error was potted as a functon of, see gure. Ceary the runs wth = are amost ndstngusabe from standard bacpropagaton, whch s aso shown. As a \coroary" these pots show that the entropc error measure s superor { even when the object s to mnmze squared error (see aso [SLF88]). In gure 2 the tme deveopment of the error s shown for the extreme vaues of and xed. 8 V 0
9 squared error eta squared error eta Fgure : Pots showng the squared error after 50 tranng epocs as a functon of the sze of the earnng rate. The eft pot s for the squared error cost functon and the rght pot for the entropc error measure. The sod ne represents the standard bac-propagaton agorthm, the dashed ne the nonnear bac-propagaton agorthm wth =, and the dot-dashed ne the one wth =. The NLBP wth =0:25 (*), =0:5 (x), and =0:75 (o) are aso shown..5 squared error tranng epocs Fgure 2: The decrease n error as a functon of earnng tme for standard bac-propagaton (sod ne), NLBP wth = (dashed ne), and = (dot-dashed ne). In a three cases the squared error functon and = 0:05 was used. 9
10 5 Hardware Impementatons A the agorthms can be mapped topoogcay onto anaog VLSI n a straght-forward manner, though seectng the most stabe s the best choce because of the mted precson of ths technoogy. In ths secton, we w gve two exampes of the mpementaton of the agorthms for a feed-forward networ usng = and the squared error cost functon: Denng a neuron error,, for each ayer (note for the output ayer) we can wrte (8) for = as ( = ; V 0 for =0 P ; w ; for >0 (23) = g(h + ) ; g(h ): (24) Thus, topoogcay ths verson of non-near bac-propagaton maps n exacty the same way on hardware as the orgna bac-propagaton agorthm the ony derence beng how s cacuated ( = g 0 (h ) for bacpropagaton) [LB93]: The system conssts of two modues: A \synapse modue" whch cacuates w, propagates j h forward and propagates bacward + j anda \neuron modue" whch forward propagates V and bacward propagates. To change the earnng agorthm to non-near bac-propagaton, t s thus necessary ony to change the \neuron modue". V DD h Bas V V SS Fgure 3: Contnuous tme non-near bac-propagaton \neuron modue" wth hyperboc tangent actvaton functon. The precson s determned by the matchng of the two derenta pars. Asmpeway to mpement a sgmod-e actvaton functon s by the use of a derenta par. Fgure 3 shows a non-near bac-propagaton \neuron 0
11 modue" wth a hyperboc tangent actvaton functon. Appyng h and ; as votages gves the V and outputs as derenta currents (the \synapse modue" can just as we cacuate ; as ). It s nterestng to notce that the crcut structure s dentca to the one used n [Bog93] to cacuate the dervatve of the actvaton functon: Repacng by a sma constant,, the output w approxmate g 0 (h ). Usng the crcut n the proposed way, however, gves better accuracy: The s not a \sma" quanttywhchmaes the nherent naccuraces ess sgncant, reatvey. Further, the crcut cacuates the desred drecty, emnatng the need of an extra mutper and thus emnatng a source of error. The accuracy of the crcut s determned by the matchng of the two derenta pars and of the bas sources. Ths can be n the order of %of the output current magntude. In a \standard mpementaton" of the \synapse modue", the h and outputs w be avaabe as currents and the V and nputs must be apped as votages. Thus the above mpementaton requres accurate transresstances to functon propery. Aso, as the same functon s used to cacuate the V s and the s, t woud be preferabe to use the same hardware as ths emnates the need of matched components. Ths s possbe f the system s not requred to functon n contnuous tme, though the output has to be samped (whch ntroduces errors). V DD h 2 Bas Bas V max : V : V mn V SS Fgure 4: Dscrete tme non-near bac-propagaton \neuron modue" wth non-near actvaton functon. The precson s determned by the accuracy of the swtched capactor. In gure 4 such a smped dscrete tme \neuron modue" whch reuses the actvaton functon boc andwhch has current nput/votage outputs s shown. Durng the coc phase, V s avaabe at the output and s samped at the capactor. Durng the 2 coc phase, s avaabe at the output. The actvaton functon saturates such that V mn V V max, though t s not very we dened. Ths s of no major concern, however the accuracy s determned by the swtched capactor. Usng desgn technques to
12 reduce charge njecton and redstrbuton, the accuracy can be n the order of 0.%of the output votage swng. As ustrated, the non-near bac-propagaton earnng agorthm s we suted for anaogue hardware mpementaton, though oset compensaton technques st have tobeempoyed. It maps topoogcay on hardware n the same way as ordnary bac-propagaton, but the crcut to cacuate the ssmuch more ecent: It can approxmate the earnng agorthm equatons more accuratey and as the agorthm requres ony smpe operatons apart from the actvaton functon, desgn eorts can be put on the eectrca speccatons of the hardware (nput mpedance, speed, nose mmunty, etc.) and on the genera shape of the sgmod-e actvaton functon. Further, as the agorthm requres ony one \speca functon", t has the potenta of very hgh accuracy through reuse of ths functon boc. 6 Concuson A new famy of earnng agorthms have been derved that can be thought of as \non-near gradent descent" type agorthms. For approprate vaues of the parameters they are amost dentca to standard bac-propagaton. By numerca smuatons of feed-forward networs earnng the NetTa probem t was shown that the performance of these agorthms were very smar to standard bac-propagaton for the range of parameters tested. The agorthms have two mportant propertes that we beeve mae them easer to mpement n eectronc hardware than the standard bacpropagaton agorthm. Frst, no dervatves of the actvaton functon need to be cacuated. Second, the bac-propagaton of errors s through the same non-near networ as the forward propagaton, and not a nearzed networ as n standard bac-propagaton. Two exampes of how anaogue eectronc hardware can utze these propertes have been gven. These advantages may aso be expected to carry over to optca mpementatons. 2
13 References [AKH90] G. I. Thorbergsson A. Krogh and J. A. Hertz. A cost functon for nterna representatons. In Neura Informaton Processng Systems 2, pages 733{740, D. S Touretzy, ed. San Mateo CA: Morgan Kaumann, 990. [Am87] [AR88] L.B. Ameda. A earnng rue for asynchronous perceptrons wth feedbac n a combnatora envronment. In M. Caud and C. Buter, edtors, IEEE Internatona Conference onneura Networs, voume 2, pages 609{68, San Dego 987, 987. IEEE, New Yor. J.A. Anderson and E. Rosenfed, edtors. Neurocomputng: Foundatons of Research. MIT Press, Cambrdge, 988. [Bog93] G. Bogason. Generaton of a neuron transfer functon and ts dervatve. Eectroncs Letters, 29:867{869, 993. [HKP9] J.A. Hertz, A. Krogh, and R.G. Pamer. Introducton to the Theory of Neura Computaton. Addson-Wesey, Redwood Cty, 99. [LB93] T. Lehmann and E. Bruun. Anaogue VLSI mpementaton of bac-propagaton earnng n artca neura networs. In Proc. 'th European Conference on Crcut Theory and Desgn, pages 49{496, Davos 993, 993. [Pn87] F.J. Pneda. Generazaton of bac-propagaton to recurrent neura networs. Physca Revew Letters, 59:2229{2232, 987. [RHW86] D.E. Rumehart, G.E. Hnton, and R.J. Wams. Learnng representatons by bac-propagatng errors. Nature, 323:533{536, 986. Reprnted n [AR88]. [SLF88] S.A. Soa, E. Levn, and M. Fesher. Acceerated earnng n ayered neura networs. Compex Systems, 2:625{639, 988. [SR87] T.J. Sejnows and C.R. Rosenberg. Parae networs that earn to pronounce engsh text. Compex Systems, :45{68,
Research on Complex Networks Control Based on Fuzzy Integral Sliding Theory
Advanced Scence and Technoogy Letters Vo.83 (ISA 205), pp.60-65 http://dx.do.org/0.4257/ast.205.83.2 Research on Compex etworks Contro Based on Fuzzy Integra Sdng Theory Dongsheng Yang, Bngqng L, 2, He
More informationA finite difference method for heat equation in the unbounded domain
Internatona Conerence on Advanced ectronc Scence and Technoogy (AST 6) A nte derence method or heat equaton n the unbounded doman a Quan Zheng and Xn Zhao Coege o Scence North Chna nversty o Technoogy
More informationEEE 241: Linear Systems
EEE : Lnear Systems Summary #: Backpropagaton BACKPROPAGATION The perceptron rule as well as the Wdrow Hoff learnng were desgned to tran sngle layer networks. They suffer from the same dsadvantage: they
More informationNeural network-based athletics performance prediction optimization model applied research
Avaabe onne www.jocpr.com Journa of Chemca and Pharmaceutca Research, 04, 6(6):8-5 Research Artce ISSN : 0975-784 CODEN(USA) : JCPRC5 Neura networ-based athetcs performance predcton optmzaton mode apped
More informationSupervised Learning. Neural Networks and Back-Propagation Learning. Credit Assignment Problem. Feedforward Network. Adaptive System.
Part 7: Neura Networ & earnng /2/05 Superved earnng Neura Networ and Bac-Propagaton earnng Produce dered output for tranng nput Generaze reaonaby & appropratey to other nput Good exampe: pattern recognton
More informationNote 2. Ling fong Li. 1 Klein Gordon Equation Probablity interpretation Solutions to Klein-Gordon Equation... 2
Note 2 Lng fong L Contents Ken Gordon Equaton. Probabty nterpretaton......................................2 Soutons to Ken-Gordon Equaton............................... 2 2 Drac Equaton 3 2. Probabty nterpretaton.....................................
More informationMARKOV CHAIN AND HIDDEN MARKOV MODEL
MARKOV CHAIN AND HIDDEN MARKOV MODEL JIAN ZHANG JIANZHAN@STAT.PURDUE.EDU Markov chan and hdden Markov mode are probaby the smpest modes whch can be used to mode sequenta data,.e. data sampes whch are not
More informationSupplementary Material: Learning Structured Weight Uncertainty in Bayesian Neural Networks
Shengyang Sun, Changyou Chen, Lawrence Carn Suppementary Matera: Learnng Structured Weght Uncertanty n Bayesan Neura Networks Shengyang Sun Changyou Chen Lawrence Carn Tsnghua Unversty Duke Unversty Duke
More informationAssociative Memories
Assocatve Memores We consder now modes for unsupervsed earnng probems, caed auto-assocaton probems. Assocaton s the task of mappng patterns to patterns. In an assocatve memory the stmuus of an ncompete
More informationWeek 5: Neural Networks
Week 5: Neural Networks Instructor: Sergey Levne Neural Networks Summary In the prevous lecture, we saw how we can construct neural networks by extendng logstc regresson. Neural networks consst of multple
More informationSupporting Information
Supportng Informaton The neural network f n Eq. 1 s gven by: f x l = ReLU W atom x l + b atom, 2 where ReLU s the element-wse rectfed lnear unt, 21.e., ReLUx = max0, x, W atom R d d s the weght matrx to
More informationThe Application of BP Neural Network principal component analysis in the Forecasting the Road Traffic Accident
ICTCT Extra Workshop, Bejng Proceedngs The Appcaton of BP Neura Network prncpa component anayss n Forecastng Road Traffc Accdent He Mng, GuoXucheng &LuGuangmng Transportaton Coege of Souast Unversty 07
More informationImage Classification Using EM And JE algorithms
Machne earnng project report Fa, 2 Xaojn Sh, jennfer@soe Image Cassfcaton Usng EM And JE agorthms Xaojn Sh Department of Computer Engneerng, Unversty of Caforna, Santa Cruz, CA, 9564 jennfer@soe.ucsc.edu
More informationLECTURE 21 Mohr s Method for Calculation of General Displacements. 1 The Reciprocal Theorem
V. DEMENKO MECHANICS OF MATERIALS 05 LECTURE Mohr s Method for Cacuaton of Genera Dspacements The Recproca Theorem The recproca theorem s one of the genera theorems of strength of materas. It foows drect
More informationMultilayer Perceptrons and Backpropagation. Perceptrons. Recap: Perceptrons. Informatics 1 CG: Lecture 6. Mirella Lapata
Multlayer Perceptrons and Informatcs CG: Lecture 6 Mrella Lapata School of Informatcs Unversty of Ednburgh mlap@nf.ed.ac.uk Readng: Kevn Gurney s Introducton to Neural Networks, Chapters 5 6.5 January,
More informationCyclic Codes BCH Codes
Cycc Codes BCH Codes Gaos Feds GF m A Gaos fed of m eements can be obtaned usng the symbos 0,, á, and the eements beng 0,, á, á, á 3 m,... so that fed F* s cosed under mutpcaton wth m eements. The operator
More informationNested case-control and case-cohort studies
Outne: Nested case-contro and case-cohort studes Ørnuf Borgan Department of Mathematcs Unversty of Oso NORBIS course Unversty of Oso 4-8 December 217 1 Radaton and breast cancer data Nested case contro
More informationDeriving the Dual. Prof. Bennett Math of Data Science 1/13/06
Dervng the Dua Prof. Bennett Math of Data Scence /3/06 Outne Ntty Grtty for SVM Revew Rdge Regresson LS-SVM=KRR Dua Dervaton Bas Issue Summary Ntty Grtty Need Dua of w, b, z w 2 2 mn st. ( x w ) = C z
More informationCHAPTER III Neural Networks as Associative Memory
CHAPTER III Neural Networs as Assocatve Memory Introducton One of the prmary functons of the bran s assocatve memory. We assocate the faces wth names, letters wth sounds, or we can recognze the people
More informationLower bounds for the Crossing Number of the Cartesian Product of a Vertex-transitive Graph with a Cycle
Lower bounds for the Crossng Number of the Cartesan Product of a Vertex-transtve Graph wth a Cyce Junho Won MIT-PRIMES December 4, 013 Abstract. The mnmum number of crossngs for a drawngs of a gven graph
More informationNeural networks. Nuno Vasconcelos ECE Department, UCSD
Neural networs Nuno Vasconcelos ECE Department, UCSD Classfcaton a classfcaton problem has two types of varables e.g. X - vector of observatons (features) n the world Y - state (class) of the world x X
More informationG : Statistical Mechanics
G25.2651: Statstca Mechancs Notes for Lecture 11 I. PRINCIPLES OF QUANTUM STATISTICAL MECHANICS The probem of quantum statstca mechancs s the quantum mechanca treatment of an N-partce system. Suppose the
More informationFor now, let us focus on a specific model of neurons. These are simplified from reality but can achieve remarkable results.
Neural Networks : Dervaton compled by Alvn Wan from Professor Jtendra Malk s lecture Ths type of computaton s called deep learnng and s the most popular method for many problems, such as computer vson
More informationAdmin NEURAL NETWORKS. Perceptron learning algorithm. Our Nervous System 10/25/16. Assignment 7. Class 11/22. Schedule for the rest of the semester
0/25/6 Admn Assgnment 7 Class /22 Schedule for the rest of the semester NEURAL NETWORKS Davd Kauchak CS58 Fall 206 Perceptron learnng algorthm Our Nervous System repeat untl convergence (or for some #
More informationThe line method combined with spectral chebyshev for space-time fractional diffusion equation
Apped and Computatona Mathematcs 014; 3(6): 330-336 Pubshed onne December 31, 014 (http://www.scencepubshnggroup.com/j/acm) do: 10.1164/j.acm.0140306.17 ISS: 3-5605 (Prnt); ISS: 3-5613 (Onne) The ne method
More informationSparse Training Procedure for Kernel Neuron *
Sparse ranng Procedure for Kerne Neuron * Janhua XU, Xuegong ZHANG and Yanda LI Schoo of Mathematca and Computer Scence, Nanng Norma Unversty, Nanng 0097, Jangsu Provnce, Chna xuanhua@ema.nnu.edu.cn Department
More information1 Convex Optimization
Convex Optmzaton We wll consder convex optmzaton problems. Namely, mnmzaton problems where the objectve s convex (we assume no constrants for now). Such problems often arse n machne learnng. For example,
More informationGeneralized Linear Methods
Generalzed Lnear Methods 1 Introducton In the Ensemble Methods the general dea s that usng a combnaton of several weak learner one could make a better learner. More formally, assume that we have a set
More informationThermodynamics II. Department of Chemical Engineering. Prof. Kim, Jong Hak
Thermodynamcs II Department o Chemca ngneerng ro. Km, Jong Hak .5 Fugacty & Fugacty Coecent : ure Speces µ > provdes undamenta crteron or phase equbrum not easy to appy to sove probem Lmtaton o gn (.9
More informationChapter 6. Rotations and Tensors
Vector Spaces n Physcs 8/6/5 Chapter 6. Rotatons and ensors here s a speca knd of near transformaton whch s used to transforms coordnates from one set of axes to another set of axes (wth the same orgn).
More informationNeural Networks. Perceptrons and Backpropagation. Silke Bussen-Heyen. 5th of Novemeber Universität Bremen Fachbereich 3. Neural Networks 1 / 17
Neural Networks Perceptrons and Backpropagaton Slke Bussen-Heyen Unverstät Bremen Fachberech 3 5th of Novemeber 2012 Neural Networks 1 / 17 Contents 1 Introducton 2 Unts 3 Network structure 4 Snglelayer
More informationDelay tomography for large scale networks
Deay tomography for arge scae networks MENG-FU SHIH ALFRED O. HERO III Communcatons and Sgna Processng Laboratory Eectrca Engneerng and Computer Scence Department Unversty of Mchgan, 30 Bea. Ave., Ann
More information3. Stress-strain relationships of a composite layer
OM PO I O U P U N I V I Y O F W N ompostes ourse 8-9 Unversty of wente ng. &ech... tress-stran reatonshps of a composte ayer - Laurent Warnet & emo Aerman.. tress-stran reatonshps of a composte ayer Introducton
More informationInner Product. Euclidean Space. Orthonormal Basis. Orthogonal
Inner Product Defnton 1 () A Eucldean space s a fnte-dmensonal vector space over the reals R, wth an nner product,. Defnton 2 (Inner Product) An nner product, on a real vector space X s a symmetrc, blnear,
More informationTransfer Functions. Convenient representation of a linear, dynamic model. A transfer function (TF) relates one input and one output: ( ) system
Transfer Functons Convenent representaton of a lnear, dynamc model. A transfer functon (TF) relates one nput and one output: x t X s y t system Y s The followng termnology s used: x y nput output forcng
More informationAn Effective Space Charge Solver. for DYNAMION Code
A. Orzhehovsaya W. Barth S. Yaramyshev GSI Hemhotzzentrum für Schweronenforschung (Darmstadt) An Effectve Space Charge Sover for DYNAMION Code Introducton Genera space charge agorthms based on the effectve
More informationCOXREG. Estimation (1)
COXREG Cox (972) frst suggested the modes n whch factors reated to fetme have a mutpcatve effect on the hazard functon. These modes are caed proportona hazards (PH) modes. Under the proportona hazards
More informationWhy feed-forward networks are in a bad shape
Why feed-forward networks are n a bad shape Patrck van der Smagt, Gerd Hrznger Insttute of Robotcs and System Dynamcs German Aerospace Center (DLR Oberpfaffenhofen) 82230 Wesslng, GERMANY emal smagt@dlr.de
More informationAPPENDIX A Some Linear Algebra
APPENDIX A Some Lnear Algebra The collecton of m, n matrces A.1 Matrces a 1,1,..., a 1,n A = a m,1,..., a m,n wth real elements a,j s denoted by R m,n. If n = 1 then A s called a column vector. Smlarly,
More informationLecture Notes on Linear Regression
Lecture Notes on Lnear Regresson Feng L fl@sdueducn Shandong Unversty, Chna Lnear Regresson Problem In regresson problem, we am at predct a contnuous target value gven an nput feature vector We assume
More informationLinear Approximation with Regularization and Moving Least Squares
Lnear Approxmaton wth Regularzaton and Movng Least Squares Igor Grešovn May 007 Revson 4.6 (Revson : March 004). 5 4 3 0.5 3 3.5 4 Contents: Lnear Fttng...4. Weghted Least Squares n Functon Approxmaton...
More informationMultilayer Perceptron (MLP)
Multlayer Perceptron (MLP) Seungjn Cho Department of Computer Scence and Engneerng Pohang Unversty of Scence and Technology 77 Cheongam-ro, Nam-gu, Pohang 37673, Korea seungjn@postech.ac.kr 1 / 20 Outlne
More information2 STATISTICALLY OPTIMAL TRAINING DATA 2.1 A CRITERION OF OPTIMALITY We revew the crteron of statstcally optmal tranng data (Fukumzu et al., 1994). We
Advances n Neural Informaton Processng Systems 8 Actve Learnng n Multlayer Perceptrons Kenj Fukumzu Informaton and Communcaton R&D Center, Rcoh Co., Ltd. 3-2-3, Shn-yokohama, Yokohama, 222 Japan E-mal:
More informationDecentralized Adaptive Control for a Class of Large-Scale Nonlinear Systems with Unknown Interactions
Decentrazed Adaptve Contro for a Cass of Large-Scae onnear Systems wth Unknown Interactons Bahram Karm 1, Fatemeh Jahangr, Mohammad B. Menhaj 3, Iman Saboor 4 1. Center of Advanced Computatona Integence,
More informationNumerical Investigation of Power Tunability in Two-Section QD Superluminescent Diodes
Numerca Investgaton of Power Tunabty n Two-Secton QD Superumnescent Dodes Matta Rossett Paoo Bardea Ivo Montrosset POLITECNICO DI TORINO DELEN Summary 1. A smpfed mode for QD Super Lumnescent Dodes (SLD)
More information22.51 Quantum Theory of Radiation Interactions
.51 Quantum Theory of Radaton Interactons Fna Exam - Soutons Tuesday December 15, 009 Probem 1 Harmonc oscator 0 ponts Consder an harmonc oscator descrbed by the Hamtonan H = ω(nˆ + ). Cacuate the evouton
More informationExample: Suppose we want to build a classifier that recognizes WebPages of graduate students.
Exampe: Suppose we want to bud a cassfer that recognzes WebPages of graduate students. How can we fnd tranng data? We can browse the web and coect a sampe of WebPages of graduate students of varous unverstes.
More informationNumerical integration in more dimensions part 2. Remo Minero
Numerca ntegraton n more dmensons part Remo Mnero Outne The roe of a mappng functon n mutdmensona ntegraton Gauss approach n more dmensons and quadrature rues Crtca anass of acceptabt of a gven quadrature
More informationQUARTERLY OF APPLIED MATHEMATICS
QUARTERLY OF APPLIED MATHEMATICS Voume XLI October 983 Number 3 DIAKOPTICS OR TEARING-A MATHEMATICAL APPROACH* By P. W. AITCHISON Unversty of Mantoba Abstract. The method of dakoptcs or tearng was ntroduced
More informationON AUTOMATIC CONTINUITY OF DERIVATIONS FOR BANACH ALGEBRAS WITH INVOLUTION
European Journa of Mathematcs and Computer Scence Vo. No. 1, 2017 ON AUTOMATC CONTNUTY OF DERVATONS FOR BANACH ALGEBRAS WTH NVOLUTON Mohamed BELAM & Youssef T DL MATC Laboratory Hassan Unversty MORO CCO
More informationLecture 12: Discrete Laplacian
Lecture 12: Dscrete Laplacan Scrbe: Tanye Lu Our goal s to come up wth a dscrete verson of Laplacan operator for trangulated surfaces, so that we can use t n practce to solve related problems We are mostly
More informationEvaluation of classifiers MLPs
Lecture Evaluaton of classfers MLPs Mlos Hausrecht mlos@cs.ptt.edu 539 Sennott Square Evaluaton For any data set e use to test the model e can buld a confuson matrx: Counts of examples th: class label
More informationAdaptive LRBP Using Learning Automata for Neural Networks
Adaptve LRBP Usng Learnng Automata for eura etworks *B. MASHOUFI, *MOHAMMAD B. MEHAJ (#, *SAYED A. MOTAMEDI and **MOHAMMAD R. MEYBODI *Eectrca Engneerng Department **Computer Engneerng Department Amrkabr
More informationMultilayer neural networks
Lecture Multlayer neural networks Mlos Hauskrecht mlos@cs.ptt.edu 5329 Sennott Square Mdterm exam Mdterm Monday, March 2, 205 In-class (75 mnutes) closed book materal covered by February 25, 205 Multlayer
More informationApproximate merging of a pair of BeÂzier curves
COMPUTER-AIDED DESIGN Computer-Aded Desgn 33 (1) 15±136 www.esever.com/ocate/cad Approxmate mergng of a par of BeÂzer curves Sh-Mn Hu a,b, *, Rou-Feng Tong c, Tao Ju a,b, Ja-Guang Sun a,b a Natona CAD
More informationXin Li Department of Information Systems, College of Business, City University of Hong Kong, Hong Kong, CHINA
RESEARCH ARTICLE MOELING FIXE OS BETTING FOR FUTURE EVENT PREICTION Weyun Chen eartment of Educatona Informaton Technoogy, Facuty of Educaton, East Chna Norma Unversty, Shangha, CHINA {weyun.chen@qq.com}
More informationThe work described by this report was supported by NSF under contract
BINAR ULTIPLICATION UING PARTIALL REDUNDANT ULTIPLE Gary Bewck chae J. Fynn Technca Report No. CL-TR-92-528 June 992 The work descrbed by ths report was supported by NF under contract IP88-2296 BINAR ULTIPLICATION
More informationNeural Networks. Neural Network Motivation. Why Neural Networks? Comments on Blue Gene. More Comments on Blue Gene
Motvaton for non-lnear Classfers Neural Networs CPS 27 Ron Parr Lnear methods are wea Mae strong assumptons Can only express relatvely smple functons of nputs Comng up wth good features can be hard Why
More informationLower Bounding Procedures for the Single Allocation Hub Location Problem
Lower Boundng Procedures for the Snge Aocaton Hub Locaton Probem Borzou Rostam 1,2 Chrstoph Buchhem 1,4 Fautät für Mathemat, TU Dortmund, Germany J. Faban Meer 1,3 Uwe Causen 1 Insttute of Transport Logstcs,
More informationMulti-layer neural networks
Lecture 0 Mult-layer neural networks Mlos Hauskrecht mlos@cs.ptt.edu 5329 Sennott Square Lnear regresson w Lnear unts f () Logstc regresson T T = w = p( y =, w) = g( w ) w z f () = p ( y = ) w d w d Gradent
More informationSingular Value Decomposition: Theory and Applications
Sngular Value Decomposton: Theory and Applcatons Danel Khashab Sprng 2015 Last Update: March 2, 2015 1 Introducton A = UDV where columns of U and V are orthonormal and matrx D s dagonal wth postve real
More informationDiscriminating Fuzzy Preference Relations Based on Heuristic Possibilistic Clustering
Mutcrtera Orderng and ankng: Parta Orders, Ambgutes and Apped Issues Jan W. Owsńsk and aner Brüggemann, Edtors Dscrmnatng Fuzzy Preerence eatons Based on Heurstc Possbstc Custerng Dmtr A. Vattchenn Unted
More informationNeural Networks. Class 22: MLSP, Fall 2016 Instructor: Bhiksha Raj
Neural Networs Class 22: MLSP, Fall 2016 Instructor: Bhsha Raj IMPORTANT ADMINSTRIVIA Fnal wee. Project presentatons on 6th 18797/11755 2 Neural Networs are tang over! Neural networs have become one of
More informationQuantum Mechanics I - Session 4
Quantum Mechancs I - Sesson 4 Aprl 3, 05 Contents Operators Change of Bass 4 3 Egenvectors and Egenvalues 5 3. Denton....................................... 5 3. Rotaton n D....................................
More information[WAVES] 1. Waves and wave forces. Definition of waves
1. Waves and forces Defnton of s In the smuatons on ong-crested s are consdered. The drecton of these s (μ) s defned as sketched beow n the goba co-ordnate sstem: North West East South The eevaton can
More informationCABLE STRUCTURE WITH LOAD-ADAPTING GEOMETRY
Compostes n Constructon Thrd Internatona Conerence Lon, France, Ju 3, CABLE STRUCTURE WITH LOAD-ADAPTING GEOMETRY A.S. Jüch, J.F. Caron and O. Bavere Insttut Naver - Lam, ENPC-LCPC 6 et, avenue Base Pasca
More informationA DIMENSION-REDUCTION METHOD FOR STOCHASTIC ANALYSIS SECOND-MOMENT ANALYSIS
A DIMESIO-REDUCTIO METHOD FOR STOCHASTIC AALYSIS SECOD-MOMET AALYSIS S. Rahman Department of Mechanca Engneerng and Center for Computer-Aded Desgn The Unversty of Iowa Iowa Cty, IA 52245 June 2003 OUTLIE
More informationChapter - 2. Distribution System Power Flow Analysis
Chapter - 2 Dstrbuton System Power Flow Analyss CHAPTER - 2 Radal Dstrbuton System Load Flow 2.1 Introducton Load flow s an mportant tool [66] for analyzng electrcal power system network performance. Load
More informationFormulas for the Determinant
page 224 224 CHAPTER 3 Determnants e t te t e 2t 38 A = e t 2te t e 2t e t te t 2e 2t 39 If 123 A = 345, 456 compute the matrx product A adj(a) What can you conclude about det(a)? For Problems 40 43, use
More informationOptimum Selection Combining for M-QAM on Fading Channels
Optmum Seecton Combnng for M-QAM on Fadng Channes M. Surendra Raju, Ramesh Annavajjaa and A. Chockangam Insca Semconductors Inda Pvt. Ltd, Bangaore-56000, Inda Department of ECE, Unversty of Caforna, San
More informationL-Edge Chromatic Number Of A Graph
IJISET - Internatona Journa of Innovatve Scence Engneerng & Technoogy Vo. 3 Issue 3 March 06. ISSN 348 7968 L-Edge Chromatc Number Of A Graph Dr.R.B.Gnana Joth Assocate Professor of Mathematcs V.V.Vannaperuma
More information9 Derivation of Rate Equations from Single-Cell Conductance (Hodgkin-Huxley-like) Equations
Physcs 171/271 - Chapter 9R -Davd Klenfeld - Fall 2005 9 Dervaton of Rate Equatons from Sngle-Cell Conductance (Hodgkn-Huxley-lke) Equatons We consder a network of many neurons, each of whch obeys a set
More informationSensitivity Analysis Using Neural Network for Estimating Aircraft Stability and Control Derivatives
Internatona Conference on Integent and Advanced Systems 27 Senstvty Anayss Usng Neura Networ for Estmatng Arcraft Stabty and Contro Dervatves Roht Garhwa a, Abhshe Hader b and Dr. Manoranan Snha c Department
More informationTypical Neuron Error Back-Propagation
x Mutayer Notaton 1 2 2 2 y Notaton ayer of neuron abeed 1,, N neuron n ayer = vector of output from neuron n ayer nput ayer = x (the nput pattern) output ayer = y (the actua output) = weght between ayer
More informationPredicting Model of Traffic Volume Based on Grey-Markov
Vo. No. Modern Apped Scence Predctng Mode of Traffc Voume Based on Grey-Marov Ynpeng Zhang Zhengzhou Muncpa Engneerng Desgn & Research Insttute Zhengzhou 5005 Chna Abstract Grey-marov forecastng mode of
More informationResearch Article Green s Theorem for Sign Data
Internatonal Scholarly Research Network ISRN Appled Mathematcs Volume 2012, Artcle ID 539359, 10 pages do:10.5402/2012/539359 Research Artcle Green s Theorem for Sgn Data Lous M. Houston The Unversty of
More informationModule 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur
Module 3 LOSSY IMAGE COMPRESSION SYSTEMS Verson ECE IIT, Kharagpur Lesson 6 Theory of Quantzaton Verson ECE IIT, Kharagpur Instructonal Objectves At the end of ths lesson, the students should be able to:
More informationCHAPTER 5 NUMERICAL EVALUATION OF DYNAMIC RESPONSE
CHAPTER 5 NUMERICAL EVALUATION OF DYNAMIC RESPONSE Analytcal soluton s usually not possble when exctaton vares arbtrarly wth tme or f the system s nonlnear. Such problems can be solved by numercal tmesteppng
More informationDistributed Moving Horizon State Estimation of Nonlinear Systems. Jing Zhang
Dstrbuted Movng Horzon State Estmaton of Nonnear Systems by Jng Zhang A thess submtted n parta fufment of the requrements for the degree of Master of Scence n Chemca Engneerng Department of Chemca and
More informationPHYS 705: Classical Mechanics. Newtonian Mechanics
1 PHYS 705: Classcal Mechancs Newtonan Mechancs Quck Revew of Newtonan Mechancs Basc Descrpton: -An dealzed pont partcle or a system of pont partcles n an nertal reference frame [Rgd bodes (ch. 5 later)]
More informationNumerical Heat and Mass Transfer
Master degree n Mechancal Engneerng Numercal Heat and Mass Transfer 06-Fnte-Dfference Method (One-dmensonal, steady state heat conducton) Fausto Arpno f.arpno@uncas.t Introducton Why we use models and
More information8 Derivation of Network Rate Equations from Single- Cell Conductance Equations
Physcs 178/278 - Davd Klenfeld - Wnter 2015 8 Dervaton of Network Rate Equatons from Sngle- Cell Conductance Equatons We consder a network of many neurons, each of whch obeys a set of conductancebased,
More informationMODEL TUNING WITH THE USE OF HEURISTIC-FREE GMDH (GROUP METHOD OF DATA HANDLING) NETWORKS
MODEL TUNING WITH THE USE OF HEURISTIC-FREE (GROUP METHOD OF DATA HANDLING) NETWORKS M.C. Schrver (), E.J.H. Kerchoffs (), P.J. Water (), K.D. Saman () () Rswaterstaat Drecte Zeeand () Deft Unversty of
More informationA parametric Linear Programming Model Describing Bandwidth Sharing Policies for ABR Traffic
parametrc Lnear Programmng Mode Descrbng Bandwdth Sharng Poces for BR Traffc I. Moschoos, M. Logothets and G. Kokknaks Wre ommuncatons Laboratory, Dept. of Eectrca & omputer Engneerng, Unversty of Patras,
More information1 GSW Iterative Techniques for y = Ax
1 for y = A I m gong to cheat here. here are a lot of teratve technques that can be used to solve the general case of a set of smultaneous equatons (wrtten n the matr form as y = A), but ths chapter sn
More informationQuantum Runge-Lenz Vector and the Hydrogen Atom, the hidden SO(4) symmetry
Quantum Runge-Lenz ector and the Hydrogen Atom, the hdden SO(4) symmetry Pasca Szrftgser and Edgardo S. Cheb-Terrab () Laboratore PhLAM, UMR CNRS 85, Unversté Le, F-59655, France () Mapesoft Let's consder
More informationEinstein Summation Convention
Ensten Suaton Conventon Ths s a ethod to wrte equaton nvovng severa suatons n a uncuttered for Exape:. δ where δ or 0 Suaton runs over to snce we are denson No ndces appear ore than two tes n the equaton
More information1 Matrix representations of canonical matrices
1 Matrx representatons of canoncal matrces 2-d rotaton around the orgn: ( ) cos θ sn θ R 0 = sn θ cos θ 3-d rotaton around the x-axs: R x = 1 0 0 0 cos θ sn θ 0 sn θ cos θ 3-d rotaton around the y-axs:
More information1 Input-Output Mappings. 2 Hebbian Failure. 3 Delta Rule Success.
Task Learnng 1 / 27 1 Input-Output Mappngs. 2 Hebban Falure. 3 Delta Rule Success. Input-Output Mappngs 2 / 27 0 1 2 3 4 5 6 7 8 9 Output 3 8 2 7 Input 5 6 0 9 1 4 Make approprate: Response gven stmulus.
More informationAndre Schneider P622
Andre Schneder P6 Probem Set #0 March, 00 Srednc 7. Suppose that we have a theory wth Negectng the hgher order terms, show that Souton Knowng β(α and γ m (α we can wrte β(α =b α O(α 3 (. γ m (α =c α O(α
More informationNote On Some Identities of New Combinatorial Integers
Apped Mathematcs & Informaton Scences 5(3 (20, 500-53 An Internatona Journa c 20 NSP Note On Some Identtes of New Combnatora Integers Adem Kııçman, Cenap Öze 2 and Ero Yımaz 3 Department of Mathematcs
More informationMATH 567: Mathematical Techniques in Data Science Lab 8
1/14 MATH 567: Mathematcal Technques n Data Scence Lab 8 Domnque Gullot Departments of Mathematcal Scences Unversty of Delaware Aprl 11, 2017 Recall We have: a (2) 1 = f(w (1) 11 x 1 + W (1) 12 x 2 + W
More informationNONLINEAR SYSTEM IDENTIFICATION BASE ON FW-LSSVM
Journa of heoretca and Apped Informaton echnoogy th February 3. Vo. 48 No. 5-3 JAI & LLS. A rghts reserved. ISSN: 99-8645 www.jatt.org E-ISSN: 87-395 NONLINEAR SYSEM IDENIFICAION BASE ON FW-LSSVM, XIANFANG
More informationReactive Power Allocation Using Support Vector Machine
Reactve Power Aocaton Usng Support Vector Machne M.W. Mustafa, S.N. Khad, A. Kharuddn Facuty of Eectrca Engneerng, Unverst Teknoog Maaysa Johor 830, Maaysa and H. Shareef Facuty of Eectrca Engneerng and
More informationarxiv: v1 [physics.comp-ph] 17 Dec 2018
Pressures nsde a nano-porous medum. The case of a snge phase fud arxv:1812.06656v1 [physcs.comp-ph] 17 Dec 2018 Oav Gateand, Dck Bedeaux, and Sgne Kjestrup PoreLab, Department of Chemstry, Norwegan Unversty
More informationTime-Varying Systems and Computations Lecture 6
Tme-Varyng Systems and Computatons Lecture 6 Klaus Depold 14. Januar 2014 The Kalman Flter The Kalman estmaton flter attempts to estmate the actual state of an unknown dscrete dynamcal system, gven nosy
More informationarxiv:nlin/ v1 [nlin.si] 24 Jun 2005
Determnant Form of Dark Soton Soutons of the Dscrete onnear Schrödnger Equaton arxv:nn/050605v [nn.si] 4 Jun 005 Ken-ch Maruno and Yasuhro Ohta Facuty of Mathematcs, Kyushu Unversty, Hakozak, Hgash-ku,
More informationSPATIAL KINEMATICS OF GEARS IN ABSOLUTE COORDINATES
SATIAL KINEMATICS OF GEARS IN ABSOLUTE COORDINATES Dmtry Vasenko and Roand Kasper Insttute of Mobe Systems (IMS) Otto-von-Guercke-Unversty Magdeburg D-39016, Magdeburg, Germany E-ma: Dmtr.Vasenko@ovgu.de
More informationPhysics 5153 Classical Mechanics. Principle of Virtual Work-1
P. Guterrez 1 Introducton Physcs 5153 Classcal Mechancs Prncple of Vrtual Work The frst varatonal prncple we encounter n mechancs s the prncple of vrtual work. It establshes the equlbrum condton of a mechancal
More informationIntroduction to Neural Networks. David Stutz
RWTH Aachen Unversty Char of Computer Scence 6 Prof. Dr.-Ing. Hermann Ney Selected Topcs n Human Language Technology and Pattern Recognton WS 13/14 Introducton to Neural Networs Davd Stutz Matrculaton
More information