Positive feedback Derivative feedback Pos. + der. feedback c Stronger. E I e f g Time (s)

Size: px
Start display at page:

Download "Positive feedback Derivative feedback Pos. + der. feedback c Stronger. E I e f g Time (s)"

Transcription

1 Frng rate (Hz) a f nput current Transent nput Step-lke nput Postve feedback Dervatve feedback Pos. + der. feedback b c Stronger d e f g Frng rate (Hz) h j Frng rate (Hz) Tme (s) Tme (s) Tme (s) Supplementary Fgure S. Comparson of memory performance n networks wth nonlnear neuronal nput-output relatons. a, Nonlnear frng rate (f) vs. nput current () relatonshp. b-d, Network structures of postve feedback (b), negatve dervatve feedback (c), and hybrd of postve and negatve dervatve feedback models (d). e-g, Actvty of the exctatory populaton n response to transent nputs wth dfferent strengths. h-j, Actvty of the exctatory populaton n response to step-lke nputs wth dfferent strengths. Supplementary Fgure S Nature Neuroscence: do:.38/nn.349

2 Contour plots of τ network a 8 b 5 q.5 Unstable 6 τ N Unstable q N τ Supplementary Fgure S. Negatve dervatve feedback networks wth a mxture of NMDA and AMPA synapses n all exctatory pathways. a-b, Tme constant of decay of network actvty τ network as a functon of the fractons of NMDA N N synapses for fxed NMDA tme constant, τ =τ = ms (a) and as a functon of the tme constants of the NMDA synapses for fxed NMDA fractons, q =q =.5 (b). The remanng A A parameters were = = 5, = = 6, τ =τ =5 ms, and τ =τ = ms Supplementary Fgure S Nature Neuroscence: do:.38/nn.349

3 NMDA only n to Frng rate (Hz) NMDA wth equal fracton, but slower knetcs, n to than n to a b Frng rate (Hz) Postve feedback Pure dervatve feedback Pos. + der. feedback c e Tme(s) d tuned 5% ncrease 5% decrease Tme(s) f Tme(s) Supplementary Fgure S3. ffect of perturbatons n NMDA-type receptors. a-b, Postve feedback networks wth only NMDA-medated exctatory currents (a) and wth a mxture of NMDA- and AMPA-medated exctatory currents (b). For consstency of the average tme constant of N N A exctatory synapses wth that n Fgure 4, τ = ms wth q = n a, and τ =5 ms and τ =5 ms wth q =q =.5 n b. c-f, Purely dervatve feedback networks (c,d) and hybrd of postve and negatve dervatve feedback networks (e,f) wth only NMDA-medated exctatory currents n to (c,e) or wth a mxture of NMDA- and AMPA-medated exctatory currents n to and to (d,f). When the fractons of NMDA-medated currents are equal n to and n to connectons, persstent actvty s mantaned followng perturbatons n the negatve dervatve feedback networks (d,f), unlke n postve feedback N A N A networks. n c and e, τ = ms, τ =5 ms, q =, and q =. n d and f, τ =5 ms, τ =5 ms, N A τ =3 ms, τ = ms, q =q =.5, and the remanng parameters are the same as n Fgure 4. Supplementary Fgure S3 Nature Neuroscence: do:.38/nn.349

4 5 Frng rate (Hz) 5 3 Tme (s) Supplementary Fgure S4. ntegraton of nputs n spkng networks wth negatve dervatve feedback. Actvty of the exctatory populaton n response to step-lke nputs. The 3 traces show the responses to nputs wth 3 dfferent strengths. The network structure s the same as n Fg. 5. nstantaneous, populaton-averaged actvty of the exctatory neurons was computed wthn tme bns of ms (gray) or ms (black). Supplementary Fgure S4 Nature Neuroscence: do:.38/nn.349

5 Frng rate (Hz) a 4 tuned 5% ncrease e d Gan n & Rec. syn. from Tme(s) Tme(s) h Nose n external nput 4 g Rec. syn. from & 6 4 xternal nput 6 4 f Rec. syn. from 6 c Gan n 6 Frng rate (Hz) b Gan n 6 3 Tme(s) Tme(s) 4 5 Supplementary Fgure S5. Robust memory performance n networks of two competng populatons wth negatve dervatve feedback. a-g, Frng rates of (sold) and (dashed) populatons wth 5% ncrease n ntrnsc gans of (a), (b), or both and (c), and wth 5% ncrease n the strengths of the external nputs to and (d), or the recurrent synapses from (e), (f), or both and (g). h, Frng rates of and populatons wth Gaussan whte nose presented wth the stmulus onset n the external nputs to and. Supplementary Fgure S5 Nature Neuroscence: do:.38/nn.349

6 a Frng rate (Hz) 5 -populaton memory crcut b Balance rato e Frng rate (Hz) 5 4-populaton push-pull memory crcut f Balance rato c Tme(s) Tme(s) d g Tme(s) h Tme(s) 8 Frng rate (Hz) 4 5 Frng rate (Hz) Tme(s) Tme(s) 3 Tme(s) Tme(s) Supplementary Fgure S6. Plastcty rule that recovers persstent actvty and the balance condton n negatve dervatve feedback networks. llustraton that the balance condton and persstent actvty n negatve dervatve feedback networks can be obtaned through a dfferental Hebban learnng rule n the recurrent synapses onto exctatory neurons. We consder a learnng rule for stablzng persstent actvty adapted from that of Xe and Seung [53] and havng the form, τ learnng dw j /dt = c j dr /dt*r j for dr /dt <K and τ learnng dw j /dt = c j Ksgn(dr /dt)*r j for dr /dt >K, where sgn(x) = x/ x gves the sgn of x and K gves the maxmum ampltude dervatve that can be sensed by the learnng mechansm. As shown n Xe and Seung [53], ths form can be derved from a spke-tmng dependent plastcty (STDP) rule n the lmt that frng rates vary much more slowly than the wdth of the STDP wndow. xtendng that work, we consder plastcty both n exctatory and nhbtory synapses onto the exctatory neurons, wth ant-hebban plastcty n the exctatory synapses (c = -) and Hebban plastcty n the nhbtory synapses (c = ). Plastcty n ether -to- or -to- synapses alone produced smlar results (data not shown). (a-h) Recovery of persstent actvty and the balance condton n crcuts wth the structure of the two-populaton memory crcut of Fg. (a-d) or the four-populaton push-pull crcut of Fg. 6 (e-h). n each network, the ntal strength of the -to- connectons s decreased 5% from perfect tunng, resultng n a balance rato of.95 (b,f) and actvty decayng rapdly to a baselne (c,g). As the balance condton recovers to nearly perfect tunng (b,f), the tme constant of actvty decay gets longer (a,e) untl persstent actvty s mantaned nearly perfectly (d,h). Smulatons shown used τ learnng =s and K = Hz/s. The equatons and parameters for the frng rate models were the same as those for Fg. or Fg. 6, but wth nonlnear frng rate vs. nput current relatonshp as n Fg. c,d (bottom) to prevent negatve frng rates. xternal nputs were presented every 3 seconds wth strengths chosen ndependently and randomly from a unform dstrbuton. Supplementary Fgure S6 Nature Neuroscence: do:.38/nn.349

7 No GABA B GABA B equally n -to- and -to- GABA B hgher n -to- than -to- a b c 6 6 Frng rate (Hz) 4 Frng rate (Hz) 4 Frng rate (Hz) Tme(s) 3 Tme(s) 3 Tme(s) Supplementary Fgure S7. Negatve dervatve feedback networks wth or wthout slow GABA B -type nhbtory currents. a, Actvty of the control network of Fgures 3 and 4, n whch neurons receve a mxture of NMDA- and AMPAmedated exctatory currents, and only fast GABA A -type nhbtory synaptc currents. The three dfferent traces represent the response to three dfferent ampltudes of transent nput, as n Fgure c. b-c, Actvtes of networks recevng the same mxture of NMDA- and AMPA-medated exctatory currents, but wth a mxture of fast GABA A -type and slow GABA B -type synaptc currents. ven n the presence of slow nhbtory current, when GABA B -type synaptc currents are equally present n the -to- and -to- connectons, the tme constant of decay of network actvty s unchanged. Ths s because the network decay tme constant depends only upon GB GA GB GA the dfference n the average tme constants aver(τ ) = q τ + ( q )τ and aver(τ ) = q τ + (-q )τ, and ths dfference remans zero. Here, the superscrpts GA and GB denote the fast (GABA A ) and slow (GABA B ) components and q and q denote the proporton of GABA B currents (a,b). ven f the fracton of GABA B -type synaptc currents s somewhat hgher n the -to- connecton (5% hgher n panel c), negatve dervatve feedback stll arses due to the slower and more NMDA-domnant composton of receptors n the -to- connecton (c). For the smulatons shown here, = = 5, = = 3, and the tme constants and fractons of NMDA-medated synaptc currents were the same as n Fgure 4. n b and c, the tme constant of GABA B synaptc currents was ms and the fractons of GABA B were q = q =. (b) and q =.3, q =. (c). Supplementary Fgure S7 Nature Neuroscence: do:.38/nn.349

8 BBalanced cortcal mcrocrcutry for mantanng short-term memory BSupplementary Modelng Sukbn Lm, Mark S. Goldman Table of Contents. Analytcal descrpton of frng rate model... 7B.. Smplfed frng rate model llustratng negatve dervatve feedback... 8B.. Condtons for generaton of persstent actvty n full-dmensonal models wth lnear dynamcs B.3. Stablty condtons for the dervatve-feedback network... B.4. Actvty patterns durng persstent frng and the optmal nput drecton... 4 B.5. Robustness aganst perturbatons n the network connectvty... 5 B.6. Negatve dervatve feedback for networks of neurons wth nput-output nonlnearty B. Analyss of frng rate models of two competng populatons B.. Prevous models wth postve feedback B.. Constructon of two competng populatons wth negatve dervatve feedback... 5B.3. Robustness aganst perturbatons n the network connectvty B3. Analyss of spkng network models B4. Parameters B4.. Frng rate model of a sngle populaton Spkng network model wth leaky ntegrate-and-fre neurons BReferences... 3 Nature Neuroscence: do:.38/nn.349

9 B. Analytcal descrpton of frng rate model n ths secton, we provde the analytcal calculatons underlyng the results on the frng rate model descrbed n the man text, and addtonally provde smplfed versons of the frng rate model that elucdate the core prncples underlyng negatve dervatve feedback networks. Usng a control theoretc analyss, we fnd condtons on the network parameters for the network to generate persstent actvty through dervatve feedback control. We show that, unlke n prevous models based on postve feedback, dfferent temporal dynamcs of recurrent exctatory and nhbtory nputs s crtcal to generatng persstent actvty through dervatve feedback control. Furthermore, we show analytcally that persstent frng n these networks s more robust aganst many natural perturbatons than n tradtonal postve feedback based models. The structure of ths secton s as follows. n Secton., we frst dentfy mportant features for dervatve feedback control from a smple reduced-dmensonalty frng rate model. n Secton., we analyze the dynamcs of the full-dmensonal models of recurrently connected exctatory and nhbtory populatons used n the man paper and fnd condtons on the network parameters for the models to generate persstent actvty through postve feedback and/or negatve-dervatve feedback. Secton.3 derves addtonal condtons assurng that the nonpersstent modes of the dervatve-feedback networks are stable. Secton.4 descrbes the relatonshp between the rates of nhbtory and exctatory neurons durng persstent frng, as well as the optmal nput drecton for drvng maxmal responses n dervatve-feedback networks. n Secton.5, we nvestgate the robustness of the mantenance of persstent actvty aganst perturbatons n the network connectvty parameters j. n Secton.6, we show how generaton of persstent actvty wth negatve dervatve feedback control can be extended to networks whose neurons have a nonlnear frng rate versus nput current relatonshp. 7B.. Smplfed frng rate model llustratng negatve dervatve feedback Here, we present a smplfed network model that provdes mathematcal ntuton for how dervatve feedback can arse n a balanced network. Specfcally, we show how dervatvelke feedback arses from balanced postve feedback and negatve feedback wth dfferent knetcs, and we relate the propertes of ths dervatve-lke feedback to the strengths and tme scales of the postve and negatve feedback pathways. The reader s referred to Sectons.4 and.5 of ths Supplement for rgorous dervatons of the analogous propertes n the full network of Fg. a. Consder a populaton of neurons that receves exctatory and nhbtory recurrent nputs wth equal strengths but wth dfferent flterng tme constants: Nature Neuroscence: do:.38/nn.349

10 r r s s s O s r. s r (A) s () t Here, r denotes the frng rate of a neuron wth tme constant τ, s denotes recurrent exctatory synaptc nput that conveys postve feedback wth tme constant, and s smlarly denotes recurrent nhbtory synaptc nput conveyng negatve feedback wth tme constant. xctatory and nhbtory synaptc nputs are assgned equal strengths. xternal nput s modeled as a bref, delta functon δ(t) pulse of nput of strength O. The key feature of the above model s that the synaptc nputs conveyng postve and negatve feedback, s and s, arrve wth equal strengths but offset knetcs due to the dfferent tme constants and. We next show that, due to ths balance n strength but dfference n knetcs of the ndvdual synaptc nputs, the total recurrent nput approxmates dervatve feedback for the low-frequency responses characterstc of persstent actvty. To show ths, we recall that the Laplace transform of the tme-dervatve of a sgnal, dr(t)/dt, equals ur(u) where R(u) s the Laplace transform of r(t) and u s the complex-valued frequency. Usng that the synaptc functons s and s are exponentally fltered transformatons of the frng rate r(t), we obtan that the Laplace transform of the total recurrent nput s proportonal to u R( u) R( u) [ s s] R( u), u u u u (A) where R(u) s the ampltude of the actvty r(t) at frequency u. For low frequences u, [ s s] ur( u), whch s a constant multple of the Laplace transform of the dervatve of the actvty ur(u). Thus, at the low frequences characterstc of persstent actvty, the dfference between s and s s approxmately proportonal to the dervatve of the actvty dr Wder n q. () n the man text: dt s dr dr s W for low-frequency r. (A3) dt dt der By contrast, at hgh frequences (large u), [ s s ] u u / R ( u ) ~ R( u) / u. Ths shows that hgh frequences are suppressed, rather than dfferentated, by the recurrent nputs. As noted n the man text, ths may be a useful feature, because hgh frequences are often assocated wth nose and would be amplfed by an exact dervatve feedback mechansm. 3 Nature Neuroscence: do:.38/nn.349

11 From the smple recurrent network defned n q. X(A)X, we can dentfy a few mportant features of negatve dervatve feedback. Frst, the tme constant of network actvty ncreases wth the strength of the recurrent feedback and the dfference between the tme scales for exctatory and nhbtory feedback. From q. () n the man text and q. X(A3)X, the tme constant of decay of the actvty s W ( ) for large. (A4) eff der Second, although the negatve dervatve feedback network s resstant aganst drft of actvty n the absence of the external nput, external nput whose strength s comparable to that of the recurrent nputs results n a sgnfcant change of actvty. For pulse-lke nput of strength O, as n q. X(A)X, the jump n actvty s gven by r / / ( ). (A5) O eff O From the equaton above, we see that Δr does not approach zero even wth large f the strength of the external nput O scales smlarly to the strength of the recurrent nputs. ndeed, snce represents the strength of the total (exctatory or nhbtory) recurrent synaptc connectons averaged across the populaton, should scale wth the number of recurrent connectons. Smlarly, O scales wth the number of connectons onto the memory network from the populaton transmttng the nformaton about the stmulus. Snce the number of external connectons scales wth the network sze n the same way as the number of recurrent connectons, O should be of the same order as. Thus, even wth large dervatve feedback, external nputs can produce large changes n the level of the persstent frng rate. n addton, for networks wth separate exctatory and nhbtory populatons, proper spatal arrangement of the external nputs can reduce the dervatve feedback durng stmulus presentaton and thereby enhance the effect of the external nputs (see Secton.4). Fnally, we note that ths smple model s robust aganst unform changes n neuronal gans or loss of a fracton of the neuronal populaton, because such changes mantan the balance of exctaton and nhbton. However, ths smple model s not robust aganst perturbatons n exctatory or nhbtory synapses snce these dsrupt the balance between exctaton and nhbton. Ths s a crtcal dfference from the full-dmensonal models descrbed below, whch exhbt robustness aganst perturbatons n recurrent exctatory or nhbtory synapses (see Secton.5). 8B.. Condtons for generaton of persstent actvty n full-dmensonal models wth lnear dynamcs n ths secton and the remander of Secton, we analytcally derve condtons for producng stable persstent actvty n lnear networks consstng of one exctatory and one 4 Nature Neuroscence: do:.38/nn.349

12 nhbtory populaton. Through ths analyss, we separately dentfy parameter regmes for postve feedback control and dervatve feedback control, and show that dervatve feedback control requres recurrent exctaton and nhbton to exhbt a close balance n strength but dfferent temporal dynamcs. MATHMATCAL CONDTONS FOR GNRATON OF PRSSTNT ACTVTY To analyze the lnear network, we use the egenvector decomposton to decompose the coupled neuronal actvtes nto non-nteractng modes (egenvectors) that can be consdered ndependently []. For a lnear network obeyng the equaton dy / dt egenvectors Aq Ay, the rght r q and correspondng egenvalues of the matrx A satsfy the equaton r r q for each = to n, where n denotes the number of state varables. The decay of each mode s exponental wth tme constant, /. For the system defned by q. (5) n the man text: eff r r f ( s s () t ) r r f ( s s () t ), s s r for, j, or j j j j O O (A6) where for the lnear case f (x)=f (x)=x, y r s T ( r,, s,, s, s ) and the matrx A s gven by r r s s s s / / / / / / / / A. / / / / / / (A7) For persstent frng (, eff large), the system y Ay defned by q. (A7) should have at least one egenvector wth ts correspondng egenvalue equal to or close to. Below we show two dfferent manners by whch one can obtan an egenvalue equal to or close to n networks of recurrently connected exctatory and nhbtory populatons. One case corresponds to postve feedback based models, and the other corresponds to negatve dervatve feedback based models. n the former case, the recurrent connectons n the network medate postve feedback that precsely offsets the ntrnsc leakness of neurons [-4], where ths leakness s represented mathematcally by the decay terms r and r n equaton (A6). n the latter case, the recurrent 5 Nature Neuroscence: do:.38/nn.349

13 feedback may not cancel the ntrnsc leakness precsely; nstead, the recurrent connectons medate a balance between large postve and negatve feedback that are offset n tme, resultng n dervatve-lke feedback that opposes any drfts n actvty. n the followng, we dentfy these two parameter regmes n lnear frng rate models. To fnd the condtons on network parameters for whch the network has an egenvalue equal to or close to, we utlze the characterstc functon of the lnear system. The characterstc polynomal of a lnear system s defned by char( x) det( A x) where s the n-by-n dentty matrx. genvalues λ of the system correspond to roots of the characterstc polynomal. n our 6-dmensonal network model descrbed by the matrx of q. (A7), the characterstc polynomal s gven by char( x) det Ax x a x a x a x a x a x a ( x )( x ) ( x ), where the coeffcents a of char(x) are functons of the network parameters j and τ j, wth,j = or, and can be expressed n terms of the egenvalues. We examne the condtons for ths characterstc polynomal to have roots whose values are or close to. n partcular, the constant term a of the characterstc polynomal char(x) determnes whether char(x) has a zero-valued root, snce a s the product of all egenvalues of A. However, ths condton only determnes the parameter sets havng a precsely egenvalue. n the case that there s one egenvalue λ close to and another egenvalue whose magntude s larger than the magntude of /λ, the product of all egenvalues represented by a can be fnte. Addtonally, the rato between the coeffcent a of the x term and the constant term a can be used to dentfy a parameter set whch allows the system to have egenvalue close to. Ths can be shown usng the expresson for a n terms of the egenvalues: (A8) a ) a n ( ) ( / j ( ) n a / a /. j (A9) f an egenvalue λ s close to zero, the magntude of ts recprocal /λ wll be large. Thus, f the magntude of the rato between a and a s large, there exsts at least one egenvalue close to zero. (Note that ths condton s a suffcent but not necessary condton for the exstence of an egenvalue close to. n the case that there exst multple egenvalues close to zero havng dfferent sgns, the recprocal of each egenvalue can be large but the sum can be fnte due to cancellaton.) 6 Nature Neuroscence: do:.38/nn.349

14 To fnd condtons on the network parameters for havng an egenvalue equal to or close to, we use the explct expressons for a and a n terms of j and τ j : a ( )( ),,, j, j ( )( ) ( ) ( ) a,,, j, j a / a ( )( ) ( ) ( ) ( )( ) / ( ) ( ) ( ) / ( ) ( ). / ( ) ( ) (A) n the above expresson, the rato between a and a becomes large ether when the denomnator s small (correspondng to small a ) or when the numerator s much larger than the denomnator f the denomnator s not close to zero. Below, we show that the former provdes a condton for postve feedback networks, and the latter provdes condtons for negatve dervatve feedback networks. CONDTONS FOR POSTV FDBACK NTWORKS As descrbed above, one condton that leads to an egenvalue equal to s to have the term a of the characterstc polynomal of q. (A8) equal zero. From the set of equatons above, ths occurs when ( )( ). (A) Bologcally, ths condton corresponds to the precse cancellaton of the ntrnsc leakness of the neurons by network-medated postve feedback, a mechansm that has been suggested prevously to underle persstent frng [5]. To see how the above equaton corresponds to such a mechansm, note that durng persstent actvty y so that the frng rate of the nhbtory populaton r n q. X(A6)X can be expressed n terms of the frng rate of the exctatory populaton r as r / ( ) r. Then, n the equaton for r, the nhbtory feedback strength through the nhbtory populaton becomes / ( ) and the net recurrent feedback strength becomes the dfference between the exctatory synaptc strength and the strength of the nhbtory feedback, / ( ). The amount of ths net recurrent feedback precsely cancels the ntrnsc leakness f / ( ), whch s the condton gven n q. 7 Nature Neuroscence: do:.38/nn.349

15 X(A)X. Thus, q. X(A)X corresponds to the condton used by tradtonal postve feedback models n whch excess postve feedback s tuned to offset ntrnsc neuronal leakness. CONDTONS FOR NGATV DRVATV FDBACK NTWORKS We next consder the alternatve mathematcal condton for havng an egenvalue close to,.e. that the rato between a and a becomes large (even f a tself s not very close to zero). That s, even f ( )( ) s not small, the network can have an egenvalue close to zero f the numerator n q. X(A)X s relatvely large compared to the term n the denomnator. Here, we show that ths condton leads to two core requrements for negatve dervatve feedback control: frst, a balance between postve and negatve feedback n strength and, second, slower postve than negatve feedback. Networks can have an egenvalue close to, that s, large a /a n q. X(A)X wth fnte a ( )( ) n two ways: ether havng a large tme constant τ (case ) or, for fnte τ, havng large s under specal relatons between the s (case ). n the frst case, havng long tme constants of synapses obvously results n slow dynamcs n the system and leads to slow decay of neural actvty. ndeed, prevous works have suggested that the use of long ntrnsc or synaptc tme constants may lessen the strctness of the tunng requrement that feedback connectons must precsely offset ntrnsc neuronal decay processes [6-8]. However, the slowest ntrnsc tme constant n most models s of order ms (e.g., the tme constant of NMDA decay knetcs), much shorter than observed memory perods of many seconds. n the second case, the network can have an egenvalue close to wth fnte ( )( ) f the numerator s much larger than the denomnator n q. X(A)X. As shown next, ths can occur when the values of the s are large. n ths case, we can approxmate the numerator and the denomnator of q. X(A)X wth ther leadng terms n the s, and, respectvely. A suffcent condton for the rato of these terms to be large s then that: or equvalently, ~ ( ), O O ( ), O( ) / ~, / O ( ).e., / ~. (A) (A3) 8 Nature Neuroscence: do:.38/nn.349

16 Networks wth parameters satsfyng the above condtons operate n a regme that corresponds to mantanng persstent frng through negatve dervatve feedback. To see ths, recall from our dscusson of the postve feedback mechansm that / ( ) represents the strength of nhbtory feedback onto the exctatory populaton through the nhbtory populaton, and s the strength of recurrent exctatory feedback onto the exctatory populaton (Fg. a). When the s are large, / ( ) / ; thus, q. X(A3)X mples that the strengths of the two feedbacks are smlar, and we refer to ths equaton as the balance condton. The second condton, gven by q. X(A)X, constrans the tme constants of the postve and negatve feedback. n q. X(A)X, the tme constants multplyng the feedback strengths correspond to the tme scales of the postve and negatve feedback, that s, and ; thus, from q. X(A)X, (below, n separate stablty analyses n Secton.3, we wll show that must be greater than ). Qualtatvely, approxmates the tme for sgnals to traverse the negatve feedback loop. Smlarly, s the tme constant of the drect postve feedback onto the exctatory populaton and represents the tme constant for ndrect postve feedback onto the exctatory populaton by suppressng the nhbtory populaton (Fg. a). Note that the condtons correspondng to postve feedback networks and negatve dervatve feedback networks are not mutually exclusve. f the s are large, the condton for the postve feedback models gven n q. X(A)X becomes a subset of the balance condton for the negatve dervatve feedback models descrbed n q. X(A3)X. n partcular, f the network satsfes both q. X(A)X and q. X(A)X for large s, that s, the amount of large postve feedback s smlar to, but slghtly larger than that of negatve feedback and the tme scales of the two feedbacks are dfferent, then the network receves large negatve dervatve feedback as well as addtonal postve feedback that precsely cancels off the ntrnsc neuronal leakness. Ths corresponds to the hybrd postve feedback and negatve dervatve feedback model of Fg. c. CONNCTON TO PHNOMNOLOGCAL AND SMPLFD FRNG RAT MODLS We next show how the negatve-dervatve feedback models descrbed above can be drectly connected to the smpler phenomenologcal model of q. () of the man text that was defned by overall postve feedback and negatve-dervatve feedback strengths W pos and W der, respectvely. Specfcally, for the case that the s are large as n the negatve dervatve feedback networks, we express W pos and W der n terms of the synaptc strengths j and ther tme constants τ j and show that the amount of negatve dervatve feedback W der s proportonal to the product of the synaptc strength scale and the dfference between and. 9 Nature Neuroscence: do:.38/nn.349

17 To derve the expresson for the amounts of negatve dervatve feedback and postve feedback n terms of network parameters, we examne the expresson for the longest tme constant of decay of network actvty τ network. Snce, /, the longest tme constant of decay of network actvty τ network s the recprocal of the egenvalue closest to whose expresson s gven by q. X(A)X. Denotng the balanced amount of postve feedback and negatve feedback /( +) as der and the dfference between them, /( +), as pos, the approxmate τ network from q. X(A)X for large s s eff network c. der (A4) pos where c s a constant of order. The above expresson s analogous to the effectve network tme constant of the phenomenologcal model of the man text (q. () of the man text), W / W where was the ntrnsc (cellular or synaptc) tme constant. Thus, eff der pos we dentfy W pos = pos and W der ~ der ( ). Thus, the amount of negatve dervatve feedback ncreases lnearly wth the s and the dfference between the tme constants of postve and negatve feedback, smlar to what was found for the smplfed frng rate model network of Secton. (q. X(A4)X). n summary, n ths secton we found the condtons for persstent frng wth postve feedback or dervatve feedback control. The dervatve feedback models are dstnct from the prevously studed postve feedback models: they requre a close balance between exctaton and nhbton (q. (3) n the man text) and dfferent knetcs of exctaton and nhbton (q. (4) n the man text). However, the postve feedback models and dervatve feedback models are not mutually exclusve, and we show how a hybrd of the two models can be constructed. 9B.3. Stablty condtons for the dervatve-feedback network n the prevous secton, we dscussed the condtons for the network parameters to have egenvalue equal to or close to zero. We found a new parameter regme n whch the network uses a dervatve-lke feedback mechansm to mantan persstent actvty. Unlke prevously models, the negatve dervatve feedback mechansm does not requre perfect cancellaton of ntrnsc leakness by postve feedback. nstead, t requres large postve and negatve feedback nputs whch balance each other but have dfferent dynamcs (qs. X(A)X and X(A3)X). Here, we dentfy addtonal condtons on the network parameters for the networks to mantan persstent actvty wthout unbounded growth of actvty n the non-persstent modes. Specfcally, the system requres that all egenvalues except those close to have a negatve real part, and we refer to ths as the stablty condton for the network. n the followng, we frst show the necessary and suffcent stablty condtons for a 4-dmensonal reduced network n whch the Nature Neuroscence: do:.38/nn.349

18 ntrnsc neuronal responses are assumed to be fast. Next, we show necessary stablty condtons for the full 6-dmensonal system. STABLTY CONDTON FOR TH 4-DMNSONAL SYSTM To smplfy the analytcal calculaton of the stablty condton, here we assume the dynamcs of the frng rates s rapd [9] so that the frng rates nstantaneously follow ther nput n q. (A6). That s, τ and τ are consdered small and the dynamcs s reduced to the 4- dmensonal system: r s s () t r s s () t. (A5) s j j j j O O s r for, j, or To determne the sgns of the egenvalues n ths 4-dmensonal system, we use the wellknown stablty test for lnear systems, the Routh stablty crteron []. n the Routh stablty crteron, the number of postve egenvalues s determned by examnng functons of the coeffcents of the characterstc polynomal through the use of a Routh table defned as follows: n a x a x a x a n x n n x n n a a a n n n4 an an anan3 an an4 anan5 b 3 5 b n n n an an b b b3 wth ba n3anb ban 5 an b3 c c c c 3 c b b a a a,,,,. n the table above, the number of roots wth postve real parts s equal to the number of changes of sgn of the elements of the frst column of the Routh table. The persstent actvty networks consdered here contan an egenvalue close to whch can have ether postve or negatve real parts, so the system s margnally stable. Thus, before drectly applyng the Routh-Hurwtz crteron to the characterstc polynomal gven n q.(a5), we factor out the root of the characterstc polynomal whose value s close to, denoted by, from the characterstc polynomal: x a x a x a x a ( x ) x ( a ) x ( a ( a )) x ( a ( a ( a ))) a ( a ( a ( a ))). 3 Nature Neuroscence: do:.38/nn.349

19 Here, the remander a ( a ( a ( a3 ))) equals snce s a root of the characterstc polynomal. n partcular, f the system has only one egenvalue close to, from q. X(A9)X, can be approxmated by a / a and the quotent functon becomes a a a a a a Q( x) x ( a ) x ( a ( a )) x ( a ( a ( a ))). (A6) a a a a a a Thus, we apply the Routh stablty crteron to ths thrd order quotent functon. Moreover, usng that the s are large, we further approxmate the coeffcents wth the leadng terms n as follows: a a a a 3 3, approx ~ a ~ O ( ) ( )( ) ~ a, approx ~ O ( ) ( )( ) / j, j, ~ ~ ( / j a, approx O ), j, ( ( )( )) / ( ) / ~ O ( ), j j, j,, j, Note that a s at most of order, snce n the balance condton n q. X(A3)X, we addtonally assume that the dfference between the strengths of postve and negatve feedbacks s of order, or equvalently, ~ O( ). Applyng the Routh-Hurwtz crteron to these asymptotc expressons for the coeffcents of the quotent functon Q(x) n q. (A6), we obtan the stablty condtons >. (A7) The last condton s smlar to q. X(A)X, whch showed that the tme scales for the postve and negatve feedback must be dfferent to have stable persstent frng. The stablty condton above addtonally specfes that the postve feedback should be slower than the Nature Neuroscence: do:.38/nn.349

20 negatve feedback. The second condton above s smlar to the last condton except that t constrans the product of the tme constants. The frst condton compares the magntudes of recurrent exctaton and recurrent nhbton; that s, for other non-persstent modes to be stable, the normalzed strength of the nhbtory feedback must be larger than that of the exctatory feedback. STABLTY CONDTON FOR TH 6-DMNSONAL SYSTM For the full 6-dmensonal system gven n q.x(a6)x, the complete stablty condtons also can be calculated by the Routh stablty crteron. However, the stablty condtons are far more complcated expressons n terms of the network parameters. Here, for ease of nterpretaton, we nstead provde smpler, necessary condtons for stablty. These necessary condtons are determned by the sgn of the coeffcents of the characterstc polynomal. To have a stable system, all egenvalues must be negatve and, correspondngly, the coeffcents of the characterstc polynomal must all be postve. However, n our persstent actvty network, the leadng egenvalue (the one close to ) may be slghtly postve (correspondng to very slow growth of actvty n the persstent mode). Therefore, as n the 4-dmensonal system, we factor out the egenvalue close to, denoted by λ, and fnd condtons for all coeffcents of the quotent functon to be postve. To leadng order n the s, these condtons are gven by. (A8) The last two condtons are the same as those obtaned for the 4-dmensonal system (q. X(A7)X). The frst condton s smlar to the 4-D case, but now the tme constants of the populaton actvty, τ and τ, contrbute to the postve and negatve feedback smlarly to the synaptc tme constants, τ and τ, respectvely. The second condton s smlar to the frst condton, but wth extra terms contanng the varous τ s. Thus, havng slower exctatory tme constants than nhbtory ones s benefcal to stable persstent frng. The suffcent and necessary condtons obtaned through the Routh stablty crteron also follow these general rules but have much more complcated forms and thus are not shown here. 3 Nature Neuroscence: do:.38/nn.349

21 B.4. Actvty patterns durng persstent frng and the optmal nput drecton n ths secton, we analytcally obtan the actvty patterns observed durng persstent frng and the optmal nput drecton that maxmzes the response of the network. We show below that the frng rates of the exctatory and nhbtory populatons change proportonally for dfferent levels of persstent frng, as has been observed expermentally []. On the other hand, we show that the response to external nput s maxmzed when the external nput exctes the exctatory neurons and suppresses the nhbtory ones, as has been suggested to lead to a transent amplfcaton of actvty n sensory networks composed of exctatory and nhbtory populatons []. To fnd the actvty pattern for persstent frng and the best nput drecton, we decompose the network actvty nto ts egenvector components. n a lnear system that s egenvector-decomposable, the network actvty n response to a transent nput can be descrbed by ts egenvalues and correspondng egenvectors. n partcular, when one egenvalue has real part much larger than the remanng egenvalues, the network actvty can be expressed approxmately n terms of ths leadng egenvalue and ts correspondng left and rght egenvectors [3]. Snce the system wth dervatve feedback dscussed n the prevous sectons has one egenvalue close to and the remanng egenvalues have real parts strctly less than, the network actvty wth dervatve feedback s well-descrbed by y y e v y q At t l r e ( q ) v, (A9) where A s the matrx defned n q. (A7), y s the vector of states before the arrval of the transent nput, and v s the external nput vector. r n q. X(A9)X, f y, the rght egenvector q corresponds to the actvty pattern that s mantaned durng persstent frng and the ampltude of ths pattern s proportonal to q l v, that l s, the projecton of the nput vector v onto the left egenvector q. Thus, the rato between r r and r s proportonal to the rato between the frst and second elements of q. n the dervatve r r r r feedback networks, q can be found through ts defnng equaton Aq q. Then, q s expressed n terms of the network parameters accordng to r s s / r s s / s r / q ~ snce s s r / s r / r r / ~, 4 Nature Neuroscence: do:.38/nn.349

22 and the rato between r and r durng persstent frng s / ~ /. Snce the s are postve, ths rato s postve, that s, r and r postvely covary for dfferent levels of persstent frng. l The left egenvector q can be computed smlarly from ts defnton Aq l. The l expresson for q n terms of the network parameters s found accordng to T r / s / s / r / s / s / r / s / l q. r / s / r / s / r / s / l Notably, the frst and second elements of q have dfferent sgns. Snce the ampltude of persstent actvty s proportonal to l l q v, whch s maxmzed when v s parallel to q, the optmal external nput to the exctatory and nhbtory populatons should have dfferent sgns, that s, excte the exctatory populatons and suppress the nhbtory populatons. Note, by contrast, that the actvtes of the exctatory and nhbtory populatons durng persstent frng have the same sgn. Ths dfference between the persstent actvty pattern and the optmal drecton of the nput, that s the dfference between the left and rght egenvectors, arses from the asymmetry of the network connectvty. Thus, t s nherent n networks of exctatory and nhbtory populatons []. n summary, n ths secton, we found the actvty patterns durng persstent frng and the external nput pattern that maxmzes the network response. We showed that the frng rates of the exctatory and nhbtory populatons postvely covary for dfferent levels of persstent actvty, whle the maxmal response s attaned n response to nput that exctes the exctatory populaton and nhbts the nhbtory populaton. B.5. Robustness aganst perturbatons n the network connectvty n ths secton, we study the effects of perturbatons n the network connectvty on the ablty to mantan persstent actvty. We fnd that persstent frng n the dervatve feedback network s robust aganst many commonly studed perturbatons such as gan changes, changes of exctaton or nhbton, and nactvaton of a fracton of the exctatory or nhbtory populatons (Fg. 4). To show ths robustness, we check how the balance condton q. X(A3)X s affected under such perturbatons. 5 Nature Neuroscence: do:.38/nn.349

23 We examne the types of perturbatons of the network parameters under whch the system stll mantans persstent actvty, that s, has an egenvalue close to and satsfes the stablty condtons. n partcular, we consder multplcatve scalng m j of the synaptc strengths, that s, the synaptc strengths become m j j. Then, gan control n the entre populaton corresponds to a unform ncrease or decrease of all m s and selectve gan control n the exctatory or nhbtory populaton corresponds to a unform ncrease n m,j or m,j for j =,, and O. Smlarly, nactvaton (or loss) of a subpopulaton of exctatory or nhbtory populatons or presynaptc changes n transmsson can be modeled by multplcatve changes n the strengths of exctatory or nhbtory synapses, correspondng to unform ncreases n the m s and m O s or m s, respectvely, for = or. Under ths multplcatve change n the synaptc strengths, the balance condton for the m m / m m ~,.e. exstence of an egenvalue close to becomes m m / m m ~. Frst, we note that for changes n ntrnsc neuronal gans n the entre network, that s, unform ncrease n m s, ths condton s satsfed. Ths reflects that, snce the postve and negatve feedback change n the same manner, the net recurrent nput contnues to provde dervatve feedback (Fg. 4). Second, we see that multplcatve changes n the gan of the exctatory or nhbtory populaton or changes n exctatory (Fg. 4j) or nhbtory (Fg. 4k) synapses or nactvaton of a subpopulaton of exctatory or nhbtory populatons smlarly mantan the balance condton snce presynaptc exctaton, presynaptc nhbton, postsynaptc exctaton, and postsynaptc nhbton are n both the numerator and denomnator of the above expresson. The stablty condtons gven n q. X(A7)X also are satsfed under moderate perturbatons n synaptc strengths. Only the frst condton n q. X(A7)X depends on the synaptc strengths, requrng that m /τ should not exceed m /τ. n our models wth bologcally plausble parameters, τ s an order of magntude larger than τ and s of the same order as. Thus, even n the presence of the perturbatons whch ncrease or decrease, the system satsfes the stablty condtons for a large range of perturbatons. However, too much ncrease n the overall exctatory nput to the system could break the stablty condton and could make the system unstable. We remark that the dervatve feedback models are not robust aganst all forms of perturbatons. For example, f the NMDA conductance s larger n exctatory to exctatory than n exctatory to nhbtory connectons, perturbaton specfcally of NMDA-type synapses dsrupts persstent frng snce ncreasng m more than m breaks the balance condton m m / m m ~. However, the dsrupton resultng from ths devaton n the balance condton s smlar to that observed n postve feedback models: f only m changes whle all other m j =, the tme constant of the network actvty n q. X(A)X becomes 6 Nature Neuroscence: do:.38/nn.349

24 network m m / ( ) ( m ) c / ( ) ( m ) m f,,. (A) n the frst approxmaton above, the balance condton q. X(A3)X s used to replace /( +) by. The fnal expresson above s smlar to the tme constant of decay / ( W pos ) n smple postve feedback networks (e.g. equaton () of the man text when W der = and when the domnant ntrnsc cellular or synaptc tme constant s ). When W pos s perturbed by m from, the tme constant becomes / ( m), smlar to q. X(A)X. We note that negatve dervatve feedback models wth NMDA-type synapses of approxmately equal strength at all exctatory synapses, but wth slower NMDA-synapses n the -to- pathway, can be far more robust aganst perturbatons n NMDA-type synapses. To see ths, we consder network models n whch all exctatory connectons are medated by two dfferent types of synaptc currents, NMDA-medated currents and AMPA-medated currents (Fg. 3 and onlne methods q. (7)). f we assume the ratos of NMDA- to AMPA-type synapses are the same n all exctatory pathways so that q q q, but the NMDA-type synapses n the -to- connecton have slower knetcs, synapses by a fracton m mantan the balance condton as follows: N A N A N, then perturbatons n NMDA-type N ( ) ( q) qm q qm ( ) O( ( qm q) ). Thus, the persstent actvty s mnmally affected by the perturbatons n NMDA-type synapses, n contrast to the gross dsruptons that occur n pure postve feedback models or n dervatve feedback models wth NMDA-type synapses only n -to- connectons. ven n the case that q q, the dsrupton of the persstent actvty s less severe f the -to- NMDA-type synapses are relatvely slow, knetcs, N N N N. By contrast, f all NMDA synapses have the same, and negatve-dervatve feedback s accomplshed by havng stronger NMDA conductance n -to- connectons, q q, then the network wll exhbt the same dsrupton n persstent actvty as n negatve-dervatve feedback models n whch NMDA appears exclusvely n -to- connectons (calculatons not shown). Thus, havng slower NMDA-type 7 Nature Neuroscence: do:.38/nn.349

25 synapses n -to- than -to- connectons [4-5] s advantageous n makng the system more robust to dsruptons of NMDA-type conductances. B.6. Negatve dervatve feedback for networks of neurons wth nput-output nonlnearty n ths secton, we consder a network model n whch the ndvdual neurons have a nonlnear frng rate versus nput current relatonshp and show that the network mplements dervatve feedback control under condtons smlar to the lnear networks. n the presence of such nonlnearty, global analyss of the network dynamcs through the egenvector decomposton s not possble. nstead, we dentfy possble sets of steady states and check the local stablty around those steady states. Let us assume that there exsts a steady state. To characterze ths steady state, we lnearze the system locally around t. For ths steady state to belong to a contnuous attractor, there should exst at least one egenvalue equal to or close to n the local lnearzaton. f we denote ths steady state as lnearzaton of q. (A6) becomes T (,,,,, ) r r s s s s and move the orgn to the steady state, the r r c ( s s ) wth c f ( s s ) r r c( s s ) wth c f ( s ) s s s r s s r s s r s s r, (A) where f '( x ) denotes the dervatve of f (x ) evaluated at x, δr = r r and δs j = s 8 j s. q. X(A)X s the same as the system wth lnear nput-output relatonshps n the prevous sectons, but wth dfferent slopes c and c. Thus, for the system to have an egenvalue close to through negatve dervatve feedback, we obtan smlar condtons but wth the replacement of each j by c j for =, or so that c c ~ O ( ), ( ), c c O for large s. f c and c are not too small, then the constants can be gnored and the above condtons are the same as the lnear dynamcs gven by q. X(A)X and q. X(A3)X. Thus, the condtons for negatve dervatve feedback do not depend on the specfc form of the nputoutput nonlnearty n the regme that the slopes of the nput-output functon are not small n j Nature Neuroscence: do:.38/nn.349

26 magntude. Typcal nput-output nonlneartes such as sgmod functons have a non-zero slope away from the threshold and the saturaton. Thus, a contnuum set of steady states correspondng to persstent actvty wll be located n such a regme. The stablty condtons at each steady state (q. X(A7)X) do depend on c and c as c /τ >c /τ. However, for τ - orders of magntude larger than τ, ths condton can hold for a wde range of c and c. Thus, n contrast to postve feedback networks (Supplementary Fg. Se,h), the memory performance n dervatve feedback networks (Supplementary Fg. Sf,) or hybrd networks contanng a large dervatve feedback component (Supplementary Fg. Sg,j) s robust to addng an nput-output nonlnearty. 3B. Analyss of frng rate models of two competng populatons n the prevous sectons, we dscussed a dervatve feedback network model consstng of one exctatory populaton and one nhbtory populaton. n parametrc workng memory tasks [6] and decson makng tasks such as two-alternatve forced choce tasks [7], t has been suggested that there exst two competng populatons whose frng rates vary n opposte drectons as a functon of the remembered stmulus parameter. n many tradtonal models, postve feedback wthn and between the populatons has been utlzed for the mantenance or ntegraton of evdence toward one choce or another [6, 8]: When the two competng populatons are connected through mutual nhbton or through a common nhbtory pool, ths forms a dsnhbtory postve feedback loop between the populatons. Thus, n such models, both recurrent exctatory and recurrent nhbtory synaptc nteractons provde postve feedback that prolongs the tme constant of decay of network actvty. n contrast to these tradtonal models, we suggest a model of two competng populatons based on negatve dervatve feedback. n Secton., we show that prevously suggested model archtectures for competng populatons cannot generate persstent frng through dervatve feedback control. n Secton., we construct a new network model of two competng populatons. n the new network archtecture, we fnd condtons on the network parameters for dervatve feedback control and descrbe ts dynamcal features. n Secton.3, we show that a network model wth dervatve feedback s robust aganst the same types of perturbatons n network parameters consdered prevously. 3B.. Prevous models wth postve feedback n ths secton, we analyze prevously proposed short-term memory models wth two competng populatons (Fgs. 6a,b). Network nteractons n these prevous models medated postve feedback through recurrent exctaton wthn a populaton and mutual nhbton between the two populatons. Here, we show that these network archtectures cannot contan large 9 Nature Neuroscence: do:.38/nn.349

27 dervatve feedback n ther synaptc nteractons. The essence of the explanaton s as follows: n prevous models, nhbtory nputs are arranged as part of dsnhbtory loops that contrbute postve feedback to the system. Snce the total amount of postve feedback to each neuron should be balanced wth the ntrnsc leakness durng persstent actvty, the amount of exctatory and nhbtory nputs are bounded n such postve feedback models. Thus, the models cannot have the large balanced exctatory and nhbtory nputs requred for strong dervatve feedback. Below, we prove ths mathematcally. Frst, we consder a postve feedback model wth dsynaptc mutual nhbton as shown n Fg. 6a. t conssts of two populatons, each of whch conssts of exctatory and nhbtory sub-populatons. The nhbtory neurons receve nputs from exctatory neurons n the opposng populaton and nhbt the exctatory neurons n the same populaton. The system can be T descrbed by state varables y ( r, r, s, s, s, s, r, r, s, s, s, s ), where and stand for the exctatory and nhbtory populatons and the subscrpt or s the ndex of the populaton. To see that large exctatory and nhbtory nputs are not allowed, we smplfy the system by assumng that all varables except s and s have fast knetcs and approxmate them as achevng ther steady states nstantaneously. Then the system s descrbed by the followng equatons: r s s () t O r s s () t O r s s () t O r s s ( t) O s r for, j,,, except when j or j j j (A) s s r. s s r (A3) n the absence of external nput, when (t)=, the frng rates of the exctatory and nhbtory sub-populatons can be expressed n terms of the slow varables, s and s by solvng q.(a). Usng these expressons, we obtan two condtons for the system defned by q. X(A3)X to have one egenvalue close to zero and one negatve egenvalue so that the system mantans persstent actvty stably. The condtons are gven by / ( ) / ( ), (A4) Nature Neuroscence: do:.38/nn.349

1 Derivation of Rate Equations from Single-Cell Conductance (Hodgkin-Huxley-like) Equations

1 Derivation of Rate Equations from Single-Cell Conductance (Hodgkin-Huxley-like) Equations Physcs 171/271 -Davd Klenfeld - Fall 2005 (revsed Wnter 2011) 1 Dervaton of Rate Equatons from Sngle-Cell Conductance (Hodgkn-Huxley-lke) Equatons We consder a network of many neurons, each of whch obeys

More information

9 Derivation of Rate Equations from Single-Cell Conductance (Hodgkin-Huxley-like) Equations

9 Derivation of Rate Equations from Single-Cell Conductance (Hodgkin-Huxley-like) Equations Physcs 171/271 - Chapter 9R -Davd Klenfeld - Fall 2005 9 Dervaton of Rate Equatons from Sngle-Cell Conductance (Hodgkn-Huxley-lke) Equatons We consder a network of many neurons, each of whch obeys a set

More information

8 Derivation of Network Rate Equations from Single- Cell Conductance Equations

8 Derivation of Network Rate Equations from Single- Cell Conductance Equations Physcs 178/278 - Davd Klenfeld - Wnter 2015 8 Dervaton of Network Rate Equatons from Sngle- Cell Conductance Equatons We consder a network of many neurons, each of whch obeys a set of conductancebased,

More information

8 Derivation of Network Rate Equations from Single- Cell Conductance Equations

8 Derivation of Network Rate Equations from Single- Cell Conductance Equations Physcs 178/278 - Davd Klenfeld - Wnter 2019 8 Dervaton of Network Rate Equatons from Sngle- Cell Conductance Equatons Our goal to derve the form of the abstract quanttes n rate equatons, such as synaptc

More information

Transfer Functions. Convenient representation of a linear, dynamic model. A transfer function (TF) relates one input and one output: ( ) system

Transfer Functions. Convenient representation of a linear, dynamic model. A transfer function (TF) relates one input and one output: ( ) system Transfer Functons Convenent representaton of a lnear, dynamc model. A transfer functon (TF) relates one nput and one output: x t X s y t system Y s The followng termnology s used: x y nput output forcng

More information

The equation of motion of a dynamical system is given by a set of differential equations. That is (1)

The equation of motion of a dynamical system is given by a set of differential equations. That is (1) Dynamcal Systems Many engneerng and natural systems are dynamcal systems. For example a pendulum s a dynamcal system. State l The state of the dynamcal system specfes t condtons. For a pendulum n the absence

More information

Structure and Drive Paul A. Jensen Copyright July 20, 2003

Structure and Drive Paul A. Jensen Copyright July 20, 2003 Structure and Drve Paul A. Jensen Copyrght July 20, 2003 A system s made up of several operatons wth flow passng between them. The structure of the system descrbes the flow paths from nputs to outputs.

More information

NUMERICAL DIFFERENTIATION

NUMERICAL DIFFERENTIATION NUMERICAL DIFFERENTIATION 1 Introducton Dfferentaton s a method to compute the rate at whch a dependent output y changes wth respect to the change n the ndependent nput x. Ths rate of change s called the

More information

CHAPTER 5 NUMERICAL EVALUATION OF DYNAMIC RESPONSE

CHAPTER 5 NUMERICAL EVALUATION OF DYNAMIC RESPONSE CHAPTER 5 NUMERICAL EVALUATION OF DYNAMIC RESPONSE Analytcal soluton s usually not possble when exctaton vares arbtrarly wth tme or f the system s nonlnear. Such problems can be solved by numercal tmesteppng

More information

Week3, Chapter 4. Position and Displacement. Motion in Two Dimensions. Instantaneous Velocity. Average Velocity

Week3, Chapter 4. Position and Displacement. Motion in Two Dimensions. Instantaneous Velocity. Average Velocity Week3, Chapter 4 Moton n Two Dmensons Lecture Quz A partcle confned to moton along the x axs moves wth constant acceleraton from x =.0 m to x = 8.0 m durng a 1-s tme nterval. The velocty of the partcle

More information

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur Module 3 LOSSY IMAGE COMPRESSION SYSTEMS Verson ECE IIT, Kharagpur Lesson 6 Theory of Quantzaton Verson ECE IIT, Kharagpur Instructonal Objectves At the end of ths lesson, the students should be able to:

More information

Numerical Heat and Mass Transfer

Numerical Heat and Mass Transfer Master degree n Mechancal Engneerng Numercal Heat and Mass Transfer 06-Fnte-Dfference Method (One-dmensonal, steady state heat conducton) Fausto Arpno f.arpno@uncas.t Introducton Why we use models and

More information

Kernel Methods and SVMs Extension

Kernel Methods and SVMs Extension Kernel Methods and SVMs Extenson The purpose of ths document s to revew materal covered n Machne Learnng 1 Supervsed Learnng regardng support vector machnes (SVMs). Ths document also provdes a general

More information

3.1 Expectation of Functions of Several Random Variables. )' be a k-dimensional discrete or continuous random vector, with joint PMF p (, E X E X1 E X

3.1 Expectation of Functions of Several Random Variables. )' be a k-dimensional discrete or continuous random vector, with joint PMF p (, E X E X1 E X Statstcs 1: Probablty Theory II 37 3 EPECTATION OF SEVERAL RANDOM VARIABLES As n Probablty Theory I, the nterest n most stuatons les not on the actual dstrbuton of a random vector, but rather on a number

More information

Linear Approximation with Regularization and Moving Least Squares

Linear Approximation with Regularization and Moving Least Squares Lnear Approxmaton wth Regularzaton and Movng Least Squares Igor Grešovn May 007 Revson 4.6 (Revson : March 004). 5 4 3 0.5 3 3.5 4 Contents: Lnear Fttng...4. Weghted Least Squares n Functon Approxmaton...

More information

More metrics on cartesian products

More metrics on cartesian products More metrcs on cartesan products If (X, d ) are metrc spaces for 1 n, then n Secton II4 of the lecture notes we defned three metrcs on X whose underlyng topologes are the product topology The purpose of

More information

The Multiple Classical Linear Regression Model (CLRM): Specification and Assumptions. 1. Introduction

The Multiple Classical Linear Regression Model (CLRM): Specification and Assumptions. 1. Introduction ECONOMICS 5* -- NOTE (Summary) ECON 5* -- NOTE The Multple Classcal Lnear Regresson Model (CLRM): Specfcaton and Assumptons. Introducton CLRM stands for the Classcal Lnear Regresson Model. The CLRM s also

More information

Lectures - Week 4 Matrix norms, Conditioning, Vector Spaces, Linear Independence, Spanning sets and Basis, Null space and Range of a Matrix

Lectures - Week 4 Matrix norms, Conditioning, Vector Spaces, Linear Independence, Spanning sets and Basis, Null space and Range of a Matrix Lectures - Week 4 Matrx norms, Condtonng, Vector Spaces, Lnear Independence, Spannng sets and Bass, Null space and Range of a Matrx Matrx Norms Now we turn to assocatng a number to each matrx. We could

More information

Supplementary Notes for Chapter 9 Mixture Thermodynamics

Supplementary Notes for Chapter 9 Mixture Thermodynamics Supplementary Notes for Chapter 9 Mxture Thermodynamcs Key ponts Nne major topcs of Chapter 9 are revewed below: 1. Notaton and operatonal equatons for mxtures 2. PVTN EOSs for mxtures 3. General effects

More information

EEE 241: Linear Systems

EEE 241: Linear Systems EEE : Lnear Systems Summary #: Backpropagaton BACKPROPAGATION The perceptron rule as well as the Wdrow Hoff learnng were desgned to tran sngle layer networks. They suffer from the same dsadvantage: they

More information

Supplementary material 1

Supplementary material 1 Supplementary materal 1 to Correlated connectvty and the dstrbuton of frng rates n the neocortex by Alexe Koulakov, Tomas Hromadka, and Anthony M. Zador The emergence of log-normal dstrbuton n neural nets.

More information

Note 10. Modeling and Simulation of Dynamic Systems

Note 10. Modeling and Simulation of Dynamic Systems Lecture Notes of ME 475: Introducton to Mechatroncs Note 0 Modelng and Smulaton of Dynamc Systems Department of Mechancal Engneerng, Unversty Of Saskatchewan, 57 Campus Drve, Saskatoon, SK S7N 5A9, Canada

More information

Lecture 12: Discrete Laplacian

Lecture 12: Discrete Laplacian Lecture 12: Dscrete Laplacan Scrbe: Tanye Lu Our goal s to come up wth a dscrete verson of Laplacan operator for trangulated surfaces, so that we can use t n practce to solve related problems We are mostly

More information

ANSWERS. Problem 1. and the moment generating function (mgf) by. defined for any real t. Use this to show that E( U) var( U)

ANSWERS. Problem 1. and the moment generating function (mgf) by. defined for any real t. Use this to show that E( U) var( U) Econ 413 Exam 13 H ANSWERS Settet er nndelt 9 deloppgaver, A,B,C, som alle anbefales å telle lkt for å gøre det ltt lettere å stå. Svar er gtt . Unfortunately, there s a prntng error n the hnt of

More information

CHAPTER III Neural Networks as Associative Memory

CHAPTER III Neural Networks as Associative Memory CHAPTER III Neural Networs as Assocatve Memory Introducton One of the prmary functons of the bran s assocatve memory. We assocate the faces wth names, letters wth sounds, or we can recognze the people

More information

Additional Codes using Finite Difference Method. 1 HJB Equation for Consumption-Saving Problem Without Uncertainty

Additional Codes using Finite Difference Method. 1 HJB Equation for Consumption-Saving Problem Without Uncertainty Addtonal Codes usng Fnte Dfference Method Benamn Moll 1 HJB Equaton for Consumpton-Savng Problem Wthout Uncertanty Before consderng the case wth stochastc ncome n http://www.prnceton.edu/~moll/ HACTproect/HACT_Numercal_Appendx.pdf,

More information

Canonical transformations

Canonical transformations Canoncal transformatons November 23, 2014 Recall that we have defned a symplectc transformaton to be any lnear transformaton M A B leavng the symplectc form nvarant, Ω AB M A CM B DΩ CD Coordnate transformatons,

More information

Problem Set 9 Solutions

Problem Set 9 Solutions Desgn and Analyss of Algorthms May 4, 2015 Massachusetts Insttute of Technology 6.046J/18.410J Profs. Erk Demane, Srn Devadas, and Nancy Lynch Problem Set 9 Solutons Problem Set 9 Solutons Ths problem

More information

Module 9. Lecture 6. Duality in Assignment Problems

Module 9. Lecture 6. Duality in Assignment Problems Module 9 1 Lecture 6 Dualty n Assgnment Problems In ths lecture we attempt to answer few other mportant questons posed n earler lecture for (AP) and see how some of them can be explaned through the concept

More information

Chapter 13: Multiple Regression

Chapter 13: Multiple Regression Chapter 13: Multple Regresson 13.1 Developng the multple-regresson Model The general model can be descrbed as: It smplfes for two ndependent varables: The sample ft parameter b 0, b 1, and b are used to

More information

= z 20 z n. (k 20) + 4 z k = 4

= z 20 z n. (k 20) + 4 z k = 4 Problem Set #7 solutons 7.2.. (a Fnd the coeffcent of z k n (z + z 5 + z 6 + z 7 + 5, k 20. We use the known seres expanson ( n+l ( z l l z n below: (z + z 5 + z 6 + z 7 + 5 (z 5 ( + z + z 2 + z + 5 5

More information

Errors for Linear Systems

Errors for Linear Systems Errors for Lnear Systems When we solve a lnear system Ax b we often do not know A and b exactly, but have only approxmatons  and ˆb avalable. Then the best thng we can do s to solve ˆx ˆb exactly whch

More information

Prof. Dr. I. Nasser Phys 630, T Aug-15 One_dimensional_Ising_Model

Prof. Dr. I. Nasser Phys 630, T Aug-15 One_dimensional_Ising_Model EXACT OE-DIMESIOAL ISIG MODEL The one-dmensonal Isng model conssts of a chan of spns, each spn nteractng only wth ts two nearest neghbors. The smple Isng problem n one dmenson can be solved drectly n several

More information

LINEAR REGRESSION ANALYSIS. MODULE IX Lecture Multicollinearity

LINEAR REGRESSION ANALYSIS. MODULE IX Lecture Multicollinearity LINEAR REGRESSION ANALYSIS MODULE IX Lecture - 30 Multcollnearty Dr. Shalabh Department of Mathematcs and Statstcs Indan Insttute of Technology Kanpur 2 Remedes for multcollnearty Varous technques have

More information

Composite Hypotheses testing

Composite Hypotheses testing Composte ypotheses testng In many hypothess testng problems there are many possble dstrbutons that can occur under each of the hypotheses. The output of the source s a set of parameters (ponts n a parameter

More information

ELASTIC WAVE PROPAGATION IN A CONTINUOUS MEDIUM

ELASTIC WAVE PROPAGATION IN A CONTINUOUS MEDIUM ELASTIC WAVE PROPAGATION IN A CONTINUOUS MEDIUM An elastc wave s a deformaton of the body that travels throughout the body n all drectons. We can examne the deformaton over a perod of tme by fxng our look

More information

2.3 Nilpotent endomorphisms

2.3 Nilpotent endomorphisms s a block dagonal matrx, wth A Mat dm U (C) In fact, we can assume that B = B 1 B k, wth B an ordered bass of U, and that A = [f U ] B, where f U : U U s the restrcton of f to U 40 23 Nlpotent endomorphsms

More information

Salmon: Lectures on partial differential equations. Consider the general linear, second-order PDE in the form. ,x 2

Salmon: Lectures on partial differential equations. Consider the general linear, second-order PDE in the form. ,x 2 Salmon: Lectures on partal dfferental equatons 5. Classfcaton of second-order equatons There are general methods for classfyng hgher-order partal dfferental equatons. One s very general (applyng even to

More information

Convergence of random processes

Convergence of random processes DS-GA 12 Lecture notes 6 Fall 216 Convergence of random processes 1 Introducton In these notes we study convergence of dscrete random processes. Ths allows to characterze phenomena such as the law of large

More information

Digital Signal Processing

Digital Signal Processing Dgtal Sgnal Processng Dscrete-tme System Analyss Manar Mohasen Offce: F8 Emal: manar.subh@ut.ac.r School of IT Engneerng Revew of Precedent Class Contnuous Sgnal The value of the sgnal s avalable over

More information

x = , so that calculated

x = , so that calculated Stat 4, secton Sngle Factor ANOVA notes by Tm Plachowsk n chapter 8 we conducted hypothess tests n whch we compared a sngle sample s mean or proporton to some hypotheszed value Chapter 9 expanded ths to

More information

Notes on Frequency Estimation in Data Streams

Notes on Frequency Estimation in Data Streams Notes on Frequency Estmaton n Data Streams In (one of) the data streamng model(s), the data s a sequence of arrvals a 1, a 2,..., a m of the form a j = (, v) where s the dentty of the tem and belongs to

More information

APPENDIX A Some Linear Algebra

APPENDIX A Some Linear Algebra APPENDIX A Some Lnear Algebra The collecton of m, n matrces A.1 Matrces a 1,1,..., a 1,n A = a m,1,..., a m,n wth real elements a,j s denoted by R m,n. If n = 1 then A s called a column vector. Smlarly,

More information

Pulse Coded Modulation

Pulse Coded Modulation Pulse Coded Modulaton PCM (Pulse Coded Modulaton) s a voce codng technque defned by the ITU-T G.711 standard and t s used n dgtal telephony to encode the voce sgnal. The frst step n the analog to dgtal

More information

AGC Introduction

AGC Introduction . Introducton AGC 3 The prmary controller response to a load/generaton mbalance results n generaton adjustment so as to mantan load/generaton balance. However, due to droop, t also results n a non-zero

More information

COMPARISON OF SOME RELIABILITY CHARACTERISTICS BETWEEN REDUNDANT SYSTEMS REQUIRING SUPPORTING UNITS FOR THEIR OPERATIONS

COMPARISON OF SOME RELIABILITY CHARACTERISTICS BETWEEN REDUNDANT SYSTEMS REQUIRING SUPPORTING UNITS FOR THEIR OPERATIONS Avalable onlne at http://sck.org J. Math. Comput. Sc. 3 (3), No., 6-3 ISSN: 97-537 COMPARISON OF SOME RELIABILITY CHARACTERISTICS BETWEEN REDUNDANT SYSTEMS REQUIRING SUPPORTING UNITS FOR THEIR OPERATIONS

More information

Maximizing the number of nonnegative subsets

Maximizing the number of nonnegative subsets Maxmzng the number of nonnegatve subsets Noga Alon Hao Huang December 1, 213 Abstract Gven a set of n real numbers, f the sum of elements of every subset of sze larger than k s negatve, what s the maxmum

More information

Difference Equations

Difference Equations Dfference Equatons c Jan Vrbk 1 Bascs Suppose a sequence of numbers, say a 0,a 1,a,a 3,... s defned by a certan general relatonshp between, say, three consecutve values of the sequence, e.g. a + +3a +1

More information

Inductance Calculation for Conductors of Arbitrary Shape

Inductance Calculation for Conductors of Arbitrary Shape CRYO/02/028 Aprl 5, 2002 Inductance Calculaton for Conductors of Arbtrary Shape L. Bottura Dstrbuton: Internal Summary In ths note we descrbe a method for the numercal calculaton of nductances among conductors

More information

Chapter 5. Solution of System of Linear Equations. Module No. 6. Solution of Inconsistent and Ill Conditioned Systems

Chapter 5. Solution of System of Linear Equations. Module No. 6. Solution of Inconsistent and Ill Conditioned Systems Numercal Analyss by Dr. Anta Pal Assstant Professor Department of Mathematcs Natonal Insttute of Technology Durgapur Durgapur-713209 emal: anta.bue@gmal.com 1 . Chapter 5 Soluton of System of Lnear Equatons

More information

LECTURE 9 CANONICAL CORRELATION ANALYSIS

LECTURE 9 CANONICAL CORRELATION ANALYSIS LECURE 9 CANONICAL CORRELAION ANALYSIS Introducton he concept of canoncal correlaton arses when we want to quantfy the assocatons between two sets of varables. For example, suppose that the frst set of

More information

Model of Neurons. CS 416 Artificial Intelligence. Early History of Neural Nets. Cybernetics. McCulloch-Pitts Neurons. Hebbian Modification.

Model of Neurons. CS 416 Artificial Intelligence. Early History of Neural Nets. Cybernetics. McCulloch-Pitts Neurons. Hebbian Modification. Page 1 Model of Neurons CS 416 Artfcal Intellgence Lecture 18 Neural Nets Chapter 20 Multple nputs/dendrtes (~10,000!!!) Cell body/soma performs computaton Sngle output/axon Computaton s typcally modeled

More information

Lecture Notes on Linear Regression

Lecture Notes on Linear Regression Lecture Notes on Lnear Regresson Feng L fl@sdueducn Shandong Unversty, Chna Lnear Regresson Problem In regresson problem, we am at predct a contnuous target value gven an nput feature vector We assume

More information

Implicit Integration Henyey Method

Implicit Integration Henyey Method Implct Integraton Henyey Method In realstc stellar evoluton codes nstead of a drect ntegraton usng for example the Runge-Kutta method one employs an teratve mplct technque. Ths s because the structure

More information

The Order Relation and Trace Inequalities for. Hermitian Operators

The Order Relation and Trace Inequalities for. Hermitian Operators Internatonal Mathematcal Forum, Vol 3, 08, no, 507-57 HIKARI Ltd, wwwm-hkarcom https://doorg/0988/mf088055 The Order Relaton and Trace Inequaltes for Hermtan Operators Y Huang School of Informaton Scence

More information

Rhythmic activity in neuronal ensembles in the presence of conduction delays

Rhythmic activity in neuronal ensembles in the presence of conduction delays Rhythmc actvty n neuronal ensembles n the presence of conducton delays Crstna Masoller Carme Torrent, Jord García Ojalvo Departament de Fsca Engnyera Nuclear Unverstat Poltecnca de Catalunya, Terrassa,

More information

C/CS/Phy191 Problem Set 3 Solutions Out: Oct 1, 2008., where ( 00. ), so the overall state of the system is ) ( ( ( ( 00 ± 11 ), Φ ± = 1

C/CS/Phy191 Problem Set 3 Solutions Out: Oct 1, 2008., where ( 00. ), so the overall state of the system is ) ( ( ( ( 00 ± 11 ), Φ ± = 1 C/CS/Phy9 Problem Set 3 Solutons Out: Oct, 8 Suppose you have two qubts n some arbtrary entangled state ψ You apply the teleportaton protocol to each of the qubts separately What s the resultng state obtaned

More information

U.C. Berkeley CS294: Spectral Methods and Expanders Handout 8 Luca Trevisan February 17, 2016

U.C. Berkeley CS294: Spectral Methods and Expanders Handout 8 Luca Trevisan February 17, 2016 U.C. Berkeley CS94: Spectral Methods and Expanders Handout 8 Luca Trevsan February 7, 06 Lecture 8: Spectral Algorthms Wrap-up In whch we talk about even more generalzatons of Cheeger s nequaltes, and

More information

Temperature. Chapter Heat Engine

Temperature. Chapter Heat Engine Chapter 3 Temperature In prevous chapters of these notes we ntroduced the Prncple of Maxmum ntropy as a technque for estmatng probablty dstrbutons consstent wth constrants. In Chapter 9 we dscussed the

More information

Section 8.3 Polar Form of Complex Numbers

Section 8.3 Polar Form of Complex Numbers 80 Chapter 8 Secton 8 Polar Form of Complex Numbers From prevous classes, you may have encountered magnary numbers the square roots of negatve numbers and, more generally, complex numbers whch are the

More information

j) = 1 (note sigma notation) ii. Continuous random variable (e.g. Normal distribution) 1. density function: f ( x) 0 and f ( x) dx = 1

j) = 1 (note sigma notation) ii. Continuous random variable (e.g. Normal distribution) 1. density function: f ( x) 0 and f ( x) dx = 1 Random varables Measure of central tendences and varablty (means and varances) Jont densty functons and ndependence Measures of assocaton (covarance and correlaton) Interestng result Condtonal dstrbutons

More information

SUPPLEMENTARY INFORMATION

SUPPLEMENTARY INFORMATION do: 0.08/nature09 I. Resonant absorpton of XUV pulses n Kr + usng the reduced densty matrx approach The quantum beats nvestgated n ths paper are the result of nterference between two exctaton paths of

More information

The Geometry of Logit and Probit

The Geometry of Logit and Probit The Geometry of Logt and Probt Ths short note s meant as a supplement to Chapters and 3 of Spatal Models of Parlamentary Votng and the notaton and reference to fgures n the text below s to those two chapters.

More information

MMA and GCMMA two methods for nonlinear optimization

MMA and GCMMA two methods for nonlinear optimization MMA and GCMMA two methods for nonlnear optmzaton Krster Svanberg Optmzaton and Systems Theory, KTH, Stockholm, Sweden. krlle@math.kth.se Ths note descrbes the algorthms used n the author s 2007 mplementatons

More information

Linear Regression Analysis: Terminology and Notation

Linear Regression Analysis: Terminology and Notation ECON 35* -- Secton : Basc Concepts of Regresson Analyss (Page ) Lnear Regresson Analyss: Termnology and Notaton Consder the generc verson of the smple (two-varable) lnear regresson model. It s represented

More information

Introduction to Vapor/Liquid Equilibrium, part 2. Raoult s Law:

Introduction to Vapor/Liquid Equilibrium, part 2. Raoult s Law: CE304, Sprng 2004 Lecture 4 Introducton to Vapor/Lqud Equlbrum, part 2 Raoult s Law: The smplest model that allows us do VLE calculatons s obtaned when we assume that the vapor phase s an deal gas, and

More information

Resource Allocation and Decision Analysis (ECON 8010) Spring 2014 Foundations of Regression Analysis

Resource Allocation and Decision Analysis (ECON 8010) Spring 2014 Foundations of Regression Analysis Resource Allocaton and Decson Analss (ECON 800) Sprng 04 Foundatons of Regresson Analss Readng: Regresson Analss (ECON 800 Coursepak, Page 3) Defntons and Concepts: Regresson Analss statstcal technques

More information

Mathematical Equivalence of Two Common Forms of Firing Rate Models of Neural Networks

Mathematical Equivalence of Two Common Forms of Firing Rate Models of Neural Networks NOTE Communcated by Terrence Sejnowsk Mathematcal Equvalence of Two Common Forms of Frng Rate Models of Neural Networks Kenneth D Mller ken@neurotheorycolumbaedu Center for Theoretcal Neuroscence, Dept

More information

Open Systems: Chemical Potential and Partial Molar Quantities Chemical Potential

Open Systems: Chemical Potential and Partial Molar Quantities Chemical Potential Open Systems: Chemcal Potental and Partal Molar Quanttes Chemcal Potental For closed systems, we have derved the followng relatonshps: du = TdS pdv dh = TdS + Vdp da = SdT pdv dg = VdP SdT For open systems,

More information

Lecture 4. Instructor: Haipeng Luo

Lecture 4. Instructor: Haipeng Luo Lecture 4 Instructor: Hapeng Luo In the followng lectures, we focus on the expert problem and study more adaptve algorthms. Although Hedge s proven to be worst-case optmal, one may wonder how well t would

More information

( ) = ( ) + ( 0) ) ( )

( ) = ( ) + ( 0) ) ( ) EETOMAGNETI OMPATIBIITY HANDBOOK 1 hapter 9: Transent Behavor n the Tme Doman 9.1 Desgn a crcut usng reasonable values for the components that s capable of provdng a tme delay of 100 ms to a dgtal sgnal.

More information

FTCS Solution to the Heat Equation

FTCS Solution to the Heat Equation FTCS Soluton to the Heat Equaton ME 448/548 Notes Gerald Recktenwald Portland State Unversty Department of Mechancal Engneerng gerry@pdx.edu ME 448/548: FTCS Soluton to the Heat Equaton Overvew 1. Use

More information

Stanford University CS359G: Graph Partitioning and Expanders Handout 4 Luca Trevisan January 13, 2011

Stanford University CS359G: Graph Partitioning and Expanders Handout 4 Luca Trevisan January 13, 2011 Stanford Unversty CS359G: Graph Parttonng and Expanders Handout 4 Luca Trevsan January 3, 0 Lecture 4 In whch we prove the dffcult drecton of Cheeger s nequalty. As n the past lectures, consder an undrected

More information

Hila Etzion. Min-Seok Pang

Hila Etzion. Min-Seok Pang RESERCH RTICLE COPLEENTRY ONLINE SERVICES IN COPETITIVE RKETS: INTINING PROFITILITY IN THE PRESENCE OF NETWORK EFFECTS Hla Etzon Department of Technology and Operatons, Stephen. Ross School of usness,

More information

Solving Nonlinear Differential Equations by a Neural Network Method

Solving Nonlinear Differential Equations by a Neural Network Method Solvng Nonlnear Dfferental Equatons by a Neural Network Method Luce P. Aarts and Peter Van der Veer Delft Unversty of Technology, Faculty of Cvlengneerng and Geoscences, Secton of Cvlengneerng Informatcs,

More information

Physics 5153 Classical Mechanics. Principle of Virtual Work-1

Physics 5153 Classical Mechanics. Principle of Virtual Work-1 P. Guterrez 1 Introducton Physcs 5153 Classcal Mechancs Prncple of Vrtual Work The frst varatonal prncple we encounter n mechancs s the prncple of vrtual work. It establshes the equlbrum condton of a mechancal

More information

ECE559VV Project Report

ECE559VV Project Report ECE559VV Project Report (Supplementary Notes Loc Xuan Bu I. MAX SUM-RATE SCHEDULING: THE UPLINK CASE We have seen (n the presentaton that, for downlnk (broadcast channels, the strategy maxmzng the sum-rate

More information

Physics 5153 Classical Mechanics. D Alembert s Principle and The Lagrangian-1

Physics 5153 Classical Mechanics. D Alembert s Principle and The Lagrangian-1 P. Guterrez Physcs 5153 Classcal Mechancs D Alembert s Prncple and The Lagrangan 1 Introducton The prncple of vrtual work provdes a method of solvng problems of statc equlbrum wthout havng to consder the

More information

Appendix B. The Finite Difference Scheme

Appendix B. The Finite Difference Scheme 140 APPENDIXES Appendx B. The Fnte Dfference Scheme In ths appendx we present numercal technques whch are used to approxmate solutons of system 3.1 3.3. A comprehensve treatment of theoretcal and mplementaton

More information

NON-CENTRAL 7-POINT FORMULA IN THE METHOD OF LINES FOR PARABOLIC AND BURGERS' EQUATIONS

NON-CENTRAL 7-POINT FORMULA IN THE METHOD OF LINES FOR PARABOLIC AND BURGERS' EQUATIONS IJRRAS 8 (3 September 011 www.arpapress.com/volumes/vol8issue3/ijrras_8_3_08.pdf NON-CENTRAL 7-POINT FORMULA IN THE METHOD OF LINES FOR PARABOLIC AND BURGERS' EQUATIONS H.O. Bakodah Dept. of Mathematc

More information

2E Pattern Recognition Solutions to Introduction to Pattern Recognition, Chapter 2: Bayesian pattern classification

2E Pattern Recognition Solutions to Introduction to Pattern Recognition, Chapter 2: Bayesian pattern classification E395 - Pattern Recognton Solutons to Introducton to Pattern Recognton, Chapter : Bayesan pattern classfcaton Preface Ths document s a soluton manual for selected exercses from Introducton to Pattern Recognton

More information

VQ widely used in coding speech, image, and video

VQ widely used in coding speech, image, and video at Scalar quantzers are specal cases of vector quantzers (VQ): they are constraned to look at one sample at a tme (memoryless) VQ does not have such constrant better RD perfomance expected Source codng

More information

2016 Wiley. Study Session 2: Ethical and Professional Standards Application

2016 Wiley. Study Session 2: Ethical and Professional Standards Application 6 Wley Study Sesson : Ethcal and Professonal Standards Applcaton LESSON : CORRECTION ANALYSIS Readng 9: Correlaton and Regresson LOS 9a: Calculate and nterpret a sample covarance and a sample correlaton

More information

The Hopfield model. 1 The Hebbian paradigm. Sebastian Seung Lecture 15: November 7, 2002

The Hopfield model. 1 The Hebbian paradigm. Sebastian Seung Lecture 15: November 7, 2002 MIT Department of Bran and Cogntve Scences 9.29J, Sprng 2004 - Introducton to Computatonal euroscence Instructor: Professor Sebastan Seung The Hopfeld model Sebastan Seung 9.64 Lecture 5: ovember 7, 2002

More information

Psychology 282 Lecture #24 Outline Regression Diagnostics: Outliers

Psychology 282 Lecture #24 Outline Regression Diagnostics: Outliers Psychology 282 Lecture #24 Outlne Regresson Dagnostcs: Outlers In an earler lecture we studed the statstcal assumptons underlyng the regresson model, ncludng the followng ponts: Formal statement of assumptons.

More information

Solution Thermodynamics

Solution Thermodynamics Soluton hermodynamcs usng Wagner Notaton by Stanley. Howard Department of aterals and etallurgcal Engneerng South Dakota School of nes and echnology Rapd Cty, SD 57701 January 7, 001 Soluton hermodynamcs

More information

Random Walks on Digraphs

Random Walks on Digraphs Random Walks on Dgraphs J. J. P. Veerman October 23, 27 Introducton Let V = {, n} be a vertex set and S a non-negatve row-stochastc matrx (.e. rows sum to ). V and S defne a dgraph G = G(V, S) and a drected

More information

The optimal delay of the second test is therefore approximately 210 hours earlier than =2.

The optimal delay of the second test is therefore approximately 210 hours earlier than =2. THE IEC 61508 FORMULAS 223 The optmal delay of the second test s therefore approxmately 210 hours earler than =2. 8.4 The IEC 61508 Formulas IEC 61508-6 provdes approxmaton formulas for the PF for smple

More information

COMPLEX NUMBERS AND QUADRATIC EQUATIONS

COMPLEX NUMBERS AND QUADRATIC EQUATIONS COMPLEX NUMBERS AND QUADRATIC EQUATIONS INTRODUCTION We know that x 0 for all x R e the square of a real number (whether postve, negatve or ero) s non-negatve Hence the equatons x, x, x + 7 0 etc are not

More information

Chapter Newton s Method

Chapter Newton s Method Chapter 9. Newton s Method After readng ths chapter, you should be able to:. Understand how Newton s method s dfferent from the Golden Secton Search method. Understand how Newton s method works 3. Solve

More information

Econ107 Applied Econometrics Topic 3: Classical Model (Studenmund, Chapter 4)

Econ107 Applied Econometrics Topic 3: Classical Model (Studenmund, Chapter 4) I. Classcal Assumptons Econ7 Appled Econometrcs Topc 3: Classcal Model (Studenmund, Chapter 4) We have defned OLS and studed some algebrac propertes of OLS. In ths topc we wll study statstcal propertes

More information

Uncertainty in measurements of power and energy on power networks

Uncertainty in measurements of power and energy on power networks Uncertanty n measurements of power and energy on power networks E. Manov, N. Kolev Department of Measurement and Instrumentaton, Techncal Unversty Sofa, bul. Klment Ohrdsk No8, bl., 000 Sofa, Bulgara Tel./fax:

More information

Time-Varying Systems and Computations Lecture 6

Time-Varying Systems and Computations Lecture 6 Tme-Varyng Systems and Computatons Lecture 6 Klaus Depold 14. Januar 2014 The Kalman Flter The Kalman estmaton flter attempts to estmate the actual state of an unknown dscrete dynamcal system, gven nosy

More information

Physics 53. Rotational Motion 3. Sir, I have found you an argument, but I am not obliged to find you an understanding.

Physics 53. Rotational Motion 3. Sir, I have found you an argument, but I am not obliged to find you an understanding. Physcs 53 Rotatonal Moton 3 Sr, I have found you an argument, but I am not oblged to fnd you an understandng. Samuel Johnson Angular momentum Wth respect to rotatonal moton of a body, moment of nerta plays

More information

Feature Selection: Part 1

Feature Selection: Part 1 CSE 546: Machne Learnng Lecture 5 Feature Selecton: Part 1 Instructor: Sham Kakade 1 Regresson n the hgh dmensonal settng How do we learn when the number of features d s greater than the sample sze n?

More information

Remarks on the Properties of a Quasi-Fibonacci-like Polynomial Sequence

Remarks on the Properties of a Quasi-Fibonacci-like Polynomial Sequence Remarks on the Propertes of a Quas-Fbonacc-lke Polynomal Sequence Brce Merwne LIU Brooklyn Ilan Wenschelbaum Wesleyan Unversty Abstract Consder the Quas-Fbonacc-lke Polynomal Sequence gven by F 0 = 1,

More information

Lecture Note 3. Eshelby s Inclusion II

Lecture Note 3. Eshelby s Inclusion II ME340B Elastcty of Mcroscopc Structures Stanford Unversty Wnter 004 Lecture Note 3. Eshelby s Incluson II Chrs Wenberger and We Ca c All rghts reserved January 6, 004 Contents 1 Incluson energy n an nfnte

More information

Chapter 8 Indicator Variables

Chapter 8 Indicator Variables Chapter 8 Indcator Varables In general, e explanatory varables n any regresson analyss are assumed to be quanttatve n nature. For example, e varables lke temperature, dstance, age etc. are quanttatve n

More information

Uncertainty and auto-correlation in. Measurement

Uncertainty and auto-correlation in. Measurement Uncertanty and auto-correlaton n arxv:1707.03276v2 [physcs.data-an] 30 Dec 2017 Measurement Markus Schebl Federal Offce of Metrology and Surveyng (BEV), 1160 Venna, Austra E-mal: markus.schebl@bev.gv.at

More information

Week 5: Neural Networks

Week 5: Neural Networks Week 5: Neural Networks Instructor: Sergey Levne Neural Networks Summary In the prevous lecture, we saw how we can construct neural networks by extendng logstc regresson. Neural networks consst of multple

More information