Weierstraß-Institut. für Angewandte Analysis und Stochastik. Leibniz-Institut im Forschungsverbund Berlin e. V. Preprint ISSN

Size: px
Start display at page:

Download "Weierstraß-Institut. für Angewandte Analysis und Stochastik. Leibniz-Institut im Forschungsverbund Berlin e. V. Preprint ISSN"

Transcription

1 Weierstraß-Institut für Anewandte Analysis und Stochastik Leibniz-Institut im Forschunsverbund Berlin e. V. Preprint ISSN Neurons death and rebirth in sparse heteroeneous inhibitory networks David Anulo-Garcia, 2, 3, Stefano Luccioli 4, 5, Simona Olmi, 4, 6, Alessandro Torcini, 2, 3, 4 submitted: September 26, 26 Aix Marseille Univ, Inserm, INS Institut de Neurosciences des Systémes Marseille, France 2 Aix Marseille Univ, Inserm, INMED Institute de Neurobioloie de la Méditerranée Marseille, France 3 Aix-Marseille Université Université de Toulon, CNRS, CPT, UMR 7332 Marseille, France 4 CNR - Consilio Nazionale delle Ricerche Istituto dei Sistemi Complessi 59 Sesto Fiorentino, Italy 5 INFN - Istituto Nazionale di Fisica Nucleare Sezione di Firenze 59 Sesto Fiorentino, Italy 6 Weierstrass Institute Mohrenstr. 39, 7 Berlin, Germany simona.olmi@wias-berlin.de No. 236 Berlin Physics and Astronomy Classification Scheme lj,5.45.Xt,87.9.lm. Key words and phrases. Pulse-coupled heteroeneous inhibitory neural networks. This work has been partially supported by the European Commission under the proram Marie Curie Network for Initial Trainin", throuh the project N , Neural Enineerin Transformative Technoloies (NETT)"(D.A.-G., S.O., and A.T), by the A MIDEX rant (No. ANR--IDEX--2) funded by the French Government proram Investissements d Avenir (D.A.-G. and A.T.), and by Departamento Adminsitrativo de Ciencia Tecnoloia e Innovacion - Colciencias"throuh the proram Doctorados en el exterior - 23"(D.A.-G.). This work has been completed at Max Planck Institute for the Physics of Complex Systems in Dresden (Germany) as part of the activity of the Advanced Study Group 26 entitled "From Microscopic to Collective Dynamics in Neural Circuits".

2 Edited by Weierstraß-Institut für Anewandte Analysis und Stochastik (WIAS) Leibniz-Institut im Forschunsverbund Berlin e. V. Mohrenstraße 39 7 Berlin Germany Fax: preprint@wias-berlin.de World Wide Web:

3 Abstract Inhibition is a key aspect of neural dynamics playin a fundamental role for the emerence of neural rhythms and the implementation of various information codin strateies. Inhibitory populations are present in several brain structures and the comprehension of their dynamics is strateical for the understandin of neural processin. In this paper, we discuss a eneral mechanism present in pulse-coupled heteroeneous inhibitory networks: inhibition can induce not only suppression of the neural activity, as expected, but it can also promote neural reactivation. In particular, for lobally coupled systems, the number of firin neurons monotonically reduces upon increasin the strenth of inhibition (neurons death). The introduction of a sparse connectivity in the network is able to reverse the action of inhibition, i.e. a sufficiently stron synaptic strenth can surprisinly promote, rather than depress, the activity of the neurons (neurons rebirth). Specifically, for small synaptic strenths, one observes an asynchronous activity of nearly independent supra-threshold neurons. By increasin the inhibition, a transition occurs towards a reime where the neurons are all effectively sub-threshold and their irreular firin is driven by current fluctuations. We explain this transition from a mean-driven to a fluctuation-driven reime by derivin an analytic mean field approach able to provide the fraction of active neurons toether with the first two moments of the firin time distribution. We show that, by varyin the synaptic time scale, the mechanism underlyin the reported phenomenon remains unchaned. However, for sufficiently slow synapses the effect becomes dramatic. For small synaptic couplin the fraction of active neurons is frozen over lon times and their firin activity is perfectly reular. For larer inhibition the active neurons display an irreular burstin behaviour induced by the emerence of correlations in the current fluctuations. In this latter reime the model ives predictions consistent with experimental findins for a specific class of neurons, namely the medium spiny neurons in the striatum. Introduction The presence of inhibition in excitable systems induces a rich dynamical repertoire, which is extremely relevant for bioloical [4], physical [36] and chemical systems [87]. In particular, inhibitory couplin has been invoked to explain cell naviation [9], morphoenesis in animal coat pattern formation [49], and the rhythmic activity of central pattern enerators in many bioloical systems [3, 48]. In brain circuits the role of inhibition is fundamental to balance massive recurrent excitation [76] in order to enerate physioloically relevant cortical rhythms [75, 2]. In particular, γ-rhythm can emere as collective oscillations in hyppocampal and neocortical networks of mutually inhibitory interneurons when tonically excited [34, 3], spindle oscillations durin early sleep stae are initiated by the the inhibitory action of thalamic reticular neurons [77], and rhythmic whiskin in rodents is controlled by inhibition [9]. Inhibitory networks are important not only for the emerence of rhythms in the brain, but also for the fundamental role they play in information encodin in the olfactory system [43] as well as in controllin and reulatin motor and learnin activity in the basal anlia [5, 5, 6]. Furthermore, stimulus dependent sequential activation of neurons or roup of neurons, reported for asymmetrically connected inhibitory cells [55, 37], has been suested as a possible mechanism to explain sequential memory storae and feature bindin [7]. These explain the lon term interest for numerical and theoretical investiations of the dynamics of inhibitory networks. Already the study of lobally coupled homoeneous systems revealed

4 a).8.6 n A.4.2. b).9 n A.8.7. Fiure : Fraction of active neurons n A as a function of the inhibitory synaptic strenth for a lobally coupled system (a), where K = N, and a sparsely connected network with K = 2 (b). In a) is reported the asymptotic value n A calculated after a time t s = 6. Conversely in b), n A is reported at successive times: namely, t s = 985 (red squares), t s =. 4 (brown stars), t s = 5 5 (blue diamonds) and t s = 6 (reen trianles). These values have been also averaed over random realizations of the network. The reported data refer to a system size N = 4 and an uniform distribution P(I) with [l, : l 2 ] = [. :.5] and θ =. 2

5 interestin dynamical features, ranin from full synchronization to clusterin appearance [26, 86, 88], from the emerence of splay states [93] to oscillator death [6]. The introduction of disorder, e.. random dilution, noise or other form of heteroeneity, in these systems leads to more complex dynamics, ranin from fast lobal oscillations [9] in neural networks and selfsustained activity in excitable systems [4], to irreular dynamics [93, 32, 33, 52, 45, 3, 29, 35, 85]. In particular, inhibitory spikin networks, due to stable chaos [66], can display extremely lon erratic transients even in linearly stable reimes [93, 92, 32, 33, 52, 3, 85, 45]. One of the most studied inhibitory neural population is represented by medium spiny neurons (MSNs) in the striatum (which is the main input structure of the basal anlia) [6, 4]. In a series of papers Ponzi and Wickens have shown that the main features of the MSN dynamics can be reproduced by considerin a sparsely connected inhibitory network of conductance based neurons subject to external stochastic excitatory inputs [67, 68, 69]. Our study has been motivated by an interestin phenomenon reported for this model in [69]: namely, upon increasin the synaptic strenth the system passes from a reularly firin reime, characterized by a lare part of quiescent neurons, to a bioloically relevant reime where almost all cells exhibit a burstin activity, characterized by an alternation of periods of silence and of hih firin. The same phenomenoloy has been recently reproduced by employin a much simpler model []. Thus suestin that this phenomenon is not related to the specific neural model employed, but it is indeed a quite eneral property of inhibitory networks. In order to exemplify the problem addressed in this paper we report in Fi. the fraction of active neurons n A (i.e. the ones emittin at least one spike durin the simulation time) as a function of the strenth of the synaptic inhibition in an heteroenous network. For a fully coupled network, n A has a monotonic decrease with (Fi. (a)), while for a sparse network n A has a non monotonic behaviour, displayin a minimum at an intermediate strenth m (Fi. (b)). In fully coupled networks the effect of inhibition is simply to reduce the number of active neurons (neurons death). However, quite counter-intuitively, in presence of dilution by increasin the synaptic strenth the previously silenced neurons can return to fire (neurons rebirth). Our aim is to clarify which are the physical mechanisms underlyin neurons death and rebirth in inhibitory networks for different synaptic time scales. In particular, we consider a deterministic network of purely inhibitory pulse-coupled Leaky Interateand-Fire (LIF) neurons with an heteroeneous distribution of excitatory DC currents, accountin for the different level of excitability of the neurons. The evolution of this model is studied for fully coupled and sparse topoloy, as well as for synapses with different time courses. For the fully coupled case, it is possible to derive, within a self-consistent mean field approach, the analytic expressions for the fraction of active neurons and of the averae firin frequency ν as a function of the couplin strenth. In this case the monotonic decrease of n A with can be interpreted as a Winner Takes All (WTA) mechanism [23, 7, 9], where only the the most excitable neurons survive to the inhibition increase. For sparse networks, the neurons rebirth can be interpreted as a re-activation process induced by erratic fluctuations in the synaptic currents. Within this framework it is possible to obtain analytically, for instantaneous synapses, a closed set of equations for n A as well as for the averae firin rate and coefficient of variation as a function of the couplin strenth. In particular, the firin statistics of the network can be obtained via a meanfield approach by extendin the formulation derived in [73] to account for synaptic shot noise with constant amplitude. The introduction of a finite synaptic time scale do not modify the overall scenario as far as this is shorter than the membrane time constant. As soon as the synaptic dynamics becomes slower, the phenomenoloy of the transition is modified. At < m we have a frozen phase where n A does not evolve in time on the explored time scales, since the current fluctuations are neliible. Above m we have a burstin reime, which can be related to the 3

6 emerence of correlated fluctuations as discussed within the adiabatic approach in [53, 54]. The paper is oranized as follows: In Sect. 2 we present the models that will be considered in the paper as well as the methods adopted to characterize its dynamics. In Sect. 3 we consider the lobally coupled network where we provide self-consistent expressions accountin for the fraction of active neurons and the averae firin rate. Section 4 is devoted to the study of sparsely connected networks with instantaneous synapses and to the analytic derivation of the set of self-consistent equations providin n A, the averae firin rate and the coefficient of variation. In section 5 we discuss the effect of synaptic filterin with a particular attention on slow synapses. Finally in Sect. 6 we briefly discuss the obtained results with a focus on the bioloical relevance of our model. 2 Model and Methods We examine the dynamical properties of an heteroeneous inhibitory sparse network made of N LIF neurons. The time evolution of the membrane potential v i of the i-th neuron is ruled by the followin first order ordinary differential equation: v i (t) = I i v i (t) E i (t) ; () where > is the inhibitory synaptic strenth, I i is the neuronal excitability of the i-th neuron encompassin both the intrinsic neuronal properties and the excitatory stimuli oriinatin from areas outside the considered neural circuit and E i (t) represents the synaptic current due to the recurrent interactions within the considered network. The membrane potential v i of neuron i evolves accordinly to Eq. until it overcomes a constant threshold θ =, this leads to the emission of a spike (action potential) transmitted to all the connected post-synaptic neurons, while v i is reset to its rest value v r =. The model in is expressed in adimensional units, this amounts to assume a membrane time constant τ m =, for the conversion to dimensional variables see Appendix A. The heteroeneity is introduced in the model by assinin to each neuron a different value of input excitability I i drawn from a flat distribution P(I), whose support is I [l : l 2 ] with l θ, therefore all the neurons are supra-threshold. The synaptic current E i (t) is iven by the linear super-position of all the inhibitory post-synaptic potentials (IPSPs) η(t) emitted at previous times t j n < t by the pre-synaptic neurons connected to neuron i, namely E i (t) = C ij η(t t j K n) ; (2) j i n t n<t where K is the number of pre-synaptic neurons. C ij represent the elements of the N N connectivity matrix associated to an undirected random network, whose entries are if there is a synaptic connection from neuron j to neuron i, and otherwise. For the sparse network, we select randomly the matrix entries, however to reduce the sources of variability in the network, we assume that the number of pre-synaptic neurons is fixed, namely j i C ij = K << N for each neuron i, where autaptic connections are not allowed. We have verified that the results do not chane if we choose randomly the links accordinly to an Erdös-Renyi distribution with a probability K/N. For a fully coupled network we have K = N. The shape of the IPSP characterize the type of filterin performed by the synapses on the received action potentials. We have considered two kind of synapses, instantaneous ones, where η(t) = δ(t), and synapses where the PSP is an α-pulse, namely η(t) = H(t)α 2 te tα ; (3) 4

7 with H denotin the Heaviside step function. In this latter case the rise and decay time of the pulse are the same, namely τ α = /α, and therefore the pulse duration τ P can be assumed to be twice the characteristic time τ α. The equations of the model Eqs. and 2 are interated exactly in terms of the associated event driven maps for different synaptic filterin, these correspond to Poincaré maps performed at the firin times (for details see Appendix A) [94, 56]. For instantaneous synapses, we have usually considered system sizes N = 4 and N =, 4 and for the sparse case in-derees 2 K 8 for N = 4 and 2 K 6 for N = 4 with interation times up to t s = 6. For synapses with a finite decay time we limit the analysis to N = 4 and K = 2 and to maximal interation times t s = 5. Finite size dependences on N are neliible with these parameter choices, as we have verified. In order to characterize the network dynamics we measure the fraction of active neurons n A (t S ) at time t S, i.e. the fraction of neurons emittin at least one spike in the time interval [ : t S ]. Therefore a neuron will be considered silent if it has a frequency smaller than /t S, with our choices of t S = 5 6, this corresponds to neurons with frequencies smaller than 3 4 Hz, by assumin as timescale a membrane time constant τ m = ms. The estimation of the number of active neurons is always started after a sufficiently lon transient time has been discarded, usually correspondin to the time needed to deliver 6 spikes in the network. Furthermore, for each neuron we estimate the time averaed inter-spike interval (ISI) T ISI, the associated firin frequency ν = /T ISI, as well as the coefficient of variation CV, which is the ratio of the standard deviation of the ISI distribution divided by T ISI. For a reular spike train CV =, for a Poissonian distributed one CV =, while CV > is usually an indication of burstin activity. The indicators usually reported in the followin to characterize the network activity are ensemble averaes over all the active neurons, which we denote as ā for a eneric observable a. To analyze the linear stability of the dynamical evolution we measure the maximal Lyapunov exponent λ, which is positive for chaotic evolution, and neative (zero) for stable (marinally stable) dynamics [4]. In particular, by followin [58, 2] λ is estimated by linearizin the correspondin event driven map as explained in details in Appendix B. 3 Fully Coupled Networks: Winner Takes All In the fully coupled case we observe that the number of active neurons n A saturates, after a short transient, to a value which remains constant in time. In this case, it is possible to derive a self-consistent mean field approach to obtain an analytic expression for the fraction of active neurons n A and for the averae firin frequency ν of the neurons in the network. In a fully coupled network each neuron receives the spikes emitted by the other K = N neurons, therefore each neuron is essentially subject to the same effective input µ, apart corrections O(/N). The effective input current, for a neuron with an excitability I, is iven by µ = I νn A ; (4) where n A (N ) is the number of active pre-synaptic neurons assumed to fire with the same averae frequency ν. 5

8 In a mean field approach, each neuron can be seen as isolated from the network and driven by the effective input current µ. Takin into account the distribution of the excitabilities P(I) one obtains the followin self-consistent expression for the averae firin frequency ν = {I A } [ ( )] I νna v r di P(I) ln (5) I νn A θ where the interal is restricted only to active neurons, i.e. to I {I A } values for which the loarithm is defined, while = {I A di P(I) is the measure of their support. In 5 we have } used the fact that for an isolated LIF neuron with constant excitability C, the ISI is simply iven by T ISI = ln[(c v r )/(C θ)] []. An implicit expression for n A can be obtained by estimatin the neurons with effective input µ > θ, in particular the number of silent neurons is iven by n A = l l dip(i), (6) where l is the lower limit of the support of the distribution, while l = νn A +θ. By solvin selfconsistently Eqs.5 and 6 one can obtain the analytic expression for n A and ν for any distribution P(I). In particular, for excitabilities distributed uniformly in the interval [l : l 2 ], the expression for the averae frequency Eq. 5 becomes ν = [ ( )] I νna v r di ln ; (7) n A (l 2 l ) {I A } I νn A θ while the number of active neurons is iven by the followin expression n A = l 2 θ l 2 l + ν ; (8) with the constraint that n A cannot be larer than one. The analytic results for these quantities compare quite well with the numerical findins estimated for different distribution intervals [l : l 2 ], different couplin strenths and system sizes, as shown in Fi. 2. Apart for definitely lare couplin > where some discrepancies amon the mean field estimations and the simulation results for ν is observable (see Fi. 2 (b)). These differences are probably due to the discreteness of the pulses, which cannot be nelected for very lare synaptic strenths. As a eneral feature we observe that n A is steadily decreasin with, thus indicatin that a roup of neurons with hiher effective inputs (winners) silence the other neurons (losers) and that the number of winners eventually vanishes for sufficiently lare couplin in the limit of lare system sizes. Furthermore, the averae excitability of the active neurons (the winners) ĪA increases with, as shown in the inset of Fi. 2 (a), thus revealin that only the neurons with hiher excitabilities survive to the silencin action exerted by the other neurons. At the same time, as an effect of the rowin inhibition the averae firin rate of the winners dramatically slows down. Therefore despite the increase of Ī A the averae effective input µ indeed decreases for increasin inhibition. This represents a clear example of the winner takes all (WTA) mechanism obtained via (lateral) inhibition, which has been shown to have bioloical relevance for neural systems [9, 22, 24, 63]. 6

9 .8 a).6 n A I A ν b) Fiure 2: Fraction of active neurons n A (a) and averae network s frequency ν (b) as a function of the synaptic strenth for uniform distributions P(I) with different supports. Inset: averae neuronal excitability of the active neurons ĪA versus. Empty (filled) symbols refer to numerical simulation with N = 4 (N = 4) and dashed lines to the correspondin analytic solution. Symbols and lines correspond from bottom to top to [l : l 2 ] = [. :.5] (black); [l : l 2 ] = [. :.8] (red) and [l : l 2 ] = [.2 : 2.] (blue). The data have been averaed over a time interval t S = 6 after discardin a transient of 6 spikes. 7

10 It is important to understand which is the minimal couplin value c for which the firin neurons start to die. In order to estimate c it is sufficient to set n A = in Eqs. 7 and 8. In particular, one ets c = (l θ)/ ν, (9) thus for l = θ even an infinitesimally small couplin is in principle sufficient to silence some neuron. Furthermore, from Fi. 3 (a) it is evident that whenever the excitabilities become homoeneous, i.e. for l l 2, the critical synaptic couplin c diveres towards infinite. Thus heteroeneity in the excitability distribution is a necessary condition in order to observe a (radual) neurons death, as shown in Fi. 2 (a). This is in areement with the results reported in [7], where homoeneous fully coupled networks of inhibitory LIF neurons have been examined. In particular, for finite systems and slow synapses the authors in [7] reveal the existence of a sub-critical Hopf bifurcation from a fully synchronized state to a reime characterized by oscillator death occurrin at some critical c. However, in the thermodynamic limit c for fast as well as slow synapses, in areement with our mean field result for instantaneous synapses. We also proceed to investiate the isolines correspondin to the same critical c in the (l,l 2 )- plane, the results are reported in Fi. 3 (b) for three selected values of c. It is evident that the l and l 2 -values associated to the isolines display a direct proportionality amon them. However, despite lyin on the same c -isoline, different parameter values induce a completely different behaviour of n A as a function of the synaptic strenth, as shown in the inset of Fi. 3 (b). Direct simulations of the network at finite sizes, namely for N = 4 and N = 4, show that for sufficiently lare couplin neurons with similar excitabilities tend to form clusters, similarly to what reported in [45], for the same model here studied, but with a delayed pulse transmission. However, at variance with [45], the overall macroscopic dynamics is asynchronous and no collective oscillations can be detected for the whole rane of considered synaptic strenths. 4 Sparse Networks : Neurons rebirth In this Section we will consider a network with sparse connectivity, namely each neuron is suprathreshold and it receives instantaneous IPSPs from K << N randomly chosen neurons in the network. Due to the sparseness, the input spike trains can be considered as uncorrelated and at a first approximation it can be assumed that each spike train is Poissonian with a frequency ν correspondent to the averae firin rate of the neurons in the network [9, 8]. Usually, the mean activity of a LIF neural network has been estimated in the context of the diffusion approximation [72, 83]. This approximation is valid whenever the arrival frequency of the IPSPs is hih with respect to the firin emission, while the amplitude of each IPSPs (namely, G = /K) is small with respect to the firin threshold θ. This latter hypothesis in our case is not valid for sufficiently lare (small) synaptic strenth (in-deree K), therefore the synaptic inputs should be treated as shot noise. In particular, here we apply an extended version of the approach derived by Richardson and Swabrick in [73] to estimate the averae firin rate and the averae coefficient of variation for LIF neurons subject to inhibitory shot noise with constant amplitude. At variance with the fully coupled case, the fraction of active neurons n A does not saturate to a constant value for sufficiently short times. Instead, n A increases in time, due to the rebirth of losers which have been previously silenced by the firin activity of the winners. As shown in in 8

11 a) l 2 /θ=.5 l 2 /θ=2 l 2 /θ=2.5 l 2 /θ=3 c b) l /θ l 2 /θ 2.5 n A l /θ Fiure 3: a) Critical value c as a function of the lower value of the excitability l for several choices of the upper limit l 2. b) Isolines correspondin to constant values of c in the (l,l 2 )- plane: namely, c =.5 (black solid line), c =. (red dashed line), c = 2. (blue dotted line). Inset: Dependence of n A on for three couples of values (l,l 2 ) chosen alon each of the isolines reported in the main fiure. 9

12 Fi. (b), the variation of n A slows down for increasin interation times t S and n A approaches some apparently asymptotic profile for sufficiently lon interation times. Furthermore, n A has a non monotonic behaviour with, as opposite to the fully coupled case. In particular, n A reveals a minimum n Am at some intermediate synaptic strenth m followed by an increase towards n A = at lare. As we have verified, as far as < K << N finite size effects are neliible and the actual value of n A depends only on the in-deree K and the considered simulation time t S. In the followin we will try to explain the oriin of such a behaviour. Despite the model is fully deterministic, due to the random connectivity the observed phenomena can be interpreted in the framework of activation processes. In particular, we can assume that each neuron in the network will receive n A K independent Poissonian trains of inhibitory kicks of constant amplitude G characterized by an averae frequency ν, thus each synaptic input can be rearded as a sinle Poissonian train with total frequency R = n A K ν, Therefore, each neuron, characterized by its own excitability I, will be subject to an averae effective input µ(i) (as reported in Eq. 4 ) plus fluctuations in the synaptic current of amplitude σ = na ν K. () Indeed, we have verified that ives a quantitatively correct estimation of the synaptic current fluctuations over the whole rane of synaptic couplin here considered (as shown in Fi. 4). For instantaneous IPSP, these fluctuations are due to stable chaos [66], since the maximal Lyapunov exponent is neative for the whole rane of couplin, as we have verified. Therefore, as reported by many authors, erratic fluctuations in inhibitory neural networks with instantaneous synapses are due to finite amplitude instabilities, while at the infinitesimal level the system is stable [93, 32, 33, 45, 52, 3, 85]. In this picture, the silent neurons stay in a quiescent state correspondin to the minimum of the effective potential U(v) = v 2 /2 µv and in order to fire they should overcome a barrier U = (θ µ) 2 /2. The averae time t A required to overcome such barrier can be estimated accordinly to the Kramers theory for activation processes [28, 83], namely ( ) (θ µ(i)) 2 t A τ exp ; () where τ is an opportune time scale. It is reasonable to assume that at a iven time t S all neurons with t A < t S will have fired at least once, and the fraction of active neurons n A at that time can be estimated by identifyin the specific neuron for which t S = t A. This neuron will be characterized by the followin excitability σ 2 Î = l 2 n A (l 2 l ) ; (2) for excitabilities I uniformly distributed in the interval [l : l 2 ]. In order to obtain the fraction of active neurons at time t S, one should solve the equation for the neuron with excitability Î by settin t S = t A, thus obtainin the followin solution n A = γ + φ + φ 2 + 2φγ 2γ 2. (3) where γ = (l 2 l ) + ν φ = 2 K ν ln(t s/τ ).

13 .5 µ A, σ Fiure 4: Effective averae input of the active neurons µ A (black circles) and averae fluctuations of the synaptic currents σ (red squares) as a function of the inhibitory couplin. The threshold potential θ = is marked by the (blue) horizontal dotted line and m by the (reen) vertical dash-dotted line. The dashed black (red) line refer to the theoretical estimation for µ A (σ) reported in Eq. 4 (Eq. ) and averaed only over the active neurons. The data refer to N = 4, K = 4, [l : l 2 ] = [. :.5] and to a simulation time t s = 6.

14 Equation 3 ives the dependence of n A on the couplin strenth for a fixed interation time t S, whenever we can provide the value of the averae frequency ν. This in turn can be obtained analytically by followin the approach introduced in [73] for LIF neurons subject to excitatory and inhibitory synaptic shot noise with exponentially distributed PSP amplitudes. In Appendix C we detail how to obtain with this method ν for inhibitory spike trains with constant PSP amplitudes. In particular, the averae frequency can be found via the followin self consistent equation ν = dip(i)ν (I,G,n A, ν) ; (4) {I A } where the explicit expression of ν is iven by Eq. 43 in Appendix C. The simultaneous solution of Eqs. 3 and 4 provide a theoretical estimation of n A and ν for the whole considered rane of synaptic strenth, once the unknown time scale τ is fixed. This time scale has been determined via an optimal fittin procedure for sparse networks with N = 4 and K = 2, 4 and 8 at a fixed interation time t S = 6. The results for n A are reported in Fi. 5 (a), the estimated curves reproduce reasonably well the numerical data for K = 2 and 4, while for K = 8 the areement worsens at lare couplin strenths. This should be due to the fact that by increasin and K the spike trains stimulatin each neuron cannot be assumed to be completely independent, as done for the derivation of Eqs. 3 and 4. Nevertheless, the averae frequency ν is quantitatively well reproduced for the considered K values over the entire rane of the synaptic strenths, as it is evident from Fis. 5 (b-d). We have also estimated analytically the averae coefficient of variation of the firin neurons CV by extendin the method derived in [73] to obtain the response of a neuron receivin synaptic shot noise inputs. The analytic expressions of the coefficient of variation for LIF neurons subject to inhibitory shot noise with fixed post-synaptic amplitude are obtained by estimatin the second moment of the associated first-passae-time distribution, the details are reported in Appendix D. The coefficient of variation can be estimated, once the self consistent values for n A and ν have been obtained via Eqs. 3 and 4. The comparison with the numerical data, reported in Fis 5 (e-), reveals a ood areement over the whole rane of synaptic strenths for all the considered in-derees. At sufficiently small synaptic couplin the neurons fire tonically and almost independently, as clear from the raster plot in Fi 5 (h) and by the fact that ν approaches the averae value for the uncoupled system (namely,.65) and CV. By increasin the couplin, n A decreases, as an effect of the inhibition more and more neurons are silenced and the averae firin rate decrease, at the same time the dynamics becomes slihtly more irreular as shown in Fi 5 (i). At lare couplin > m, a new reime appears, where almost all neurons become active but with an extremely slow dynamics which is essentially stochastic with CV, as testified also by the raster plot reported in Fi 5 (j). Furthermore, from Fi. 5(a) it is clear that the value of the minimum of the fraction of active neurons n Am decreases by increasin the network in-deree K, while m increases with K. This behaviour is further investiated in a larer network, namely N = 4, and reported in the inset of Fi. 6. It is evident that n A stays close to the lobally coupled solutions over larer and larer intervals for increasin K. This can be qualitatively understood by the fact that the current fluctuations Eq., responsible for the rebirth of silent neurons, are proportional to and scales as / K, therefore at larer in-deree the fluctuations have similar amplitudes only for larer synaptic couplin. The eneral mechanism behind neurons rebirth can be understood by considerin the value of the effective neuronal input and of the current fluctuations as a function of. As shown in Fi. 4, 2

15 e).9.2 na f) ν c) d).7 ) ν neuron index 4 i) h) 2 ts ν 3 4. CV CV b).4 CV.6 a) ν j) 2 ts ν ts ν 3 4 Fiure 5: a) Fraction of active neurons na as a function of inhibition for several values of K. b-d) Averae network s firin rate ν for the same cases depicted in a), and the correspondin CV (e-f). In all panels, filled symbols correspond to numerical data and dashed lines to analytic values: black circles correspond to K = 2 (τ = ), red squares to K = 4 (τ = 9) and blue diamonds to K = 8 (τ = 26.6). The data are averaed over a time interval ts = 6 and different realizations of the random network. h-j) Raster plots for three different synaptic strenhts for N = 4 and K = 2: namely, h) =.; i) = and j) =. The correspondin value for the fraction of active neurons, averae frequency and averae coefficient of variation are na = (.95,.83,.97), ν = (.55,.32,.) and CV = (.4,.3,.83), respectively. The neurons are ordered in terms of their intrinsic excitability and the time is rescaled by the averae frequency ν. The remainin parameters as in Fi. (b). 3

16 the effective input current µ A, averaed over the active neurons, essentially coincide with the averae of the excitability ĪA for, where the neurons can be considered as independent one from the others. The inhibition leads to a decrease of µ A, and to a crossin of the threshold θ exactly at = m. This indicates that at < m the active neurons, bein on averae suprathreshold, fire almost tonically inhibitin the losers via a WTA mechanism. In this case the firin neurons are essentially mean-driven and the current fluctuations play a role on the rebirth of silent neurons only on extremely lon time scales, this is confirmed by the low values of σ in such a rane, as evident from Fi. 4. On the other hand, for > m, the active neurons are now on averae below threshold while the fluctuations dominate the dynamics. In particular, the firin is now extremely irreular bein due mainly to reactivation processes. Therefore the oriin of the minimum in n A can be understood as a transition from a mean-driven to a fluctuation-driven reime [7]. A quantitative definition of m can be iven by requirin that the averae input current of the active neurons µ A crosses the threshold θ at = m, namely µ A ( m ) = ĪA m ν m n Am = θ. (5) where ĪA is the averae excitability of the firin neurons, while n Am and ν m are the number of active neurons and the averae frequency at the minimum. For an uniform distribution P(I), a simple expression for m can be derived, namely [ m = ν m l2 θ + ] n Am 2 (l l 2 ). (6) The estimations obtained with this expression are compared with the numerical data reported in Fi. 6 for a network of size N =, 4 and various in-derees. The overall areement is more than satisfactory for in-derees ranin over almost two decades (namely, for 2 K 6). It should be stressed that, as we have verified for various system sizes (namely, N = 7,4 and 28) and for a constant averae in-deree K = 4, for instantaneous synapses the network is in an heteroeneous asynchronous state for all the considered values of the synaptic couplin. This is demonstrated by the fact that the amplitude of the fluctuations of the averae firin activity, measured by considerin the low-pass filtered linear super-position of all the spikes emitted in the network, vanishes as / N [88]. Therefore, the observed transition at = m is not associated to the emerence of irreular collective behaviours as reported for lobally coupled heteroeneous inhibitory networks of LIF neurons with delay [45] and of pulse-coupled phase oscillators [85]. 5 Effect of synaptic filterin In this Section we will investiate how synaptic filterin can influence the previously reported results. In particular, we will consider non instantaneous IPSP with α-function profile 3, whose evolution is ruled by a sinle time scale τ α. 5. Fully Coupled Networks Let us first examine the fully coupled topoloy, in this case we observe analoously to the δ- pulse couplin that by increasin the inhibition, the number of active neurons steadily decreases 4

17 m n A K Fiure 6: m as a function of the in-deree K. The symbols refer to numerical data, while the dashed line to the expression 6. Inset: n A versus for the fully coupled case (solid black line) and for diluted networks (dashed lines), from top to bottom K = 2, 4, 8, 4. A network of size N = 4 is evolved durin a period t S = 5 after discardin a transient of 6 spikes, the data are averaed over 5 different random realizations of the network. Other parameters as in Fi.. 5

18 a).6 b).8.4 ν.2.6 n A n A Fiure 7: Fraction of active neurons n A as a function of the inhibition with IPSPs with α- profile for a fully coupled topoloy (a) and a sparse network (b) with K = 2. a) Black (red) symbols correspond to τ α = (τ α =.25), while the dashed lines are the theoretical predictions 7 and 8 previously reported for instantaneous synapses. The data are averaed over a time window t s = 5. Inset: averae frequency ν as a function of. b) n A is measured at successive times : from lower to upper curve the considered times are t s = {, 5,, 5, }, while τ α =. The system size is N = 4 in both cases, the distribution of excitabilities is uniform with [l : l 2 ] = [. :.5] and θ =. towards a limit where only few neurons (or eventually only one) will survive. At the same time the averae frequency also decreases monotonically, as shown in Fi. 7 for two different τ α differin by almost two orders of manitude. Furthermore, the mean field estimations 7 and 8 obtained for n A and ν represent a very ood approximation also for α-pulses (as shown in Fi. 7). In particular, the mean field estimation essentially coincides with the numerical values for slow synapses, as evident from the data reported in Fi. 7 for τ α = (black filled circles). This can be explained by the fact that for sufficiently slow synapses, with τ P > T ISI, the neurons feel the synaptic input current as continuous, because each input pulse has essentially no time to decay between two firin events. Therefore the mean field approximation for the input current 4 works definitely well in this case. This is particularly true for τ α =, where τ P = 2 and T ISI 2 6 in the rane of the considered couplin. While for τ α =.25, we observe some deviation from the mean field results (red squares in Fi. 7) and the reason for these discrepancies reside in the fact that τ P < T ISI for any couplin strenth, therefore the discreteness of the pulses cannot be completely nelected in particular for lare amplitudes (lare synaptic couplins) analoously to what observed for instantaneous synapses. 5.2 Sparse Networks For the sparse networks n A has the same qualitative behaviour as a function of the synaptic inhibition observed for instantaneous IPSPs, as shown in Fi. 7 (b) and Fi. 8 (a). The value of n A decreases with and reaches a minimal value at m, afterwards it increases towards n A = at larer couplin. The oriin of the minimum of n A as a function of is the same as for instantaneous synapses, for < m the active neurons are subject on averae to a suprathreshold effective input µ A, while at larer couplin µ A < θ, as shown in the inset of Fi. 8 (b). This is true for any value of τ α, however, this transition from mean- to fluctuation-driven becomes dramatic for slow synapses. As evidenced from the data for the averae output firin rate ν and the averae coefficient of variation CV, these quantities have almost discontinuous jumps at = m, as shown in Fi. 9. 6

19 Therefore, let us first concentrate on slow synapses with τ α larer than the membrane time constant, which is one for adimensional units. For < m the fraction of active neurons is frozen in time, at least on the considered time scales, as revealed by the data in Fi. 7 (b). Furthermore, for < m, the mean field approximation obtained for the fully coupled case works almost perfectly both for n A and ν, as reported in Fi. 8 (a). The frozen phase is characterized by extremely small values of the current fluctuations σ (as shown Fi. 8 (b)) and a quite hih firin rate ν.4.5 with an associated averae coefficient of variation CV almost zero (see black circles and red squares in Fi. 9). Instead, for > m the number of active neurons increases in time similarly to what observed for the instantaneous synapses, while the averae frequency becomes extremely small ν.4.9 and the value of the coefficient of variation becomes definitely larer than one. These effects can be explained by the fact that, below m the active neurons (the winners) are subject to an effective input µ A > θ that induces a quite reular firin, as testified by the raster plot displayed in Fi. 8 (c). The supra-threshold activity of the winners joined toether with the filterin action of the synapses uarantee that on averae each neuron in the network receive an almost continuous current, with small fluctuations in time. These results explain why the mean field approximation still works in the frozen phase, where the fluctuations in the synaptic currents are essentially neliible and unable to induce any neuron s rebirth, at least on realistic time scales. In this reime the only mechanism in action is the WTA, fluctuations bein to have a role for slow synapses only for > m. Indeed, as shown in Fi. 8 (b), the synaptic fluctuations σ for τ α = (black circles) are almost neliible for < m and show an enormous increase at = m of almost two orders of manitude. Similarly at τ α = 2 (red square) a noticeable increase of σ is observable at the transition. In order to better understand the abrupt chanes in ν and in CV observable for slow synapses at = m, let us consider the case τ α =. As shown in Fi. 9 (c), τ P > T ISI 2 3 for < m, therefore for these couplins the IPSPs have no time to decay between a firin emission and the next one, thus the synaptic fluctuations σ are definitely small in this case, as already shown. At m an abrupt jump is observable to lare values T ISI > τ P, this is due to the fact that now the neurons display burstin activities, as evident from the raster plot shown in Fi. 8 (d). The burstin is due to the fact for > m the active neurons are subject to an effective input which is on averae sub-threshold, therefore the neurons preferentially tend to be silent. However, due to current fluctuations a neuron can pass the threshold and the silent periods can be interrupted by burstin phases where the neuron fires almost reularly. As a matter of fact, the silent (inter-burst) periods are very lon 7 9, if compared to the duration of the burstin periods, namely 25 5, as shown in Fi. 9 (c). This explains the abrupt decrease of the averae firin rate reported in Fi. 9 (a). Furthermore, the inter-burst periods are exponentially distributed with a an associated coefficient of variation.8., which clearly indicates the stochastic nature of the switchin from the silent to the burstin phase. The firin periods within the burstin phase are instead quite reular, with an associated coefficient of variation.2, and with a duration similar to T ISI measured in the frozen phase (shaded ray circles in Fi. 9 (c)). Therefore, above m the distribution of the ISI exhibits a lon exponential tail associated to the burstin activity and this explains the very lare values of the measured coefficient of variation. By increasin the couplin the fluctuations in the input current becomes larer thus the fraction of neurons that fires at least once within a certain time interval increases. At the same time, ν, the averae inter-burst periods and the firin periods within the burstin phase remain almost constant at >, as shown in Fi. 9 (a). This indicates that the decrease of µ A and the increase of σ due by the increased inhibitory couplin essentially compensate each other in this rane. Indeed, we have verified that for τ α = and τ α = 2 7

20 a) b).8 σ.6.2. na µa c) neuron index 4 neuron index ts ν 6 8 d) 2 4 ts ν 6 8 Fiure 8: a) Fraction of active neurons for a network of α-pulse coupled neurons as a function of for various τα : namely, τα = (black circles), τα = 2 (red squares), τα =.5 (blue diamonds) and τα =.25 (reen trianles). For instantaneous synapses, the fully coupled analytic solution is reported (solid line), as well as the measured na for the sparse network with same level of dilution and estimated over the same time interval (dashed line). b) Averae fluctuations of the synaptic current σ versus for ISPS with α-profile the symbols refer to the same τα as in panel (a). Inset: Averae input current µ A of the active neurons vs, the dashed line is the threshold value θ =. The simulation time has been fixed to ts = 5. c-d) Raster plots for two different synaptic strenths for τα = : namely, c) = corresponds to na '.52, ν '.45 and CV ' 3 4 ; while i) = to na '.99, ν '.6 and CV ' 4.. The neurons are ordered accordin to their intrinsic excitability and the time is rescaled by the averae frequency ν. The data have been obtained for a system size N = 4 and K = 2, other parameters as in Fi. 7. µ A (σ ) decreases (increases) linearly with with similar slopes, namely µ A ' while σ ' For faster synapses, the frozen phase is no more present, furthermore due to rebirths induced by current flutuations na is always larer than the fully coupled mean field result 8, even at < m. It is interestin to notice that by decreasin τα, we are now approachin the instantaneous limit. As indicated by the results reported for na in Fi. 8 (a) and CV in Fi. 9 (b), in particular for τα =.25 (reen trianles) the data almost collapse on the correspondin values measured for instantaneous synapses in a sparse networks with the same characteristics and over a similar time interval (dashed line). Furthermore, for fast synapses with τα < the burstin activity is no more present as it can be appreciated by the fact that at most CV approaches one in the very lare couplin limit. The reported results can be interpreted in the framework of the so-called adiabatic approach developed by Moreno-Bote and Para in [53, 54], to estimate analytically ν. This method applies to LIF neurons with a synaptic time scale definitely loner than the membrane time constant. In 8

21 ν.3.2 a) CV 2 b) c) τ P.... Fiure 9: a) Averae firin rate ν vs for a network of α-pulse coupled neurons, for four values of τ α. Theoretical estimations for ν calculated with the adiabatic approach 57 are reported as dashed lines of colors correspondin to the relative symbols. b) Averae coefficient of variation CV for four values of τ α as a function of the inhibition. The dashed line refers to the values obtained for instantaneous synapses and a sparse network with the same value of dilution. c): Averae inter-spike interval TISI (filled black circles) as a function of for τ α =. For > m the averae inter-burst interval (empty circles) and the averae ISI measured within bursts (ray circles) are also shown, toether with the position of m (reen veritical line). The symbols and colors denote the same τ α values as in Fi. 8. All the reported data were calculated for a system size N = 4 and K = 2 and for a fixed simulation time of t s = 5. these conditions, the output firin rate can be reproduced by assumin that the current fluctuations correspond to colored noise, with a correlation time iven by the pulse duration τ P = 2τ α (for more details see Appendix E). In this case we are unable to develop an analytic expression for n A, that can be solved self consistently with that for the averae frequency. However, once n A is provided the analytic estimated 57 obtained with the adiabatic approach ives very ood areement with the numerical data for sufficiently slow synapses, namely for τ P, as shown in Fi. 9(a) for τ α =, 2 and.5. The theoretical expression 57 is even able to reproduce the jump in the averae frequencies observable at m and therefore to capture the burstin phenomenon. By considerin τ P <, as expected, the theoretical expression fails to reproduce the numerical data in particular at lare couplin (see the dashed reen line in Fi. 9(a) correspondin to τ α =.25). The burstin phenomenon observed for τ α > and > m can be seen at a mean field level as the response of a sub-threshold LIF neuron subject to colored noise with correlation τ P. The neuron is definitely sub-threshold, but in presence of a lare fluctuation it can be lead to fire and due to the finite correlation time it can stays supra-threshold reularly firin for a period τ P. Indeed, the averae burstin periods we measured are of the order of τ P = 2τ α, namely, 27 5 ( 7 4) for τ α = (τ α = 2) As a final point, to better understand the dynamical oriin of the measured fluctuations in this deterministic model, we have estimated the maximal Lyapunov exponent λ (for detail see Appendix B). As expected from previous analysis, for non-instantaneous synapses we can observe the emerence of reular chaos in purely inhibitory networks [33, 92, ]. In particular, for sufficiently fast synapses, we typically note a transition from a chaotic state at low couplin to a linearly stable reime (with λ < ) at lare synaptic strenths, as shown in Fi. (a) for τ α =.25. Despite the fact that current fluctuations are monotonically increasin with the synaptic strenth. Therefore, fluctuations are due to chaos, at small couplin, while at larer they are due to finite amplitude instabilities, as expected for stable chaotic systems [3]. However, the passae from positive to neative values of the maximal Lyapunov exponent is not related to the transition occurrin at m from a mean-driven to a fluctuation-driven dynamics in the network. 9

Collective dynamics in sparse networks

Collective dynamics in sparse networks Parma 22 p. Collective dynamics in sparse networks Simona Olmi Istituto dei Sistemi Complessi - CNR - Firenze Istituto Nazionale di Fisica Nucleare - Sezione di Firenze Centro interdipartimentale di Studi

More information

Consider the following spike trains from two different neurons N1 and N2:

Consider the following spike trains from two different neurons N1 and N2: About synchrony and oscillations So far, our discussions have assumed that we are either observing a single neuron at a, or that neurons fire independent of each other. This assumption may be correct in

More information

Neurophysiology of a VLSI spiking neural network: LANN21

Neurophysiology of a VLSI spiking neural network: LANN21 Neurophysiology of a VLSI spiking neural network: LANN21 Stefano Fusi INFN, Sezione Roma I Università di Roma La Sapienza Pza Aldo Moro 2, I-185, Roma fusi@jupiter.roma1.infn.it Paolo Del Giudice Physics

More information

Supporting Online Material for

Supporting Online Material for www.sciencemag.org/cgi/content/full/319/5869/1543/dc1 Supporting Online Material for Synaptic Theory of Working Memory Gianluigi Mongillo, Omri Barak, Misha Tsodyks* *To whom correspondence should be addressed.

More information

PHY 133 Lab 1 - The Pendulum

PHY 133 Lab 1 - The Pendulum 3/20/2017 PHY 133 Lab 1 The Pendulum [Stony Brook Physics Laboratory Manuals] Stony Brook Physics Laboratory Manuals PHY 133 Lab 1 - The Pendulum The purpose of this lab is to measure the period of a simple

More information

First- and Second Order Phase Transitions in the Holstein- Hubbard Model

First- and Second Order Phase Transitions in the Holstein- Hubbard Model Europhysics Letters PREPRINT First- and Second Order Phase Transitions in the Holstein- Hubbard Model W. Koller 1, D. Meyer 1,Y.Ōno 2 and A. C. Hewson 1 1 Department of Mathematics, Imperial Collee, London

More information

Stability of the splay state in pulse-coupled networks

Stability of the splay state in pulse-coupled networks Krakow August 2013 p. Stability of the splay state in pulse-coupled networks S. Olmi, A. Politi, and A. Torcini http://neuro.fi.isc.cnr.it/ Istituto dei Sistemi Complessi - CNR - Firenze Istituto Nazionale

More information

An Introductory Course in Computational Neuroscience

An Introductory Course in Computational Neuroscience An Introductory Course in Computational Neuroscience Contents Series Foreword Acknowledgments Preface 1 Preliminary Material 1.1. Introduction 1.1.1 The Cell, the Circuit, and the Brain 1.1.2 Physics of

More information

The Mixed States of Associative Memories Realize Unimodal Distribution of Dominance Durations in Multistable Perception

The Mixed States of Associative Memories Realize Unimodal Distribution of Dominance Durations in Multistable Perception The Mixed States of Associative Memories Realize Unimodal Distribution of Dominance Durations in Multistable Perception Takashi Kanamaru Department of Mechanical Science and ngineering, School of Advanced

More information

Renormalization Group Theory

Renormalization Group Theory Chapter 16 Renormalization Group Theory In the previous chapter a procedure was developed where hiher order 2 n cycles were related to lower order cycles throuh a functional composition and rescalin procedure.

More information

Altitude measurement for model rocketry

Altitude measurement for model rocketry Altitude measurement for model rocketry David A. Cauhey Sibley School of Mechanical Aerospace Enineerin, Cornell University, Ithaca, New York 14853 I. INTRODUCTION In his book, Rocket Boys, 1 Homer Hickam

More information

Stochastic simulations of genetic switch systems

Stochastic simulations of genetic switch systems Stochastic simulations of enetic switch systems Adiel Loiner, 1 Azi Lipshtat, 2 Nathalie Q. Balaban, 1 and Ofer Biham 1 1 Racah Institute of Physics, The Hebrew University, Jerusalem 91904, Israel 2 Department

More information

Phase Response Properties and Phase-Locking in Neural Systems with Delayed Negative-Feedback. Carter L. Johnson

Phase Response Properties and Phase-Locking in Neural Systems with Delayed Negative-Feedback. Carter L. Johnson Phase Response Properties and Phase-Locking in Neural Systems with Delayed Negative-Feedback Carter L. Johnson Faculty Mentor: Professor Timothy J. Lewis University of California, Davis Abstract Oscillatory

More information

High-conductance states in a mean-eld cortical network model

High-conductance states in a mean-eld cortical network model Neurocomputing 58 60 (2004) 935 940 www.elsevier.com/locate/neucom High-conductance states in a mean-eld cortical network model Alexander Lerchner a;, Mandana Ahmadi b, John Hertz b a Oersted-DTU, Technical

More information

Spike-Frequency Adaptation: Phenomenological Model and Experimental Tests

Spike-Frequency Adaptation: Phenomenological Model and Experimental Tests Spike-Frequency Adaptation: Phenomenological Model and Experimental Tests J. Benda, M. Bethge, M. Hennig, K. Pawelzik & A.V.M. Herz February, 7 Abstract Spike-frequency adaptation is a common feature of

More information

Hysteretic Transitions in the Kuramoto Model with Inertia

Hysteretic Transitions in the Kuramoto Model with Inertia Rostock 4 p. Hysteretic Transitions in the uramoto Model with Inertia A. Torcini, S. Olmi, A. Navas, S. Boccaletti http://neuro.fi.isc.cnr.it/ Istituto dei Sistemi Complessi - CNR - Firenze, Italy Istituto

More information

CORRELATION TRANSFER FROM BASAL GANGLIA TO THALAMUS IN PARKINSON S DISEASE. by Pamela Reitsma. B.S., University of Maine, 2007

CORRELATION TRANSFER FROM BASAL GANGLIA TO THALAMUS IN PARKINSON S DISEASE. by Pamela Reitsma. B.S., University of Maine, 2007 CORRELATION TRANSFER FROM BASAL GANGLIA TO THALAMUS IN PARKINSON S DISEASE by Pamela Reitsma B.S., University of Maine, 27 Submitted to the Graduate Faculty of the Department of Mathematics in partial

More information

Emergence of resonances in neural systems: the interplay between adaptive threshold and short-term synaptic plasticity

Emergence of resonances in neural systems: the interplay between adaptive threshold and short-term synaptic plasticity Emergence of resonances in neural systems: the interplay between adaptive threshold and short-term synaptic plasticity Jorge F. Mejias 1,2 and Joaquín J. Torres 2 1 Department of Physics and Center for

More information

This article was originally published in a journal published by Elsevier, and the attached copy is provided by Elsevier for the author s benefit and for the benefit of the author s institution, for non-commercial

More information

2 1. Introduction. Neuronal networks often exhibit a rich variety of oscillatory behavior. The dynamics of even a single cell may be quite complicated

2 1. Introduction. Neuronal networks often exhibit a rich variety of oscillatory behavior. The dynamics of even a single cell may be quite complicated GEOMETRIC ANALYSIS OF POPULATION RHYTHMS IN SYNAPTICALLY COUPLED NEURONAL NETWORKS J. Rubin and D. Terman Dept. of Mathematics; Ohio State University; Columbus, Ohio 43210 Abstract We develop geometric

More information

Lecture 5 Processing microarray data

Lecture 5 Processing microarray data Lecture 5 Processin microarray data (1)Transform the data into a scale suitable for analysis ()Remove the effects of systematic and obfuscatin sources of variation (3)Identify discrepant observations Preprocessin

More information

Amplitude Adaptive ASDM without Envelope Encoding

Amplitude Adaptive ASDM without Envelope Encoding Amplitude Adaptive ASDM without Envelope Encodin Kaspars Ozols and Rolands Shavelis Institute of Electronics and Computer Science 14 Dzerbenes Str., LV-16, Ria, Latvia Email: kaspars.ozols@edi.lv, shavelis@edi.lv

More information

Efficient method for obtaining parameters of stable pulse in grating compensated dispersion-managed communication systems

Efficient method for obtaining parameters of stable pulse in grating compensated dispersion-managed communication systems 3 Conference on Information Sciences and Systems, The Johns Hopkins University, March 12 14, 3 Efficient method for obtainin parameters of stable pulse in ratin compensated dispersion-manaed communication

More information

Patterns, Memory and Periodicity in Two-Neuron Delayed Recurrent Inhibitory Loops

Patterns, Memory and Periodicity in Two-Neuron Delayed Recurrent Inhibitory Loops Math. Model. Nat. Phenom. Vol. 5, No. 2, 2010, pp. 67-99 DOI: 10.1051/mmnp/20105203 Patterns, Memory and Periodicity in Two-Neuron Delayed Recurrent Inhibitory Loops J. Ma 1 and J. Wu 2 1 Department of

More information

Patterns of Synchrony in Neural Networks with Spike Adaptation

Patterns of Synchrony in Neural Networks with Spike Adaptation Patterns of Synchrony in Neural Networks with Spike Adaptation C. van Vreeswijky and D. Hanselz y yracah Institute of Physics and Center for Neural Computation, Hebrew University, Jerusalem, 9194 Israel

More information

V DD. M 1 M 2 V i2. V o2 R 1 R 2 C C

V DD. M 1 M 2 V i2. V o2 R 1 R 2 C C UNVERSTY OF CALFORNA Collee of Enineerin Department of Electrical Enineerin and Computer Sciences E. Alon Homework #3 Solutions EECS 40 P. Nuzzo Use the EECS40 90nm CMOS process in all home works and projects

More information

4.3. Solving Friction Problems. Static Friction Problems. Tutorial 1 Static Friction Acting on Several Objects. Sample Problem 1.

4.3. Solving Friction Problems. Static Friction Problems. Tutorial 1 Static Friction Acting on Several Objects. Sample Problem 1. Solvin Friction Problems Sometimes friction is desirable and we want to increase the coefficient of friction to help keep objects at rest. For example, a runnin shoe is typically desined to have a lare

More information

THE LOCUST OLFACTORY SYSTEM AS A CASE STUDY FOR MODELING DYNAMICS OF NEUROBIOLOGICAL NETWORKS: FROM DISCRETE TIME NEURONS TO CONTINUOUS TIME NEURONS

THE LOCUST OLFACTORY SYSTEM AS A CASE STUDY FOR MODELING DYNAMICS OF NEUROBIOLOGICAL NETWORKS: FROM DISCRETE TIME NEURONS TO CONTINUOUS TIME NEURONS 1 THE LOCUST OLFACTORY SYSTEM AS A CASE STUDY FOR MODELING DYNAMICS OF NEUROBIOLOGICAL NETWORKS: FROM DISCRETE TIME NEURONS TO CONTINUOUS TIME NEURONS B. QUENET 1 AND G. HORCHOLLE-BOSSAVIT 2 1 Equipe de

More information

INFLUENCE OF TUBE BUNDLE GEOMETRY ON HEAT TRANSFER TO FOAM FLOW

INFLUENCE OF TUBE BUNDLE GEOMETRY ON HEAT TRANSFER TO FOAM FLOW HEFAT7 5 th International Conference on Heat Transfer, Fluid Mechanics and Thermodynamics Sun City, South Africa Paper number: GJ1 INFLUENCE OF TUBE BUNDLE GEOMETRY ON HEAT TRANSFER TO FOAM FLOW Gylys

More information

Dynamical Constraints on Computing with Spike Timing in the Cortex

Dynamical Constraints on Computing with Spike Timing in the Cortex Appears in Advances in Neural Information Processing Systems, 15 (NIPS 00) Dynamical Constraints on Computing with Spike Timing in the Cortex Arunava Banerjee and Alexandre Pouget Department of Brain and

More information

Synaptic dynamics. John D. Murray. Synaptic currents. Simple model of the synaptic gating variable. First-order kinetics

Synaptic dynamics. John D. Murray. Synaptic currents. Simple model of the synaptic gating variable. First-order kinetics Synaptic dynamics John D. Murray A dynamical model for synaptic gating variables is presented. We use this to study the saturation of synaptic gating at high firing rate. Shunting inhibition and the voltage

More information

arxiv: v3 [q-bio.nc] 18 Feb 2015

arxiv: v3 [q-bio.nc] 18 Feb 2015 1 Inferring Synaptic Structure in presence of eural Interaction Time Scales Cristiano Capone 1,2,5,, Carla Filosa 1, Guido Gigante 2,3, Federico Ricci-Tersenghi 1,4,5, Paolo Del Giudice 2,5 arxiv:148.115v3

More information

Collective Dynamics in Neural Networks

Collective Dynamics in Neural Networks Aarhus 8/12/11 p. Collective Dynamics in Neural Networks Alessandro Torcini alessandro.torcini@cnr.it Istituto dei Sistemi Complessi - CNR - Firenze Istituto Nazionale di Fisica Nucleare - Sezione di Firenze

More information

arxiv:cond-mat/ v1 [cond-mat.str-el] 15 Dec 2003

arxiv:cond-mat/ v1 [cond-mat.str-el] 15 Dec 2003 Imperial Collee Preprints PREPRINT arxiv:cond-mat/0312367v1 [cond-mat.str-el] 15 Dec 2003 First- and Second Order Phase Transitions in the Holstein- Hubbard Model W. Koller 1, D. Meyer 1, Y. Ōno2 and A.

More information

Computing with Inter-spike Interval Codes in Networks of Integrate and Fire Neurons

Computing with Inter-spike Interval Codes in Networks of Integrate and Fire Neurons Computing with Inter-spike Interval Codes in Networks of Integrate and Fire Neurons Dileep George a,b Friedrich T. Sommer b a Dept. of Electrical Engineering, Stanford University 350 Serra Mall, Stanford,

More information

THE TRANSFER AND PROPAGATION OF CORRELATED NEURONAL ACTIVITY

THE TRANSFER AND PROPAGATION OF CORRELATED NEURONAL ACTIVITY THE TRANSFER AND PROPAGATION OF CORRELATED NEURONAL ACTIVITY A Dissertation Presented to the Faculty of the Department of Mathematics University of Houston In Partial Fulfillment of the Requirements for

More information

The Spike Response Model: A Framework to Predict Neuronal Spike Trains

The Spike Response Model: A Framework to Predict Neuronal Spike Trains The Spike Response Model: A Framework to Predict Neuronal Spike Trains Renaud Jolivet, Timothy J. Lewis 2, and Wulfram Gerstner Laboratory of Computational Neuroscience, Swiss Federal Institute of Technology

More information

Population dynamics of interacting spiking neurons

Population dynamics of interacting spiking neurons Population dynamics of interacting spiking neurons Maurizio Mattia* Physics Laboratory, Istituto Superiore di Sanità, INFN - Gr. Coll. Roma I, V.le Regina Elena 299, 00161 Roma, Italy Paolo Del Giudice

More information

Mathematical Models of Dynamic Behavior of Individual Neural Networks of Central Nervous System

Mathematical Models of Dynamic Behavior of Individual Neural Networks of Central Nervous System Mathematical Models of Dynamic Behavior of Individual Neural Networks of Central Nervous System Dimitra Despoina Pagania 1, Adam Adamopoulos 1,2 and Spiridon D. Likothanassis 1 1 Pattern Recognition Laboratory,

More information

When do Correlations Increase with Firing Rates? Abstract. Author Summary. Andrea K. Barreiro 1* and Cheng Ly 2

When do Correlations Increase with Firing Rates? Abstract. Author Summary. Andrea K. Barreiro 1* and Cheng Ly 2 When do Correlations Increase with Firing Rates? Andrea K. Barreiro 1* and Cheng Ly 2 1 Department of Mathematics, Southern Methodist University, Dallas, TX 75275 U.S.A. 2 Department of Statistical Sciences

More information

Formal modeling with multistate neurones and multidimensional synapses

Formal modeling with multistate neurones and multidimensional synapses BioSystems 79 (25) 2 32 Formal modelin with multistate neurones and multidimensional synapses Briitte Quenet, Ginette Horcholle-Bossavit, Adrien Wohrer, Gérard Dreyfus Laboratoire d Electronique, Ecole

More information

Is the superposition of many random spike trains a Poisson process?

Is the superposition of many random spike trains a Poisson process? Is the superposition of many random spike trains a Poisson process? Benjamin Lindner Max-Planck-Institut für Physik komplexer Systeme, Dresden Reference: Phys. Rev. E 73, 2291 (26) Outline Point processes

More information

5 Shallow water Q-G theory.

5 Shallow water Q-G theory. 5 Shallow water Q-G theory. So far we have discussed the fact that lare scale motions in the extra-tropical atmosphere are close to eostrophic balance i.e. the Rossby number is small. We have examined

More information

Memory capacity of networks with stochastic binary synapses

Memory capacity of networks with stochastic binary synapses Memory capacity of networks with stochastic binary synapses Alexis M. Dubreuil 1,, Yali Amit 3 and Nicolas Brunel 1, 1 UMR 8118, CNRS, Université Paris Descartes, Paris, France Departments of Statistics

More information

Fast neural network simulations with population density methods

Fast neural network simulations with population density methods Fast neural network simulations with population density methods Duane Q. Nykamp a,1 Daniel Tranchina b,a,c,2 a Courant Institute of Mathematical Science b Department of Biology c Center for Neural Science

More information

Decorrelation of neural-network activity by inhibitory feedback

Decorrelation of neural-network activity by inhibitory feedback Decorrelation of neural-network activity by inhibitory feedback Tom Tetzlaff 1,2,#,, Moritz Helias 1,3,#, Gaute T. Einevoll 2, Markus Diesmann 1,3 1 Inst. of Neuroscience and Medicine (INM-6), Computational

More information

LFFs in Vertical Cavity Lasers

LFFs in Vertical Cavity Lasers LFFs in Vertical Cavity Lasers A. Torcini, G. Giacomelli, F. Marin torcini@inoa.it, S. Barland ISC - CNR - Firenze (I) Physics Dept - Firenze (I) INLN - Nice (F) FNOES - Nice, 14.9.5 p.1/21 Plan of the

More information

DEVS Simulation of Spiking Neural Networks

DEVS Simulation of Spiking Neural Networks DEVS Simulation of Spiking Neural Networks Rene Mayrhofer, Michael Affenzeller, Herbert Prähofer, Gerhard Höfer, Alexander Fried Institute of Systems Science Systems Theory and Information Technology Johannes

More information

1 CHAPTER 7 PROJECTILES. 7.1 No Air Resistance

1 CHAPTER 7 PROJECTILES. 7.1 No Air Resistance CHAPTER 7 PROJECTILES 7 No Air Resistance We suppose that a particle is projected from a point O at the oriin of a coordinate system, the y-axis bein vertical and the x-axis directed alon the round The

More information

arxiv:cond-mat/ v2 [cond-mat.str-el] 10 Feb 2004

arxiv:cond-mat/ v2 [cond-mat.str-el] 10 Feb 2004 Europhysics Letters PREPRINT arxiv:cond-mat/0312367v2 [cond-mat.str-el] 10 Feb 2004 First- and Second Order Phase Transitions in the Holstein- Hubbard Model W. Koller 1, D. Meyer 1, Y. Ōno2 and A. C. Hewson

More information

Synfire Waves in Small Balanced Networks

Synfire Waves in Small Balanced Networks Synfire Waves in Small Balanced Networks Yuval Aviel 1,3, David Horn 2 and Moshe Abeles 1 1 Interdisciplinary Center for Neural Computation, Hebrew University, Jerusalem, Israel. 2 School of Physics and

More information

Biological Modeling of Neural Networks:

Biological Modeling of Neural Networks: Week 14 Dynamics and Plasticity 14.1 Reservoir computing - Review:Random Networks - Computing with rich dynamics Biological Modeling of Neural Networks: 14.2 Random Networks - stationary state - chaos

More information

Localized activity patterns in excitatory neuronal networks

Localized activity patterns in excitatory neuronal networks Localized activity patterns in excitatory neuronal networks Jonathan Rubin Amitabha Bose February 3, 2004 Abstract. The existence of localized activity patterns, or bumps, has been investigated in a variety

More information

Robust Semiparametric Optimal Testing Procedure for Multiple Normal Means

Robust Semiparametric Optimal Testing Procedure for Multiple Normal Means Veterinary Dianostic and Production Animal Medicine Publications Veterinary Dianostic and Production Animal Medicine 01 Robust Semiparametric Optimal Testin Procedure for Multiple Normal Means Pen Liu

More information

Advanced Methods Development for Equilibrium Cycle Calculations of the RBWR. Andrew Hall 11/7/2013

Advanced Methods Development for Equilibrium Cycle Calculations of the RBWR. Andrew Hall 11/7/2013 Advanced Methods Development for Equilibrium Cycle Calculations of the RBWR Andrew Hall 11/7/2013 Outline RBWR Motivation and Desin Why use Serpent Cross Sections? Modelin the RBWR Axial Discontinuity

More information

Effects of synaptic conductance on the voltage distribution and firing rate of spiking neurons

Effects of synaptic conductance on the voltage distribution and firing rate of spiking neurons PHYSICAL REVIEW E 69, 051918 (2004) Effects of synaptic conductance on the voltage distribution and firing rate of spiking neurons Magnus J. E. Richardson* Laboratory of Computational Neuroscience, Brain

More information

The Bionic Lightweight Design of the Mid-rail Box Girder Based on the Bamboo Structure

The Bionic Lightweight Design of the Mid-rail Box Girder Based on the Bamboo Structure Wenmin Chen 1, Weian Fu 1, Zeqian Zhan 1, Min Zhan 2 Research Institute of Mechanical Enineerin, Southwest Jiaoton University, Chendu, Sichuan, China (1), Department of Industrial and System Enineerin,

More information

Experiment 1: Simple Pendulum

Experiment 1: Simple Pendulum COMSATS Institute of Information Technoloy, Islamabad Campus PHY-108 : Physics Lab 1 (Mechanics of Particles) Experiment 1: Simple Pendulum A simple pendulum consists of a small object (known as the bob)

More information

An Improved Logical Effort Model and Framework Applied to Optimal Sizing of Circuits Operating in Multiple Supply Voltage Regimes

An Improved Logical Effort Model and Framework Applied to Optimal Sizing of Circuits Operating in Multiple Supply Voltage Regimes n Improved Loical Effort Model and Framework pplied to Optimal Sizin of Circuits Operatin in Multiple Supply Voltae Reimes Xue Lin, Yanzhi Wan, Shahin Nazarian, Massoud Pedram Department of Electrical

More information

Adjustment of Sampling Locations in Rail-Geometry Datasets: Using Dynamic Programming and Nonlinear Filtering

Adjustment of Sampling Locations in Rail-Geometry Datasets: Using Dynamic Programming and Nonlinear Filtering Systems and Computers in Japan, Vol. 37, No. 1, 2006 Translated from Denshi Joho Tsushin Gakkai Ronbunshi, Vol. J87-D-II, No. 6, June 2004, pp. 1199 1207 Adjustment of Samplin Locations in Rail-Geometry

More information

A Learning Theory for Reward-Modulated Spike-Timing-Dependent Plasticity with Application to Biofeedback

A Learning Theory for Reward-Modulated Spike-Timing-Dependent Plasticity with Application to Biofeedback A Learning Theory for Reward-Modulated Spike-Timing-Dependent Plasticity with Application to Biofeedback Robert Legenstein, Dejan Pecevski, Wolfgang Maass Institute for Theoretical Computer Science Graz

More information

Information Theory and Neuroscience II

Information Theory and Neuroscience II John Z. Sun and Da Wang Massachusetts Institute of Technology October 14, 2009 Outline System Model & Problem Formulation Information Rate Analysis Recap 2 / 23 Neurons Neuron (denoted by j) I/O: via synapses

More information

A Mathematical Model for the Fire-extinguishing Rocket Flight in a Turbulent Atmosphere

A Mathematical Model for the Fire-extinguishing Rocket Flight in a Turbulent Atmosphere A Mathematical Model for the Fire-extinuishin Rocket Fliht in a Turbulent Atmosphere CRISTINA MIHAILESCU Electromecanica Ploiesti SA Soseaua Ploiesti-Tiroviste, Km 8 ROMANIA crismihailescu@yahoo.com http://www.elmec.ro

More information

Probabilistic Models in Theoretical Neuroscience

Probabilistic Models in Theoretical Neuroscience Probabilistic Models in Theoretical Neuroscience visible unit Boltzmann machine semi-restricted Boltzmann machine restricted Boltzmann machine hidden unit Neural models of probabilistic sampling: introduction

More information

Physics 20 Homework 1 SIMS 2016

Physics 20 Homework 1 SIMS 2016 Physics 20 Homework 1 SIMS 2016 Due: Wednesday, Auust 17 th Problem 1 The idea of this problem is to et some practice in approachin a situation where you miht not initially know how to proceed, and need

More information

Ch 14: Feedback Control systems

Ch 14: Feedback Control systems Ch 4: Feedback Control systems Part IV A is concerned with sinle loop control The followin topics are covered in chapter 4: The concept of feedback control Block diaram development Classical feedback controllers

More information

Investigation of ternary systems

Investigation of ternary systems Investiation of ternary systems Introduction The three component or ternary systems raise not only interestin theoretical issues, but also have reat practical sinificance, such as metallury, plastic industry

More information

Activity Driven Adaptive Stochastic. Resonance. Gregor Wenning and Klaus Obermayer. Technical University of Berlin.

Activity Driven Adaptive Stochastic. Resonance. Gregor Wenning and Klaus Obermayer. Technical University of Berlin. Activity Driven Adaptive Stochastic Resonance Gregor Wenning and Klaus Obermayer Department of Electrical Engineering and Computer Science Technical University of Berlin Franklinstr. 8/9, 187 Berlin fgrewe,obyg@cs.tu-berlin.de

More information

Phase Diagrams: construction and comparative statics

Phase Diagrams: construction and comparative statics 1 / 11 Phase Diarams: construction and comparative statics November 13, 215 Alecos Papadopoulos PhD Candidate Department of Economics, Athens University of Economics and Business papadopalex@aueb.r, https://alecospapadopoulos.wordpress.com

More information

Asymptotic Behavior of a t Test Robust to Cluster Heterogeneity

Asymptotic Behavior of a t Test Robust to Cluster Heterogeneity Asymptotic Behavior of a t est Robust to Cluster Heteroeneity Andrew V. Carter Department of Statistics University of California, Santa Barbara Kevin. Schnepel and Doulas G. Steierwald Department of Economics

More information

2.3. PBL Equations for Mean Flow and Their Applications

2.3. PBL Equations for Mean Flow and Their Applications .3. PBL Equations for Mean Flow and Their Applications Read Holton Section 5.3!.3.1. The PBL Momentum Equations We have derived the Reynolds averaed equations in the previous section, and they describe

More information

ANALYSIS OF POWER EFFICIENCY FOR FOUR-PHASE POSITIVE CHARGE PUMPS

ANALYSIS OF POWER EFFICIENCY FOR FOUR-PHASE POSITIVE CHARGE PUMPS ANALYSS OF POWER EFFCENCY FOR FOUR-PHASE POSTVE CHARGE PUMPS Chien-pin Hsu and Honchin Lin Department of Electrical Enineerin National Chun-Hsin University, Taichun, Taiwan e-mail:hclin@draon.nchu.edu.tw

More information

Effects of refractory periods in the dynamics of a diluted neural network

Effects of refractory periods in the dynamics of a diluted neural network Effects of refractory periods in the dynamics of a diluted neural network F. A. Tamarit, 1, * D. A. Stariolo, 2, * S. A. Cannas, 2, *, and P. Serra 2, 1 Facultad de Matemática, Astronomía yfísica, Universidad

More information

The homogeneous Poisson process

The homogeneous Poisson process The homogeneous Poisson process during very short time interval Δt there is a fixed probability of an event (spike) occurring independent of what happened previously if r is the rate of the Poisson process,

More information

Information capacity of genetic regulatory elements

Information capacity of genetic regulatory elements PHYSICAL REVIEW E 78, 9 8 Information capacity of enetic reulatory elements Gašper Tkačik,, Curtis G. Callan, Jr.,,,3 and William Bialek,,3 Joseph Henry Laboratories of Physics, Princeton University, Princeton,

More information

arxiv: v1 [q-bio.nc] 13 Feb 2018

arxiv: v1 [q-bio.nc] 13 Feb 2018 Gain control with A-type potassium current: I A as a switch between divisive and subtractive inhibition Joshua H Goldwyn 1*, Bradley R Slabe 2, Joseph B Travers 3, David Terman 2 arxiv:182.4794v1 [q-bio.nc]

More information

Dynamical Synapses Give Rise to a Power-Law Distribution of Neuronal Avalanches

Dynamical Synapses Give Rise to a Power-Law Distribution of Neuronal Avalanches Dynamical Synapses Give Rise to a Power-Law Distribution of Neuronal Avalanches Anna Levina 3,4, J. Michael Herrmann 1,2, Theo Geisel 1,2,4 1 Bernstein Center for Computational Neuroscience Göttingen 2

More information

Introduction to Neural Networks U. Minn. Psy 5038 Spring, 1999 Daniel Kersten. Lecture 2a. The Neuron - overview of structure. From Anderson (1995)

Introduction to Neural Networks U. Minn. Psy 5038 Spring, 1999 Daniel Kersten. Lecture 2a. The Neuron - overview of structure. From Anderson (1995) Introduction to Neural Networks U. Minn. Psy 5038 Spring, 1999 Daniel Kersten Lecture 2a The Neuron - overview of structure From Anderson (1995) 2 Lect_2a_Mathematica.nb Basic Structure Information flow:

More information

Assessment of the MCNP-ACAB code system for burnup credit analyses. N. García-Herranz 1, O. Cabellos 1, J. Sanz 2

Assessment of the MCNP-ACAB code system for burnup credit analyses. N. García-Herranz 1, O. Cabellos 1, J. Sanz 2 Assessment of the MCNP-ACAB code system for burnup credit analyses N. García-Herranz, O. Cabellos, J. Sanz 2 Departamento de Ineniería Nuclear, Universidad Politécnica de Madrid 2 Departamento de Ineniería

More information

arxiv: v3 [q-bio.nc] 17 Oct 2018

arxiv: v3 [q-bio.nc] 17 Oct 2018 Evaluating performance of neural codes in model neural communication networks Chris G. Antonopoulos 1, Ezequiel Bianco-Martinez 2 and Murilo S. Baptista 3 October 18, 2018 arxiv:1709.08591v3 [q-bio.nc]

More information

Time varying networks and the weakness of strong ties

Time varying networks and the weakness of strong ties Supplementary Materials Time varying networks and the weakness of strong ties M. Karsai, N. Perra and A. Vespignani 1 Measures of egocentric network evolutions by directed communications In the main text

More information

How Spike Generation Mechanisms Determine the Neuronal Response to Fluctuating Inputs

How Spike Generation Mechanisms Determine the Neuronal Response to Fluctuating Inputs 628 The Journal of Neuroscience, December 7, 2003 23(37):628 640 Behavioral/Systems/Cognitive How Spike Generation Mechanisms Determine the Neuronal Response to Fluctuating Inputs Nicolas Fourcaud-Trocmé,

More information

arxiv: v1 [cond-mat.stat-mech] 6 Mar 2008

arxiv: v1 [cond-mat.stat-mech] 6 Mar 2008 CD2dBS-v2 Convergence dynamics of 2-dimensional isotropic and anisotropic Bak-Sneppen models Burhan Bakar and Ugur Tirnakli Department of Physics, Faculty of Science, Ege University, 35100 Izmir, Turkey

More information

Mathematical Models of Dynamic Behavior of Individual Neural Networks of Central Nervous System

Mathematical Models of Dynamic Behavior of Individual Neural Networks of Central Nervous System Mathematical Models of Dynamic Behavior of Individual Neural Networks of Central Nervous System Dimitra-Despoina Pagania,*, Adam Adamopoulos,2, and Spiridon D. Likothanassis Pattern Recognition Laboratory,

More information

REVIEW: Going from ONE to TWO Dimensions with Kinematics. Review of one dimension, constant acceleration kinematics. v x (t) = v x0 + a x t

REVIEW: Going from ONE to TWO Dimensions with Kinematics. Review of one dimension, constant acceleration kinematics. v x (t) = v x0 + a x t Lecture 5: Projectile motion, uniform circular motion 1 REVIEW: Goin from ONE to TWO Dimensions with Kinematics In Lecture 2, we studied the motion of a particle in just one dimension. The concepts of

More information

Experiment 3 The Simple Pendulum

Experiment 3 The Simple Pendulum PHY191 Fall003 Experiment 3: The Simple Pendulum 10/7/004 Pae 1 Suested Readin for this lab Experiment 3 The Simple Pendulum Read Taylor chapter 5. (You can skip section 5.6.IV if you aren't comfortable

More information

AT2 Neuromodeling: Problem set #3 SPIKE TRAINS

AT2 Neuromodeling: Problem set #3 SPIKE TRAINS AT2 Neuromodeling: Problem set #3 SPIKE TRAINS Younesse Kaddar PROBLEM 1: Poisson spike trains Link of the ipython notebook for the code Brain neuron emit spikes seemingly randomly: we will aim to model

More information

Teams to exploit spatial locality among agents

Teams to exploit spatial locality among agents Teams to exploit spatial locality amon aents James Parker and Maria Gini Department of Computer Science and Enineerin, University of Minnesota Email: [jparker,ini]@cs.umn.edu Abstract In many situations,

More information

Convergence of DFT eigenvalues with cell volume and vacuum level

Convergence of DFT eigenvalues with cell volume and vacuum level Converence of DFT eienvalues with cell volume and vacuum level Sohrab Ismail-Beii October 4, 2013 Computin work functions or absolute DFT eienvalues (e.. ionization potentials) requires some care. Obviously,

More information

Fast and exact simulation methods applied on a broad range of neuron models

Fast and exact simulation methods applied on a broad range of neuron models Fast and exact simulation methods applied on a broad range of neuron models Michiel D Haene michiel.dhaene@ugent.be Benjamin Schrauwen benjamin.schrauwen@ugent.be Ghent University, Electronics and Information

More information

Analysis of Neural Networks with Chaotic Dynamics

Analysis of Neural Networks with Chaotic Dynamics Chaos, Solitonr & Fructals Vol. 3, No. 2, pp. 133-139, 1993 Printed in Great Britain @60-0779/93$6.00 + 40 0 1993 Pergamon Press Ltd Analysis of Neural Networks with Chaotic Dynamics FRANCOIS CHAPEAU-BLONDEAU

More information

THE USE OF POWER FOR THE CHARACTERIZATION OF EARTHQUAKE GROUND MOTIONS

THE USE OF POWER FOR THE CHARACTERIZATION OF EARTHQUAKE GROUND MOTIONS 13 th World Conference on Earthquake Enineerin Vancouver, B.C., Canada Auust 1-6, 004 Paper No. 574 THE UE OF POWER FOR THE CHARACTERIZATION OF EARTHQUAKE GROUND MOTION Kenneth ELWOOD 1 and Erik NIIT UMMARY

More information

Causality and communities in neural networks

Causality and communities in neural networks Causality and communities in neural networks Leonardo Angelini, Daniele Marinazzo, Mario Pellicoro, Sebastiano Stramaglia TIRES-Center for Signal Detection and Processing - Università di Bari, Bari, Italy

More information

Motion in Two Dimensions Sections Covered in the Text: Chapters 6 & 7, except 7.5 & 7.6

Motion in Two Dimensions Sections Covered in the Text: Chapters 6 & 7, except 7.5 & 7.6 Motion in Two Dimensions Sections Covered in the Tet: Chapters 6 & 7, ecept 7.5 & 7.6 It is time to etend the definitions we developed in Note 03 to describe motion in 2D space. In doin so we shall find

More information

arxiv:cond-mat/ v1 [cond-mat.stat-mech] 17 Sep 1999

arxiv:cond-mat/ v1 [cond-mat.stat-mech] 17 Sep 1999 Shape Effects of Finite-Size Scalin Functions for Anisotropic Three-Dimensional Isin Models Kazuhisa Kaneda and Yutaka Okabe Department of Physics, Tokyo Metropolitan University, Hachioji, Tokyo 92-397,

More information

The story of the Universe

The story of the Universe The story of the Universe From the Bi Ban to today's Universe Quantum ravity era Grand unification era Electroweak era Protons and neutrons form Nuclei are formed Atoms and liht era Galaxy formation Today

More information

How to read a burst duration code

How to read a burst duration code Neurocomputing 58 60 (2004) 1 6 www.elsevier.com/locate/neucom How to read a burst duration code Adam Kepecs a;, John Lisman b a Cold Spring Harbor Laboratory, Marks Building, 1 Bungtown Road, Cold Spring

More information

IEEE TRANSACTIONS ON BIOMEDICAL ENGINEERING 1

IEEE TRANSACTIONS ON BIOMEDICAL ENGINEERING 1 IEEE TRANSACTIONS ON BIOMEDICAL ENGINEERING 1 Intervention in Gene Reulatory Networks via a Stationary Mean-First-Passae-Time Control Policy Golnaz Vahedi, Student Member, IEEE, Babak Faryabi, Student

More information

IN THIS turorial paper we exploit the relationship between

IN THIS turorial paper we exploit the relationship between 508 IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL. 10, NO. 3, MAY 1999 Weakly Pulse-Coupled Oscillators, FM Interactions, Synchronization, Oscillatory Associative Memory Eugene M. Izhikevich Abstract We study

More information

arxiv:chao-dyn/ v1 5 Mar 1996

arxiv:chao-dyn/ v1 5 Mar 1996 Turbulence in Globally Coupled Maps M. G. Cosenza and A. Parravano Centro de Astrofísica Teórica, Facultad de Ciencias, Universidad de Los Andes, A. Postal 26 La Hechicera, Mérida 5251, Venezuela (To appear,

More information