Triplets of Spikes in a Model of Spike Timing-Dependent Plasticity

Size: px
Start display at page:

Download "Triplets of Spikes in a Model of Spike Timing-Dependent Plasticity"

Transcription

1 The Journal of Neuroscience, September 20, (38): Behavioral/Systems/Cognitive Triplets of Spikes in a Model of Spike Timing-Dependent Plasticity Jean-Pascal Pfister and Wulfram Gerstner Laboratory of Computational Neuroscience, School of Computer and Communication Sciences and Brain-Mind Institute, Ecole Polytechnique Fédérale de Lausanne, CH-1015 Lausanne, Switzerland Classical experiments on spike timing-dependent plasticity (STDP) use a protocol based on pairs of presynaptic and postsynaptic spikes repeated at a given frequency to induce synaptic potentiation or depression. Therefore, standard STDP models have expressed the weight change as a function of pairs of presynaptic and postsynaptic spike. Unfortunately, those paired-based STDP models cannot account for the dependence on the repetition frequency of the pairs of spike. Moreover, those STDP models cannot reproduce recent triplet and quadruplet experiments. Here, we examine a triplet rule (i.e., a rule which considers sets of three spikes, i.e., two pre and one post or one pre and two post) and compare it to classical pair-based STDP learning rules. With such a triplet rule, it is possible to fit experimental data from visual cortical slices as well as from hippocampal cultures. Moreover, when assuming stochastic spike trains, the triplet learning rule can be mapped to a Bienenstock Cooper Munro learning rule. Key words: STDP; spike triplet; modeling; computational neuroscience; Hebbian learning; long-term potentiation Introduction During the last decade, an increasing number of experiments have shown that synaptic strength changes as a function of the precise spike timing of the presynaptic and postsynaptic neurons. In the early experiments (Markram et al., 1997; Bi and Poo, 1998, 2001; Zhang et al., 1998), potentiation has been elicited by a sequence of n pairs of pre then post spikes, whereas depression occurred when the timing was reversed (i.e., when each postsynaptic spike precedes a presynaptic one). At this point, it was natural to characterize synaptic plasticity as a function of the time difference t t post t pre between pairs of spikes. However, performing experiments with pairs of spikes does not mean that pairs of spikes are the elementary building block. There is no a priori reason to think that pairs of spikes are more relevant than three spikes (triplets), four spikes (quadruplets), or even more. It is clear that a lot of other neuronal variables such as calcium concentration (Malenka et al., 1988; Lisman, 1989; Lisman and Zhabotinsky, 2001; Shouval et al., 2002) or postsynaptic membrane potential (Rao and Sejnowski, 2001; Sjöström et al., 2001; Lisman and Spruston, 2005) play an important role in triggering potentiation or depression. The point of this study was to see how far we can explain experiments that only use spike timing as a parameter with models that only use spike timing. Recent experiments (Bi and Wang, 2002; Froemke and Dan, 2002; Wang et al., 2005; Froemke et al., 2006) have studied the Received April 4, 2006; revised Aug. 4, 2006; accepted Aug. 4, Thisworkwassupportedbythe SwissNationalScienceFoundation Wegratefullyacknowledge discussions with Taro Toyoizumi and Magnus Richardson. Correspondence should be addressed to Prof. Wulfram Gerstner, Ecole Polytechnique Fédérale de Lausanne, Laboratory of Computational Neuroscience, Station 15, 1015 Lausanne, Switzerland. wulfram. gerstner@epfl.ch. DOI: /JNEUROSCI Copyright 2006 Society for Neuroscience /06/ $15.00/0 detailed role of spike timing by triggering synaptic plasticity with spike triplets (one presynaptic spike combined with two postsynaptic spikes or one postsynaptic spike with two presynaptic spikes). The results of those experiments indicate that classical spike timing-dependent plasticity (STDP) models based on pairs of spikes are not sufficient to explain synaptic changes triggered by triplets or quadruplets of spikes. In the first part of this study, we review some experimental protocols performed in visual cortex (Sjöström et al., 2001) and hippocampal culture (Wang et al., 2005) and show why the classical pair-based STDP models fail to reproduce those experimental data. In the second part of this study, we show that if we assume that synaptic plasticity is governed by a suitable combination of pairs and triplets of spikes, the results from the above mentioned protocols can be surprisingly well reproduced. Moreover, we show that our triplet learning rule elicits input selectivity analogous to that of the Bienenstock Cooper Monro (BCM) theory (Bienenstock et al., 1982). Claiming that triplet of spikes are more relevant than pairs of spikes is not enough to construct a model of synaptic plasticity. It is also necessary to determine how those pairs or triplets of spikes integrate. For both the pair-based models and the triplet-based models, we consider the case in which a presynaptic spike interacts with all previous postsynaptic ones or vice versa (we call this the All-to-All interaction) (Gerstner et al., 1996; Kempter et al., 1999; Kistler and van Hemmen, 2000; Song et al., 2000) and the case where only neighboring spikes are taken into account (Nearest-Spike interaction) (van Rossum et al., 2000; Bi, 2002; Izhikevich and Desai, 2003; Burkitt et al., 2004; Pfister and Gerstner, 2006). We found a slight preference for All-to-All interactions. Materials and Methods We compared a new triplet-based model with experimental data from the hippocampus and visual cortex. The visual cortex data set (Sjöström

2 9674 J. Neurosci., September 20, (38): Pfister and Gerstner Triplets in STDP Models et al., 2001) used in this study consists of a standard pairing protocol in which the frequency of the pairing has been changed. We also considered a hippocampal culture data set (Wang et al., 2005) which consists of pair, triplet, and quadruplet protocols. Because both data sets disagree on some specific protocols (at low frequency of the pairing protocol, no potentiation is elicited in Sjöström s data, whereas a large amount of potentiation is present in Wang s data) and because the preparations are different, we fitted our models with different parameters for each data set. Synaptic learning rule. Our new triplet-based model of STDP is an extension of classical pairbased STDP models. Traditional mechanistic models of STDP involve a small number of variables that are updated by presynaptic and postsynaptic firing events (Kistler and van Hemmen, 2000; Abarbanel et al., 2002; Gerstner and Kistler, 2002; Karmarkar and Buonomano, 2002). The new triplet rule is formulated in such a framework. To introduce the variables used in our model, we considered the process of synaptic transmission. Whenever a presynaptic spike arrives at an excitatory synapse, glutamate is released into the synaptic cleft and binds to glutamate receptors. Let r 1 denote the amount of glutamate bound to a postsynaptic receptor. The variable r 1 increases whenever there is a presynaptic spike and decreases back to zero otherwise with a time constant of. This can be written as follows: dr 1 t r 1t dt if t t pre, then r 1 3 r 1 1. (1) Here, t pre denotes the moment of spike arrival at the presynaptic terminal. The units of r 1 are chosen such that glutamate binding increases by one unit after spike arrival. We emphasize that r 1 is an abstract variable. Instead of glutamate binding, it could describe equally well some other quantity that increases after presynaptic spike arrival. We call r 1 a detector of presynaptic events. Instead of having only one process triggered by a presynaptic spike, it is possible to consider several different quantities, which increase in the presence of a presynaptic spike. In our model, we considered two different detectors of presynaptic events, namely r 1 and r 2. The dynamics of r 2 is analogous to that of r 1 except that its time constant x is larger than. Similarly, we assume that each postsynaptic spike t post induces an increase of two different quantities that we denote o 1 and o 2. Potential interpretations of o 1 and o 2 are given below. In the absence of postsynaptic spiking, these postsynaptic detectors decrease their value with a time constant and y, respectively. Formally, this gives the following: dr 2 t r 2t dt x if t t pre then r 2 3 r 2 1 do 1 t o 1t dt if t t post then o 1 3 o 1 1 do 2 t o 2t dt y if t t post then o 2 3 o 2 1. We do not want to identify the variables r 1, r 2, o 1, and o 2 with specific biophysical quantities. Candidates of detectors of presynaptic events are, for example, the amount of glutamate bound (Karmarkar and Buonomano, 2002) or the number of NMDA receptors in an activated state (Senn et al., 2001). Postsynaptic detectors o 1 and o 2 could represent the Figure 1. Schematic description of the triplet learning rules. A, Schematic description of the two terms contributing to longterm depression (LTD) controlled by A 2 and A 3 and the two long-term potentiation (LTP) terms controlled by A 2 and A 3.A presynaptic spike after a postsynaptic one (post 3 pre) induces LTD if the temporal difference is not much larger than (pair term, A 2 ). The presence of a previous presynaptic spike gives an additional contribution (2-pre-1-post triplet term, A 3 )ifthe interval between the two presynaptic spikes is not much larger than x. Similarly, the triplet term for LTP depends on one presynaptic spike but two postsynaptic spikes. The presynaptic spike must occur before the second postsynaptic one with a temporaldifferencenotmuchlargerthan.b,timecourseofdetectorsofpresynapticandpostsynapticeventsr 1,r 2,o 1,ando 2. The presynaptic variables r 1 and r 2 are increased by a fixed amount after arrival of a presynaptic spike. Analogously, postsynaptic variables are updated after postsynaptic firing. With All-to-All interactions, each postsynaptic spike interacts with all previous postsynapticspikesandviceversa(i.e.,theinternalvariablesr 1,r 2,o 1,ando 2 accumulateoverseveralpostsynapticspiketimings). The red and blue dots denote the values of those internal variables read by the triplet model whenever a spike occurs [e.g., the value of the postsynaptic variable o 1 is read out at the moment of presynaptic spike arrival leading to synaptic depression proportional to the momentary value of o 1 (blue dot)]. Similarly, the value of the presynaptic variable r 1 and the postsynaptic variable o 2 are read out at the moment of the second postsynaptic spike and determine the amplitude of synaptic potentiation. C, Same as in B but with Nearest-Spike interactions: the extension of the spike interaction is restricted to the last spike; no accumulation occurs. (2) influx of calcium concentration through voltage-gated Ca 2 channels and NMDA channels (Karmarkar and Buonomano, 2002) or the number of secondary messengers in a deactivated state of the NMDA receptor (Senn et al., 2001) or the voltage trace of a back-propagating action potential (Shouval et al., 2002). Because our present model is formulated as a mechanistic model, it is possible to define changes of synaptic efficacies for our triplet learning rule with All-to-All interactions as a function of those four detectors without making any assumption on the biophysical quantities they represent. We assume that the weight decreases after presynaptic spike arrival by an amount that is proportional to the value of the postsynaptic variable o 1 but depends also on the value of the second presynaptic detector r 2. Hence, presynaptic spike arrival at time t pre triggers a change given by the following: wt 3 wt o 1 ta 2 A 3 r 2 t if t t pre. (3) Similarly, a postsynaptic spike at time t post triggers a change that depends on the presynaptic variable r 1 and the second postsynaptic variable o 2 as follows: wt 3 wt r 1 ta 2 A 3 o 2 t if t t post. (4) Here, A 2 and A 2 denote the amplitude of the weight change whenever there is a pre-post pair or a post-pre pair. Similarly, A 3 and A 3 denote the amplitude of the triplet term for potentiation and depression, respectively (Fig. 1A). All of the four amplitude parameters are assumed to be greater or equal to zero. is a small positive constant to ensure that the weight is updated before the detectors r 2 or o 2. In other words, r 2 is zero unless a previous presynaptic spike has led to an increase of r 2. This ensures the detection of spike triplets. Figure 1B illustrates how a 1-pre-2-post triplet is detected by the learning rule. At the time of a postsynaptic spike, the learning rule reads the value of the second postsynaptic variable o 2 just before the spike (see the red dot at time t post in Fig. 1B) as well as the value of the presynaptic

3 Pfister and Gerstner Triplets in STDP Models J. Neurosci., September 20, (38): Table 1. Experimental weight change w as a function of the delay t t post t pre induced by a pairing protocol in the visual cortex t 10 ms t 10 ms 0.1 Hz Hz Hz Hz Hz Those values are used for the fitting of the pair-based and triplet-based models of visual cortical neurons. Data were obtained from Sjöström (2001). Table 2. Experimental weight change w as a function of the relative spike timing t, t 1, t 2, and T induced by pairing, triplet and quadruplet protocols in hippocampal cultures Pairing Quadruplet w t (ms) w T(ms) t (ms) Triplet (2-pre-1-post) Triplet (1-pre-2-post) w t 1 (ms) t 2 (ms) w t 1 (ms) t 2 (ms) The data are used for the fitting of the pair-based and triplet-based models of hippocampal culture neurons. Data were obtained from Wang et al. (2005). detector r 1 (see the blue dot at time t post in Fig. 1B) and increases the weight by an amount A 2 r 1 (t post ) A 3 r 1 (t post )o 2 (t post - ) (see Eq. 4). Note that if we set A 3 0 and A 3 0, the model becomes a classical pair-based STDP model (Gerstner et al., 1996; Kempter et al., 1999; Kistler and van Hemmen, 2000; Song et al., 2000). This pair-based STDP model was used for the results of Figure 2. It should be further noted that the two extra triplet terms vanish if a single spike pair is presented or if spike pairings are repeated at low frequency. This means that in the limit of low frequency, the classical pair-based learning rule is identical to our triplet learning rule. The triplet learning rule of Equations 3 and 4 can also describe a Nearest-Spike interaction scheme if we redefine the update rule of presynaptic and postsynaptic detectors. Instead of simply low-pass filtering the spike trains (i.e., adding the effects of all spikes), the detector variables saturate at 1 (i.e., 0 r 1, r 2, o 1, o 2 1). This is achieved by updating the variables to the value of 1 instead of updating by at step of 1. In this way, the synapse forgets all other previous spikes and keeps only the memory of the last one (Fig. 1C). In this paper, we consider first a full triplet model, which takes into account all four terms of Equations 3 and 4. Then, we will see that only some of the terms are really necessary. This is why we define two different minimal models. The first one is intended to fit the visual cortex data and disregards two terms (i.e., A 2 0 and A 3 0). For the hippocampal culture data set, we consider a slightly different minimal model, which disregards only one term (i.e., A 3 0). In principle, the amplitude parameters A 2, A 2, A 3, and A 3 could change on a slow time scale. For example, similar to the threshold in the BCM rule, those parameters could, because of homeostatic processes, depend on the mean postsynaptic firing rate y averaged over a time scale of 10 min or more. Protocols. To compare our model to experimental data, we followed three different experimental protocols (see Fig. 2) in which the synaptic weight changes as a function of the presynaptic and postsynaptic spike statistics. The forth protocol is of a more theoretical value in the sense that it can be compared with the BCM learning rule, which has interesting computational properties. Pairing protocol. This is the classical STDP protocol (see Fig. 2A) (Markram et al., 1997; Bi and Poo, 1998, 2001; Zhang et al., 1998; Sjöström et al., 2001; Froemke and Dan, 2002). n 60 pairs of presynaptic and postsynaptic spikes shifted by t are elicited at regular intervals of 1/. The interest of the study of Sjöström et al. (2001) is that the authors analyzed, in this pairing protocol, the weight change as a function of the frequency for a fixed t. Changing the frequency is a good way to check the validity of a model, especially at high frequency, where many spikes are potentially in the temporal range of interaction. It should be noted that the amount of potentiation for a pre-post (t 10 ms) pair reported by Wang et al. (2005) is significantly lower than the one originally measured (Bi and Poo, 1998) under the same conditions. As mentioned by Wang et al. (2005), this can be accounted for by the difference in initial synaptic strength, which was higher in the study by Bi and Poo (1998). To test our model on a consistent set of data, we took the measurements of Wang et al. (2005) (compare their supplemental Fig. 1) (i.e., w for t 10 ms and w for t 10 ms. Data from Wang et al. (2005), including error bars, are redrawn in Figure 3. Because the potentiation and depression time constant are not present in the study by Wang et al. (2005), we took 16.8 ms and 33.7 from (Bi and Poo, 2001). Triplet protocol. The first triplet protocol (see Fig. 2C) consists of n 60 sets of three spikes repeated at a given frequency 1 Hz. Each triplet consists of two presynaptic spikes and one postsynaptic spike characterized by t 1 t post t 1 pre and t 2 t post t 2 pre where t 1 pre and t 2 pre are the first and second presynaptic spikes of the triplet. The second triplet protocol (see Fig. 2D) also consists of n 60 triplets. The only difference is that each triplet consists of one presynaptic and two postsynaptic spikes. In this case, t 1 t 1 post t pre and t 2 t 2 post t pre, where t 1 post and t 2 post are, respectively, the first and second postsynaptic spikes of the triplet. Experiments with such a triplet protocol have been performed by Froemke and Dan (2002) in L2/3 pyramidal neurons of the rat visual cortex and by Wang et al. (2005) in hippocampal cultures. To have a consistent and broad data set (i.e., pair, triplet, and quadruplet experiments), we decided, in the present study, to focus only on the data of Wang et al. (2005), because we did not find enough quantitative information about quadruplets in the study by Froemke and Dan (2002). Quadruplet protocol. This protocol consists of n 60 quadruplets at frequency 1 Hz (see Fig. 2B). It was used by Wang et al. (2005) and is characterized as follows: a post-pre pair with a delay of t 1 t 1 post t 1 pre 0 is followed after a time T by a pre-post pair with a delay of t 2 t 2 post t 2 pre 0. When T is negative, the opposite happens. A pre-post pair (t 2 t 2 post t 2 pre 0) is followed by a post-pre pair (t 1 t 1 post t 1 pre 0). Formally, T is defined by T (t 2 pre t 2 post )/2 (t 1 pre t 1 post )/2. Throughout this paper, we took t t 1 t 2 5 ms. Poisson protocol. The presynaptic and postsynaptic spike trains are Poisson spike trains with firing rate x and y, respectively. The interest of such a protocol is that it is possible to establish a link with the BCM learning rule (Bienenstock et al., 1982), which has attractive theoretical properties. Indeed, this learning rule was originally used to explain the emergence of orientation selectivity in the visual cortex. Even if this protocol has less experimental support than the other protocols, some aspects of it have been indirectly measured in the visual cortex (Kirkwood et al., 1996) and in hippocampal slice (Artola et al., 1990; Dudek and Bear, 1992). Data fitting. To fit the amplitude parameters A 2, A 2, A 3, and A 3 and the time constants x and y ( 16.8 ms and 33.7 ms are kept fixed), we calculated the total weight change w i mod for a given pairing or triplet protocol and compared it to the experimental value w i exp. For the optimization of the parameters, we performed a minimization of the normalized mean-square error E defined by the following; P E 1 P i1 w i exp mod 2 w i i, (5)

4 9676 J. Neurosci., September 20, (38): Pfister and Gerstner Triplets in STDP Models where w i exp and i are the experimental mean weight change and SEM weight change for a given data point i. P is the number of data points within a data set; p 10 for the visual cortex data set (Table 1), and p 13 for the hippocampal culture data set (Table 2). w i mod is the weight change for a given model (pair or triplet model). Numerical procedures. The weight change w mod for a given model and a given protocol can be either simulated numerically with Equations 1 4 or calculated analytically. See supplemental material (available at for an example of analytical calculation of the weight change of the triplet model applied to the pairing protocol with Nearest- Spike interactions. In the present study, the weight changes predicted by all different models (pair-based models, minimal and full triplet-based models with both Nearest-Spike and All-to-All interactions) have been calculated analytically and then evaluated numerically with MatLab (MathWorks, Natick, MA) on a Sun machine. The normalized mean-square error E of Equation 5 has been minimized with the MatLab built-in function Isqnonlin, which uses a reflective Newton method. Results Standard pair-based STDP models fail to reproduce frequency effects In a first series of experiments, we applied a classical pair-based STDP learning rule (compare Eqs. 3 and 4 with A 3 0 and A 3 0) to the pairing protocol with 60 pairs of presynaptic and postsynaptic spikes (see Materials and Methods). Obviously, the weight change predicted by the model depends on the precise choice of the parameters A 2, A 2,, and. We therefore set those parameters in such a way that the normalized mean square error E across all experimental protocols is minimal (see Eq. 5). We found that even with the best set of parameters, the classical STDP model fails, for both the All-to-All interaction and the Nearest-Spike interaction, to reproduce the experimental data (Fig. 2 A). This is attributable to the following reasons. First, as pointed out by Sjöström et al. (2001), a surprising aspect of their finding is that at low repetition frequency,, there is no potentiation. This cannot be captured by standard pairbased STDP models, because for any choice of the parameter A 2 0, the pair-based model induces LTP if a presynaptic spike precedes a postsynaptic one by a few milliseconds. Second, as we can see in Figure 2, for t 0, potentiation increases when frequency increases. This behavior can also not be reproduced by classical STDP models. Indeed, in pair-based STDP models, as soon as the frequency increases, the pre-post pairs approach each other and generate an interaction between the postsynaptic spike of one pair and the presynaptic spike of the next pair. The effect of these post-pre pairs should increase with frequency and therefore depress the synapse, which is not what is seen in the experiments. Therefore, classical pair-based models fail to reproduce the pairing experiment of Sjöström et al. (2001). It should be noted that the absence of potentiation at low frequency is in direct conflict with the results of Bi and Poo (1998), Zhang et al. (1998), and Froemke and Dan (2002), where there is a reasonable potentiation at low frequency. Because the preparation of Sjöström et al. (2001) is different from the one of Bi and Poo (1998) and Wang et al. (2005) and the results in conflict, it seems natural to use different parameters in our model for each data set. Figure 2. Failure of pair-based STDP learning rules. In all four subgraphs, black lines or symbols denote experimental data, blue lines correspond to the All-to-All pair model, and the red lines correspond to the Nearest-Spike pair model (see Results for details). A, Weight change in a pairing protocol as a function of the frequency (solid lines, t 10 ms; dashed lines, t 10 ms). Black lines and data points (with errors) are redrawn from Sjöström et al. (2001). The experimental data are reproduced at neither high nor low values of the repetition frequency. B, Quadruplet protocol. Black circles are redrawn from Wang et al. (2005). C, D, Triplet protocol for the pre-post-pre case (C) and the post-pre-post case (D). The black dots in B and the black bars (and SEs) in C and D are redrawn from Wang et al. (2005). The asymmetry of the experimental results [no potentiation for (t 1, t 2 ) (5 ms, 5 ms)] in C but strong potentiation for (5 ms, 5 ms) in D is not captured by the pair-based models. Standard pair-based STDP models fail to reproduce triplet and quadruplet experiments The following is a second set of evidence of the limits of pairbased STDP learning rules. In triplet experiments (Fig. 2C,D), there is a clear asymmetry between a pre-post-pre and a post-prepost experiment. For example, 60 repetitions of a pre-post-pre triplet with relative timing (t 1, t 2 ) (5 ms, 5 ms) yields no weight change, whereas the same number of repetitions of a postpre-post triplet with (t 1, t 2 ) (5 ms, 5 ms) yields a weight change of 30%. However, any pair-based model would predict the same result for pre-post-pre and post-pre-post experiments, because the same pairs occur. Therefore, triplet results cannot be explained by a sum of a pre-post potentiation term and a post-pre depression term (Fig. 2C,D). Finally, the asymmetry present in the quadruplets experiments (Fig. 2 B) also causes some problems for pair-based STDP models. A quadruplet consists of a pre-post-post-pre sequence or a post-pre-pre-post sequence, and T denotes the interval between the first and last pair of spikes within the quadruplet (see Materials and Methods for more details). In a pair-based model with All-to-All interactions and for a given interval T between the pairs, the weight changes for post-pre-pre-post and pre-postpost-pre are strictly identical because of the symmetry of the protocol and the symmetry of the All-to-All interaction. The weight change predicted by a pair model can therefore not explain the asymmetry seen in the data. With Nearest-Spike interactions, the situation gets even worse: pre-post-post-pre quadruplets consist of two pre-post pairs and one post-pre term, whereas for the post-pre-pre-post case, the opposite occurs: two post-pre pairs and only one pre-post pair. Therefore, the Nearest-Spike

5 Pfister and Gerstner Triplets in STDP Models J. Neurosci., September 20, (38): Table 3. Visual cortex data set Model A 2 A 3 interaction scheme leads to an asymmetry that is opposite to the one found in experiments (Fig. 2B). Triplet rule So far, we have shown that standard pair-based STDP models fail to reproduce frequency effects of the pairing protocol as well as triplet and quadruplet experiments. This is mainly because of the fact that pair-based models are intrinsically symmetric, in the sense that they predict the same weight change for a pre-post pair followed by a post-pre pair with the same delay t as for the inverted order [i.e., a post-pre pair followed by a pre-post pair (with the same delay t)]. However, there is no a priori reason to think that a pre-post-pre and a post-pre-post triplet should give the same result because they will activate different presynaptic and postsynaptic pathways. We therefore included extra terms in the learning rule to break the symmetry induced by pair-based models. Specifically, we added a triplet depression term (i.e., a 2-pre-1-post term) as well as a triplet potentiation term (i.e., a 1-pre-2-post term) (see Materials and Methods for more details). We call this model a full triplet model, because it includes both pair terms and triplet terms. The full triplet model is described by eight parameters: four amplitude parameters A 2, A 2, A 3, and A 3 and four time constants,, x, and y. Note that pair-based models are described by four parameters (A 2, A 2,, and ). In analogy to our approach in the previous subsection, we applied our triplet model to the protocols described in Materials A 2 All-to-All Full (101) Min Nearest-Spike Full Min Table 4. Hippocampal culture data set Model A 2 A 3 All-to-All Full Min Nearest-Spike Full (575) Min List of parameters used to model the hippocampal culture data set. In this table, the terms full and min. represent full triplet model and minimal triplet model, respectively. The additional parameters 16.8 ms and 33.7 ms are taken from Bi and Poo (2001) and kept fixed for all models and data sets. In some cases, parentheses are added in the x column to indicate that the error function is insensitive to the exact value of x in those cases. The last column corresponds to the fitting error given by Equation 5 and plotted in Figure 6. A 2 A 3 A 3 x (ms) y (ms) E and Methods. More precisely, we calculated analytically for each protocol the weight change predicted by our triplet learning rule (see supplemental material, available at for an example of explicit expression of the weight change). As before, we want our triplet learning rule to fit as best as possible to the x (ms) y (ms) E experimental data of Sjöström et al. (2001) or Wang et al. (2005). We therefore minimized the normalized mean square error across all data points of a given data set (Table 1 or 2) by adjusting the eight parameters mentioned above. The resulting parameters are summarized in Tables 3 and 4. As a first test for the triplet learning rule, we checked whether it can reproduce the biphasic learning window observed by Bi and Poo (1998). Our triplet learning rule succeeds to reproduce the classical STDP learning window (Fig. 3), because the triplet terms specific to our model play a minor role at a fixed low frequency. Triplet learning rules can reproduce frequency effects In this section, we study the pairing protocol used by Sjöström et al. (2001) in visual cortex (i.e., we apply 60 pairs of presynaptic and postsynaptic spikes at a given frequency ). As shown in Figure 4 A, our full triplet learning rule succeeds to reproduce frequency effects of the pairing protocol. Indeed, the two main problems of the pair-based STDP models explained in section three for the pairing protocol are solved by the triplet model for the following reasons. First, the absence of potentiation at low frequency is achieved by setting A 2 to a low value; second, the increase of potentiation with frequency is implemented via the triplet potentiation term controlled by A 3, which has a stronger effect than the triplet depression term A 3 (Table 3). Thus, our model can explain results at different frequencies without an explicit potentiation wins mechanism suggested previously (Sjöström et al., 2001; Wang et al., 2005). Because some of the optimized parameters of the triplet learning rule have values close to zero, we concluded that the terms controlled by these parameters can be neglected. This allowed us to define a minimal triplet model with less parameters. The first parameter we can easily drop is the amplitude A 2 of the pair potentiation term, because it is extremely small in both the Allto-All and Nearest-Spike interaction scheme (Table 3). The second parameter we neglect is A 3. This is possible for the following reason. In the All-to-All interaction scheme, we have A 3 A 2. Therefore, the effect of the triplet depression term is negligible compared with the depression induced by spike pairs. Results with the minimal triplet model show good agreement with experimental data (Fig. 5A). Hence, the minimal model with five parameters can explain the visual cortex data that the classical pair-based STDP model with four parameters fails to explain. Figure 3. The triplet learning rule can reproduce the STDP learning window. Weight change induced by a repetition of 60 pairs of presynaptic and postsynaptic spike with a delay of t at a repetition frequency of 1 Hz. A, Weight change as a function of the time difference between postsynaptic and presynaptic spike timing for the full triplet model (A) and the minimal triplet model (B). The parameters taken for the triplet models are those that correspond to the hippocampal culture data (Tables 3, 4). Experimental data points and SEs are redrawn from Wang et al. (2005). Triplet learning rules can reproduce triplet and quadruplet experiments By following the same procedure as the one described in the previous paragraph, we applied our full triplet learning rule to the second set of data (i.e., the hippocampal culture data set) (Bi and Poo, 1998, 2001; Wang et al., 2005). The parameters resulting from the minimization of the normalized mean square error

6 9678 J. Neurosci., September 20, (38): Pfister and Gerstner Triplets in STDP Models Figure 4. The full triplet learning rule succeeds to reproduce the pairing experiment and most of the triplet and quadruplet experiments. In all four subgraphs, the black lines and circles denote experimental data, the blue lines correspond to the All-to-All pair model, and the red lines correspond to the Nearest-Spike pair model. A, Weight change in a pairing protocol as a functionofthefrequency(solidlines,t10ms;dashedlines,t10ms).theblack lines and data points (with errors) are redrawn from Sjöström (2001). B, Quadruplet protocol. The black circles are redrawn from Wang et al. (2005). C, D, Triplet protocol for the pre-post-pre case (C) and the post-pre-post case (D). The black dots in B and black bars (and SEs) in C and D are redrawn from Wang et al. (2005). The triplet-based models succeed to reproduce the asymmetry in triplets protocols [no potentiation for (t 1, t 2 ) (5 ms, 5 ms) in C and strong potentiation for (5 ms, 5 ms)] in D: for those triplets, the model results (with All-to-All interactions) are within 1.1 (SE of experimental data), whereas the results of the pair-based models are off by 4. across the pair, triplet, and quadruplet data are summarized in Table 3. Our triplet learning rule does not only reproduce the classical STDP learning window (Fig. 3), but it also captures the results of most of the triplet and the quadruplet experiments. See Figure 4 B D. For example, the asymmetry between the pre-post-pre [(t 1, t 2 ) (5 ms, 5 ms)] and the post-pre-post [(t 1, t 2 ) (5 ms, 5 ms)] triplets can be well captured by our model. For those two specific triplet protocols, the predicted weight change of the full triplet learning rule with All-to-All interactions is within 1.1 (SEM) off the experimental mean weight change, whereas the pair-based learning predictions are off by 4.We should, however, note that even if our triplet learning rule captures most of the triplet experiments, the fit is not perfect. For example, the pre-post-pre [with (t 1, t 2 ) (5 ms, 15 ms)] triplet experiment is not well reproduced by our triplet learning rule (Figs. 4C, 5C). With arguments similar to those applied above to model the visual cortex data set, it is possible to reduce the complexity of the model of the hippocampal data set. Specifically, we have set A 3 0 as done previously. However, in contrast to the above minimal model for visual cortex data, the pair term controlled by A 2 is kept as part of the model, because it is necessary to explain the Figure 5. Minimal triplet learning rules are almost as good as full triplet learning rules. In all four subgraphs, the black line or circle denote experimental data, blue lines correspond to the All-to-All pair model, and the red lines correspond to the Nearest-Spike pair model. A. Weight change in a pairing protocol as a function of the frequency (solid lines, t 10 ms; dashed lines, t 10 ms). Black lines and data points (with errors) are redrawn from Sjöström (2001). B, Quadruplet protocol. Black circles are redrawn from Wang et al. (2005). C, D, Triplet protocol for the pre-post-pre case (C) and the post-pre-post case (D). Black dots in B and black bars (and SEs) in C and D are redrawn from Wang et al. (2005). potentiation at 1 Hz repetition frequency. The resulting weight change of the minimal model applied to the triplet and quadruplet experiments is depicted in Figure 5B D. We emphasize that the minimal model for the hippocampal data is different from the one used for the visual cortex data. To compare the pair models and the minimal and full triplet models, we plotted the fitting error given by Equation 5 as a function of the number of parameters in the model (Fig. 6). The best types of model are those that can predict the experimental data as well as possible while being as simple as possible (i.e., having as few parameters as possible). In this sense, the minimal models are the best, because they perform almost as well as the full triplet models while having only one extra parameter compared with standard pair-based models (two extra parameters for the hippocampal culture data set). Finally, for future tests of the triplet models, we propose two new protocols that have not yet been used experimentally. The first protocol consists of pre-post-pre triplets with relative timing (t 1, t 2 ) (5 ms, 5 ms), and the second protocol consists of post-pre-post triplets with relative timing (t 1, t 2 ) (5 ms, 5 ms). Triplets are repeated 60 times at different frequencies. Figure 6, C and D, depicts the weight change predicted by the minimal triplet models (with All-to-All and Nearest-Spike interactions) for the two triplet protocols. The models predict a frequency dependence with a positive slope. However, the overall level of potentiation predicted by the All-to-All model is clearly different from that of the Nearest-Spike interaction model. Thus,

7 Pfister and Gerstner Triplets in STDP Models J. Neurosci., September 20, (38): Figure 6. A, B, Comparison between the pair and triplet models. C, D, Predictions of the triplet models. A, Fitting error (compare Eq. 5) for the visual cortex data set of Sjöström et al. (2001) asafunctionofthenumberofparametersinthemodel. Theminimalmodelhasonlyone extra parameter compared with a pair-based model but performs 20 times better. B, Fitting error for the hippocampal data set of Wang et al. (2005). C, Predicted weight change (visual cortex) of the triplet protocol [solid lines, pre-post-pre with (t 1, t 2 ) (5, 5) ms; dashedlines,post-pre-postwith(t 1,t 2 )(5,5)ms)withAll-to-Allinteractions(blue lines) and with Nearest-Spike interactions (red lines)]. D, Same as in C but for the hippocampal culturedataset. BlackbarscorrespondtotheexperimentalresultsalsopresentinsubplotsCand D of Figures 2, 4, and 5. the above experimental protocol would allow to test the triplet models and distinguish between its two variants. Triplet learning rule can be mapped to the BCM learning rule Functional consequences of our new triplet model can be studied in two different ways (i.e., analytically or by numerical simulations). We used a combination of the two and proceeded as follows. First, we show analytically a close analogy ( mapping ) between our triplet model and the traditional BCM theory. As a result of this mapping, we may conclude that, under random spike arrival with rate x, our triplet model behaves as a BCM model and inherits all of its functional properties. In particular, we expect that our triplet model exhibits synaptic competition leading to input selectivity as required for receptive field development. In a second step, we tested this prediction of input selectivity by numerical simulation. First, we show that unlike standard pair-based STDP learning rules, our triplet learning rule can be mapped to the BCM learning rule. If we assume that the presynaptic and postsynaptic spike trains have Poisson statistics with x and y, respectively, as firing rate, the expected weight change can be calculated analytically. Intuitively, we may expect that a triplet term with one presynaptic and two postsynaptic spikes leads to a weight change that is proportional to the postsynaptic rate and the square of the presynaptic rate. An analogous argument holds for the other terms. Indeed, a detailed calculation for the All-to-All triplet learning rule based on Equations 1 4 yields an expected weight change as follows: Figure 7. The triplet learning rule can be mapped to a BCM learning rule. A, Instantaneous weight change as a function of the postsynaptic frequency for a minimal triplet model. (compareeq.6witha 3 0).ThepresynapticandpostsynapticspiketrainsarePoissonspiketrains. Thedashedlinecorrespondsto p y / p ,thesolidlinecorrespondsto1,andthe dashed line corresponds to B, Energy landscape produced by the minimal triplet learning rule (with p 2 and 0 10 Hz) in a two-input environment: 1 x (10 Hz, 0) T and 2 x (0, 10 Hz) T. The presence of two specialized (and stable) fixed points as well as two unspecialized (and unstable) fixed points is an essential feature of the BCM learning rule. C, Gaussian stimulation profile across 100 presynaptic neurons. The center of the Gaussian is shifted randomly every 200 ms to one of 10 random positions. Periodic boundary conditions are assumed. D, Evolution of the 100 weights as a function of time under the stimulation described in C. After 1 min of stimulation, the postsynaptic neuron becomes sensitive to a stimulation centered around the 70th presynaptic neuron. The parameters taken in the minimal model are those that correspond to the visual cortex (compare Tables 3 and 4). dw dt A 2 x y A 3 x x 2 y A 2 x y A 3 y x y 2 Figure 7A depicts the expected weight change of Equation 6 as a function of the postsynaptic frequency y. The above weight dynamics can be written as a BCM learning rule. Indeed, the BCM theory requires first that the weight change can be written as dw/dt ( y, ) x, where is such that ( y, ) 0, ( y, ) 0, and (0, ) 0. Our Equation 6 can satisfy this condition if A 3 0, as is the case for our minimal triplet models. The second requirement is that the threshold between potentiation and depression is proportional to the expectation of the p th power of the postsynaptic firing rate, i.e., y p, where p 1 (Bienenstock et al., 1982; Intrator and Cooper, 1992). This second requirement can be fulfilled if the parameters A 2 and A 2 depend on the mean firing rate y (or powers thereof) of the postsynaptic neuron. Specifically, we set A 2 3 A 2 y p / 0 p as well as A 2 3 A 2 y p / 0 p. By doing so, the threshold becomes y p (A 2 A 2 )/( 0 p A 3 y ). Strictly speaking, y p corresponds to the expectation over the input statistics of the p th power of the postsynaptic firing rate. Practically, this quantity can be evaluated on-line by low-pass filtering y p with a time constant of the order of 10 min or more. With this range of time scale, y p can be considered as constant (i.e., y p 0 p ) over the duration of the pairing, triplet, and (6)

8 9680 J. Neurosci., September 20, (38): Pfister and Gerstner Triplets in STDP Models quadruplet protocols we used in this study. As an aside, we note that with Nearest-Spike interactions, our triplet learning rule can almost (but not strictly) be mapped to a BCM learning rule. Because the triplet rule shares properties with BCM theory, we expect that it generates input selectivity if a neuron receives a large number of inputs. Development of input selectivity is thought be an important property to account for receptive fields development. For a numerical illustration of the input selectivity property of the triplet learning rule, we simulated the following scenario. We assume that our model neuron receives 100 afferents (1 i 100), which are stimulated with Gaussian profiles i 1Hz50 Hz exp[ (i ) 2 /( )], i 1,..., 100, the center of which is shifted randomly every 200 ms (Fig. 7C) over 10 possible positions. Presynaptic spikes are generated at time t f i with a rate i. Each presynaptic spike generates an exponential postsynaptic potential with decay time constant 10 ms, so that the total potential is u i w f ti t i 0 exp[(t t f i )/], where 0 1 mv. The postsynaptic firing rate increases with the membrane potential according to post 1Hzg u, where g 10Hz/mV. The neuron is stimulated over 60 s, whereas synapses change according to our triplet learning rule. As we can see in Figure 7D, the neuron becomes automatically specialized to one of the 10 input patterns (i.e., the one with 70). In other words, learning leads to input selectivity, a necessary property for receptive field development. It is interesting to note that the dynamics of Equation 6 can be seen as a gradient ascent of an objective function L (i.e., w L/w). Let p 2 and A 3 y. L can be written as L (/3) 3 y (/) 2. If the model neuron has only two input afferents and, hence, only two synapses, this objective function L (or energy landscape) (Fig. 7B) elicits two selective points, which correspond to the two maxima of the function L. The first maximum is at w 1 1 and w 2 0, and the second is at w 1 0 and w 2 1. Therefore, the pattern of synaptic weight corresponds to input selectivity (i.e., the neuron is sensitive to only one of two inputs). Thus, the objective function can be used for a mathematical demonstration of input selectivity. Note that the objective function exists only if we assume that is a function of p y and not y p (See also Cooper et al., 2004). Discussion In this paper, we first showed the limitations of the standard pair-based STDP models in terms of predicting the outcome of several spike timing-based protocols. We then showed that a triplet learning rule is more suitable to reproduce those experimental protocols, namely, the frequency dependence of the pairing protocol as well as the triplet and quadruplet protocols. Finally, we showed the link between our triplet learning rule and the BCM learning rule. We found noteworthy and somewhat unexpected that our detailed modeling of frequency dependence of pairbased protocols and asymmetries in triplet protocols should lead under the assumption of Poisson spike trains to a known theoretical rule with well characterized features. Throughout this paper, we compared the All-to-All interactions versus the Nearest-Spike interactions for pair-based models as well as for triplet-based models. Although Nearest-Spike interactions induce some potentially interesting nonlinearities in pairbased models (van Rossum et al., 2000; Izhikevich and Desai, 2003; Burkitt et al., 2004) (especially in the Poisson protocol), it is not possible to make a strict mapping of Nearest-Spike interactions models to the BCM rule, and more importantly pair-based models with Nearest-Spike interactions fail to reproduce the correct frequency dependence in the pairing protocol as well as triplet and quadruplet experiments. Limitations Even if our triplet model can capture most of the triplet and quadruplet experiments, it is necessary to keep in mind the kind of experiments this model cannot reproduce. Because our model predicts weight changes as a function of the spike timing only, it fails to make any kind of inference for experiments that trigger explicitly other biophysical parameters such as Ca 2 concentration or postsynaptic membrane potential. We nevertheless think that this approach is interesting, because in one way or another, those biophysical parameters depend on the timing of the presynaptic and postsynaptic spikes. For example, the calcium concentration depends on the timing of the postsynaptic spike (via the back-propagating action potential) and the presynaptic spike (via voltage-gated calcium channels and NMDA channels). Recent experiments (Froemke et al., 2005) show that the shape of the depression part of the learning window depends on the position of the synapse on the dendritic tree. Although our model does not include such geometrical properties, it is possible to account for the position of the synapse by changing explicitly the time constant (characterizing the LTD part of the learning window) as a function of the distance between the synapse and the soma. Even in the context of typical STDP experiments, some aspects are not covered by our model. In most STDP experiments, plasticity is induced after a repetition of a fixed number of pairs of presynaptic and postsynaptic spikes. Clearly, the amount of plasticity depends on the number of pairs. In fact, the amount of potentiation increases with the number of pairs of presynaptic and postsynaptic spikes and saturates at a given value (Senn et al., 2001; Froemke et al., 2006). This saturation is not taken into account in our present model, because the weight dependence is not explicitly mentioned. The dependence on the weights can easily be added in the triplet models (the parameters A 2, A 2, A 3, and A 3 could also depend on w). Even if we have some indications (Bi and Poo, 2001; Wang et al., 2005) of how synapses change as a function of w, more experimental data are clearly needed to determine the correct weight dependence. It should be noted that if we add the dependence on the weight, there would not be an unambiguous mapping to the BCM theory. Alternative interpretations of the experimental data The goal of this study was to go as far as possible in the prediction of the weight change with only spike timing and no other neuronal variables or mechanism. It is interesting to note that our triplet learning rule can reproduce both the that have been explained by a postsynaptic potential effect (Sjöström et al., 2001) or by a suppression effect (Wang et al., 2005). In Sjöström s experiment, the increased potentiation at high frequency is explained by the increased membrane potential because of the accumulation of presynaptic inputs, whereas in our model the increased potentiation is attributable to the increase of the postsynaptic variable o 2. Combined with a suitable neuron model, an increased frequency would of course yield a higher potential on average. Wang et al. (2005) interpreted their triplet and quadruplet experiment as a result of a suppression mechanism (i.e., if a pre-post pair is followed by a post-pre pair, the latter depression term suppresses the first potentiation term, and not the other way round). This phenomenon is captured in our framework by the extra potentiation attributable to the 1-pre-2- post triplet term.

Novel VLSI Implementation for Triplet-based Spike-Timing Dependent Plasticity

Novel VLSI Implementation for Triplet-based Spike-Timing Dependent Plasticity Novel LSI Implementation for Triplet-based Spike-Timing Dependent Plasticity Mostafa Rahimi Azghadi, Omid Kavehei, Said Al-Sarawi, Nicolangelo Iannella, and Derek Abbott Centre for Biomedical Engineering,

More information

arxiv: v1 [cs.ne] 30 Mar 2013

arxiv: v1 [cs.ne] 30 Mar 2013 A Neuromorphic VLSI Design for Spike Timing and Rate Based Synaptic Plasticity Mostafa Rahimi Azghadi a,, Said Al-Sarawi a,, Derek Abbott a, Nicolangelo Iannella a, a School of Electrical and Electronic

More information

Design and Implementation of BCM Rule Based on Spike-Timing Dependent Plasticity

Design and Implementation of BCM Rule Based on Spike-Timing Dependent Plasticity Design and Implementation of BCM Rule Based on Spike-Timing Dependent Plasticity Mostafa Rahimi Azghadi, Said Al-Sarawi, Nicolangelo Iannella, and Derek Abbott Centre for Biomedical Engineering, School

More information

How do biological neurons learn? Insights from computational modelling of

How do biological neurons learn? Insights from computational modelling of How do biological neurons learn? Insights from computational modelling of neurobiological experiments Lubica Benuskova Department of Computer Science University of Otago, New Zealand Brain is comprised

More information

Consider the following spike trains from two different neurons N1 and N2:

Consider the following spike trains from two different neurons N1 and N2: About synchrony and oscillations So far, our discussions have assumed that we are either observing a single neuron at a, or that neurons fire independent of each other. This assumption may be correct in

More information

Synaptic Plasticity. Introduction. Biophysics of Synaptic Plasticity. Functional Modes of Synaptic Plasticity. Activity-dependent synaptic plasticity:

Synaptic Plasticity. Introduction. Biophysics of Synaptic Plasticity. Functional Modes of Synaptic Plasticity. Activity-dependent synaptic plasticity: Synaptic Plasticity Introduction Dayan and Abbott (2001) Chapter 8 Instructor: Yoonsuck Choe; CPSC 644 Cortical Networks Activity-dependent synaptic plasticity: underlies learning and memory, and plays

More information

This script will produce a series of pulses of amplitude 40 na, duration 1ms, recurring every 50 ms.

This script will produce a series of pulses of amplitude 40 na, duration 1ms, recurring every 50 ms. 9.16 Problem Set #4 In the final problem set you will combine the pieces of knowledge gained in the previous assignments to build a full-blown model of a plastic synapse. You will investigate the effects

More information

Kinetic models of spike-timing dependent plasticity and their functional consequences in detecting correlations

Kinetic models of spike-timing dependent plasticity and their functional consequences in detecting correlations Biol Cybern (27) 97:81 97 DOI 1.17/s422-7-155-3 ORIGINAL PAPER Kinetic models of spike-timing dependent plasticity and their functional consequences in detecting correlations Quan Zou Alain Destexhe Received:

More information

Biological Modeling of Neural Networks

Biological Modeling of Neural Networks Week 4 part 2: More Detail compartmental models Biological Modeling of Neural Networks Week 4 Reducing detail - Adding detail 4.2. Adding detail - apse -cable equat Wulfram Gerstner EPFL, Lausanne, Switzerland

More information

Synaptic dynamics. John D. Murray. Synaptic currents. Simple model of the synaptic gating variable. First-order kinetics

Synaptic dynamics. John D. Murray. Synaptic currents. Simple model of the synaptic gating variable. First-order kinetics Synaptic dynamics John D. Murray A dynamical model for synaptic gating variables is presented. We use this to study the saturation of synaptic gating at high firing rate. Shunting inhibition and the voltage

More information

Investigations of long-term synaptic plasticity have revealed

Investigations of long-term synaptic plasticity have revealed Dynamical model of long-term synaptic plasticity Henry D. I. Abarbanel*, R. Huerta, and M. I. Rabinovich *Marine Physical Laboratory, Scripps Institution of Oceanography, Department of Physics, and Institute

More information

Branch-Specific Plasticity Enables Self-Organization of Nonlinear Computation in Single Neurons

Branch-Specific Plasticity Enables Self-Organization of Nonlinear Computation in Single Neurons The Journal of Neuroscience, July 27, 2011 31(30):10787 10802 10787 Development/Plasticity/Repair Branch-Specific Plasticity Enables Self-Organization of Nonlinear Computation in Single Neurons Robert

More information

Plasticity and Learning

Plasticity and Learning Chapter 8 Plasticity and Learning 8.1 Introduction Activity-dependent synaptic plasticity is widely believed to be the basic phenomenon underlying learning and memory, and it is also thought to play a

More information

Processing of Time Series by Neural Circuits with Biologically Realistic Synaptic Dynamics

Processing of Time Series by Neural Circuits with Biologically Realistic Synaptic Dynamics Processing of Time Series by Neural Circuits with iologically Realistic Synaptic Dynamics Thomas Natschläger & Wolfgang Maass Institute for Theoretical Computer Science Technische Universität Graz, ustria

More information

STDP in adaptive neurons gives close-to-optimal information transmission

STDP in adaptive neurons gives close-to-optimal information transmission COMPUTTIONL NEUROSCIENCE Original Research rticle published: 3 December doi:.3389/fncom..43 STDP in adaptive neurons gives close-to-optimal information transmission Guillaume Hennequin *, Wulfram Gerstner

More information

The Spike Response Model: A Framework to Predict Neuronal Spike Trains

The Spike Response Model: A Framework to Predict Neuronal Spike Trains The Spike Response Model: A Framework to Predict Neuronal Spike Trains Renaud Jolivet, Timothy J. Lewis 2, and Wulfram Gerstner Laboratory of Computational Neuroscience, Swiss Federal Institute of Technology

More information

Reducing Spike Train Variability: A Computational Theory Of Spike-Timing Dependent Plasticity

Reducing Spike Train Variability: A Computational Theory Of Spike-Timing Dependent Plasticity Reducing Spike Train Variability: A Computational Theory Of Spike-Timing Dependent Plasticity Sander M. Bohte a Michael C. Mozer b a CWI, Kruislaan 413, 1098 SJ Amsterdam, The Netherlands b Dept. of Computer

More information

Basic elements of neuroelectronics -- membranes -- ion channels -- wiring. Elementary neuron models -- conductance based -- modelers alternatives

Basic elements of neuroelectronics -- membranes -- ion channels -- wiring. Elementary neuron models -- conductance based -- modelers alternatives Computing in carbon Basic elements of neuroelectronics -- membranes -- ion channels -- wiring Elementary neuron models -- conductance based -- modelers alternatives Wiring neurons together -- synapses

More information

Reducing the Variability of Neural Responses: A Computational Theory of Spike-Timing-Dependent Plasticity

Reducing the Variability of Neural Responses: A Computational Theory of Spike-Timing-Dependent Plasticity LETTER Communicated by Gal Chechik Reducing the Variability of Neural Responses: A Computational Theory of Spike-Timing-Dependent Plasticity Sander M. Bohte sbohte@cwi.nl Netherlands Centre for Mathematics

More information

Decoding. How well can we learn what the stimulus is by looking at the neural responses?

Decoding. How well can we learn what the stimulus is by looking at the neural responses? Decoding How well can we learn what the stimulus is by looking at the neural responses? Two approaches: devise explicit algorithms for extracting a stimulus estimate directly quantify the relationship

More information

Outline. NIP: Hebbian Learning. Overview. Types of Learning. Neural Information Processing. Amos Storkey

Outline. NIP: Hebbian Learning. Overview. Types of Learning. Neural Information Processing. Amos Storkey Outline NIP: Hebbian Learning Neural Information Processing Amos Storkey 1/36 Overview 2/36 Types of Learning Types of learning, learning strategies Neurophysiology, LTP/LTD Basic Hebb rule, covariance

More information

Emergence of resonances in neural systems: the interplay between adaptive threshold and short-term synaptic plasticity

Emergence of resonances in neural systems: the interplay between adaptive threshold and short-term synaptic plasticity Emergence of resonances in neural systems: the interplay between adaptive threshold and short-term synaptic plasticity Jorge F. Mejias 1,2 and Joaquín J. Torres 2 1 Department of Physics and Center for

More information

Neuron. Detector Model. Understanding Neural Components in Detector Model. Detector vs. Computer. Detector. Neuron. output. axon

Neuron. Detector Model. Understanding Neural Components in Detector Model. Detector vs. Computer. Detector. Neuron. output. axon Neuron Detector Model 1 The detector model. 2 Biological properties of the neuron. 3 The computational unit. Each neuron is detecting some set of conditions (e.g., smoke detector). Representation is what

More information

3 Detector vs. Computer

3 Detector vs. Computer 1 Neurons 1. The detector model. Also keep in mind this material gets elaborated w/the simulations, and the earliest material is often hardest for those w/primarily psych background. 2. Biological properties

More information

Learning Spatio-Temporally Encoded Pattern Transformations in Structured Spiking Neural Networks 12

Learning Spatio-Temporally Encoded Pattern Transformations in Structured Spiking Neural Networks 12 Learning Spatio-Temporally Encoded Pattern Transformations in Structured Spiking Neural Networks 12 André Grüning, Brian Gardner and Ioana Sporea Department of Computer Science University of Surrey Guildford,

More information

CSE/NB 528 Final Lecture: All Good Things Must. CSE/NB 528: Final Lecture

CSE/NB 528 Final Lecture: All Good Things Must. CSE/NB 528: Final Lecture CSE/NB 528 Final Lecture: All Good Things Must 1 Course Summary Where have we been? Course Highlights Where do we go from here? Challenges and Open Problems Further Reading 2 What is the neural code? What

More information

An Introductory Course in Computational Neuroscience

An Introductory Course in Computational Neuroscience An Introductory Course in Computational Neuroscience Contents Series Foreword Acknowledgments Preface 1 Preliminary Material 1.1. Introduction 1.1.1 The Cell, the Circuit, and the Brain 1.1.2 Physics of

More information

Dynamical Constraints on Computing with Spike Timing in the Cortex

Dynamical Constraints on Computing with Spike Timing in the Cortex Appears in Advances in Neural Information Processing Systems, 15 (NIPS 00) Dynamical Constraints on Computing with Spike Timing in the Cortex Arunava Banerjee and Alexandre Pouget Department of Brain and

More information

Supporting Online Material for

Supporting Online Material for www.sciencemag.org/cgi/content/full/319/5869/1543/dc1 Supporting Online Material for Synaptic Theory of Working Memory Gianluigi Mongillo, Omri Barak, Misha Tsodyks* *To whom correspondence should be addressed.

More information

How to read a burst duration code

How to read a burst duration code Neurocomputing 58 60 (2004) 1 6 www.elsevier.com/locate/neucom How to read a burst duration code Adam Kepecs a;, John Lisman b a Cold Spring Harbor Laboratory, Marks Building, 1 Bungtown Road, Cold Spring

More information

Emergence of network structure due to spike-timing-dependent plasticity in recurrent neuronal networks IV

Emergence of network structure due to spike-timing-dependent plasticity in recurrent neuronal networks IV Biol Cybern (2009) 101:427 444 DOI 10.1007/s00422-009-0346-1 ORIGINAL PAPER Emergence of network structure due to spike-timing-dependent plasticity in recurrent neuronal networks IV Structuring synaptic

More information

Spike-Timing-Dependent Plasticity and Relevant Mutual Information Maximization

Spike-Timing-Dependent Plasticity and Relevant Mutual Information Maximization LETTER Communicated by David Horn Spike-Timing-Dependent Plasticity and Relevant Mutual Information Maximization Gal Chechik ggal@cs.huji.ac.il Interdisciplinary Center for Neural Computation, Hebrew University,

More information

Spike-Timing Dependent Plasticity and Relevant Mutual Information Maximization

Spike-Timing Dependent Plasticity and Relevant Mutual Information Maximization Spike-Timing Dependent Plasticity and Relevant Mutual Information Maximization Gal Chechik The Interdisciplinary Center for Neural Computation The Hebrew University, Givat Ram 91904, Jerusalem, Israel

More information

Biological Modeling of Neural Networks:

Biological Modeling of Neural Networks: Week 14 Dynamics and Plasticity 14.1 Reservoir computing - Review:Random Networks - Computing with rich dynamics Biological Modeling of Neural Networks: 14.2 Random Networks - stationary state - chaos

More information

Basic elements of neuroelectronics -- membranes -- ion channels -- wiring

Basic elements of neuroelectronics -- membranes -- ion channels -- wiring Computing in carbon Basic elements of neuroelectronics -- membranes -- ion channels -- wiring Elementary neuron models -- conductance based -- modelers alternatives Wires -- signal propagation -- processing

More information

Spike-Frequency Adaptation: Phenomenological Model and Experimental Tests

Spike-Frequency Adaptation: Phenomenological Model and Experimental Tests Spike-Frequency Adaptation: Phenomenological Model and Experimental Tests J. Benda, M. Bethge, M. Hennig, K. Pawelzik & A.V.M. Herz February, 7 Abstract Spike-frequency adaptation is a common feature of

More information

arxiv: v1 [cs.ne] 19 Sep 2015

arxiv: v1 [cs.ne] 19 Sep 2015 An objective function for STDP arxiv:1509.05936v1 [cs.ne] 19 Sep 2015 Yoshua Bengio 1, Thomas Mesnard, Asja Fischer, Saizheng Zhang and Yuhuai Wu Montreal Institute for Learning Algorithms, University

More information

Dendritic computation

Dendritic computation Dendritic computation Dendrites as computational elements: Passive contributions to computation Active contributions to computation Examples Geometry matters: the isopotential cell Injecting current I

More information

Tuning tuning curves. So far: Receptive fields Representation of stimuli Population vectors. Today: Contrast enhancment, cortical processing

Tuning tuning curves. So far: Receptive fields Representation of stimuli Population vectors. Today: Contrast enhancment, cortical processing Tuning tuning curves So far: Receptive fields Representation of stimuli Population vectors Today: Contrast enhancment, cortical processing Firing frequency N 3 s max (N 1 ) = 40 o N4 N 1 N N 5 2 s max

More information

Linearization of F-I Curves by Adaptation

Linearization of F-I Curves by Adaptation LETTER Communicated by Laurence Abbott Linearization of F-I Curves by Adaptation Bard Ermentrout Department of Mathematics, University of Pittsburgh, Pittsburgh, PA 15260, U.S.A. We show that negative

More information

Neuronal Dynamics: Computational Neuroscience of Single Neurons

Neuronal Dynamics: Computational Neuroscience of Single Neurons Week 5 part 3a :Three definitions of rate code Neuronal Dynamics: Computational Neuroscience of Single Neurons Week 5 Variability and Noise: The question of the neural code Wulfram Gerstner EPFL, Lausanne,

More information

Evolution of the Average Synaptic Update Rule

Evolution of the Average Synaptic Update Rule Supporting Text Evolution of the Average Synaptic Update Rule In this appendix we evaluate the derivative of Eq. 9 in the main text, i.e., we need to calculate log P (yk Y k, X k ) γ log P (yk Y k ). ()

More information

Intrinsic Stabilization of Output Rates by Spike-Based Hebbian Learning

Intrinsic Stabilization of Output Rates by Spike-Based Hebbian Learning LETTER Communicated by Laurence Abbott Intrinsic Stabilization of Output Rates by Spike-Based Hebbian Learning Richard Kempter kempter@phy.ucsf.edu Keck Center for Integrative Neuroscience, University

More information

An Efficient Method for Computing Synaptic Conductances Based on a Kinetic Model of Receptor Binding

An Efficient Method for Computing Synaptic Conductances Based on a Kinetic Model of Receptor Binding NOTE Communicated by Michael Hines An Efficient Method for Computing Synaptic Conductances Based on a Kinetic Model of Receptor Binding A. Destexhe Z. F. Mainen T. J. Sejnowski The Howard Hughes Medical

More information

Effects of Interactive Function Forms and Refractoryperiod in a Self-Organized Critical Model Based on Neural Networks

Effects of Interactive Function Forms and Refractoryperiod in a Self-Organized Critical Model Based on Neural Networks Commun. Theor. Phys. (Beijing, China) 42 (2004) pp. 121 125 c International Academic Publishers Vol. 42, No. 1, July 15, 2004 Effects of Interactive Function Forms and Refractoryperiod in a Self-Organized

More information

IN THIS turorial paper we exploit the relationship between

IN THIS turorial paper we exploit the relationship between 508 IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL. 10, NO. 3, MAY 1999 Weakly Pulse-Coupled Oscillators, FM Interactions, Synchronization, Oscillatory Associative Memory Eugene M. Izhikevich Abstract We study

More information

Fast neural network simulations with population density methods

Fast neural network simulations with population density methods Fast neural network simulations with population density methods Duane Q. Nykamp a,1 Daniel Tranchina b,a,c,2 a Courant Institute of Mathematical Science b Department of Biology c Center for Neural Science

More information

DISCRETE EVENT SIMULATION IN THE NEURON ENVIRONMENT

DISCRETE EVENT SIMULATION IN THE NEURON ENVIRONMENT Hines and Carnevale: Discrete event simulation in the NEURON environment Page 1 Preprint of a manuscript that will be published in Neurocomputing. DISCRETE EVENT SIMULATION IN THE NEURON ENVIRONMENT Abstract

More information

Structure and Measurement of the brain lecture notes

Structure and Measurement of the brain lecture notes Structure and Measurement of the brain lecture notes Marty Sereno 2009/2010!"#$%&'(&#)*%$#&+,'-&.)"/*"&.*)*-'(0&1223 Neurons and Models Lecture 1 Topics Membrane (Nernst) Potential Action potential/voltage-gated

More information

Nervous Tissue. Neurons Electrochemical Gradient Propagation & Transduction Neurotransmitters Temporal & Spatial Summation

Nervous Tissue. Neurons Electrochemical Gradient Propagation & Transduction Neurotransmitters Temporal & Spatial Summation Nervous Tissue Neurons Electrochemical Gradient Propagation & Transduction Neurotransmitters Temporal & Spatial Summation What is the function of nervous tissue? Maintain homeostasis & respond to stimuli

More information

Matthieu Gilson Anthony N. Burkitt David B. Grayden Doreen A. Thomas J. Leo van Hemmen

Matthieu Gilson Anthony N. Burkitt David B. Grayden Doreen A. Thomas J. Leo van Hemmen Biol Cybern (21) 13:365 386 DOI 1.17/s422-1-45-7 ORIGINAL PAPER Emergence of network structure due to spike-timing-dependent plasticity in recurrent neuronal networks V: self-organization schemes and weight

More information

What Can a Neuron Learn with Spike-Timing-Dependent Plasticity?

What Can a Neuron Learn with Spike-Timing-Dependent Plasticity? LETTER Communicated by Wulfram Gerstner What Can a Neuron Learn with Spike-Timing-Dependent Plasticity? Robert Legenstein legi@igi.tugraz.at Christian Naeger naeger@gmx.de Wolfgang Maass maass@igi.tugraz.at

More information

STUDENT PAPER. Santiago Santana University of Illinois, Urbana-Champaign Blue Waters Education Program 736 S. Lombard Oak Park IL, 60304

STUDENT PAPER. Santiago Santana University of Illinois, Urbana-Champaign Blue Waters Education Program 736 S. Lombard Oak Park IL, 60304 STUDENT PAPER Differences between Stochastic and Deterministic Modeling in Real World Systems using the Action Potential of Nerves. Santiago Santana University of Illinois, Urbana-Champaign Blue Waters

More information

Adaptation in the Neural Code of the Retina

Adaptation in the Neural Code of the Retina Adaptation in the Neural Code of the Retina Lens Retina Fovea Optic Nerve Optic Nerve Bottleneck Neurons Information Receptors: 108 95% Optic Nerve 106 5% After Polyak 1941 Visual Cortex ~1010 Mean Intensity

More information

Liquid Computing in a Simplified Model of Cortical Layer IV: Learning to Balance a Ball

Liquid Computing in a Simplified Model of Cortical Layer IV: Learning to Balance a Ball Liquid Computing in a Simplified Model of Cortical Layer IV: Learning to Balance a Ball Dimitri Probst 1,3, Wolfgang Maass 2, Henry Markram 1, and Marc-Oliver Gewaltig 1 1 Blue Brain Project, École Polytechnique

More information

Introduction to Neural Networks U. Minn. Psy 5038 Spring, 1999 Daniel Kersten. Lecture 2a. The Neuron - overview of structure. From Anderson (1995)

Introduction to Neural Networks U. Minn. Psy 5038 Spring, 1999 Daniel Kersten. Lecture 2a. The Neuron - overview of structure. From Anderson (1995) Introduction to Neural Networks U. Minn. Psy 5038 Spring, 1999 Daniel Kersten Lecture 2a The Neuron - overview of structure From Anderson (1995) 2 Lect_2a_Mathematica.nb Basic Structure Information flow:

More information

Tensor Decomposition by Modified BCM Neurons Finds Mixture Means Through Input Triplets

Tensor Decomposition by Modified BCM Neurons Finds Mixture Means Through Input Triplets Tensor Decomposition by Modified BCM Neurons Finds Mixture Means Through Input Triplets Matthew Lawlor Applied Mathematics Yale University New Haven, CT 0650 matthew.lawlor@yale.edu Steven W. Zucer Computer

More information

Exercise 15 : Cable Equation

Exercise 15 : Cable Equation Biophysics of Neural Computation : Introduction to Neuroinformatics WS 2008-2009 Prof. Rodney Douglas, Kevan Martin, Hans Scherberger, Matthew Cook Ass. Frederic Zubler fred@ini.phys.ethz.ch http://www.ini.uzh.ch/

More information

Dynamic Stochastic Synapses as Computational Units

Dynamic Stochastic Synapses as Computational Units LETTER Communicated by Laurence Abbott Dynamic Stochastic Synapses as Computational Units Wolfgang Maass Institute for Theoretical Computer Science, Technische Universität Graz, A 8010 Graz, Austria Anthony

More information

Spike-Based Reinforcement Learning in Continuous State and Action Space: When Policy Gradient Methods Fail

Spike-Based Reinforcement Learning in Continuous State and Action Space: When Policy Gradient Methods Fail Spike-Based Reinforcement Learning in Continuous State and Action Space: When Policy Gradient Methods Fail Eleni Vasilaki 1,2 *, Nicolas Frémaux 1, Robert Urbanczik 3, Walter Senn 3, Wulfram Gerstner 1

More information

How do synapses transform inputs?

How do synapses transform inputs? Neurons to networks How do synapses transform inputs? Excitatory synapse Input spike! Neurotransmitter release binds to/opens Na channels Change in synaptic conductance! Na+ influx E.g. AMA synapse! Depolarization

More information

Effects of Interactive Function Forms in a Self-Organized Critical Model Based on Neural Networks

Effects of Interactive Function Forms in a Self-Organized Critical Model Based on Neural Networks Commun. Theor. Phys. (Beijing, China) 40 (2003) pp. 607 613 c International Academic Publishers Vol. 40, No. 5, November 15, 2003 Effects of Interactive Function Forms in a Self-Organized Critical Model

More information

Synapse Model. Neurotransmitter is released into cleft between axonal button and dendritic spine

Synapse Model. Neurotransmitter is released into cleft between axonal button and dendritic spine Synapse Model Neurotransmitter is released into cleft between axonal button and dendritic spine Binding and unbinding are modeled by first-order kinetics Concentration must exceed receptor affinity 2 MorphSynapse.nb

More information

Lecture 11 : Simple Neuron Models. Dr Eileen Nugent

Lecture 11 : Simple Neuron Models. Dr Eileen Nugent Lecture 11 : Simple Neuron Models Dr Eileen Nugent Reading List Nelson, Biological Physics, Chapter 12 Phillips, PBoC, Chapter 17 Gerstner, Neuronal Dynamics: from single neurons to networks and models

More information

Action Potentials and Synaptic Transmission Physics 171/271

Action Potentials and Synaptic Transmission Physics 171/271 Action Potentials and Synaptic Transmission Physics 171/271 Flavio Fröhlich (flavio@salk.edu) September 27, 2006 In this section, we consider two important aspects concerning the communication between

More information

Subthreshold cross-correlations between cortical neurons: Areference model with static synapses

Subthreshold cross-correlations between cortical neurons: Areference model with static synapses Neurocomputing 65 66 (25) 685 69 www.elsevier.com/locate/neucom Subthreshold cross-correlations between cortical neurons: Areference model with static synapses Ofer Melamed a,b, Gilad Silberberg b, Henry

More information

Neurophysiology. Danil Hammoudi.MD

Neurophysiology. Danil Hammoudi.MD Neurophysiology Danil Hammoudi.MD ACTION POTENTIAL An action potential is a wave of electrical discharge that travels along the membrane of a cell. Action potentials are an essential feature of animal

More information

Effects of synaptic conductance on the voltage distribution and firing rate of spiking neurons

Effects of synaptic conductance on the voltage distribution and firing rate of spiking neurons PHYSICAL REVIEW E 69, 051918 (2004) Effects of synaptic conductance on the voltage distribution and firing rate of spiking neurons Magnus J. E. Richardson* Laboratory of Computational Neuroscience, Brain

More information

R7.3 Receptor Kinetics

R7.3 Receptor Kinetics Chapter 7 9/30/04 R7.3 Receptor Kinetics Professional Reference Shelf Just as enzymes are fundamental to life, so is the living cell s ability to receive and process signals from beyond the cell membrane.

More information

Coherence detection in a spiking neuron via Hebbian learning

Coherence detection in a spiking neuron via Hebbian learning Neurocomputing 44 46 (2002) 133 139 www.elsevier.com/locate/neucom Coherence detection in a spiking neuron via Hebbian learning L. Perrinet, M. Samuelides ONERA-DTIM, 2 Av. E. Belin, BP 4025, 31055 Toulouse,

More information

Frequency Adaptation and Bursting

Frequency Adaptation and Bursting BioE332A Lab 3, 2010 1 Lab 3 January 5, 2010 Frequency Adaptation and Bursting In the last lab, we explored spiking due to sodium channels. In this lab, we explore adaptation and bursting due to potassium

More information

Hierarchy. Will Penny. 24th March Hierarchy. Will Penny. Linear Models. Convergence. Nonlinear Models. References

Hierarchy. Will Penny. 24th March Hierarchy. Will Penny. Linear Models. Convergence. Nonlinear Models. References 24th March 2011 Update Hierarchical Model Rao and Ballard (1999) presented a hierarchical model of visual cortex to show how classical and extra-classical Receptive Field (RF) effects could be explained

More information

Probabilistic Models in Theoretical Neuroscience

Probabilistic Models in Theoretical Neuroscience Probabilistic Models in Theoretical Neuroscience visible unit Boltzmann machine semi-restricted Boltzmann machine restricted Boltzmann machine hidden unit Neural models of probabilistic sampling: introduction

More information

Abstract. Author Summary

Abstract. Author Summary 1 Self-organization of microcircuits in networks of spiking neurons with plastic synapses Gabriel Koch Ocker 1,3, Ashok Litwin-Kumar 2,3,4, Brent Doiron 2,3 1: Department of Neuroscience, University of

More information

How to do backpropagation in a brain. Geoffrey Hinton Canadian Institute for Advanced Research & University of Toronto

How to do backpropagation in a brain. Geoffrey Hinton Canadian Institute for Advanced Research & University of Toronto 1 How to do backpropagation in a brain Geoffrey Hinton Canadian Institute for Advanced Research & University of Toronto What is wrong with back-propagation? It requires labeled training data. (fixed) Almost

More information

LESSON 2.2 WORKBOOK How do our axons transmit electrical signals?

LESSON 2.2 WORKBOOK How do our axons transmit electrical signals? LESSON 2.2 WORKBOOK How do our axons transmit electrical signals? This lesson introduces you to the action potential, which is the process by which axons signal electrically. In this lesson you will learn

More information

Causality and communities in neural networks

Causality and communities in neural networks Causality and communities in neural networks Leonardo Angelini, Daniele Marinazzo, Mario Pellicoro, Sebastiano Stramaglia TIRES-Center for Signal Detection and Processing - Università di Bari, Bari, Italy

More information

Copyright by. Changan Liu. May, 2017

Copyright by. Changan Liu. May, 2017 Copyright by Changan Liu May, 2017 THE IMPACT OF STDP AND CORRELATED ACTIVITY ON NETWORK STRUCTURE A Dissertation Presented to the Faculty of the Department of Mathematics University of Houston In Partial

More information

Comparing integrate-and-fire models estimated using intracellular and extracellular data 1

Comparing integrate-and-fire models estimated using intracellular and extracellular data 1 Comparing integrate-and-fire models estimated using intracellular and extracellular data 1 Liam Paninski a,b,2 Jonathan Pillow b Eero Simoncelli b a Gatsby Computational Neuroscience Unit, University College

More information

How to do backpropagation in a brain

How to do backpropagation in a brain How to do backpropagation in a brain Geoffrey Hinton Canadian Institute for Advanced Research & University of Toronto & Google Inc. Prelude I will start with three slides explaining a popular type of deep

More information

Neural Conduction. biologyaspoetry.com

Neural Conduction. biologyaspoetry.com Neural Conduction biologyaspoetry.com Resting Membrane Potential -70mV A cell s membrane potential is the difference in the electrical potential ( charge) between the inside and outside of the cell. The

More information

Control and Integration. Nervous System Organization: Bilateral Symmetric Animals. Nervous System Organization: Radial Symmetric Animals

Control and Integration. Nervous System Organization: Bilateral Symmetric Animals. Nervous System Organization: Radial Symmetric Animals Control and Integration Neurophysiology Chapters 10-12 Nervous system composed of nervous tissue cells designed to conduct electrical impulses rapid communication to specific cells or groups of cells Endocrine

More information

Nervous Tissue. Neurons Neural communication Nervous Systems

Nervous Tissue. Neurons Neural communication Nervous Systems Nervous Tissue Neurons Neural communication Nervous Systems What is the function of nervous tissue? Maintain homeostasis & respond to stimuli Sense & transmit information rapidly, to specific cells and

More information

Plasticity Kernels and Temporal Statistics

Plasticity Kernels and Temporal Statistics Plasticity Kernels and Temporal Statistics Peter Dayan1 Michael Hausser2 Michael London1 2 1 GCNU, 2WIBR, Dept of Physiology UCL, Gower Street, London dayan@gats5y.ucl.ac.uk {m.hausser,m.london}@ucl.ac.uk

More information

SPIKE TRIGGERED APPROACHES. Odelia Schwartz Computational Neuroscience Course 2017

SPIKE TRIGGERED APPROACHES. Odelia Schwartz Computational Neuroscience Course 2017 SPIKE TRIGGERED APPROACHES Odelia Schwartz Computational Neuroscience Course 2017 LINEAR NONLINEAR MODELS Linear Nonlinear o Often constrain to some form of Linear, Nonlinear computations, e.g. visual

More information

Synaptic Input. Linear Model of Synaptic Transmission. Professor David Heeger. September 5, 2000

Synaptic Input. Linear Model of Synaptic Transmission. Professor David Heeger. September 5, 2000 Synaptic Input Professor David Heeger September 5, 2000 The purpose of this handout is to go a bit beyond the discussion in Ch. 6 of The Book of Genesis on synaptic input, and give some examples of how

More information

A Learning Theory for Reward-Modulated Spike-Timing-Dependent Plasticity with Application to Biofeedback

A Learning Theory for Reward-Modulated Spike-Timing-Dependent Plasticity with Application to Biofeedback A Learning Theory for Reward-Modulated Spike-Timing-Dependent Plasticity with Application to Biofeedback Robert Legenstein, Dejan Pecevski, Wolfgang Maass Institute for Theoretical Computer Science Graz

More information

STDP Learning of Image Patches with Convolutional Spiking Neural Networks

STDP Learning of Image Patches with Convolutional Spiking Neural Networks STDP Learning of Image Patches with Convolutional Spiking Neural Networks Daniel J. Saunders, Hava T. Siegelmann, Robert Kozma College of Information and Computer Sciences University of Massachusetts Amherst

More information

Nervous Systems: Neuron Structure and Function

Nervous Systems: Neuron Structure and Function Nervous Systems: Neuron Structure and Function Integration An animal needs to function like a coherent organism, not like a loose collection of cells. Integration = refers to processes such as summation

More information

The Bayesian Brain. Robert Jacobs Department of Brain & Cognitive Sciences University of Rochester. May 11, 2017

The Bayesian Brain. Robert Jacobs Department of Brain & Cognitive Sciences University of Rochester. May 11, 2017 The Bayesian Brain Robert Jacobs Department of Brain & Cognitive Sciences University of Rochester May 11, 2017 Bayesian Brain How do neurons represent the states of the world? How do neurons represent

More information

MEMBRANE POTENTIALS AND ACTION POTENTIALS:

MEMBRANE POTENTIALS AND ACTION POTENTIALS: University of Jordan Faculty of Medicine Department of Physiology & Biochemistry Medical students, 2017/2018 +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ Review: Membrane physiology

More information

Introduction. Previous work has shown that AER can also be used to construct largescale networks with arbitrary, configurable synaptic connectivity.

Introduction. Previous work has shown that AER can also be used to construct largescale networks with arbitrary, configurable synaptic connectivity. Introduction The goal of neuromorphic engineering is to design and implement microelectronic systems that emulate the structure and function of the brain. Address-event representation (AER) is a communication

More information

Sampling-based probabilistic inference through neural and synaptic dynamics

Sampling-based probabilistic inference through neural and synaptic dynamics Sampling-based probabilistic inference through neural and synaptic dynamics Wolfgang Maass for Robert Legenstein Institute for Theoretical Computer Science Graz University of Technology, Austria Institute

More information

Mid Year Project Report: Statistical models of visual neurons

Mid Year Project Report: Statistical models of visual neurons Mid Year Project Report: Statistical models of visual neurons Anna Sotnikova asotniko@math.umd.edu Project Advisor: Prof. Daniel A. Butts dab@umd.edu Department of Biology Abstract Studying visual neurons

More information

Maximising Sensitivity in a Spiking Network

Maximising Sensitivity in a Spiking Network Maximising Sensitivity in a Spiking Network Anthony J. Bell, Redwood Neuroscience Institute 00 El Camino Real, Suite 380 Menlo Park, CA 94025 tbell@rni.org Lucas C. Parra Biomedical Engineering Department

More information

Neocortical Pyramidal Cells Can Control Signals to Post-Synaptic Cells Without Firing:

Neocortical Pyramidal Cells Can Control Signals to Post-Synaptic Cells Without Firing: Neocortical Pyramidal Cells Can Control Signals to Post-Synaptic Cells Without Firing: a model of the axonal plexus Erin Munro Department of Mathematics Boston University 4/14/2011 Gap junctions on pyramidal

More information

Modeling Synaptic Plasticity in Conjunction with the Timing of Pre- and Postsynaptic Action Potentials

Modeling Synaptic Plasticity in Conjunction with the Timing of Pre- and Postsynaptic Action Potentials LETTER Communicated by Misha Tsodyks Modeling Synaptic Plasticity in Conjunction with the Timing of Pre- and Postsynaptic Action Potentials Werner M. Kistler J. Leo van Hemmen Physik Department der TU

More information

Patterns of Synchrony in Neural Networks with Spike Adaptation

Patterns of Synchrony in Neural Networks with Spike Adaptation Patterns of Synchrony in Neural Networks with Spike Adaptation C. van Vreeswijky and D. Hanselz y yracah Institute of Physics and Center for Neural Computation, Hebrew University, Jerusalem, 9194 Israel

More information

Model of a Biological Neuron as a Temporal Neural Network

Model of a Biological Neuron as a Temporal Neural Network Model of a Biological Neuron as a Temporal Neural Network Sean D. Murphy and Edward W. Kairiss Interdepartmental Neuroscience Program, Department of Psychology, and The Center for Theoretical and Applied

More information

Biosciences in the 21st century

Biosciences in the 21st century Biosciences in the 21st century Lecture 1: Neurons, Synapses, and Signaling Dr. Michael Burger Outline: 1. Why neuroscience? 2. The neuron 3. Action potentials 4. Synapses 5. Organization of the nervous

More information