Frequency separation by an excitatory-inhibitory network: separating frequencies

Size: px
Start display at page:

Download "Frequency separation by an excitatory-inhibitory network: separating frequencies"

Transcription

1 Journal of Computational Neuroscience manuscript No. (will be inserted by the editor) Frequency separation by an excitatory-inhibitory network: separating frequencies Alla Borisyuk Janet Best David Terman Received: date / Accepted: date Abstract We consider a situation in which individual features of the input are represented in the neural system by different frequencies of periodic firings. Thus, if two of the features are presented concurrently, the input to the system will consist of a superposition of two periodic trains. In this paper we present an algorithm that is capable of extracting the individual features from the composite signal by separating the signal into periodic spike trains with different frequencies. We show that the algorithm can be implemented in a biophysically based excitatory-inhibitory network model. The frequency separation process works over a range of frequencies determined by time constants of the model s intrinsic variables. It does not rely on a resonance phenomenon and is not tuned to a discrete set of frequencies. The frequency separation is still reliable when the timing of incoming spikes is noisy. Keywords excitatory-inhibitory networks decorrelation oscillations A. Borisyuk Department of Mathematics University of Utah 155 S 1400 E, Salt Lake City, UT, Tel.: Fax: borisyuk@math.utah.edu J. Best Department of Mathematics Mathematical Biosciences Institute Ohio State University 231 W 18th Ave., Columbus, OH, jbest@math.ohio-state.edu D.Terman Department of Mathematics Ohio State University 231 W 18th Ave., Columbus, OH, terman@math.ohio-state.edu

2 2 1 Introduction Oscillatory or near-oscillatory activity is a ubiquitous feature of neuronal networks. Examples range from auditory nerve responses to pure tones [Rose et al., 1967] to theta-rhythm in hippocampus [Green and Arduini, 1954], to odorevoked oscillations in the mammalian olfactory bulb [Adrian, 1950], etc. The exact role of these oscillations in neural coding remains elusive. However, in many systems the rate of near-periodic firing is well correlated with features of the stimulus, such as orientation tuning in primary visual corex [Arieli et al., 1995], coding of head direction [Taube et al., 1990], air current direction coding in cricket cercal system [Landolfa and Miller, 1995], and so on. It has been suggested in numerous earlier studies that the natural frequency of oscillators may be used to represent stimulus features, and adaptation of oscillator frequencies can be used as a mechanism of learning and memory (e.g. [Torras, 1986, Niebur et al., 2002, Kuramoto, 1991, Kazanovich and Borisyuk, 2006]). However, because sensory input typically encodes for multiple features, there must be some mechanism by which the brain extracts individual features from composite signals. It has also been suggested that the recurrent activity in excitatory-inhibitory networks may serve to decorrelate (reduce the correlations) in the spiking activity [Ecker et al., 2010, Renart et al., 2010, Tetzlaff et al., 2010]. This type of network has been studied in a variety of different contexts, including models for sleep rhythms, Parkinsonian rhythms, olfaction and working memory. In some sense, the present paper is motivated by work presented in [Bar-Gad et al., 2000, Bar-Gad and Bergman, 2001] where it is suggested the neuronal activity within the basal ganglia serves to reduce the dimensionality and decorrelate information coming from cortical areas. Here, we address the problem of frequency separation as a well-formulated particular first step in studying the detangling of input information. The basic setup for the problem we consider is shown in Figure 1A. Two stimuli, represented by different frequencies of firing at lower levels of processing, are presented simultaneously to the neuronal network (top panel in Figure 1A). Thus the incoming signal is a superposition of two periodic pulse trains. The main task of the system is to produce two outputs: one at each of the original frequencies making up the input (lowest panel in Figure 1A). First, we present an algorithm for frequency separation. We formulate a set of rules and prove that most frequencies can be successfully separated. Next, we build a biophysical model (based on [Terman et al., 2002] with some modifications) that implements the algorithm and demonstrate its functionality in numerical simulations. Even though the implementation of the algorithm is not exact, we show in numerical simulations that the frequency separation works for a range of frequencies. The mechanism does not rely on precise tuning to a predetermined set of frequencies. We also show that errors in the frequency separation can be avoided by changing the relative phase of the inputs or the initial condition of the model elements, and that the frequency separation is

3 3 still successful when the times of incoming pulses are perturbed by random amounts. 2 Frequency separation algorithm 2.1 Setup B x 1 x 2 A rule used A1 B A C A B C A A C A A C x 1 y 1 x 2 y 2 y 1 y 2 cell responding t, msec Fig. 1 Frequency separation algorithm. A: schematic. The input is a superposition of two periodic pulse trains (black and grey), the output are the individual periodic trains. The algorithm is represented by the quantities x i, y i, i = 1,2. B: Example of frequency separation. The input pulse trains (the bottom panel) have interpulse intervals of 115 and 200 msec. The dynamics of x i variables (top) and y i variables (middle) varies according to the algorithm rules (see text). The rule used at each incoming pulse is indicated by a letter between top two panels. The numbers above the bottom panel indicate which unit responded to each incoming pulse. Note that after about 400 msec all black incoming pulses are picked out by cell 1 and all grey ones by cell 2 (frequencies have been separated). The frequency separator in the algorithmic form consists of two response units, each represented by a pair of variables (x i, y i ), i = 1, 2 (Figure 1A). Both x and y evolve according to the rules below. The input arrives at both cells at times (t j ) obtained by superposition of 2 pulse trains with interpulse intervals T 1 and T 2 (without loss of generality T 1 < T 2 ). At each input pulse, one of the units responds by resetting its x and y values according to the rules described in the next section and the response is recorded. The variable x i tracks time since the most recent response of the unit i and the variable y i represents the anticipated time until the next response (the difference between the previous inter-response time and the time since the most recent response). We use notation f(t ) and f(t + ) as the leftt and right limits of function at t.

4 4 2.2 Rules The algorithm can be summarized as follows: Cell 1 always responds if it is expecting a pulse (i.e., y 1 0). The first time a pulse arrives earlier than cell 1 anticipates (i.e. while y 1 > 0), cell 2 responds. Thereafter, cell 1 also responds to an unexpected pulse if cell 1 is less surprised than cell 2 (i.e., y 2 y 1 > 0). Formally: - Dynamics: x i = y i = 0 at time 0; dx i /dt = 1 and dy i /dt = 1; - Response at pulse time t j : If y 1 0, cell 1 responds (rule A) If cell 2 has never responded, it sets y 2 (t + j ) = x 2(t j ), x 2(t + j ) = x 1(t j )(rule A1) If y 2 y 1 > 0, cell 1 responds, unless cell 2 has never responded, then cell 2 responds. (rule B) If y 1 > 0 and y 1 > y 2, cell 2 responds. (rule C) If two input pulses coincide, both cells respond. (rule D) - Reset: when cell i responds at time t, it resets y i (t + ) = x i (t ), x i (t + ) = 0; when cell 1 responds, and cell 2 has never responded, rule A1 is used An example of the algorithm application is shown in Figure 1B. The top two panels track the evolution of x 1,2 and y 1,2 with time. The bottom panel shows the incoming pulse train and the number over each pulse indicates which of the two cells responded. Two periodic trains have different shades (black and grey) for ease of viewing, but this information is not available to the model system. One can see that after about 400 msec cell 1 responds at all black and cell 2 at all grey pulses, demonstrating the success of frequency separation. To see how the rules are applied, let us consider the input pulse near 400 msec mark. At that time y 2 > y 1 > 0, so the rule B is applied, cell 1 responds and x 1 and y 1 are reset. At the previous input pulse y 1 < 0, so rule A applies and cell 1 responds as well. 2.3 Validity of the algorithm. We will now formally show that for any input frequencies the algorithm works for selected initial conditions. First we will introduce notations and definitions, then we will formulate and prove the main algorithm result. Let T i (i = 1, 2) be the periods of the original periodic input pulse trains. We assume that T i are integers (in msec), and, without loss of generality, that T 1 < T 2 and T 2 = mt 1 + R where m is an integer and R < T 1. We will use notation t j i for the time of jth spike from the train with period T i, while x k and y k (k = 1, 2) will be the algorithm variables, as above.

5 5 Definition. We say cell k remembers period T i at time t if its last response at or before time t happened at the last encountered T i pulse (say, at time t j i t), and x k((t j i ) ) = T i. We note that the event cell k remembers T i has to occur first at a time of T i pulse. If it happens at time t j i (both k and i being 1 or 2) then the cell will continue to remember T i for any time t > t j i until one of the following events occurs: cell k does not respond to a T i pulse; cell k responds to a T i pulse, but just before the response x k T i ; or cell k responds to a pulse that does not belong to T i. Definition. We say cell 1 remembers period T i and cell 2 remembers period T k at time t if at time t each cell remembers the corresponding period according to the above definition. Definition. We say that the algorithm separates frequencies if there exists N such that for all input pulse times t n, n > N, cell 1 will respond to every T i pulse and cell 2 will respond to every T k pulse (i k). Definition. We say that initial configuration of the inputs is an increasing interval initial condition if the time to the first input pulse and 3 subsequent interpulse intervals form a strictly increasing sequence. The increasing interval initial condition frequently occurs. It will occur, for instance, for m > 1 when the first spike from T 1 train occurs early and the one from T 2 train follows soon, namely: t 1 1 < T 1 /2 and t 1 1 < t 1 2 < t T 1 /2. Lemma 1 If cell 1 remembers T i and cell 2 remembers T k (i k) at some time t, then the frequencies are separated. Proof If the event cell 1 remembers T i and cell 2 remembers T k is true at some time t between the incoming pulses, it will persist until the next incoming pulse. Thus, we need to show that if cell 1 remembers T i and cell 2 remembers T k occurs at one of the incoming pulses, then it will also occur at the subsequent pulse. Then, by induction, it will occur indefinitely, meaning that cell one will respond to every input from T i sequence, and cell 2 will respond to every input from T k sequence. 1. Suppose cell 1 remembers T i and cell 2 remembers T k occurs at some t = t j k (which may also be coincident with one of T i pulses). At t (x 1, y 1, x 2, y 2 ) = (a, T i a, T k, y2), where a is the time from last T i input, and y2 is an arbitrary value. Since cell 2 responds, we have at t+ : (x 1, y 1, x 2, y 2 ) = (a, T i a, 0, T k ). (If t j k was coincident with one of T i pulses, then both cells respond by rule D and the values after reset are (0, T i, 0, T k )). If the next incoming input at time t is from T i sequence, we will have x 1 (t ) = T i, y 1 (t ) = 0. Thus, cell 1 will respond by rule A, and we still have cell 1 remembers T i and cell 2 remembers T k. Conversely, if the next incoming input at time t is from T k sequence (which also implies T i > T k ), then at t : (x 1, y 1, x 2, y 2 ) = (b, T i b, T k, 0), where b is the time since the last T i spike. We have y 1 (t ) > 0 = y 2 (t ) thus cell 2 will respond by rule C and cell 1 remembers T i and cell 2 remembers T k persists.

6 6 Finally, if the next incoming input at time t is coincident, then values before reset are (T i, 0, T k, 0), both cells respond by rule D and cell 1 remembers T i and cell 2 remembers T k remains. 2. If cell 1 remembers T i and cell 2 remembers T k occurs at some t j i, we can similarly show that for any subsequent pulse (T i, T k or coincident) cell 1 remembers T i and cell 2 remembers T k will remain true. As a result, cell 1 remembers T i and cell 2 remembers T k will occur indefinitely, meaning that cell one will respond to every input from T i sequence, and cell 2 will respond to every input from T k sequence. Theorem 1 Suppose T 1, T 2 are positive integers with T 1 < T 2. Under the setup described above, for any pair T 1, T 2, there exist initial conditions such that frequencies will be separated. In particular, it will happen in the following situations: 1. for initial condition in which first two input pulses belong to T 1 train; 2. for increasing-interval initial conditions for m > 1 (where m is such that T 2 = mt 1 + R, see above). Proof Case 1. Suppose that first two input pulses belong to T 1 train, and there are no coincident inputs up to t 2 2. Let s say the first pulse arrives at time t = t 1 1 = a < T 1, then at t = a : (x 1, y 1, x 2, y 2 ) = (a, a, a, a). Here y 1 < 0, so cell 1 responds by rule A, and at t = a + the values of (x 1, y 1, x 2, y 2 ) become (0, a, a, a) by rule A1. Next pulse is again T 1 at t = t 2 1. The values of (x 1, y 1, x 2, y 2 ) at t are (T 1, a T 1, a + T 1, a T 1 ). By rule A cell 1 responds and cell 1 remembers T 1. The values are reset to (0, T 1, T 1, a + T 1 ). Now, as long as T 1 pulses continue to arrive, cell 1 will respond by rule 1 and the reset values each time will be (0, T 1, T 1, 2T 1 ), and cell 1 will continue to remember T 1. At some point we will get a T 2 pulse. Let s say it arrives at time b after the previous T 1 pulse. Just before the first T 2 input the values are (b, T 1 b, T 1 + b, 2T 1 b) or (b, T 1 b, T 1 + b, a + T 1 b) if T 2 pulse arrives right away, as the third pulse of the joint train. In either case we have y 1 > 0 and cell 2 has never responded, so cell 2 responds by rule B and reset values are (b, T 1 b, 0, T 1 + b). Next, as long as T 1 inputs are arriving, we will have y 1 = 0, cell 1 responding, and cell 1 still remembering T 1. Finally, when t 2 2 arrives (some time c after the previous T 1 pulse), we will have (x 1, y 1, x 2, y 2 ) = (c, T 1 c, T 2, T 1 +b T 2 ). Here y 1 > 0, and y 2 = T 1 +b T 2. Also, we must have T 1 + b < T 2 (even for m = 1), otherwise we would have had a T 2 pulse earlier. Thus, cell 2 responds again by rule C and cell 2 remembers T 2. Frequencies are separated by Lemma 1. Now consider the case when first T 2 pulse is coincident with one of T 1 pulses. Note that because of the theorem assumption it can be coincident with t 3 1 or later, which also implies m 2, and T 2 > 2T 1. Just before the first T 2 pulse, similar to above, the values of (x 1, y 1, x 2, y 2 ) are (T 1, 0, 2T 1, T 1 ) or (T 1, 0, 2T 1, a) (the latter will happen if t 1 2 = t3 1 ). In both cases both cell 1 and cell 2 respond (by rule D) and after the reset the values become (0, T 1, 0, 2T 1 ),

7 7 and cell 1 remembers T 1. Then, as long as T 1 pulses continue arriving, cell 1 responds by rule A and remembers T 1. When the second T 2 pulse arrives, it could again be coincident with one of T 1 pulses, in which case the values just before t 2 2 will be (T 1, 0, T 2, 2T 1 T 2 ), both cells respond by rule D, and cell 2 remembers T 2, while cell 1 remembers T 1. If the second T 2 pulse arrives time c after the preceding T 1 pulse (0 < c < T 1 ), then the values just before are (c, T 1 c, T 2, 2T 1 T 2 ), cell 2 responds by rule C and cell 2 remembers T 2. Frequencies are separated by lemma 1. Similarly, if the first T 2 pulse is not coincident with T 1 (occurs at time b after the preceding T 1 pulse), but the second one is, then just before t 2 2 the values of (x 1, y 1, x 2, y 2 ) are (T 1, 0, T 2, T 1 + b T 2 ), both cells respond by rule D, cell 1 remembers T 1 and cell 2 remembers T 2, and lemma 1 applies. Case 2. Now consider increasing interval initial condition with m > 1. First, we will show that this condition implies that second incoming pulse is from T 2 train and that the interval between the third and fourth pulses is equal to T 1. Let us say first input pulse occurs at time a, and subsequent interpulse intervals are b, c and d. Increasing interval condition means that a < b < c < d. We also know that d T 1 as T 1 is the largest possible interpulse interval. This implies that there are no two consecutive T 1 pulses among the first three, so second pulse is T 2. Moreover, since m > 1 the next T 2 spike can only occur as the fifth one or later. This means that third and fourth pulses are both T 1 and d = T 1. Next, by following the rules of the algorithm, we can obtain that cell 1 will respond to each of the first 4 pulses. After the fourth one the values are reset to (0, d, d, c + d) = (0, T 1, T 1, c + T 1 ) and cell 1 remembers T 1. This is exactly the same situation as we found after the second input pulse in part 1 of Case 1 of this proof, and the rest follows. Remark. We believe that the frequencies will also be separated for most initial configurations as long as T 1 does not divide T 2 (i.e., R 0). To show this, all initial configurations must be considered. In some cases an incorrect pattern is initially remembered and it takes a while for the algorithm to find the correct solution. Moreover, for smaller m, cell 1 can remember either T 1 or T 2, depending on initial conditions while for m large enough (m > 3), we always have cell 1 remembering T 1 for any initial conditions. The proofs of these statements are very technical, and we do not present them in this paper. Instead, in figure 2 we show results of numerical iteration of the algorithm. We ran algorithm 10 6 times. In each run we choose T 1 and T 2 randomly (uniformly) between 1 and 100 (in numerical simulations we did not follow the T 1 < T 2 convention), and also choose the time for the first T i pulse uniformly between 1 and T i 1, i = 1, 2. Each run proceeded until separation of frequencies occurs, or time reaches 15,000, whichever is earlier. Overall, there was 0.1% of cases in which the frequencies were not separated (time ran to the maximum). This figure includes those (few) cases where separation of frequencies would take even longer, and those where non-separated

8 8 periodic pattern is reached (separation will never occur). Figure 2A shows values of T 1 and T 2 in all of these unsuccessful cases. Most of them lie on T 2 = T 1, T 2 = 3T 1, or T 2 = T 1 /3 lines (solid lines), but there are a few other points as well. Note that each of the points in this figure corresponds to multiple failed cases, i.e. multiple sets of initial conditions. Next, we look at the distribution of times at which frequency separation occurs (figure 2B). Most of cases (99.6%) were successfully separated by t = 600, but there is a long tail extending (and slowly decaying) to the right. Median separation time is 131. If we eliminate special resetting at the beginning of the separation (eliminate rule A1), then the percent of failed separations grows to 1.2% and corresponding values of T 1 and T 2 now cover a broad range (figure 2C). A 100 B 25 C T Percent per bin T T 1 Decorrelation time T 1 Fig. 2 Numerical iteration of the algorithm. A: Black dots show randomly chosen (T 1, T 2 ) pairs that were not sucessfully separated for at least one set of initial conditions in 15,000 long run (see text). Solid lines show T 2 = T 1, T 2 = 3T 1, or T 2 = T 1 /3. B: Distribution of times taken to reach frequency separation in 10 6 iterations of the algorithm with randomly chosen periods and initial conditions (see text). The horizontal axis is artificially cut at 2,000 for ease of viewing. C: Same results as in panel A, with rule A1 removed from the algorithm. 3 Biophysical implementation 3.1 Frequency separator unit As explained in the introduction, the original motivation for this work came from experimental findings in basal ganglia [Bar-Gad et al., 2000, Bar-Gad and Bergman, 2001]. Thus, we chose to implement the algorithm in a modification of the model [Terman et al., 2002] of basal ganglia circuit, consisting of the external segment of the globus pallidus (GPe; inhibitory cells) and the subthalamic nucleus (STN; excitatory cells). The models for individual cells are quite minimal (reduced Hodgkin-Huxley tupe model [Hodgkin and Huxley, 1952], with 2 variables for the inhibitory cell and 3 for the excitatory cells) but include ionic currents that are known to be present in real cells and have been shown to be important for their function, such as low threshold calcium

9 9 B A I msec I 2 E 1 E 2 I I I I E E E E 300 ISI 150 Fig. 3 Biophysical model. A: Schematic of the wiring of the model excitatory (E) and inhibitory (I) cells, and the inputs. Arrows represent excitatory connections, filled circles represent inhibitory connections, and open circles represent y-dependent inhibitory connections (see text). Basic unit of the network is a set of 4 cells (shaded box). Two such units are shown. B: Example of the response of the biophysical model. Top panel shows the inputs (black periodic pulse train and grey periodic pulse train are combined), middle panels show voltages of the 4 cells from the shaded box in panel A. Inhibitory cells respond to inputs by ceasing their firing, and excitatory cells respond by spiking. Lower panel summarizes responses of the output (excitatory) cells by showing their interspike intervals. Dotted lines show the frequencies of the input pulse trains. 0 current in the E cell. It is possible that a more generic model could be successful in implementing the frequency separation algorithm as well, but it is beyond the scope of this paper. The main unit of the model is a network of two excitatory (E) and two inhibitory (I) cells (shaded box in Figure 3A). We will call it the frequencyseparator unit. Each cell is described by a biophysically-based model (Hodgkin- Huxley type, ([Hodgkin and Huxley, 1952])), in which the membrane potential (voltage) V = V (t) satisfies the current-balance equation: C m dv dt = I ion(v ) I syn + I input. Here C m is the membrane capacitance, I ion represents the cell s intrinsic ionic currents, the synaptic current I syn is the current due to activity of other cells in the network, and finally I input is the incoming mixed-frequency signal. The details of the model, together with parameter values and units of various variables are given below. The frequency-separator unit is the minimal network of excitatory-inhibitory connections to allow the two inhibitory cells to compete and to allow the E cells to serve as read-outs of the network output. Addition of another such unit, as in figure 3A and in simulations, provides additional lateral inhibition, making competition between I cells more efficient. Plus, as explained in Discussion, if the two blocks start at different initial conditions, or receive inputs at different initial phase shifts (due to delay lines), then the second unit may be able to separate frequencies correctly even of the first one fails.

10 10 The model has several features that need to be pointed out. First, the input to the system is inhibitory. So the cells in the first, inhibitory, layer respond to the input by temporally stopping their firing (for a period of time longer than the typical interspike interval). The cells of the second layer (excitatory cells) are usually suppressed and when they respond, it is by emitting a spike (or a short burst). Therefore when we refer to a cell responding it can mean stopping to fire in the case of inhibitory cells or firing in the case of excitatory cells. This arrangement is not a requirement of the model. The frequency separation would work as well if the roles of excitation and inhibition reversed. Second, a special feature of the inhibitory cells is the presence of x and y variables, analogous to x i and y i in the algorithm above. The quantities x and y can be thought of as fractions of some substances X and Y in the active state, affecting both the incoming and the outgoing synapses of the cells. We discuss the roles and possible implementations and interpretations of x and y in the section 3.4 below. Besides these special features, the algorithm implementation is not dependent on details of the particular model. Equations for an I cell. The intrinsic current I ion consists of the leak, sodium, and potassium currents, and the bias current I: I ion = g L (V V L ) + g Na (m (V )) 3 h(v V Na ) + g K (1 h) 4 (V V K ) I, dh dt = (h (V ) h)/τ h (V ). The functions and parameters are given in the Appendix. The synaptic current I syn consists of the contribution to I from E cells (subscript IE ; this current depends on the synaptic conductance s E of an appropriate E cell as shown in diagram in Figure 1, for definition of s E see equation for E cell below); and the input from neighboring I cells (subscript II ; the summation is over two neighboring I cells with periodic boundary condition, connection between cells 1 and 4 are not shown in the figure) I syn = I IE + I II. Currents I II have the same parameters whether or not to I cells belong to the same block. The synaptic currents are described in detail in sections The inhibitory cell also produces a gating variable s I to be used as an input in E equations below: ds I dt = α I(1 s I )s,i (V ) β I s I, s,i (V ) = 1/(1 + exp( (V + 45))). Equations for an E cell. The intrinsic current I ion consists of the leak, sodium, and potassium currents, and the T-type current (outward current activated by hyperpolarization): I ion = g L (V V L )+g Na (m (V )) 3 h(v V Na )+g K (1 h) 4 (V V K )+g T (m,t ) 2 h T V,

11 11 dh dt = (h (V ) h)/τ h (V ), dh T = (h,t (V ) h)/τ h,t (V ), dt where the functions and the parameter values are given in the Appendix. External input to the E cell is equal to zero (I input = 0). The E cell also produces a gating variable s E to be used as an input in I equations above: ds E dt = α E (1 s E )s,e (V ) β E s E, s,e (V ) = 1/(1 + exp( (V + 35))). 3.2 Excitatory connections The excitatory current received by the cell with voltage V is given by I IE = g IE s E (V V IE ). 3.3 y-independent inhibitory connections The synaptic current I syn received by the E cell with voltage V comes from a neighboring I cell according to the wiring diagram in Figure 3A and depends on the I cell activity through the s I variable: I EI = g EI s I (V V EI ). 3.4 y-dependent inhibitory connections For each inhibitory cell the external input it feels (I input ) and the current it sends to its neighbors (I II ) are influenced by the cell s y variable. The dynamics of y. The dynamics of y is governed by: dy dt = β y. In addition, at the start of a response (at time t r when the cell I stops firing) y is reset to a value determined by an auxiliary variable x: y(t + r ) = x(t r ), dx dt = α x, x(t + r ) = 0. Both x and y stay constant for the duration of the response.

12 12 x x y β x (V) α x (V) β y (V) α y (V) x i y i Fig. 4 Schemes for activation and inactivation of substances X (left) and Y (right). They can switch between active state (x/y) and inactive (x i /y i ) with voltage-dependent rates α x(v ), β x(v ), α y(v ), β y(v ) as shown in the figure. Activation of Y is also affected by the amount of activated X and is drawn from a large pool (grey). In general, x and y can be thought of as fractions of substances X and Y in the active state. During the firing of the I cell, x is accumulating and y is being removed (Figure 4). Once the firing stops and V is shifted to a relatively more hyperpolarized level, y accumulates quickly to the extent that x is available and x is quickly removed after a short delay. As in the algorithmic toy model, x keeps track of time since the last event, and y compares the time spent since last event with the previous ISI. The cell is ready to fire if the current time since last event is long enough (y is low enough). Similar dynamics has been used in models of synaptic depression in [Bose et al., 2001] and [Matveev et al., 2007]. There the depression variable d was increased at every spike by a multiplicative factor and decayed exponentially between spikes. At the same time the synaptic variable s was increased by the spike to the value of d (the amount of available synaptic resources) and decayed between spikes at its own time scale. The amount of substance y in an I cell affects the amount of inhibition that this cell is sending to other I cells (presynaptic effect on I-I inhibition), and also how responsive the cell is to the external input (post-synaptic effect on external input) - shown in Figure 3A. Right after the response the high concentration of y makes the external input less efficient. At the same time the higher value of y facilitates the I-I synapses preparing the cell s neighbors to respond more easily to expected external (inhibitory) input. Note that the higher value of y has the same effect on both neighboring I cells. As a result, neigboring cells will tend to pick out different frequencies from the original train, and every other cell will tend to pick out the same frequency. As the cell keeps on firing, its y value decreases attenuating the efficacy of I-I inhibition and increasing the efficacy of the external input. Both effects of y in the network (decreasing sensitivity to increasing input and potentiating lateral inhibitory connections) contribute to biophysical implementation of the main idea behind the separation algorithm: the cell with lower y responds. The third component that has similar effect is the

13 13 net-inhibitory I-E-I connection. It is worth noting that finer points of the algorithm, such as advantage of cell 1 (it always responds if y < 0, by rule A), setting of special initial configuration (rule A1 and B), and linear relationship of x and y with time are all lost. This contributes to considerably less successful frequency separation by the biophysical model compared to the algorithm. The time constants for x and y restrict the range of frequencies over which the frequency separation is successful. Synaptic currents. The I-I synaptic current received by the cell is given by I II = g II H(y j θ 1 ), where H(y) is the smoothed Heaviside function: H(y) = j exp( y/0.02), θ 1 is the threshold value of y for this connection, and the summation is over two neighboring I cells with periodic boundary conditions. The external input current has as its gating variable s input a modified, more realistic form of the original mixed-frequency pulse train F(t): ds input I input = g input s input max(1 y/θ 2, 0)(V V input ), = α inp (1 s input )H(F(t).5) β inp (1 H(F(t).5))s input, dt F(t) = H(10 sin(2πt/t 1 ))(1 H(10 sin(2π(t+d p )/T 1 )))+H(10 sin(2π(t+ϕ)/t 2 ))(1 H(10 sin(2π(t+d p +ϕ)/t 2 ))), where ϕ is the phase shift between the pulse trains, and d p is the duration of the pulse. Notice that the current is zero if the postsynaptic cell is not ready to accept it (y > θ 2 ). For values of all parameters see Table 1. If we consider a single I cell, it can be switched from firing continuously to quiescence and back by varying the parameter k = s input max(1 y/θ 2, 0). In particular, for an isolated cell k = 0 (s input = 0) and the cell is in oscillatory state (firing continuously). For some larger value of k = k a steady state voltage solution stabilizes (at a hyperpolarized V value) and the firing stops (Figure 5A). In terms of s input and y it means that for y large enough the oscillatory solution persists for any value of input s input. For small y, on the other hand, the input can stop the firing if s input θ2k θ 2 y (Figure 5B). 4 Simulation results All results below were obtained with a block of two frequency separators (2 E and 2 I cells each), shown schematically in Figure 3A. Input is inhibitory to I cells and the output is the spikes of the E cells. Single cells. Both E and I cells want to fire continuously, until the firing is stopped by inhibitory input either from the external input (for I cells) or from

14 14 20 V inh 50 s input Quiescence k* 0.1 k k* y Spiking Fig. 5 Bifurcation diagram of the inhibitory cell. Left panel shows stable (solid) and unstable (dashed) steady states as a function of bifurcation parameter k (see text). The maximum and minimum of the family of periodic orbits are also shown (thick solid curves). As k crosses k from right to left, the system dynamics changes from quiescence to spiking. The transition information is replotted again in the right panel in the two-parameter space (see text). Solid line marks the bifurcation value k = k and the dotted line is the asymptote of this curve. For y > θ 2 there is no quiescent regime for any value of s input. θ 2 the I population (for E cells). Cells will respond to release from inhibition by firing. This is accomplished, because each cell in isolation has a high firing rate, and is aided by the presence of the low-threshold Ca 2+ current, which is transiently activated as cell is released from hyperpolarization, resulting in fast depolarization and spiking. Frequency separation. Figure 3B shows a typical example of the numerical experiment with the model. The top panel shows the input pulse train which is constructed as a superposition of two periodic pulse trains, in this case with interpulse intervals of 210 and 270 msec. For picture clarity we have colored two original trains in different shades. The next four panels (labeled I 1, I 2, E 1 and E 2 ) show the voltage time courses of the four cells of the first separator unit (shaded box in Figure 3A). Let us consider what happens when an input pulse arrives shortly before the 1000 msec mark (black arrow). Both cells I 1 and I 2 receive it as an inhibitory input. At that time cell I 2 has lower value of y, thus, as explained in section 3.4 it is more susceptible to the external input and it also receives more inhibition from its neighbors, with their higher y values. As a result, it is only cell I 2 that terminates its firing. This, in essence, implements the essential part of rules A,B and C of the algorithm - the cell with lower y value will be the one to respond. Next, excitatory cells act as readouts of these responses. Due to a pause in I 2 firing E 2 receives less inhibition and is able to fire a spike, aided by the presence of the lower threshold calcium current. This, in turn, provides excitation to I 1, which fires faster, further reducing chances of E 1 to fire. Cessation of firing in I 2 also causes the reset of its x and y variables, as explained in section 3.4. The lower panel of Figure 3B shows the interspike intervals of output cells E 1 and E 2. You can see that each of the cells settles to firing with approximately one of the original input frequencies.

15 Interpulse interval % 58% Interpulse interval 1 Fig. 6 Examples of frequency separation with a network from Fig. 2A. Left: Each of the input periods is chosen as T + 10r where r is a random number between 0 and 1 and T is varied from 100 to 600 msec in 10 msec steps. Each period pair is presented for 5000 msec and the separation is judged as successful if for each input train there was an output cell that maintained the correct interspike interval (within 5% of the input period) for 2000 msec. Successful trials are marked with filled circles and occurred in 65% of the trials, unsuccessful trials are marked with open circles. Right: When the range of presented periods is restricted (indicated by a frame), the success rate may increase or decrease (indicated by percent of successful trials). Simulation data same as in the left panel. Figure 6 shows an example of the network performance for a range of input frequency values. Each point in Figure 6 corresponds to a simulation, in which the model was presented with a mixture of pulses of two given frequencies for 5,000 msec. The periods of input trains were each randomly chosen by drawing once from a uniform distribution on every 10-by-10 msec square. The phase-shift of inputs ϕ is fixed at 0, and the initial conditions of the ODE system were the same in every trial. The trial is successful (filled circle) if for each input period T i there was at least one output (E) cell with an average inter-spike interval (ISI) within 5% of T i for 2000 msec. Otherwise the trial is labeled unsuccessful (open circle). Note that the sucess is not automatic even when T 1 and T 2 lie within 5% from each other. For example one of the cells can take over and respond nearly every time, while the other one will respond only infrequently or not at all. It can also be seen by observing the open circles that occur in the figure even near the diagonal. The rate of successes overall was 65%. Each panel on the right highlights a section of the main figure and lists percentage of frequency pairs in that section that were successfully decorrelated. The success rate is high at the optimal range of frequencies (81%, upper right), but starts to decline if one of the input frequencies becomes too high or too low (58%, lower right). Range

16 16 of successfully separated frequencies can be varied by adjusting parameters of the model. A B Phase shift between inputs Interpulse interval 1 Interpulse interval Interpulse interval 1 Fig. 7 Error correction by the model. A: Errors can be corrected by changing the phase shift between inputs. Interpulse interval of the first input train is shown on the horizontal axis, the interpulse interval of the second input train is marked by the square. Different rows correspond to different phase shift between the inputs as indicated. Successful separation (by the same criterion as in Figure 6) is shown with a filled circle, unsuccessful with a cross. B: Results of frequency separation in the situation when the time of each incoming spike is modified by adding a random number uniformly distributed in the range [-20 20]. In the example on top black bars correspond to the original pulses, and thin lines indicate the ±20msec range around each black pulse, from which the perturbed pulse time is drawn. White bars are the perturbed pulses. Notation on the grid is the same as in A. Most of the errors in the model s performance can be corrected by using a different phase-shift of inputs (ϕ) or the initial conditions of the network. Figure 7A shows an example in which we vary the input phase shift ϕ with fixed T 1 = 232 and several different values of T 2. For each frequency combination (and fixed initial conditions, same as in Figure 6) there are only a few phase shifts (if any) at which the trial is unsuccessful. Similar result is achieved if the phase is fixed, but the initial condition is allowed to vary (not shown). The performance of the model is robust to jitter in the input spike trains. Figure 7B shows an example in which the time of arrival of each input pulse was modified by a random amount (uniform distribution from -20 to 20 msec). We tested pairs of input frequencies in the range of msec, with 10 msec increments and fixed initial conditions. We found that 90% of frequency pairs were successfully separated (Figure 7B). 5 Discussion We described an algorithmic model that can pick out individual periodic trains from the superposition of two such trains and its biophysical implementation. We believe that this algorithm can be implemented in hardware as well.

17 17 It should be noted that this biophysical implementation is not unique, and it does not follow the algorithm exactly. Our results show that even with an imperfect implementation the frequency separation procedure works. It also suggests another possible role for excitatory-inhibitory networks. One of the specialized features of proposed biophysical network is the presence of variables x and y. We do not presently have specific agents in mind that could be represented by this model, but we will speculate about possibilities. As we indicated above in section 3.4, they could be thought of as concentrations of some substances X and Y. It is conceivable that Ca 2+ can play the role of X, as it is accumulated during spiking. The substance Y needs to be related to efficacy of synaptic function, and its activation be affected by X. For example, it could represent a part of a metabotropic receptor pathway that naturally weakens during spiking, potentiating the synapse. Once the spiking stops, it quickly uses calcium to reactivate again. We repeat once more, that this pure speculation, and more work needs to be done to identify specific biophysical identities for x and y. One of the strengths of the proposed algorithm is that the frequency separation works for a whole range of frequencies and does not rely on a resonance to a discrete set of preferred frequencies. We have shown in numerical simulations (Figure 7A) that the errors of computation can be corrected by changing the phase shift between inputs or the network s initial conditions. This property of the model can be exploited in a larger network, in which each of the separating units starts independently at a different time, thus effectively being at a different initial condition at the time of the input s arrival. Then the units with most consistent output ISIs (the successful units) can be rewarded and reinforced. As a possible functional role for such a network, we envision a situation where there is a large class of possible features (attributes) that the system needs to recognize. For example, the object can be red, blue, square, triangular, etc. Suppose that each of the possible attributes is represented in the system by a different frequency [Torras, 1986, Niebur et al., 2002, Kazanovich and Borisyuk, 2006, Kuramoto, 1991]. Assume additionally that at every object presentation only two features (from a large pool of possibilities) are presented (say, a red octagon). The system needs to recognize which features it is confronted with (i.e., to detect the individual frequencies in the mixed signal) then transfer this information to higher processing areas. For example the detected frequencies can be compared to memory-stored database and the appropriate action retrieved. This work is also related to studies in pattern identification, in which the system is looking for a certain sequence of neuronal firings (ISIs). Several different strategies have been described, such as matching to template [Dayhoff and Gerstein, 1983, Tetko and Villa, 2001] and a method based on the correlation integral [Christen et al., 2004]. In contrast to these studies, our work focuses on identification of only the periodic sequences of ISIs. On the other hand it is more flexible, not requiring the presence of a template and the detecting

18 18 network produces detected patterns as its outputs ready for transmission and further use. Acknowledgments Appendix Equations for an I cell. C m dv dt = I ion(v ) I syn + I input. The intrinsic current I ion consists of the leak, sodium, and potassium currents, and the bias current I: I ion = g L (V V L ) + g Na (m (V )) 3 h(v V Na ) + g K (1 h) 4 (V V K ) I, where The parameters are given in table 1. Equations for an E cell. dh dt = (h (V ) h)/τ h (V ), m (V ) = 1/(1 + exp( (V + 37)/7)), h (V ) = 1/(1 + exp((v + 41)/4)), τ h (V ) = 0.69/(α h (V ) + β h (V )), α h (V ) = exp( (46 + V )/18), β h (V ) = 4/(1 + exp( (23 + V )/5)). C m dv dt = I ion(v ) I syn + I input. The intrinsic current I ion consists of the leak, sodium, and potassium currents, and the T-type current - outward current activated by hyperpolarization: I ion = g L (V V L )+g Na (m (V )) 3 h(v V Na )+g K (1 h) 4 (V V K )+g T (m,t ) 2 h T V, where dh T dt dh dt = (h (V ) h)/τ h (V ), = (h,t (V ) h)/τ h,t (V ), m (V ) = 1/(1 + exp( (V + 37)/7)), m,t (V ) = 1/(1 + exp( (V + 60)/6.2)), h (V ) = 1/(1 + exp((v + 41)/4)), h,t (V ) = 1/(1 + exp((v + 84)/4)),

19 19 Parameter Value g L 0.05 g Na 3 g K 5 g T 1 V L -70 V Na 50 V K -90 β y α x θ θ 2 2/3 d p 10 Table 1 Model parameters τ h (V ) = 0.83/(α h (V ) + β h (V )), τ h,t (V ) = 28 exp((v + 25)/10.5), α h (V ) = exp( (46 + V )/18), β h (V ) = 4/(1 + exp( (23 + V )/5)). External input to the E cell is equal to zero (I input = 0). Parameter values are given in Table 1. Acknowledgements This work was supported by the Mathematical Biosciences Institute and the National Science Foundation under grant DMS , NSF grant DMS (AB), NSF CAREER Award DMS (JB), Alfred P. Sloan Research Foundation Fellowship (JB). References Adrian, E.D. (1950) The electrical activity of the mammalian olfactory bulb. Electroencephalogr Clin Neurophysiol, 2, Arieli A, Shoham D, Hildesheim R, Grinvald A. (1995) Coherent spatiotemporal patterns of ongoing activity revealed by real-time optical imaging coupled with single-unit recording in the cat visual cortex. J Neurophysiol., 73, Bar-Gad I, and Bergman H. (2001) Stepping out of the box: information processing in the neural networks of the basal ganglia. Curr Opin Neurobiol, 11, Bar-Gad I., Havazelet Heimer G., Goldberg J. A., Ruppin E. and Bergman H. (2000) Reinforcement driven dimensionality reduction - a model for information processing in the basal ganglia. J Basic & Clinical Physiol & Pharm, 11,

20 20 Bose, A., Manor, Y. and Nadim, F. (2001) Bistable oscillations arising from synaptic depression. SIAM Journal of Applied Mathematics, 62, M. Christen, A. Kern, A. Nikitchenko, W.-W. Steeb,and R. Stoop. (2004) Fast spike pattern detection using the correlation integral. Physical Review E 70, J. E. Dayhoff and G. L. Gerstein (1983) Favored patterns in spike trains. I. Detection. J. Neurophysiol. 49, Ecker, A.S., Berens, P., Keliris, G.A., Bethge, M., Logothetis, N., Tolias, A. (2010) Decorrelated Neuronal Firing in Cortical Microcircuits Science, 327, Green, JD, Arduini A. (1954) Hippocampal activity in arousal. J Neurophysiol 17: Hodgkin, A., and Huxley, A. (1952) A quantitative description of membrane current and its application to conduction and excitation in nerve. J. Physiol., 117, Kazanovich Ya. and Borisyuk R. (2006) An Oscillatory Neural Model of Multiple Object Tracking. Neural Computation 18, Kuramoto, Y. (1991). Collective synchronization of pulse coupled oscillators and excitable units. Physica D 50, 1530 Landolfa MA, Miller JP (1995) Stimulus-response properties of cricket cercal filiform receptors. J. Comp Physiol., 177, Matveev, V., Bose, A and Nadim, F. (2007) Capturing the bursting dynamics of a two-cell inhibitory network using a one-dimensional map J. Comput. Neurosci. 23, Niebur, E., Hsiao, S. S., and Johnson, K. O. (2002). Synchrony: A neuronal mechanism for attentional selection? Current Opinion in Neurobiology 12, Renart, A., de la Rocha, J., Bartho, P., Hollender, L., Parga, N., Reyes, A., & Harris K. (2010) The Asynchronous State in Cortical Circuits Science, 327, Rose, J.E., Brugge, J.F., Anderson, D.J., Hind, J.E. (1967) Phase-locked response to low-frequency tones in single auditory nerve fibers of the squirrel monkey. J Neurophysiol, 30, D. Terman, J.E. Rubin, A.C. Yew and C.J. Wilson (2002) Activity patterns in a model for the subthalamopallidal network of the basal ganglia. J Neurosci, 22, Taube, J.S., Muller, R.U., Ranck, J.B. Jr. (1990) Head-direction cells recorded from the postsubiculum in freely moving rats. I. Description and quantitative analysis. J. Neurosci, 10, I.V. Tetko and A.E.P. Villa. A pattern grouping algorithm for analysis of spatiotemporal patterns in neuronal spike trains. 1. Detection of repeated patterns. J. Neurosci. Methods 105, 1-14 Tetzlaff, T., Helias, M., Einevoll, G. & Diesmann, M. (2010). Decorrelation of low-frequency neural activity. BMC Neuroscience, 11, Suppl. 1, 011. Torras C. (1986) Neural network model with rhythm assimilation capacity IEEE Transactions on Systems, Man and Cybernetics 16,

Frequency separation by an excitatory-inhibitory network

Frequency separation by an excitatory-inhibitory network J Comput Neurosci (2013) 34:231 243 DOI 10.1007/s10827-012-0417-5 Frequency separation by an excitatory-inhibitory network Alla Borisyuk Janet Best David Terman Received: 19 March 2012 / Revised: 29 May

More information

Consider the following spike trains from two different neurons N1 and N2:

Consider the following spike trains from two different neurons N1 and N2: About synchrony and oscillations So far, our discussions have assumed that we are either observing a single neuron at a, or that neurons fire independent of each other. This assumption may be correct in

More information

Reducing neuronal networks to discrete dynamics

Reducing neuronal networks to discrete dynamics Physica D 237 (2008) 324 338 www.elsevier.com/locate/physd Reducing neuronal networks to discrete dynamics David Terman a,b,, Sungwoo Ahn a, Xueying Wang a, Winfried Just c a Department of Mathematics,

More information

Supporting Online Material for

Supporting Online Material for www.sciencemag.org/cgi/content/full/319/5869/1543/dc1 Supporting Online Material for Synaptic Theory of Working Memory Gianluigi Mongillo, Omri Barak, Misha Tsodyks* *To whom correspondence should be addressed.

More information

9 Generation of Action Potential Hodgkin-Huxley Model

9 Generation of Action Potential Hodgkin-Huxley Model 9 Generation of Action Potential Hodgkin-Huxley Model (based on chapter 12, W.W. Lytton, Hodgkin-Huxley Model) 9.1 Passive and active membrane models In the previous lecture we have considered a passive

More information

Lecture 11 : Simple Neuron Models. Dr Eileen Nugent

Lecture 11 : Simple Neuron Models. Dr Eileen Nugent Lecture 11 : Simple Neuron Models Dr Eileen Nugent Reading List Nelson, Biological Physics, Chapter 12 Phillips, PBoC, Chapter 17 Gerstner, Neuronal Dynamics: from single neurons to networks and models

More information

Fast neural network simulations with population density methods

Fast neural network simulations with population density methods Fast neural network simulations with population density methods Duane Q. Nykamp a,1 Daniel Tranchina b,a,c,2 a Courant Institute of Mathematical Science b Department of Biology c Center for Neural Science

More information

CORRELATION TRANSFER FROM BASAL GANGLIA TO THALAMUS IN PARKINSON S DISEASE. by Pamela Reitsma. B.S., University of Maine, 2007

CORRELATION TRANSFER FROM BASAL GANGLIA TO THALAMUS IN PARKINSON S DISEASE. by Pamela Reitsma. B.S., University of Maine, 2007 CORRELATION TRANSFER FROM BASAL GANGLIA TO THALAMUS IN PARKINSON S DISEASE by Pamela Reitsma B.S., University of Maine, 27 Submitted to the Graduate Faculty of the Department of Mathematics in partial

More information

Topics in Neurophysics

Topics in Neurophysics Topics in Neurophysics Alex Loebel, Martin Stemmler and Anderas Herz Exercise 2 Solution (1) The Hodgkin Huxley Model The goal of this exercise is to simulate the action potential according to the model

More information

Synaptic dynamics. John D. Murray. Synaptic currents. Simple model of the synaptic gating variable. First-order kinetics

Synaptic dynamics. John D. Murray. Synaptic currents. Simple model of the synaptic gating variable. First-order kinetics Synaptic dynamics John D. Murray A dynamical model for synaptic gating variables is presented. We use this to study the saturation of synaptic gating at high firing rate. Shunting inhibition and the voltage

More information

An Introductory Course in Computational Neuroscience

An Introductory Course in Computational Neuroscience An Introductory Course in Computational Neuroscience Contents Series Foreword Acknowledgments Preface 1 Preliminary Material 1.1. Introduction 1.1.1 The Cell, the Circuit, and the Brain 1.1.2 Physics of

More information

Mathematical Foundations of Neuroscience - Lecture 3. Electrophysiology of neurons - continued

Mathematical Foundations of Neuroscience - Lecture 3. Electrophysiology of neurons - continued Mathematical Foundations of Neuroscience - Lecture 3. Electrophysiology of neurons - continued Filip Piękniewski Faculty of Mathematics and Computer Science, Nicolaus Copernicus University, Toruń, Poland

More information

Voltage-clamp and Hodgkin-Huxley models

Voltage-clamp and Hodgkin-Huxley models Voltage-clamp and Hodgkin-Huxley models Read: Hille, Chapters 2-5 (best) Koch, Chapters 6, 8, 9 See also Clay, J. Neurophysiol. 80:903-913 (1998) (for a recent version of the HH squid axon model) Rothman

More information

2 1. Introduction. Neuronal networks often exhibit a rich variety of oscillatory behavior. The dynamics of even a single cell may be quite complicated

2 1. Introduction. Neuronal networks often exhibit a rich variety of oscillatory behavior. The dynamics of even a single cell may be quite complicated GEOMETRIC ANALYSIS OF POPULATION RHYTHMS IN SYNAPTICALLY COUPLED NEURONAL NETWORKS J. Rubin and D. Terman Dept. of Mathematics; Ohio State University; Columbus, Ohio 43210 Abstract We develop geometric

More information

Neural Modeling and Computational Neuroscience. Claudio Gallicchio

Neural Modeling and Computational Neuroscience. Claudio Gallicchio Neural Modeling and Computational Neuroscience Claudio Gallicchio 1 Neuroscience modeling 2 Introduction to basic aspects of brain computation Introduction to neurophysiology Neural modeling: Elements

More information

Voltage-clamp and Hodgkin-Huxley models

Voltage-clamp and Hodgkin-Huxley models Voltage-clamp and Hodgkin-Huxley models Read: Hille, Chapters 2-5 (best Koch, Chapters 6, 8, 9 See also Hodgkin and Huxley, J. Physiol. 117:500-544 (1952. (the source Clay, J. Neurophysiol. 80:903-913

More information

Linearization of F-I Curves by Adaptation

Linearization of F-I Curves by Adaptation LETTER Communicated by Laurence Abbott Linearization of F-I Curves by Adaptation Bard Ermentrout Department of Mathematics, University of Pittsburgh, Pittsburgh, PA 15260, U.S.A. We show that negative

More information

Patterns, Memory and Periodicity in Two-Neuron Delayed Recurrent Inhibitory Loops

Patterns, Memory and Periodicity in Two-Neuron Delayed Recurrent Inhibitory Loops Math. Model. Nat. Phenom. Vol. 5, No. 2, 2010, pp. 67-99 DOI: 10.1051/mmnp/20105203 Patterns, Memory and Periodicity in Two-Neuron Delayed Recurrent Inhibitory Loops J. Ma 1 and J. Wu 2 1 Department of

More information

Action Potentials and Synaptic Transmission Physics 171/271

Action Potentials and Synaptic Transmission Physics 171/271 Action Potentials and Synaptic Transmission Physics 171/271 Flavio Fröhlich (flavio@salk.edu) September 27, 2006 In this section, we consider two important aspects concerning the communication between

More information

Emergence of resonances in neural systems: the interplay between adaptive threshold and short-term synaptic plasticity

Emergence of resonances in neural systems: the interplay between adaptive threshold and short-term synaptic plasticity Emergence of resonances in neural systems: the interplay between adaptive threshold and short-term synaptic plasticity Jorge F. Mejias 1,2 and Joaquín J. Torres 2 1 Department of Physics and Center for

More information

The Bayesian Brain. Robert Jacobs Department of Brain & Cognitive Sciences University of Rochester. May 11, 2017

The Bayesian Brain. Robert Jacobs Department of Brain & Cognitive Sciences University of Rochester. May 11, 2017 The Bayesian Brain Robert Jacobs Department of Brain & Cognitive Sciences University of Rochester May 11, 2017 Bayesian Brain How do neurons represent the states of the world? How do neurons represent

More information

3 Detector vs. Computer

3 Detector vs. Computer 1 Neurons 1. The detector model. Also keep in mind this material gets elaborated w/the simulations, and the earliest material is often hardest for those w/primarily psych background. 2. Biological properties

More information

Methods for Estimating the Computational Power and Generalization Capability of Neural Microcircuits

Methods for Estimating the Computational Power and Generalization Capability of Neural Microcircuits Methods for Estimating the Computational Power and Generalization Capability of Neural Microcircuits Wolfgang Maass, Robert Legenstein, Nils Bertschinger Institute for Theoretical Computer Science Technische

More information

Neuron. Detector Model. Understanding Neural Components in Detector Model. Detector vs. Computer. Detector. Neuron. output. axon

Neuron. Detector Model. Understanding Neural Components in Detector Model. Detector vs. Computer. Detector. Neuron. output. axon Neuron Detector Model 1 The detector model. 2 Biological properties of the neuron. 3 The computational unit. Each neuron is detecting some set of conditions (e.g., smoke detector). Representation is what

More information

The homogeneous Poisson process

The homogeneous Poisson process The homogeneous Poisson process during very short time interval Δt there is a fixed probability of an event (spike) occurring independent of what happened previously if r is the rate of the Poisson process,

More information

Marr's Theory of the Hippocampus: Part I

Marr's Theory of the Hippocampus: Part I Marr's Theory of the Hippocampus: Part I Computational Models of Neural Systems Lecture 3.3 David S. Touretzky October, 2015 David Marr: 1945-1980 10/05/15 Computational Models of Neural Systems 2 Marr

More information

Subthreshold cross-correlations between cortical neurons: Areference model with static synapses

Subthreshold cross-correlations between cortical neurons: Areference model with static synapses Neurocomputing 65 66 (25) 685 69 www.elsevier.com/locate/neucom Subthreshold cross-correlations between cortical neurons: Areference model with static synapses Ofer Melamed a,b, Gilad Silberberg b, Henry

More information

Balance of Electric and Diffusion Forces

Balance of Electric and Diffusion Forces Balance of Electric and Diffusion Forces Ions flow into and out of the neuron under the forces of electricity and concentration gradients (diffusion). The net result is a electric potential difference

More information

Synaptic Input. Linear Model of Synaptic Transmission. Professor David Heeger. September 5, 2000

Synaptic Input. Linear Model of Synaptic Transmission. Professor David Heeger. September 5, 2000 Synaptic Input Professor David Heeger September 5, 2000 The purpose of this handout is to go a bit beyond the discussion in Ch. 6 of The Book of Genesis on synaptic input, and give some examples of how

More information

Tuning tuning curves. So far: Receptive fields Representation of stimuli Population vectors. Today: Contrast enhancment, cortical processing

Tuning tuning curves. So far: Receptive fields Representation of stimuli Population vectors. Today: Contrast enhancment, cortical processing Tuning tuning curves So far: Receptive fields Representation of stimuli Population vectors Today: Contrast enhancment, cortical processing Firing frequency N 3 s max (N 1 ) = 40 o N4 N 1 N N 5 2 s max

More information

Neurons and Nervous Systems

Neurons and Nervous Systems 34 Neurons and Nervous Systems Concept 34.1 Nervous Systems Consist of Neurons and Glia Nervous systems have two categories of cells: Neurons, or nerve cells, are excitable they generate and transmit electrical

More information

Exercises. Chapter 1. of τ approx that produces the most accurate estimate for this firing pattern.

Exercises. Chapter 1. of τ approx that produces the most accurate estimate for this firing pattern. 1 Exercises Chapter 1 1. Generate spike sequences with a constant firing rate r 0 using a Poisson spike generator. Then, add a refractory period to the model by allowing the firing rate r(t) to depend

More information

Sample HHSim Exercises

Sample HHSim Exercises I. Equilibrium Potential II. Membrane Potential III. The Action Potential IV. The Fast Sodium Channel V. The Delayed Rectifier VI. Voltage-Gated Channel Parameters Part I: Equilibrium Potential Sample

More information

Activity Driven Adaptive Stochastic. Resonance. Gregor Wenning and Klaus Obermayer. Technical University of Berlin.

Activity Driven Adaptive Stochastic. Resonance. Gregor Wenning and Klaus Obermayer. Technical University of Berlin. Activity Driven Adaptive Stochastic Resonance Gregor Wenning and Klaus Obermayer Department of Electrical Engineering and Computer Science Technical University of Berlin Franklinstr. 8/9, 187 Berlin fgrewe,obyg@cs.tu-berlin.de

More information

Computing with Inter-spike Interval Codes in Networks of Integrate and Fire Neurons

Computing with Inter-spike Interval Codes in Networks of Integrate and Fire Neurons Computing with Inter-spike Interval Codes in Networks of Integrate and Fire Neurons Dileep George a,b Friedrich T. Sommer b a Dept. of Electrical Engineering, Stanford University 350 Serra Mall, Stanford,

More information

Neurons, Synapses, and Signaling

Neurons, Synapses, and Signaling Chapter 48 Neurons, Synapses, and Signaling PowerPoint Lecture Presentations for Biology Eighth Edition Neil Campbell and Jane Reece Lectures by Chris Romero, updated by Erin Barley with contributions

More information

Phase Response Properties and Phase-Locking in Neural Systems with Delayed Negative-Feedback. Carter L. Johnson

Phase Response Properties and Phase-Locking in Neural Systems with Delayed Negative-Feedback. Carter L. Johnson Phase Response Properties and Phase-Locking in Neural Systems with Delayed Negative-Feedback Carter L. Johnson Faculty Mentor: Professor Timothy J. Lewis University of California, Davis Abstract Oscillatory

More information

Localized activity patterns in excitatory neuronal networks

Localized activity patterns in excitatory neuronal networks Localized activity patterns in excitatory neuronal networks Jonathan Rubin Amitabha Bose February 3, 2004 Abstract. The existence of localized activity patterns, or bumps, has been investigated in a variety

More information

9 Generation of Action Potential Hodgkin-Huxley Model

9 Generation of Action Potential Hodgkin-Huxley Model 9 Generation of Action Potential Hodgkin-Huxley Model (based on chapter 2, W.W. Lytton, Hodgkin-Huxley Model) 9. Passive and active membrane models In the previous lecture we have considered a passive

More information

Conductance-Based Integrate-and-Fire Models

Conductance-Based Integrate-and-Fire Models NOTE Communicated by Michael Hines Conductance-Based Integrate-and-Fire Models Alain Destexhe Department of Physiology, Laval University School of Medicine, Québec, G1K 7P4, Canada A conductance-based

More information

Exploring a Simple Discrete Model of Neuronal Networks

Exploring a Simple Discrete Model of Neuronal Networks Exploring a Simple Discrete Model of Neuronal Networks Winfried Just Ohio University Joint work with David Terman, Sungwoo Ahn,and Xueying Wang August 6, 2010 An ODE Model of Neuronal Networks by Terman

More information

Dynamical systems in neuroscience. Pacific Northwest Computational Neuroscience Connection October 1-2, 2010

Dynamical systems in neuroscience. Pacific Northwest Computational Neuroscience Connection October 1-2, 2010 Dynamical systems in neuroscience Pacific Northwest Computational Neuroscience Connection October 1-2, 2010 What do I mean by a dynamical system? Set of state variables Law that governs evolution of state

More information

80% of all excitatory synapses - at the dendritic spines.

80% of all excitatory synapses - at the dendritic spines. Dendritic Modelling Dendrites (from Greek dendron, tree ) are the branched projections of a neuron that act to conduct the electrical stimulation received from other cells to and from the cell body, or

More information

Decoding. How well can we learn what the stimulus is by looking at the neural responses?

Decoding. How well can we learn what the stimulus is by looking at the neural responses? Decoding How well can we learn what the stimulus is by looking at the neural responses? Two approaches: devise explicit algorithms for extracting a stimulus estimate directly quantify the relationship

More information

Dynamical Constraints on Computing with Spike Timing in the Cortex

Dynamical Constraints on Computing with Spike Timing in the Cortex Appears in Advances in Neural Information Processing Systems, 15 (NIPS 00) Dynamical Constraints on Computing with Spike Timing in the Cortex Arunava Banerjee and Alexandre Pouget Department of Brain and

More information

Spike-Frequency Adaptation: Phenomenological Model and Experimental Tests

Spike-Frequency Adaptation: Phenomenological Model and Experimental Tests Spike-Frequency Adaptation: Phenomenological Model and Experimental Tests J. Benda, M. Bethge, M. Hennig, K. Pawelzik & A.V.M. Herz February, 7 Abstract Spike-frequency adaptation is a common feature of

More information

Patterns of Synchrony in Neural Networks with Spike Adaptation

Patterns of Synchrony in Neural Networks with Spike Adaptation Patterns of Synchrony in Neural Networks with Spike Adaptation C. van Vreeswijky and D. Hanselz y yracah Institute of Physics and Center for Neural Computation, Hebrew University, Jerusalem, 9194 Israel

More information

arxiv: v1 [q-bio.nc] 13 Feb 2018

arxiv: v1 [q-bio.nc] 13 Feb 2018 Gain control with A-type potassium current: I A as a switch between divisive and subtractive inhibition Joshua H Goldwyn 1*, Bradley R Slabe 2, Joseph B Travers 3, David Terman 2 arxiv:182.4794v1 [q-bio.nc]

More information

How do synapses transform inputs?

How do synapses transform inputs? Neurons to networks How do synapses transform inputs? Excitatory synapse Input spike! Neurotransmitter release binds to/opens Na channels Change in synaptic conductance! Na+ influx E.g. AMA synapse! Depolarization

More information

An Efficient Method for Computing Synaptic Conductances Based on a Kinetic Model of Receptor Binding

An Efficient Method for Computing Synaptic Conductances Based on a Kinetic Model of Receptor Binding NOTE Communicated by Michael Hines An Efficient Method for Computing Synaptic Conductances Based on a Kinetic Model of Receptor Binding A. Destexhe Z. F. Mainen T. J. Sejnowski The Howard Hughes Medical

More information

Effects of Betaxolol on Hodgkin-Huxley Model of Tiger Salamander Retinal Ganglion Cell

Effects of Betaxolol on Hodgkin-Huxley Model of Tiger Salamander Retinal Ganglion Cell Effects of Betaxolol on Hodgkin-Huxley Model of Tiger Salamander Retinal Ganglion Cell 1. Abstract Matthew Dunlevie Clement Lee Indrani Mikkilineni mdunlevi@ucsd.edu cll008@ucsd.edu imikkili@ucsd.edu Isolated

More information

REAL-TIME COMPUTING WITHOUT STABLE

REAL-TIME COMPUTING WITHOUT STABLE REAL-TIME COMPUTING WITHOUT STABLE STATES: A NEW FRAMEWORK FOR NEURAL COMPUTATION BASED ON PERTURBATIONS Wolfgang Maass Thomas Natschlager Henry Markram Presented by Qiong Zhao April 28 th, 2010 OUTLINE

More information

DISCRETE EVENT SIMULATION IN THE NEURON ENVIRONMENT

DISCRETE EVENT SIMULATION IN THE NEURON ENVIRONMENT Hines and Carnevale: Discrete event simulation in the NEURON environment Page 1 Preprint of a manuscript that will be published in Neurocomputing. DISCRETE EVENT SIMULATION IN THE NEURON ENVIRONMENT Abstract

More information

Introduction to Neural Networks U. Minn. Psy 5038 Spring, 1999 Daniel Kersten. Lecture 2a. The Neuron - overview of structure. From Anderson (1995)

Introduction to Neural Networks U. Minn. Psy 5038 Spring, 1999 Daniel Kersten. Lecture 2a. The Neuron - overview of structure. From Anderson (1995) Introduction to Neural Networks U. Minn. Psy 5038 Spring, 1999 Daniel Kersten Lecture 2a The Neuron - overview of structure From Anderson (1995) 2 Lect_2a_Mathematica.nb Basic Structure Information flow:

More information

Data Mining Part 5. Prediction

Data Mining Part 5. Prediction Data Mining Part 5. Prediction 5.5. Spring 2010 Instructor: Dr. Masoud Yaghini Outline How the Brain Works Artificial Neural Networks Simple Computing Elements Feed-Forward Networks Perceptrons (Single-layer,

More information

In biological terms, memory refers to the ability of neural systems to store activity patterns and later recall them when required.

In biological terms, memory refers to the ability of neural systems to store activity patterns and later recall them when required. In biological terms, memory refers to the ability of neural systems to store activity patterns and later recall them when required. In humans, association is known to be a prominent feature of memory.

More information

A Novel Chaotic Neural Network Architecture

A Novel Chaotic Neural Network Architecture ESANN' proceedings - European Symposium on Artificial Neural Networks Bruges (Belgium), - April, D-Facto public., ISBN ---, pp. - A Novel Neural Network Architecture Nigel Crook and Tjeerd olde Scheper

More information

Effects of synaptic conductance on the voltage distribution and firing rate of spiking neurons

Effects of synaptic conductance on the voltage distribution and firing rate of spiking neurons PHYSICAL REVIEW E 69, 051918 (2004) Effects of synaptic conductance on the voltage distribution and firing rate of spiking neurons Magnus J. E. Richardson* Laboratory of Computational Neuroscience, Brain

More information

Electrophysiology of the neuron

Electrophysiology of the neuron School of Mathematical Sciences G4TNS Theoretical Neuroscience Electrophysiology of the neuron Electrophysiology is the study of ionic currents and electrical activity in cells and tissues. The work of

More information

Frequency Adaptation and Bursting

Frequency Adaptation and Bursting BioE332A Lab 3, 2010 1 Lab 3 January 5, 2010 Frequency Adaptation and Bursting In the last lab, we explored spiking due to sodium channels. In this lab, we explore adaptation and bursting due to potassium

More information

The Spike Response Model: A Framework to Predict Neuronal Spike Trains

The Spike Response Model: A Framework to Predict Neuronal Spike Trains The Spike Response Model: A Framework to Predict Neuronal Spike Trains Renaud Jolivet, Timothy J. Lewis 2, and Wulfram Gerstner Laboratory of Computational Neuroscience, Swiss Federal Institute of Technology

More information

Information processing. Divisions of nervous system. Neuron structure and function Synapse. Neurons, synapses, and signaling 11/3/2017

Information processing. Divisions of nervous system. Neuron structure and function Synapse. Neurons, synapses, and signaling 11/3/2017 Neurons, synapses, and signaling Chapter 48 Information processing Divisions of nervous system Central nervous system (CNS) Brain and a nerve cord Integration center Peripheral nervous system (PNS) Nerves

More information

High-conductance states in a mean-eld cortical network model

High-conductance states in a mean-eld cortical network model Neurocomputing 58 60 (2004) 935 940 www.elsevier.com/locate/neucom High-conductance states in a mean-eld cortical network model Alexander Lerchner a;, Mandana Ahmadi b, John Hertz b a Oersted-DTU, Technical

More information

From neuronal oscillations to complexity

From neuronal oscillations to complexity 1/39 The Fourth International Workshop on Advanced Computation for Engineering Applications (ACEA 2008) MACIS 2 Al-Balqa Applied University, Salt, Jordan Corson Nathalie, Aziz Alaoui M.A. University of

More information

MATH 3104: THE HODGKIN-HUXLEY EQUATIONS

MATH 3104: THE HODGKIN-HUXLEY EQUATIONS MATH 3104: THE HODGKIN-HUXLEY EQUATIONS Parallel conductance model A/Prof Geoffrey Goodhill, Semester 1, 2009 So far we have modelled neuronal membranes by just one resistance (conductance) variable. We

More information

Self-organized Criticality and Synchronization in a Pulse-coupled Integrate-and-Fire Neuron Model Based on Small World Networks

Self-organized Criticality and Synchronization in a Pulse-coupled Integrate-and-Fire Neuron Model Based on Small World Networks Commun. Theor. Phys. (Beijing, China) 43 (2005) pp. 466 470 c International Academic Publishers Vol. 43, No. 3, March 15, 2005 Self-organized Criticality and Synchronization in a Pulse-coupled Integrate-and-Fire

More information

THE LOCUST OLFACTORY SYSTEM AS A CASE STUDY FOR MODELING DYNAMICS OF NEUROBIOLOGICAL NETWORKS: FROM DISCRETE TIME NEURONS TO CONTINUOUS TIME NEURONS

THE LOCUST OLFACTORY SYSTEM AS A CASE STUDY FOR MODELING DYNAMICS OF NEUROBIOLOGICAL NETWORKS: FROM DISCRETE TIME NEURONS TO CONTINUOUS TIME NEURONS 1 THE LOCUST OLFACTORY SYSTEM AS A CASE STUDY FOR MODELING DYNAMICS OF NEUROBIOLOGICAL NETWORKS: FROM DISCRETE TIME NEURONS TO CONTINUOUS TIME NEURONS B. QUENET 1 AND G. HORCHOLLE-BOSSAVIT 2 1 Equipe de

More information

Processing of Time Series by Neural Circuits with Biologically Realistic Synaptic Dynamics

Processing of Time Series by Neural Circuits with Biologically Realistic Synaptic Dynamics Processing of Time Series by Neural Circuits with iologically Realistic Synaptic Dynamics Thomas Natschläger & Wolfgang Maass Institute for Theoretical Computer Science Technische Universität Graz, ustria

More information

Biological Modeling of Neural Networks

Biological Modeling of Neural Networks Week 4 part 2: More Detail compartmental models Biological Modeling of Neural Networks Week 4 Reducing detail - Adding detail 4.2. Adding detail - apse -cable equat Wulfram Gerstner EPFL, Lausanne, Switzerland

More information

Chapter 37 Active Reading Guide Neurons, Synapses, and Signaling

Chapter 37 Active Reading Guide Neurons, Synapses, and Signaling Name: AP Biology Mr. Croft Section 1 1. What is a neuron? Chapter 37 Active Reading Guide Neurons, Synapses, and Signaling 2. Neurons can be placed into three groups, based on their location and function.

More information

ACTION POTENTIAL. Dr. Ayisha Qureshi Professor MBBS, MPhil

ACTION POTENTIAL. Dr. Ayisha Qureshi Professor MBBS, MPhil ACTION POTENTIAL Dr. Ayisha Qureshi Professor MBBS, MPhil DEFINITIONS: Stimulus: A stimulus is an external force or event which when applied to an excitable tissue produces a characteristic response. Subthreshold

More information

IN THIS turorial paper we exploit the relationship between

IN THIS turorial paper we exploit the relationship between 508 IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL. 10, NO. 3, MAY 1999 Weakly Pulse-Coupled Oscillators, FM Interactions, Synchronization, Oscillatory Associative Memory Eugene M. Izhikevich Abstract We study

More information

Biosciences in the 21st century

Biosciences in the 21st century Biosciences in the 21st century Lecture 1: Neurons, Synapses, and Signaling Dr. Michael Burger Outline: 1. Why neuroscience? 2. The neuron 3. Action potentials 4. Synapses 5. Organization of the nervous

More information

Neurophysiology of a VLSI spiking neural network: LANN21

Neurophysiology of a VLSI spiking neural network: LANN21 Neurophysiology of a VLSI spiking neural network: LANN21 Stefano Fusi INFN, Sezione Roma I Università di Roma La Sapienza Pza Aldo Moro 2, I-185, Roma fusi@jupiter.roma1.infn.it Paolo Del Giudice Physics

More information

The Phase Response Curve of Reciprocally Inhibitory Model Neurons Exhibiting Anti-Phase Rhythms

The Phase Response Curve of Reciprocally Inhibitory Model Neurons Exhibiting Anti-Phase Rhythms The Phase Response Curve of Reciprocally Inhibitory Model Neurons Exhibiting Anti-Phase Rhythms Jiawei Zhang Timothy J. Lewis Department of Mathematics, University of California, Davis Davis, CA 9566,

More information

arxiv:physics/ v1 [physics.bio-ph] 19 Feb 1999

arxiv:physics/ v1 [physics.bio-ph] 19 Feb 1999 Odor recognition and segmentation by coupled olfactory bulb and cortical networks arxiv:physics/9902052v1 [physics.bioph] 19 Feb 1999 Abstract Zhaoping Li a,1 John Hertz b a CBCL, MIT, Cambridge MA 02139

More information

How to read a burst duration code

How to read a burst duration code Neurocomputing 58 60 (2004) 1 6 www.elsevier.com/locate/neucom How to read a burst duration code Adam Kepecs a;, John Lisman b a Cold Spring Harbor Laboratory, Marks Building, 1 Bungtown Road, Cold Spring

More information

Simulation of Cardiac Action Potentials Background Information

Simulation of Cardiac Action Potentials Background Information Simulation of Cardiac Action Potentials Background Information Rob MacLeod and Quan Ni February 7, 2 Introduction The goal of assignments related to this document is to experiment with a numerical simulation

More information

Introduction and the Hodgkin-Huxley Model

Introduction and the Hodgkin-Huxley Model 1 Introduction and the Hodgkin-Huxley Model Richard Bertram Department of Mathematics and Programs in Neuroscience and Molecular Biophysics Florida State University Tallahassee, Florida 32306 Reference:

More information

6.3.4 Action potential

6.3.4 Action potential I ion C m C m dφ dt Figure 6.8: Electrical circuit model of the cell membrane. Normally, cells are net negative inside the cell which results in a non-zero resting membrane potential. The membrane potential

More information

Dynamical Systems in Neuroscience: Elementary Bifurcations

Dynamical Systems in Neuroscience: Elementary Bifurcations Dynamical Systems in Neuroscience: Elementary Bifurcations Foris Kuang May 2017 1 Contents 1 Introduction 3 2 Definitions 3 3 Hodgkin-Huxley Model 3 4 Morris-Lecar Model 4 5 Stability 5 5.1 Linear ODE..............................................

More information

Neurophysiology. Danil Hammoudi.MD

Neurophysiology. Danil Hammoudi.MD Neurophysiology Danil Hammoudi.MD ACTION POTENTIAL An action potential is a wave of electrical discharge that travels along the membrane of a cell. Action potentials are an essential feature of animal

More information

CSE/NB 528 Final Lecture: All Good Things Must. CSE/NB 528: Final Lecture

CSE/NB 528 Final Lecture: All Good Things Must. CSE/NB 528: Final Lecture CSE/NB 528 Final Lecture: All Good Things Must 1 Course Summary Where have we been? Course Highlights Where do we go from here? Challenges and Open Problems Further Reading 2 What is the neural code? What

More information

A Model for Real-Time Computation in Generic Neural Microcircuits

A Model for Real-Time Computation in Generic Neural Microcircuits A Model for Real-Time Computation in Generic Neural Microcircuits Wolfgang Maass, Thomas Natschläger Institute for Theoretical Computer Science Technische Universitaet Graz A-81 Graz, Austria maass, tnatschl

More information

Identification of Odors by the Spatiotemporal Dynamics of the Olfactory Bulb. Outline

Identification of Odors by the Spatiotemporal Dynamics of the Olfactory Bulb. Outline Identification of Odors by the Spatiotemporal Dynamics of the Olfactory Bulb Henry Greenside Department of Physics Duke University Outline Why think about olfaction? Crash course on neurobiology. Some

More information

Basic elements of neuroelectronics -- membranes -- ion channels -- wiring

Basic elements of neuroelectronics -- membranes -- ion channels -- wiring Computing in carbon Basic elements of neuroelectronics -- membranes -- ion channels -- wiring Elementary neuron models -- conductance based -- modelers alternatives Wires -- signal propagation -- processing

More information

DEVS Simulation of Spiking Neural Networks

DEVS Simulation of Spiking Neural Networks DEVS Simulation of Spiking Neural Networks Rene Mayrhofer, Michael Affenzeller, Herbert Prähofer, Gerhard Höfer, Alexander Fried Institute of Systems Science Systems Theory and Information Technology Johannes

More information

Nervous Systems: Neuron Structure and Function

Nervous Systems: Neuron Structure and Function Nervous Systems: Neuron Structure and Function Integration An animal needs to function like a coherent organism, not like a loose collection of cells. Integration = refers to processes such as summation

More information

An algorithm for detecting oscillatory behavior in discretized data: the damped-oscillator oscillator detector

An algorithm for detecting oscillatory behavior in discretized data: the damped-oscillator oscillator detector An algorithm for detecting oscillatory behavior in discretized data: the damped-oscillator oscillator detector David Hsu, Murielle Hsu, He Huang and Erwin B. Montgomery, Jr Department of Neurology University

More information

Fast and exact simulation methods applied on a broad range of neuron models

Fast and exact simulation methods applied on a broad range of neuron models Fast and exact simulation methods applied on a broad range of neuron models Michiel D Haene michiel.dhaene@ugent.be Benjamin Schrauwen benjamin.schrauwen@ugent.be Ghent University, Electronics and Information

More information

Coarse-grained event tree analysis for quantifying Hodgkin-Huxley neuronal network dynamics

Coarse-grained event tree analysis for quantifying Hodgkin-Huxley neuronal network dynamics J Comput Neurosci (212) 32:55 72 DOI 1.17/s1827-11-339-7 Coarse-grained event tree analysis for quantifying Hodgkin-Huxley neuronal network dynamics Yi Sun Aaditya V. Rangan Douglas Zhou David Cai Received:

More information

Evolution of the Average Synaptic Update Rule

Evolution of the Average Synaptic Update Rule Supporting Text Evolution of the Average Synaptic Update Rule In this appendix we evaluate the derivative of Eq. 9 in the main text, i.e., we need to calculate log P (yk Y k, X k ) γ log P (yk Y k ). ()

More information

Neural Networks 1 Synchronization in Spiking Neural Networks

Neural Networks 1 Synchronization in Spiking Neural Networks CS 790R Seminar Modeling & Simulation Neural Networks 1 Synchronization in Spiking Neural Networks René Doursat Department of Computer Science & Engineering University of Nevada, Reno Spring 2006 Synchronization

More information

Overview Organization: Central Nervous System (CNS) Peripheral Nervous System (PNS) innervate Divisions: a. Afferent

Overview Organization: Central Nervous System (CNS) Peripheral Nervous System (PNS) innervate Divisions: a. Afferent Overview Organization: Central Nervous System (CNS) Brain and spinal cord receives and processes information. Peripheral Nervous System (PNS) Nerve cells that link CNS with organs throughout the body.

More information

Nervous Tissue. Neurons Electrochemical Gradient Propagation & Transduction Neurotransmitters Temporal & Spatial Summation

Nervous Tissue. Neurons Electrochemical Gradient Propagation & Transduction Neurotransmitters Temporal & Spatial Summation Nervous Tissue Neurons Electrochemical Gradient Propagation & Transduction Neurotransmitters Temporal & Spatial Summation What is the function of nervous tissue? Maintain homeostasis & respond to stimuli

More information

Membrane Potentials, Action Potentials, and Synaptic Transmission. Membrane Potential

Membrane Potentials, Action Potentials, and Synaptic Transmission. Membrane Potential Cl Cl - - + K + K+ K + K Cl - 2/2/15 Membrane Potentials, Action Potentials, and Synaptic Transmission Core Curriculum II Spring 2015 Membrane Potential Example 1: K +, Cl - equally permeant no charge

More information

MEMBRANE POTENTIALS AND ACTION POTENTIALS:

MEMBRANE POTENTIALS AND ACTION POTENTIALS: University of Jordan Faculty of Medicine Department of Physiology & Biochemistry Medical students, 2017/2018 +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ Review: Membrane physiology

More information

Limulus. The Neural Code. Response of Visual Neurons 9/21/2011

Limulus. The Neural Code. Response of Visual Neurons 9/21/2011 Crab cam (Barlow et al., 2001) self inhibition recurrent inhibition lateral inhibition - L16. Neural processing in Linear Systems: Temporal and Spatial Filtering C. D. Hopkins Sept. 21, 2011 The Neural

More information

Structured reservoir computing with spatiotemporal chaotic attractors

Structured reservoir computing with spatiotemporal chaotic attractors Structured reservoir computing with spatiotemporal chaotic attractors Carlos Lourenço 1,2 1- Faculty of Sciences of the University of Lisbon - Informatics Department Campo Grande, 1749-016 Lisboa - Portugal

More information

Deconstructing Actual Neurons

Deconstructing Actual Neurons 1 Deconstructing Actual Neurons Richard Bertram Department of Mathematics and Programs in Neuroscience and Molecular Biophysics Florida State University Tallahassee, Florida 32306 Reference: The many ionic

More information