Communicated by John Rinzel

Size: px
Start display at page:

Download "Communicated by John Rinzel"

Transcription

1 ARTICLE Communicated by John Rinzel Dynamics of Membrane Excitability Determine Interspike Interval Variability: A Link Between Spike Generation Mechanisms and Cortical Spike Train Statistics Boris S. Gutkin G. Bard Ermentrout Program in Neurobiology and Department of Mathematics, University of Pittsburgh, Pittsburgh, PA 15260, U.S.A. We propose a biophysical mechanism for the high interspike interval variability observed in cortical spike trains. The key lies in the nonlinear dynamics of cortical spike generation, which are consistent with type I membranes where saddle-node dynamics underlie excitability (Rinzel & Ermentrout, 1989). We present a canonical model for type I membranes, the θ-neuron. The θ-neuron is a phase model whose dynamics reflect salient features of type I membranes. This model generates spike trains with coefficient of variation (CV) above 0.6 when brought to firing by noisy inputs. This happens because the timing of spikes for a type I excitable cell is exquisitely sensitive to the amplitude of the suprathreshold stimulus pulses. A noisy input current, giving random amplitude kicks to the cell, evokes highly irregular firing across a wide range of firing rates; an intrinsically oscillating cell gives regular spike trains. We corroborate the results with simulations of the Morris-Lecar (M-L) neural model with random synaptic inputs: type I M-L yields high CVs. When this model is modified to have type II dynamics (periodicity arises via a Hopf bifurcation), however, it gives regular spike trains (CV below 0.3). Our results suggest that the high CV values such as those observed in cortical spike trains are an intrinsic characteristic of type I membranes driven to firing by random inputs. In contrast, neural oscillators or neurons exhibiting type II excitability should produce regular spike trains. 1 Introduction The statistical nature of single neuron response has been a widely recognized feature of neural information processing. Historically, a number of preparations yielded spike trains with a large degree of variability (Burns & Webb, 1976; Kroner & Kaplan, 1993). Spike trains with high coefficients of variation (CV) have been reported for a wide range of stimulus-evoked activity of nonbursting pyramidal neurons in visual cortical areas of monkeys (Softky & Koch, 1993; Dean, 1981; McCormick, Connors, Lighthall, & Neural Computation 10, (1998) c 1998 Massachusetts Institute of Technology

2 1048 Boris S. Gutkin and G. Bard Ermentrout Prince, 1985). Such high in vivo interspike interval variability is contrasted with highly reproducible in vitro response of neurons to depolarizing current steps (Holt, Softky, Koch, & Douglas, 1996), aperiodic stimuli (Mainen & Sejnowski, 1995) and robust spike timing for high-contrast visual stimuli in vivo (Reich, Victor, Knight, Ozaki, & Kaplan, 1997). Softky and Koch (1993) presented an analysis of cortical spike trains, showing that neither the changes in the mean firing rate nor spike frequency adaptation could account for high CVs. To date, several hypotheses have been proposed to explain these seemingly paradoxical findings. Classical stochastic models presented neurons as temporal integrators with spike generation as a random walk with an absorbing boundary (Stein, 1965; Ricciardi, 1994). Numerous variants of these random walk or stochastic integrate-and-fire (IF) models strove to account for nonstationary excitability of the neuron, nonlinear summation of the synaptic inputs, and multimodal output distributions (Smith, 1992; Wilbur & Rinzel, 1983). Softky and Koch (1993) suggested that the high CVs are inconsistent with temporal integration of randomly arriving excitatory postsynaptic potentials (EPSPs). Based on studies of compartmental models with Hodgkin- Huxley spike-generating currents, they proposed that several mechanisms, most notably active dendritic processes, amplify weak temporal correlations in the input and produce highly variable input currents at the soma. The cell acts as a coincidence detector and produces noisy output. An alternative hypothesis states that sufficiently variable input currents can be generated under a balance of excitatory and inhibitory inputs. The neuron then remains close to the threshold, and firing reflects the fine temporal fluctuations in the input current. Shadlen and Newsome (1994) found high CVs for the balanced Stein model and Bell, Mainen, Tsodyks, and Sejnowski (1995) found similar results for a Hodgkin-Huxley neuron under specific parameter choices. Networks with connectivity that ensures the excitatory-inhibitory balance also produce highly variable firing in noisy IF neurons (Usher, Stemmler, Koch, & Olami, 1994) and chaotic threshold elements (van Vreeswijk & Sompolinsky, 1996). Recently Troyer and Miller (1997) showed that input-output properties of the neuron can strongly influence the integration of noisy inputs. They modified the leaky IF neuron to include a partial postspike reset voltage. Using the reset as a free parameter, they fitted IF neurons to real pyramidal in vitro response frequency to input current (FI) curves. They found that such IF neurons (termed high gain ) 1 give high CVs, and do so without a balance of inputs. Neurons where FI gain is much lower than seen in data did not produce high CVs. The explanation was that the high-gain IF neuron hovers near a steady state (set by the reset voltage) and remains sensitive to temporal fluctuations in the random inputs. The low-gain neuron spends 1 Gain is defined as the slope of the FI curve.

3 Interspike Interval Variability 1049 much more time depolarizing toward the threshold and thus damps out the input variability. The emphasis of previous studies (with the notable exception of Troyer & Miller, 1997) has been mainly on mechanisms that generate sufficiently variable input currents at the soma. By using either the simplest point neurons or a standard Hodgkin-Huxley soma, most of the authors referred to omitted from their analysis the nonlinear spike-generating dynamics and their role in spiking statistics. In this article, we assume that the input at the soma is variable. Focusing on the nonlinear dynamics of spike-generating mechanism, we show how properties of neural membranes dominated by saddle-node dynamics (type I) yield firing statistics observed in in vivo recordings and in vitro input-output characteristics of cortical neurons. We argue that the key is the sensitive dependence of spike latency to the amplitude of the suprathreshold inputs evident in type I membranes. We review the salient characteristics of type I membranes and contrast these with type II membranes. In section 3 we present the canonical model for type I membranes, the θ-neuron. We show that high CV spike trains arise for the θ-neuron in the excitable regime. The oscillating θ-neuron produces low CV firing patterns. We follow with an example of a more detailed spiking neural model (Morris-Lecar, M-L) in section 4. Simulations of type I Morris-Lecar corroborate our θ-neuron findings. Type II M-L yields low CV spike trains, suggesting that type II membranes do not produce highly variable spike trains. 2 Type I vs. Type II Neural Membrane Dynamics Our major assumption is type I membrane excitability for the spikegenerating soma. The general idea is to classify the cells by the dynamical structure that underlies the onset of autonomous periodic firing. A more complete discussion of this classification can be found in Rinzel and Ermentrout (1989). The classification, based on observations of squid axons, was proposed by Hodgkin (1948), who found arbitrarily low response frequencies and spike latencies for some axons (Type I) and a narrow range of responses with no spike delay for others (Type II). We use the M-L model to illustrate type I and type II characteristics. Observationally, a type I membrane is recognized by a continuous FI curve that shows oscillations arising with arbitrarily low frequencies (see Figure 1a). It shows that the type I cell is capable of a wide range of firing frequencies and that near the threshold, the input-output gain is infinite. This suggests that in the excitable regime, a number of dynamical behaviors is possible depending on the strength of the time-dependent stimulus input. In Figure 1b we see that the spike latency for a type I neural model (here M-L) is strongly sensitive to the magnitude of the suprathreshold stimulus.

4 1050 Boris S. Gutkin and G. Bard Ermentrout frequency (Hz) delay (msec) [A] Ι (µα/cm^2) 20 [B] Ι (µα /cm^2) frequency (Hz) 10 [C] Ι (µα/cm^2) Figure 1: (a) Firing frequency to input current plot for type I membrane shows oscillations appearing with arbitrarily low frequencies. (b) The delay to spike in the type I model depends on the amplitude of the suprathreshold stimulus. (c) Firing frequency to input current plot for the type II shows oscillations arising with nonzero frequency. Consider that temporal integration by the neural membrane acts to translate randomly timed synaptic inputs into a background DC bias (perhaps due to distant synapses) plus current with randomly varying amplitude (perhaps due to more proximal ones). Then a type I membrane, with its high spike latency sensitivity, converts the variability in the input current to variability in output spike timing. The phase plane for type I membrane helps us understand why this happens (see Figure 2a). The voltage nullcline intersects the activation nullcline, forming an attracting node and a saddle that acts as a threshold for the spike generation. A subthreshold stimulus would not evoke much response. Any stimulus pushing the voltage past

5 Interspike Interval Variability 1051 the saddle node will result in a spike of constant shape, but with a varying delay. This delay results from the fact that voltage trajectory near the threshold hugs the stable manifold of the saddle, moving rather slowly away from the threshold. Also, near the threshold, the membrane is most sensitive to small size inputs. As the voltage increases, the velocity of motion increases, and the membrane becomes insensitive to inputs because the active conductances dominate the dynamics. A fast spike is produced, followed by a refractory period and repolarization to the rest state. The important notion is the nonuniformity of motion around the phase plane. This is reminiscent of cortical neurons that, given a suprathreshold pulse stimulus, will slowly depolarize and then produce a fast spike. 2 We also note that a type I model spends most of its time near the steady state, just like the high-gain neuron of Troyer and Miller (1997). We can change the excitability of the cell by increasing the bias current, which lifts the voltage nullcline. This lowers the threshold and reduces the region in the phase space where the membrane is most sensitive to input perturbations. With still more positive bias, the rest state disappears, and a limit cycle is left behind. This limit cycle is of constant amplitude but with a period dependent on the bias. In contrast, type II membranes are characterized by discontinuous FI curves with the oscillations arising with a nonzero frequency. These oscillations are due to a subcritical Hopf bifurcation. The response frequency range is narrow and largely independent of the bias (see Figure 1c). There is also no true threshold for the appearance of spikes, which are not an allor-nothing phenomenon, but with amplitude that can depend on the size of the pulse stimulus (see Figure 2c). The delay to spike is not sensitive to the size of the suprathreshold stimulus, and the long prespike depolarization is absent. Several widely used cortical models are of type I for example, Traub s model (Traub & Miles, 1991) and the Bower model (Wilson & Bower, 1989). Examples of type II membrane include the standard Hodgkin-Huxley model (Hodgkin & Huxley, 1952) and the FitzHugh-Nagumo reduced model (FitzHugh, 1961). The M-L model can be put into either the type I or the type II regime. 3 The θ-neuron: A Canonical Model for Type I Membranes We present a reduced neural phase model (θ-neuron) capable of reproducing spike-train statistics at a wide range of mean firing rates. The θ-neuron is a canonical model for type I membranes resulting from formal mathematical reduction of multidimensional neural models exhibiting type I dynamics. That is, every neural model with type I dynamics can be reduced to the 2 This is thought to be due to the A-current.

6 1052 Boris S. Gutkin and G. Bard Ermentrout Figure 2: Phase plane for type I and type II neural membranes. Here we use the Morris-Lecar model as an example, with w being the activation variable and parameters set as in Rinzel and Ermentrout (1989). (a) Phase plane for a type I membrane in the excitable regime. Note that the stimulus-induced processions around the phase plane are of constant size and profile. Such processions r periodic solutions are said to live on an invariant circle. However, the rise time of spikes depends on stimulus amplitude. Here R is the attracting rest state, T is the saddle, and U is an unstable steady state. (b) Phase plane for type I membrane in oscillatory regime. Once again the spikes are of constant amplitude and live on the invariant circle. The voltage nullcline has been lifted by the added constant bias current. (c) Phase plane for type II membrane in the excitable regime. Note that the spikes are of variable amplitude. θ-neuron. The parameters of the θ-neuron can be quantitatively related to physiologically observable quantities, and the dynamics reflect the nonlinear properties of the neuronal membrane. We describe the neuron by a phase-variable θ. This phase represents the location of the voltage and activation state vector along the spike trajectory (see Figure 3a). The dynamics of the phase under noisy input are governed

7 Interspike Interval Variability 1053 Figure 3: (a) Phase evolution on a circle and its analog in state of the membrane voltage. Note that the spike occupies a small region near π, and the model spends most of its time near the threshold. A suprathreshold stimulus pushes θ past the threshold and into the excited region. Here the regenerative dynamics that summarize active conductances carry the phase through the spike. (b) A representative spike train for the phase model excited by a random stimulus. Here we plot not θ but ν = 1 cos(θ), which makes the spikes apparent. by the following stochastic (Langevin) differential equation: dθ/dτ = (1 cos θ)+ (1 + cos θ)(β + σ W τ ) θ [0, 2π], (3.1) for white noise input W τ with intensity σ. 3 Here β is the bias parameter, which controls the excitability of the cell. The critical value is at β = 0. In the excitable regime, where β is negative, the model has an attracting rest state and a repelling threshold. 4 In case of a subthreshold stimulus, the phase returns passively to rest, while a suprathreshold pulse causes the phase to rotate around the circle and produce a spike. In fact, if we plot the time evolution of ν = (1 cos(θ)) we can readily see the fast spike (see Fig. 3b). 3.1 Reduction to θ-neuron. We present an outline of the mathematical reduction process, a more detailed description has been published in Ermentrout (1996b) and a complete mathematical treatment in Hoppensteadt and Izhkevich (1997). The reduction relies on perturbation methods for the saddle-node bifurcation inherent in type I membranes. Heuristically we can describe the 3 W t is constructed by generating gaussian random deviates with variance proportional to dτ; the integral of W t gives a Wiener process. 4 These are given by θ rest = arccos 1+β 1 β and θ threshold = arccos 1+β 1 β, respectively.

8 1054 Boris S. Gutkin and G. Bard Ermentrout behavior by a phase variable because the oscillations in the type I neuron are of invariant amplitude (i.e., the cell produces spikes of constant shape). Let us consider a generic conductance model: dv/dt = F 0 (V) + ɛ 2 N(V). (3.2) Here V is the vector of dynamical variables of the model (e.g., membrane voltage, activation variables), F 0 (V) is the nonlinear function that includes the membrane properties of the conductance model, and N(V) is the input; ɛ is small. We assume that when ɛ = 0, there exists an invariant circle around a single fixed point, which persists on both sides of the bifurcation. Then let the saddle node appear at the critical value V. We linearize F 0 (V) around that value and note that the Jacobian of F 0 (V) at V has a zero eigenvalue. Letting V = V + ɛz e, where e is the eigenvector corresponding to the zero eigenvalue, the dynamics of equation 3.2 near the bifurcation are governed by dz/dt = ɛ(η + qz 2 ) + h.o.t., (3.3) which is the normal form for saddle-node dynamical systems. We now make a change of coordinates τ=ɛt and z=tan(θ/2), and setting q to unity without loss of generality, we arrive at dθ/dτ = (1 cos θ)+ η(1 + cos θ),θ [0, 2π],θ(0) = θ(2π), (3.4) where η is proportional to the inputs in the original model. 5 The reduction method determines how we can include a noise term to model the influence of a large number of positive and negative inputs of random strength arriving at random times. The additive input N(V) in the full model (in equation 3.2) is reduced to η in the θ-neuron. Letting η = (β + σ W τ ) where β is the bias and W τ models white noise, we arrive at the appropriate model for the random inputs. The o.d.e. for the phase in equation 3.4 then becomes the Langevin d.e.: dθ/dτ =(1 cos θ)+(1+cos θ)(β +σ W τ ), θ [0,2π],θ(0)=θ(2π). (3.5) Note that this noise model is based solely on the mathematics of the reduction and reflects several important characteristics about how a neuron responds to inputs. The neuron is most sensitive to its inputs when at rest and most insensitive during the spike when voltage is dominated by spikegenerating currents and the refractory periods that follow. The inputs in the θ-neuron are scaled by the (1 + cos(θ)) and have the most effect on the 5 In general, η and q can depend on time and phase and can be calculated directly from the original neural model; see Ermentrout (1996b).

9 Interspike Interval Variability 1055 Figure 4: Saddle-node dynamics on an invariant circle. The upper trace shows the location of critical points on the invariant circle, the middle trace shows the behavior of the phase variable, and the lower trace is the trace of (1 cos(θ)) showing the spikes. The horizontal axis in the lower two traces gives time in ms. (a) Excitable regime with β = 0.3, the spike is evoked by a suprathreshold pulse stimulus marked by the triangle. (b) Bifurcation with saddle-node point, β = 0. The homoclinic trajectory (one that joins a critical point to itself) has an infinite period. (c) Oscillatory regime with β = 0.3. Autonomous periodic processions in the phase variable and spikes in (1 cos(θ)) are present. phase θ when the cell is close to the resting potential, and little or no effect when the cell is traversing through the spike. In this work, we present results for the white noise inputs, although similar arguments can lead to an appropriate model for Poisson-timed excitatory and inhibitory inputs. 6 The deterministic behavior of the model has been described in detail in Ermentrout (1996b). The main point is that the θ-neuron reflects all the salient characteristics of the dynamics of the original full model. Since the θ- neuron is a canonical model for type I neural models, its dynamics reflect the saddle-node-based spike-generating behavior of any type I neural model, including the spike latency sensitivity to stimulus amplitude. The θ-neuron exhibits both excitable and tonically oscillating regimes, depending on the bias β. Figure 4 summarizes the behavior of the model for different bias values. 6 η = (β + g e dn ex + g i dn in ), where dn ex dn in are unit events with arrival times given by Poisson processes with intensity λ ex and λ in, respectively. The amplitudes of ESPSs and IPSPs are given by g ex and g in. We should note that the bias + white noise input model would not work in the limit of low EPSP amplitudes, long EPSP duration, and high arrival rates where the net effect would be a mean DC current. However, for this work, we start by assuming that the inputs to the soma carry a significant degree of variability.

10 1056 Boris S. Gutkin and G. Bard Ermentrout 4 Results of Numerical Experiments for θ-neuron To study the stochastic dynamics of model 3.1, we carried out numerical simulations using the XPPAUT differential equation exploration package (Ermentrout, 1996a) The equations were integrated on a circle with period of 2π using a stochastic version of the Euler method with time step of 0.05 ms. The noise was generated by XPP using standard algorithm for construction of Wiener process (see footnote 3 and Kloeden & Platen, 1994). The voltage time series was converted into spike-train data for which we computed ISI CVs and histograms. The interspike interval (ISI) data were examined to ensure stationarity. 4.1 The Excitable Regime: Noise-Induced High-CV Firing. In the excitable regime (β below 0), firing is induced purely by the noise inputs. ISI histograms of the noise-driven excitable θ-neuron show a characteristic peak. That is, for a given β, the noise induces a characteristic mean ISI. The mean ISI is controlled by both the constant bias and the amplitude of the noise process. As the β becomes more positive or σ increases, the peak in the ISI histogram moves to the left (see Figures 5a and b) increasing the mean firing rate. The noise and bias have similar effects on the firing rate. However, they have differential effects on the ISI variability. Increasing the amplitude of the noise inputs while holding the bias constant has comparatively little effect on the scale of the ISI histogram. For β = 0.3, the high-noise histogram has qualitatively equal mass in the tail as the low-noise one (see Figure 5a). Consequently, when the firing rate is controlled by the noise intensity, the CV remains high and does so across a wide range of mean ISIs, with a slight downward trend toward the shorter ISIs (see Figure 6c). We propose the following explanation. With a constant negative bias, the distance to the threshold and the region where the motion is slow are held constant. The firing rate then depends on the mean frequency of random crossings of the threshold. Because of the spike latency characteristics of type I membranes, the variance of the ISIs depends on the threshold crossings, and also on the amplitudes of the suprathreshold inputs. As the variance of the noise input goes up, the variability in the amplitude of the suprathreshold shocks increases, thereby driving the variability of the spike latencies up. Thus, the CV remains largely invariant for a wide range of firing rates. As the firing rate becomes very large, the refractory period exerts a regularizing influence, and the CV begins to decrease. On the other hand, when we hold the noise amplitude fixed, increasing β gives spike trains with lower CVs and ISI histograms with progressively shorter tails (see Figure 5b). We suggest that this happens because as the rest and threshold approach each other, the active spike currents (here intrinsic regenerative behavior of the phase) are much easier to activate. These currents then drown out the variability in the inputs. To put it another way, the

11 Interspike Interval Variability 1057 Figure 5: Normalized histograms for θ-neuron, ISIs plotted on log scale. (a) β = 0.3 and noise amplitude varied. Note that the left tail of the histogram appears to shorten as σ increases; partially due to the log scale and also because with higher input variance, the mode moves to closer to the smallest possible ISI. (b) Noise amplitude = 1 and β varied. The mean ISI of deterministic oscillations for β = 1 case is 3.14 ms. more excitable cell is much less dominated by the slow dynamics near the rest state, and the range of inputs that would cause highly variable spike is decreased. 4.2 The Oscillatory Regime: Noise-Modulated Periodic Firing. At supercritical bias values (β above 0), the model fires periodically even without the noisy inputs. In this regime, the noise modulates the mean frequency of firing, while the firing is comparatively regular with low CVs and shorttailed ISI histograms. Once again our explanation holds. The dynamics of the oscillating cell are dominated by the active currents, and the noise inputs exert a comparatively weak influence on the firing behavior. 5 Numerical Simulation of Stochastic Morris-Lecar To corroborate our findings in the θ-neuron, we studied the M-L model. The M-L has the advantage of being a conductance-based model that can be put

12 1058 Boris S. Gutkin and G. Bard Ermentrout Figure 6: CV results for θ-model. (a) CV remains high across a wide range of noise parameters. (b) CV decreases linearly with β. (c) CV remains high across a wide range of firing rates, when such are controlled by noise. Here β = 1, σ varied. into both type I and type II regimes depending on the chosen parameters. We examined the hypothesis that for equivalent excitability, the type I model yields high CVs under a variety of random input conditions, while the type II model gives a regular firing pattern. For our simulations, both type I and type II models were parameterized by the deviation from the critical input bias current, (I bias I critical ), to fix the excitability conditions. 7 Equations and exact parameter values for both models are given in the appendix. For both models, the negative deviation indicates the excitable regime, while the positive corresponds to an intrinsic oscillator. The inputs consist of instantaneous excitatory and inhibitory postsynaptic currents (EPSCs and IPSCs, respectively) with Poisson-distributed arrival times. The amplitudes of EP- 7 For type I Morris-Lecar I critical = 40 mv, type II, I critical = 100 mv.

13 Interspike Interval Variability 1059 SCs were set to ensure integration of many inputs to generate a spike. Both inputs were modeled as simple exponentials with time constants of 1 ms. The IPSC arrival rate for all simulations was kept constant, while the rate of EPSCs was varied. 8 The random arrival times were generated by using twostate Markov chains with transition probabilities set to give Poisson arrival proccesses for inhibition and excitation with desired means (this feature is built into XPP). 5.1 Morris-Lecar in Type I Regime. Type I Morris-Lecar shows high CV behavior. The type I model yields a wide range of firing rates for different excitability and input parameters. There is a strong dependence of the CV on the excitability of the cell (see Figure 7, upper trace). In the excitable regime, CV values close to unity are observed, with the CV clearly decreasing as the model passes into the oscillatory regime, yet the CV is always above 0.3. In the oscillatory regime, the inputs are dominated by the intrinsic dynamics of the oscillating membrane, and the mean ISI reflects the intrinsic frequency of oscillation. On the other hand, for a model in the excitable regime, the CV value is comparatively insensitive to the rate of arrivals of EPSPs. Consequently the CV remains consistently high for a wide range of firing rates when these are controlled by the variability of inputs. Also the CVs for type I spike train for the M-L model are largely independent of the inhibition-to-excitation ratio. 5.2 Morris-Lecar in Type II Regime. We now modify the M-L model to reflect type II excitability. Compared to type I, this model in the excitable regime exhibits a narrow range of ISIs, which are in fact close to the frequency of oscillations at criticality. The CV is not sensitive to the bias current (see Figure 7, lower trace) and at no value of EPSP rate does the model exhibit CVs above 0.5. In fact, the only way to achieve high CV value for this model is to resize the amplitude of EPSP to make the threshold near to one event, with the model doing no integration. 6 Summary In this work we asked whether dynamics of spike generation consistent with cortical neuron data can account for the statistics of cortical spike trains. Using a canonical model, we show that type I cells driven to firing by the noisy inputs give highly variable spike trains. On the other hand, intrinsic oscillators have a much more regular spiking behavior. Furthermore, type I M-L results clearly show high CV levels, with excitation dominating the random inputs, while type II models give regular firing. Our results are 8 The ratio of inhibition to excitation as defined in Troyer and Miller (1997) R = (g i V V i rate i )/g e V V e rate e ) varied from 0.1 to 2.

14 1060 Boris S. Gutkin and G. Bard Ermentrout Figure 7: CV results for Morris-Lecar models. The upper trace shows type I behavior, and the lower trace is type II. Note that IPSC rate was 0.3 for all simulations. EPSC rate varied. in contrast to the suggestions by Softky and Koch (1993), who imply rather specific synaptic organization of inputs and restrictive parameter regime for the dendritic spikes, or to the Bell et al. model (1995), which requires not only a balance of inhibitory and excitatory inputs but also a very narrow subset of the parameter space. We are also able to reproduce neural variability without reliance on network dynamics. We provide a formal mathematical method to derive the θ-neuron from conductance-based models, and thus can give specific physiological meaning to the parameters. High-CV spike trains are observed for the excitable θ-neuron and type I Morris-Lecar. The saddle-node bifurcation characteristic of such membranes underlies the observed results. In particular, we suspect that the CVs are high because the noise randomly samples large period orbits, leading to the spike latencies that are strongly dependent on the size of the time-dependent stimulus. Thus a stimulus that provides suprathreshold shocks of random amplitudes will evoke strongly variable ISIs for type I neurons. In contrast are membranes with type II dynamics, where the oscillations arise with a nonzero frequency through a Hopf bifurcation. In such a system there are no long-period orbits for the noise to sample, and the spike latency for suprathreshold stimuli is bounded above. We observe generically low

15 Interspike Interval Variability 1061 CV values for the type II M-L even with inhibition and excitation balanced in the random inputs. We expect that type II neurons in general do not exhibit high CVs except under some very special conditions that ensure bistability (Bell et al., 1995). We also observe that type I neurons that are driven to firing by random inputs exhibit high CV values generically, without the balance of excitation and inhibition. We suggest that the high physiological gain condition proposed by Troyer and Miller (1997) is a natural consequence of type I membrane dynamics. In fact, for type I models, gain is infinite near the threshold for onset of oscillations. The key, we suggest, is not the gain as such, but the range of input amplitudes that leads to significantly variant spike latencies. Furthermore, just like the high -gain IF neuron, type I models spend most of their time near a steady state and not depolarizing toward the threshold. Essentially the dynamics of the saddle-node-induced firing imply a highly nonuniform motion for the voltage-activation trajectory with very slow motion near the rest state. 9 Then the cell spends most of its time near the rest and is pushed to firing by the fast swings in the input current. In this way, the spike-generating mechanism acts as a de facto high-pass filter. It is particularly interesting to note that Hodgkin in his 1948 paper observed that type I spikers (axons in his case) produced a much more variable firing pattern than the type II spikers. Arbitrary spike delay latencies were also reported in the same work. At the same time θ-neuron and the M-L are not coincidence detectors in the sense that both code the mean input rate with a mean output rate. This means that high CVs cannot be used as a litmus test to solve the rate versus coincident coding dilemma. However, the delay to spike characteristics of type I neurons suggests that cortical neurons can act as amplitude to spike latency converters, and perhaps pass information about the temporal structure of the stimulus not only in the firing rate but also in the relative timing of individual spikes. In this way, the spike train as a whole would look very noisy, yet the information about the stimulus would be encoded quite precisely by the timing of spikes. In order for this coding mechanism to work, the cells must respond robustly to aperiodic inputs, and in fact data from Mainen and Sejnowski (1995) show highly reproducible responses to repeated noiselike stimuli in vitro. Robust spike timing despite noisy firing was also recently reported by Reich et al. (1997) in visually stimulated in vivo cat retinal ganglion cells and lateral geniculate nucleus for high-contrast stimuli, while low-contrast stimuli seemed to lead to less 9 Such slowdown near the rest can result in a real neuron or biophysical model from a sodium current that is slow to activate at the beginning of the spike generation, with a steeply increasing voltage-dependent time constant. Alternatively, the same effect can be generated by an interaction of a slow potassium current that is partially activated at rest (e.g., I m ) and a fast sodium spike-generating current, or an inactivating potassium current (e.g., I A ).

16 1062 Boris S. Gutkin and G. Bard Ermentrout robust responses. We found behavior similar to that reported in both reports above for the θ-neuron in our preliminary simulations. These results will be published elsewhere. The θ-neuron model, like the integrate-and-fire model, is a one-dimensional caricature of a real neuron. Both models have arbitrarily low firing rates as the input current is lowered to the threshold. However, the way in which the frequency goes to zero is like 1/ log(i I ) in the integrateand-fire model rather than the square root law that type I models and the θ-model obey. Similarly, the slope at criticality is infinite in both cases. The main difference lies in the latency to firing due to a suprathreshold stimulus. There is no notion of latency to firing in an integrate-and-fire model; either a stimulus is above threshold, in which case firing is instantaneous, or it is below threshold and no firing occurs. In the θ-model and in type I membranes, the latency is due to the saddle point. This enables cells to respond at arbitrarily long latencies after receiving a suprathreshold stimulus. This in fact was noted by Hodgkin (1948). Our work suggests that cells with high CV spike trains are in the excitable regime as opposed to intrinsic oscillators. This may mean that cortical neurons are not intrinsic oscillators but are driven to firing by the inputs. Then coherent oscillations such as those observed in cortical networks depend critically on the presence and characteristics of the afferent and efferent inputs. One interesting finding in this study is the dependence of CV on the excitability of the cells. The excitability in the simple models we present is set by the DC bias. In in vitro experiments, the bias current is the step current applied by the experimenter, but in in vivo neurons, any slow depolarizing or hyperpolarizing currents or effects of inputs impinging on distal dendrites and filtered by the dendritic tree can be viewed as bias currents. Some suggestions for changing bias currents include NMDA activation and muscarinic modulation of the M-current; then our work implies an interesting effect of slow modulation that the slow modulatory synaptic currents can upregulate the spike rate by making the cell easier to fire and reducing accommodation, and also change the overall variability of the spike train. The effects of changing excitability on spike-train variability can be studied experimentally in pyramidal neurons by manipulating the slow modulatory currents (e.g., by applying NMDA agonists or muscarinic antagonists). Finally, since our models become noisier as the input mean moves away from the repetitive firing threshold, the mechanism for generating highly noisy firing is not balancing of inhibition and excitation. As we have pointed out before, the key is the high-pass filtering property of the spikegenerating mechanism when the spike is caused by a fast procession in the random input current. On the other hand, in the intrinsically oscillating cell, the inputs are dominated by the intrinsic currents that generate the periodic firing, with the random inputs having less influence on the statistics of the spike train. Thus, the further the cell is from being an intrinsic oscillator, the

17 Interspike Interval Variability 1063 more it is driven by the synaptic currents (relative to intrinsic currents) and the noisier is its output. In summary, by focusing on the dynamics of the spike generation by a soma receiving random inputs, we propose that the dynamical mechanism of spike generation has a strong effect on the stochastic properties of neuronal activity. The dynamical mechanism in our work implies that spike generation in cortical pyramidal cells is consistent with saddle-node dynamics. This suggestion can be tested in in vitro experiments that examine spike latency curves and experimentally constructing phase response curves for cortical pyramidal neuron. In fact, we already know that arbitrarily low firing rates are observed experimentally. The saddle-node spike-generating dynamics can be caused by a number of biophysical mechanisms, and further experiments should be designed to pinpoint the precise combination of conductances that forms the substrate in a particular class of neurons. Furthermore, we expect that cells that tonically oscillate (due to the slow depolarization) and are given noisy current input would not show high CVs. At the same time, experiments where a noisy current is injected into cells exhibiting type II dynamics should corroborate the idea that such spike-generating dynamics cannot produce high, irregular firing except for a rather specific parameter regime (e.g., where the cell is bistable). Appendix: Morris-Lecar Equations The Morris-Lecar equations that we used are based on the model that appeared in Rinzel and Ermentrout (1989). They have the form: C dv dt = g Cam (V)(V V Ca ) g K w(v V K ) g L (V V L ) + I dw dt = φ(w (V) w)/τ w (V) m (V) =.5(1 + tanh((v V 1 )/V 2 )) w (V) =.5(1 + tanh((v V 3 )/V 4 )) τ w (V) = 1/ cosh((v V 3 )/V 4 )) Standard values for type I membranes are V K = 80 mv, V L = 60 mv, V Ca = 120 mv, C = 20 µf/cm 2, g L = 2 µs/cm 2, g K = 8 µs/cm 2, V 1 = 1.2 mv, V 2 = 18 mv, V 3 = 12 mv, V 4 = 17.4 mv,φ =.067, and g Ca = 4.0 µs/cm 2. For type II membrane simulations, parameters are the same except, V 3 = 2mV,V 4 = 30 mv, φ = 0.04, and g Ca = 4.4 µs/cm 2. Acknowledgments We thank Dr. Satish Iyengar and Dr. John Horn for fruitful discussions and Dr. Todd Troyer for valuable comments on earlier versions of this article. Work on this project was supported by NSF and NIMH.

18 1064 Boris S. Gutkin and G. Bard Ermentrout References Bell, A., Mainen, Z., Tsodyks, M., & Sejnowski, T. (1995). Balancing of conductances may explain irregular cortical spiking (Tech. Rep. No. INC-9502). San Diego: Institute for Neural Computation, University of California at San Diego. Burns, B. D., & Webb, A. C. (1976). The spontaneous activity of neurons in the cat s visual cortex. Proc. R. Soc. London (Biol.), 194, Dean, A. (1981). The variability of discharge of simple cells in the cat striate cortex. Exp. Brain Res., 44, Ermentrout, G. B. (1996a). XPPAUT1.8 The differential equations tool. Available at Ermentrout G. B. (1996b). Type I membranes, phase resetting curves, and synchrony. Neural Computation, 8(5), FitzHugh, R. (1961). Impulses and physiological states in models of nerve membrane. Biophys. J., 1, Hodgkin, A. L. (1948). The local changes associated with repetitive action in a non-medullated axon. J. Physiol. (London), 107, Hodgkin, A. L., & Huxley, A. F. (1952). A quantitative description of membrane current and its application to conduction and excitation in nerve. J. Physiol. (London), 463, Holt, G. R., Softky, W. R., Koch, K., & Douglas, R. J. (1996). A comparison of discharge variability in vitro and in vivo in cat visual cortex neurons. J. Neurophysiol., 75(5), Hoppensteadt, F. C., & Izhkevich, E. M. (1997). Weakly connected neural networks. New York: Springer-Verlag. Kloeden, P. E., & Platen, E. (1994). Numerical solutions to stochastic differential equations. New York: Springer-Verlag. Kroner, L., & Kaplan, E. (1993). Response variability in retinal ganglion cells in primates. Proc. Natl. Acad. Sci., USA, 86, Mainen, Z. F., & Sejnowski, T. J. (1995). Reliability of spike timing in neocortical neurons. Science, 268, McCormick, D. A., Connors, B. W., Lighthall, J. W., & Prince, D. A. (1985). Comparative electrophysiology of pyramidal and sparsely spiny stellate neurons of the neocortex. J. Neurophysiol., 54, Reich, D. S., Victor, J. D., Knight, B. W., Ozaki, T., & Kaplan, E. (1997). Response variability and timing precision of neuronal spike trains in vivo. J. Neurophysiol., 77, Ricciardi, L. M. (1994). Diffusion models of single neurons. In F. Ventriglia (Ed.), Neural modeling and neural networks (pp ). Oxford: Pergamon Press. Rinzel, J., & Ermentrout, G. B. (1989). Analysis of neural excitability and oscillations. In K. Koch & I. Segev (Eds.), Methods in neuronal modeling. Cambridge, MA: MIT Press. Shadlen, M. N., & Newsome, W. T. (1994). Noise, neural codes and cortical organization. Curr. Op. Neurobiol., 4, Smith, C. E. (1992). A heuristic approach to stochastic models of single neurons.

19 Interspike Interval Variability 1065 In T. McKenna, J. Davis, & S. Zornetzer (Eds.), Single neuron computation (pp ). San Diego: Academic Press. Softky, W., & Koch, K. (1993). The highly irregular firing of cortical cells is inconsistent with temporal integration of random EPSPs. J. Neurosci., 13(1), Stein, R. B. (1965). A theoretical analysis of neuronal variability. Biophys. J., 5, Traub, R. D., & Miles, R. (1991). Neuronal networks of the hippocampus. New York: Cambridge University Press. Troyer, T. W., & Miller, K. D. (1997). Physiological gain leads to high ISI variability in a simple model of a cortical regular spiking cell. Neural Computation, 9(5), Usher, M., Stemmler, M., Koch, C., & Olami, Z. (1994). Network amplification of local fluctuations causes high spike rate variability, fractal firing patterns and oscillatory local field potentials. Neural Computation, 6(5), van Vreeswijk, C., & Sompolinsky, H. (1996). Irregular spiking in cortex through inhibition/excitation balance. Poster presented at Computational Neural Systems Conference, Cambridge, MA. Wilbur, W. J., & Rinzel, J. (1983). A theoretical basis for large coefficient of variation and bimodality in neuronal interspike interval distribution. J. Theo. Biol., 105, Wilson, M. A., & Bower, J. M. (1989). The simulation of large-scale neural networks. In K. Koch & I. Segev (Eds.), Methods in neuronal modeling. Cambridge, MA: MIT Press. Received May 9, 1997; accepted November 14, 1997.

Linearization of F-I Curves by Adaptation

Linearization of F-I Curves by Adaptation LETTER Communicated by Laurence Abbott Linearization of F-I Curves by Adaptation Bard Ermentrout Department of Mathematics, University of Pittsburgh, Pittsburgh, PA 15260, U.S.A. We show that negative

More information

Dynamical Systems in Neuroscience: Elementary Bifurcations

Dynamical Systems in Neuroscience: Elementary Bifurcations Dynamical Systems in Neuroscience: Elementary Bifurcations Foris Kuang May 2017 1 Contents 1 Introduction 3 2 Definitions 3 3 Hodgkin-Huxley Model 3 4 Morris-Lecar Model 4 5 Stability 5 5.1 Linear ODE..............................................

More information

MANY scientists believe that pulse-coupled neural networks

MANY scientists believe that pulse-coupled neural networks IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL. 10, NO. 3, MAY 1999 499 Class 1 Neural Excitability, Conventional Synapses, Weakly Connected Networks, and Mathematical Foundations of Pulse-Coupled Models Eugene

More information

Reduction of Conductance Based Models with Slow Synapses to Neural Nets

Reduction of Conductance Based Models with Slow Synapses to Neural Nets Reduction of Conductance Based Models with Slow Synapses to Neural Nets Bard Ermentrout October 1, 28 Abstract The method of averaging and a detailed bifurcation calculation are used to reduce a system

More information

Consider the following spike trains from two different neurons N1 and N2:

Consider the following spike trains from two different neurons N1 and N2: About synchrony and oscillations So far, our discussions have assumed that we are either observing a single neuron at a, or that neurons fire independent of each other. This assumption may be correct in

More information

An Introductory Course in Computational Neuroscience

An Introductory Course in Computational Neuroscience An Introductory Course in Computational Neuroscience Contents Series Foreword Acknowledgments Preface 1 Preliminary Material 1.1. Introduction 1.1.1 The Cell, the Circuit, and the Brain 1.1.2 Physics of

More information

When is an Integrate-and-fire Neuron like a Poisson Neuron?

When is an Integrate-and-fire Neuron like a Poisson Neuron? When is an Integrate-and-fire Neuron like a Poisson Neuron? Charles F. Stevens Salk Institute MNL/S La Jolla, CA 92037 cfs@salk.edu Anthony Zador Salk Institute MNL/S La Jolla, CA 92037 zador@salk.edu

More information

Chapter 24 BIFURCATIONS

Chapter 24 BIFURCATIONS Chapter 24 BIFURCATIONS Abstract Keywords: Phase Portrait Fixed Point Saddle-Node Bifurcation Diagram Codimension-1 Hysteresis Hopf Bifurcation SNIC Page 1 24.1 Introduction In linear systems, responses

More information

The Spike Response Model: A Framework to Predict Neuronal Spike Trains

The Spike Response Model: A Framework to Predict Neuronal Spike Trains The Spike Response Model: A Framework to Predict Neuronal Spike Trains Renaud Jolivet, Timothy J. Lewis 2, and Wulfram Gerstner Laboratory of Computational Neuroscience, Swiss Federal Institute of Technology

More information

Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting

Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting Eugene M. Izhikevich The MIT Press Cambridge, Massachusetts London, England Contents Preface xv 1 Introduction 1 1.1 Neurons

More information

Supporting Online Material for

Supporting Online Material for www.sciencemag.org/cgi/content/full/319/5869/1543/dc1 Supporting Online Material for Synaptic Theory of Working Memory Gianluigi Mongillo, Omri Barak, Misha Tsodyks* *To whom correspondence should be addressed.

More information

Single-Cell and Mean Field Neural Models

Single-Cell and Mean Field Neural Models 1 Single-Cell and Mean Field Neural Models Richard Bertram Department of Mathematics and Programs in Neuroscience and Molecular Biophysics Florida State University Tallahassee, Florida 32306 The neuron

More information

Computing with Inter-spike Interval Codes in Networks of Integrate and Fire Neurons

Computing with Inter-spike Interval Codes in Networks of Integrate and Fire Neurons Computing with Inter-spike Interval Codes in Networks of Integrate and Fire Neurons Dileep George a,b Friedrich T. Sommer b a Dept. of Electrical Engineering, Stanford University 350 Serra Mall, Stanford,

More information

Title. Author(s)Fujii, Hiroshi; Tsuda, Ichiro. CitationNeurocomputing, 58-60: Issue Date Doc URL. Type.

Title. Author(s)Fujii, Hiroshi; Tsuda, Ichiro. CitationNeurocomputing, 58-60: Issue Date Doc URL. Type. Title Neocortical gap junction-coupled interneuron systems exhibiting transient synchrony Author(s)Fujii, Hiroshi; Tsuda, Ichiro CitationNeurocomputing, 58-60: 151-157 Issue Date 2004-06 Doc URL http://hdl.handle.net/2115/8488

More information

Emergence of resonances in neural systems: the interplay between adaptive threshold and short-term synaptic plasticity

Emergence of resonances in neural systems: the interplay between adaptive threshold and short-term synaptic plasticity Emergence of resonances in neural systems: the interplay between adaptive threshold and short-term synaptic plasticity Jorge F. Mejias 1,2 and Joaquín J. Torres 2 1 Department of Physics and Center for

More information

Introduction to Neural Networks U. Minn. Psy 5038 Spring, 1999 Daniel Kersten. Lecture 2a. The Neuron - overview of structure. From Anderson (1995)

Introduction to Neural Networks U. Minn. Psy 5038 Spring, 1999 Daniel Kersten. Lecture 2a. The Neuron - overview of structure. From Anderson (1995) Introduction to Neural Networks U. Minn. Psy 5038 Spring, 1999 Daniel Kersten Lecture 2a The Neuron - overview of structure From Anderson (1995) 2 Lect_2a_Mathematica.nb Basic Structure Information flow:

More information

Neural Excitability in a Subcritical Hopf Oscillator with a Nonlinear Feedback

Neural Excitability in a Subcritical Hopf Oscillator with a Nonlinear Feedback Neural Excitability in a Subcritical Hopf Oscillator with a Nonlinear Feedback Gautam C Sethia and Abhijit Sen Institute for Plasma Research, Bhat, Gandhinagar 382 428, INDIA Motivation Neural Excitability

More information

Spike-Frequency Adaptation of a Generalized Leaky Integrate-and-Fire Model Neuron

Spike-Frequency Adaptation of a Generalized Leaky Integrate-and-Fire Model Neuron Journal of Computational Neuroscience 10, 25 45, 2001 c 2001 Kluwer Academic Publishers. Manufactured in The Netherlands. Spike-Frequency Adaptation of a Generalized Leaky Integrate-and-Fire Model Neuron

More information

Single-Compartment Neural Models

Single-Compartment Neural Models Single-Compartment Neural Models BENG/BGGN 260 Neurodynamics University of California, San Diego Week 2 BENG/BGGN 260 Neurodynamics (UCSD) Single-Compartment Neural Models Week 2 1 / 18 Reading Materials

More information

Roles of Fluctuations. in Pulsed Neural Networks

Roles of Fluctuations. in Pulsed Neural Networks Ph.D. Thesis Roles of Fluctuations in Pulsed Neural Networks Supervisor Professor Yoichi Okabe Takashi Kanamaru Department of Advanced Interdisciplinary Studies, Faculty of Engineering, The University

More information

Conductance-Based Integrate-and-Fire Models

Conductance-Based Integrate-and-Fire Models NOTE Communicated by Michael Hines Conductance-Based Integrate-and-Fire Models Alain Destexhe Department of Physiology, Laval University School of Medicine, Québec, G1K 7P4, Canada A conductance-based

More information

Computational Neuroscience. Session 4-2

Computational Neuroscience. Session 4-2 Computational Neuroscience. Session 4-2 Dr. Marco A Roque Sol 06/21/2018 Two-Dimensional Two-Dimensional System In this section we will introduce methods of phase plane analysis of two-dimensional systems.

More information

Bursting and Chaotic Activities in the Nonlinear Dynamics of FitzHugh-Rinzel Neuron Model

Bursting and Chaotic Activities in the Nonlinear Dynamics of FitzHugh-Rinzel Neuron Model Bursting and Chaotic Activities in the Nonlinear Dynamics of FitzHugh-Rinzel Neuron Model Abhishek Yadav *#, Anurag Kumar Swami *, Ajay Srivastava * * Department of Electrical Engineering, College of Technology,

More information

Synaptic Input. Linear Model of Synaptic Transmission. Professor David Heeger. September 5, 2000

Synaptic Input. Linear Model of Synaptic Transmission. Professor David Heeger. September 5, 2000 Synaptic Input Professor David Heeger September 5, 2000 The purpose of this handout is to go a bit beyond the discussion in Ch. 6 of The Book of Genesis on synaptic input, and give some examples of how

More information

Dynamical systems in neuroscience. Pacific Northwest Computational Neuroscience Connection October 1-2, 2010

Dynamical systems in neuroscience. Pacific Northwest Computational Neuroscience Connection October 1-2, 2010 Dynamical systems in neuroscience Pacific Northwest Computational Neuroscience Connection October 1-2, 2010 What do I mean by a dynamical system? Set of state variables Law that governs evolution of state

More information

Neural Modeling and Computational Neuroscience. Claudio Gallicchio

Neural Modeling and Computational Neuroscience. Claudio Gallicchio Neural Modeling and Computational Neuroscience Claudio Gallicchio 1 Neuroscience modeling 2 Introduction to basic aspects of brain computation Introduction to neurophysiology Neural modeling: Elements

More information

Effects of synaptic conductance on the voltage distribution and firing rate of spiking neurons

Effects of synaptic conductance on the voltage distribution and firing rate of spiking neurons PHYSICAL REVIEW E 69, 051918 (2004) Effects of synaptic conductance on the voltage distribution and firing rate of spiking neurons Magnus J. E. Richardson* Laboratory of Computational Neuroscience, Brain

More information

A Mathematical Study of Electrical Bursting in Pituitary Cells

A Mathematical Study of Electrical Bursting in Pituitary Cells A Mathematical Study of Electrical Bursting in Pituitary Cells Richard Bertram Department of Mathematics and Programs in Neuroscience and Molecular Biophysics Florida State University Collaborators on

More information

Biological Modeling of Neural Networks

Biological Modeling of Neural Networks Week 4 part 2: More Detail compartmental models Biological Modeling of Neural Networks Week 4 Reducing detail - Adding detail 4.2. Adding detail - apse -cable equat Wulfram Gerstner EPFL, Lausanne, Switzerland

More information

3 Action Potentials - Brutal Approximations

3 Action Potentials - Brutal Approximations Physics 172/278 - David Kleinfeld - Fall 2004; Revised Winter 2015 3 Action Potentials - Brutal Approximations The Hodgkin-Huxley equations for the behavior of the action potential in squid, and similar

More information

Stochastic Oscillator Death in Globally Coupled Neural Systems

Stochastic Oscillator Death in Globally Coupled Neural Systems Journal of the Korean Physical Society, Vol. 52, No. 6, June 2008, pp. 19131917 Stochastic Oscillator Death in Globally Coupled Neural Systems Woochang Lim and Sang-Yoon Kim y Department of Physics, Kangwon

More information

Action Potential Initiation in the Hodgkin-Huxley Model

Action Potential Initiation in the Hodgkin-Huxley Model Action Potential Initiation in the Hodgkin-Huxley Model The Harvard community has made this article openly available. Please share how this access benefits you. Your story matters. Citation Published Version

More information

An analytical model for the large, fluctuating synaptic conductance state typical of neocortical neurons in vivo

An analytical model for the large, fluctuating synaptic conductance state typical of neocortical neurons in vivo An analytical model for the large, fluctuating synaptic conductance state typical of neocortical neurons in vivo Hamish Meffin Anthony N. Burkitt David B. Grayden The Bionic Ear Institute, 384-388 Albert

More information

Nonlinear Dynamics of Neural Firing

Nonlinear Dynamics of Neural Firing Nonlinear Dynamics of Neural Firing BENG/BGGN 260 Neurodynamics University of California, San Diego Week 3 BENG/BGGN 260 Neurodynamics (UCSD) Nonlinear Dynamics of Neural Firing Week 3 1 / 16 Reading Materials

More information

Divisive Inhibition in Recurrent Networks

Divisive Inhibition in Recurrent Networks Divisive Inhibition in Recurrent Networks Frances S. Chance and L. F. Abbott Volen Center for Complex Systems and Department of Biology Brandeis University Waltham MA 2454-911 Abstract Models of visual

More information

Introduction to Neural Networks. Daniel Kersten. Lecture 2. Getting started with Mathematica. Review this section in Lecture 1

Introduction to Neural Networks. Daniel Kersten. Lecture 2. Getting started with Mathematica. Review this section in Lecture 1 Introduction to Neural Networks Daniel Kersten Lecture 2 Getting started with Mathematica Review this section in Lecture 1 2 Lect_2_TheNeuron.nb The Neuron - overview of structure From Anderson (1995)

More information

6.3.4 Action potential

6.3.4 Action potential I ion C m C m dφ dt Figure 6.8: Electrical circuit model of the cell membrane. Normally, cells are net negative inside the cell which results in a non-zero resting membrane potential. The membrane potential

More information

Factors affecting phase synchronization in integrate-and-fire oscillators

Factors affecting phase synchronization in integrate-and-fire oscillators J Comput Neurosci (26) 2:9 2 DOI.7/s827-6-674-6 Factors affecting phase synchronization in integrate-and-fire oscillators Todd W. Troyer Received: 24 May 25 / Revised: 9 November 25 / Accepted: November

More information

Channel Noise in Excitable Neuronal Membranes

Channel Noise in Excitable Neuronal Membranes Channel Noise in Excitable Neuronal Membranes Amit Manwani, Peter N. Steinmetz and Christof Koch Computation and Neural Systems Program, M-S 9-74 California Institute of Technology Pasadena, CA 95 fquixote,peter,kochg@klab.caltech.edu

More information

Dynamical Constraints on Computing with Spike Timing in the Cortex

Dynamical Constraints on Computing with Spike Timing in the Cortex Appears in Advances in Neural Information Processing Systems, 15 (NIPS 00) Dynamical Constraints on Computing with Spike Timing in the Cortex Arunava Banerjee and Alexandre Pouget Department of Brain and

More information

/639 Final Solutions, Part a) Equating the electrochemical potentials of H + and X on outside and inside: = RT ln H in

/639 Final Solutions, Part a) Equating the electrochemical potentials of H + and X on outside and inside: = RT ln H in 580.439/639 Final Solutions, 2014 Question 1 Part a) Equating the electrochemical potentials of H + and X on outside and inside: RT ln H out + zf 0 + RT ln X out = RT ln H in F 60 + RT ln X in 60 mv =

More information

Introduction and the Hodgkin-Huxley Model

Introduction and the Hodgkin-Huxley Model 1 Introduction and the Hodgkin-Huxley Model Richard Bertram Department of Mathematics and Programs in Neuroscience and Molecular Biophysics Florida State University Tallahassee, Florida 32306 Reference:

More information

Spike-Frequency Adaptation: Phenomenological Model and Experimental Tests

Spike-Frequency Adaptation: Phenomenological Model and Experimental Tests Spike-Frequency Adaptation: Phenomenological Model and Experimental Tests J. Benda, M. Bethge, M. Hennig, K. Pawelzik & A.V.M. Herz February, 7 Abstract Spike-frequency adaptation is a common feature of

More information

Decoding Input Signals in Time Domain A Model Approach

Decoding Input Signals in Time Domain A Model Approach Journal of Computational Neuroscience 6, 237 249, 24 c 24 Kluwer Academic Publishers. Manufactured in The Netherlands. Decoding Input Signals in Time Domain A Model Approach JIANFENG FENG AND DAVID BROWN

More information

Computational Modeling of Neuronal Systems (Advanced Topics in Mathematical Physiology: G , G )

Computational Modeling of Neuronal Systems (Advanced Topics in Mathematical Physiology: G , G ) Computational Modeling of Neuronal Systems (Advanced Topics in Mathematical Physiology: G63.2855.001, G80.3042.004) Thursday, 9:30-11:20am, WWH Rm 1314. Prerequisites: familiarity with linear algebra,

More information

Canonical Neural Models 1

Canonical Neural Models 1 Canonical Neural Models 1 Frank Hoppensteadt 1 and Eugene zhikevich 2 ntroduction Mathematical modeling is a powerful tool in studying fundamental principles of information processing in the brain. Unfortunately,

More information

Analytic Expressions for Rate and CV of a Type I Neuron Driven by White Gaussian Noise

Analytic Expressions for Rate and CV of a Type I Neuron Driven by White Gaussian Noise LETTER Communicated by Bard Ermentrout Analytic Expressions for Rate and CV of a Type I Neuron Driven by White Gaussian Noise Benjamin Lindner Lindner.Benjamin@science.uottawa.ca André Longtin alongtin@physics.uottawa.ca

More information

Exercises. Chapter 1. of τ approx that produces the most accurate estimate for this firing pattern.

Exercises. Chapter 1. of τ approx that produces the most accurate estimate for this firing pattern. 1 Exercises Chapter 1 1. Generate spike sequences with a constant firing rate r 0 using a Poisson spike generator. Then, add a refractory period to the model by allowing the firing rate r(t) to depend

More information

Dendritic computation

Dendritic computation Dendritic computation Dendrites as computational elements: Passive contributions to computation Active contributions to computation Examples Geometry matters: the isopotential cell Injecting current I

More information

How Spike Generation Mechanisms Determine the Neuronal Response to Fluctuating Inputs

How Spike Generation Mechanisms Determine the Neuronal Response to Fluctuating Inputs 628 The Journal of Neuroscience, December 7, 2003 23(37):628 640 Behavioral/Systems/Cognitive How Spike Generation Mechanisms Determine the Neuronal Response to Fluctuating Inputs Nicolas Fourcaud-Trocmé,

More information

Voltage-clamp and Hodgkin-Huxley models

Voltage-clamp and Hodgkin-Huxley models Voltage-clamp and Hodgkin-Huxley models Read: Hille, Chapters 2-5 (best) Koch, Chapters 6, 8, 9 See also Clay, J. Neurophysiol. 80:903-913 (1998) (for a recent version of the HH squid axon model) Rothman

More information

Neuronal Dynamics: Computational Neuroscience of Single Neurons

Neuronal Dynamics: Computational Neuroscience of Single Neurons Week 4 part 5: Nonlinear Integrate-and-Fire Model 4.1 From Hodgkin-Huxley to 2D Neuronal Dynamics: Computational Neuroscience of Single Neurons Week 4 Recing detail: Two-dimensional neuron models Wulfram

More information

9 Generation of Action Potential Hodgkin-Huxley Model

9 Generation of Action Potential Hodgkin-Huxley Model 9 Generation of Action Potential Hodgkin-Huxley Model (based on chapter 12, W.W. Lytton, Hodgkin-Huxley Model) 9.1 Passive and active membrane models In the previous lecture we have considered a passive

More information

Effects of Betaxolol on Hodgkin-Huxley Model of Tiger Salamander Retinal Ganglion Cell

Effects of Betaxolol on Hodgkin-Huxley Model of Tiger Salamander Retinal Ganglion Cell Effects of Betaxolol on Hodgkin-Huxley Model of Tiger Salamander Retinal Ganglion Cell 1. Abstract Matthew Dunlevie Clement Lee Indrani Mikkilineni mdunlevi@ucsd.edu cll008@ucsd.edu imikkili@ucsd.edu Isolated

More information

Math 345 Intro to Math Biology Lecture 20: Mathematical model of Neuron conduction

Math 345 Intro to Math Biology Lecture 20: Mathematical model of Neuron conduction Math 345 Intro to Math Biology Lecture 20: Mathematical model of Neuron conduction Junping Shi College of William and Mary November 8, 2018 Neuron Neurons Neurons are cells in the brain and other subsystems

More information

A FINITE STATE AUTOMATON MODEL FOR MULTI-NEURON SIMULATIONS

A FINITE STATE AUTOMATON MODEL FOR MULTI-NEURON SIMULATIONS A FINITE STATE AUTOMATON MODEL FOR MULTI-NEURON SIMULATIONS Maria Schilstra, Alistair Rust, Rod Adams and Hamid Bolouri Science and Technology Research Centre, University of Hertfordshire, UK Department

More information

arxiv: v1 [q-bio.nc] 13 Feb 2018

arxiv: v1 [q-bio.nc] 13 Feb 2018 Gain control with A-type potassium current: I A as a switch between divisive and subtractive inhibition Joshua H Goldwyn 1*, Bradley R Slabe 2, Joseph B Travers 3, David Terman 2 arxiv:182.4794v1 [q-bio.nc]

More information

Mathematical Analysis of Bursting Electrical Activity in Nerve and Endocrine Cells

Mathematical Analysis of Bursting Electrical Activity in Nerve and Endocrine Cells Mathematical Analysis of Bursting Electrical Activity in Nerve and Endocrine Cells Richard Bertram Department of Mathematics and Programs in Neuroscience and Molecular Biophysics Florida State University

More information

Interspike interval distributions of spiking neurons driven by fluctuating inputs

Interspike interval distributions of spiking neurons driven by fluctuating inputs J Neurophysiol 106: 361 373, 2011. First published April 27, 2011; doi:10.1152/jn.00830.2010. Interspike interval distributions of spiking neurons driven by fluctuating inputs Srdjan Ostojic Center for

More information

Characterizing the firing properties of an adaptive

Characterizing the firing properties of an adaptive Characterizing the firing properties of an adaptive analog VLSI neuron Daniel Ben Dayan Rubin 1,2, Elisabetta Chicca 2, and Giacomo Indiveri 2 1 Department of Bioengineering, Politecnico di Milano, P.zza

More information

Electrophysiology of the neuron

Electrophysiology of the neuron School of Mathematical Sciences G4TNS Theoretical Neuroscience Electrophysiology of the neuron Electrophysiology is the study of ionic currents and electrical activity in cells and tissues. The work of

More information

Neurophysiology of a VLSI spiking neural network: LANN21

Neurophysiology of a VLSI spiking neural network: LANN21 Neurophysiology of a VLSI spiking neural network: LANN21 Stefano Fusi INFN, Sezione Roma I Università di Roma La Sapienza Pza Aldo Moro 2, I-185, Roma fusi@jupiter.roma1.infn.it Paolo Del Giudice Physics

More information

Title. Author(s)Yanagita, T. CitationPhysical Review E, 76(5): Issue Date Doc URL. Rights. Type.

Title. Author(s)Yanagita, T. CitationPhysical Review E, 76(5): Issue Date Doc URL. Rights. Type. Title Input-output relation of FitzHugh-Nagumo elements ar Author(s)Yanagita, T. CitationPhysical Review E, 76(5): 5625--5625-3 Issue Date 27- Doc URL http://hdl.handle.net/25/32322 Rights Copyright 27

More information

IN THIS turorial paper we exploit the relationship between

IN THIS turorial paper we exploit the relationship between 508 IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL. 10, NO. 3, MAY 1999 Weakly Pulse-Coupled Oscillators, FM Interactions, Synchronization, Oscillatory Associative Memory Eugene M. Izhikevich Abstract We study

More information

Synaptic dynamics. John D. Murray. Synaptic currents. Simple model of the synaptic gating variable. First-order kinetics

Synaptic dynamics. John D. Murray. Synaptic currents. Simple model of the synaptic gating variable. First-order kinetics Synaptic dynamics John D. Murray A dynamical model for synaptic gating variables is presented. We use this to study the saturation of synaptic gating at high firing rate. Shunting inhibition and the voltage

More information

Voltage-clamp and Hodgkin-Huxley models

Voltage-clamp and Hodgkin-Huxley models Voltage-clamp and Hodgkin-Huxley models Read: Hille, Chapters 2-5 (best Koch, Chapters 6, 8, 9 See also Hodgkin and Huxley, J. Physiol. 117:500-544 (1952. (the source Clay, J. Neurophysiol. 80:903-913

More information

9 Generation of Action Potential Hodgkin-Huxley Model

9 Generation of Action Potential Hodgkin-Huxley Model 9 Generation of Action Potential Hodgkin-Huxley Model (based on chapter 2, W.W. Lytton, Hodgkin-Huxley Model) 9. Passive and active membrane models In the previous lecture we have considered a passive

More information

High-conductance states in a mean-eld cortical network model

High-conductance states in a mean-eld cortical network model Neurocomputing 58 60 (2004) 935 940 www.elsevier.com/locate/neucom High-conductance states in a mean-eld cortical network model Alexander Lerchner a;, Mandana Ahmadi b, John Hertz b a Oersted-DTU, Technical

More information

Phase Locking. 1 of of 10. The PRC's amplitude determines which frequencies a neuron locks to. The PRC's slope determines if locking is stable

Phase Locking. 1 of of 10. The PRC's amplitude determines which frequencies a neuron locks to. The PRC's slope determines if locking is stable Printed from the Mathematica Help Browser 1 1 of 10 Phase Locking A neuron phase-locks to a periodic input it spikes at a fixed delay [Izhikevich07]. The PRC's amplitude determines which frequencies a

More information

Methods for Estimating the Computational Power and Generalization Capability of Neural Microcircuits

Methods for Estimating the Computational Power and Generalization Capability of Neural Microcircuits Methods for Estimating the Computational Power and Generalization Capability of Neural Microcircuits Wolfgang Maass, Robert Legenstein, Nils Bertschinger Institute for Theoretical Computer Science Technische

More information

Single neuron models. L. Pezard Aix-Marseille University

Single neuron models. L. Pezard Aix-Marseille University Single neuron models L. Pezard Aix-Marseille University Biophysics Biological neuron Biophysics Ionic currents Passive properties Active properties Typology of models Compartmental models Differential

More information

This script will produce a series of pulses of amplitude 40 na, duration 1ms, recurring every 50 ms.

This script will produce a series of pulses of amplitude 40 na, duration 1ms, recurring every 50 ms. 9.16 Problem Set #4 In the final problem set you will combine the pieces of knowledge gained in the previous assignments to build a full-blown model of a plastic synapse. You will investigate the effects

More information

Computational Explorations in Cognitive Neuroscience Chapter 2

Computational Explorations in Cognitive Neuroscience Chapter 2 Computational Explorations in Cognitive Neuroscience Chapter 2 2.4 The Electrophysiology of the Neuron Some basic principles of electricity are useful for understanding the function of neurons. This is

More information

Dynamical phase transitions in periodically driven model neurons

Dynamical phase transitions in periodically driven model neurons Dynamical phase transitions in periodically driven model neurons Jan R. Engelbrecht 1 and Renato Mirollo 2 1 Department of Physics, Boston College, Chestnut Hill, Massachusetts 02467, USA 2 Department

More information

The Phase Response Curve of Reciprocally Inhibitory Model Neurons Exhibiting Anti-Phase Rhythms

The Phase Response Curve of Reciprocally Inhibitory Model Neurons Exhibiting Anti-Phase Rhythms The Phase Response Curve of Reciprocally Inhibitory Model Neurons Exhibiting Anti-Phase Rhythms Jiawei Zhang Timothy J. Lewis Department of Mathematics, University of California, Davis Davis, CA 9566,

More information

Chaos in the Hodgkin Huxley Model

Chaos in the Hodgkin Huxley Model SIAM J. APPLIED DYNAMICAL SYSTEMS Vol. 1, No. 1, pp. 105 114 c 2002 Society for Industrial and Applied Mathematics Chaos in the Hodgkin Huxley Model John Guckenheimer and Ricardo A. Oliva Abstract. The

More information

Localized activity patterns in excitatory neuronal networks

Localized activity patterns in excitatory neuronal networks Localized activity patterns in excitatory neuronal networks Jonathan Rubin Amitabha Bose February 3, 2004 Abstract. The existence of localized activity patterns, or bumps, has been investigated in a variety

More information

This document is downloaded from DR-NTU, Nanyang Technological University Library, Singapore.

This document is downloaded from DR-NTU, Nanyang Technological University Library, Singapore. This document is downloaded from DR-NTU, Nanyang Technological University Library, Singapore. Title Small-signal neural models and their applications Author(s) Basu, Arindam Citation Basu, A. (01). Small-signal

More information

Effects of Noisy Drive on Rhythms in Networks of Excitatory and Inhibitory Neurons

Effects of Noisy Drive on Rhythms in Networks of Excitatory and Inhibitory Neurons Effects of Noisy Drive on Rhythms in Networks of Excitatory and Inhibitory Neurons Christoph Börgers 1 and Nancy Kopell 2 1 Department of Mathematics, Tufts University, Medford, MA 2155 2 Department of

More information

Overview Organization: Central Nervous System (CNS) Peripheral Nervous System (PNS) innervate Divisions: a. Afferent

Overview Organization: Central Nervous System (CNS) Peripheral Nervous System (PNS) innervate Divisions: a. Afferent Overview Organization: Central Nervous System (CNS) Brain and spinal cord receives and processes information. Peripheral Nervous System (PNS) Nerve cells that link CNS with organs throughout the body.

More information

Modeling Action Potentials in Cell Processes

Modeling Action Potentials in Cell Processes Modeling Action Potentials in Cell Processes Chelsi Pinkett, Jackie Chism, Kenneth Anderson, Paul Klockenkemper, Christopher Smith, Quarail Hale Tennessee State University Action Potential Models Chelsi

More information

Limitations of the Hodgkin-Huxley Formalism: Effects of Single Channel Kinetics on Transmembrane Voltage Dynamics

Limitations of the Hodgkin-Huxley Formalism: Effects of Single Channel Kinetics on Transmembrane Voltage Dynamics ARTICLE Communicated by Idan Segev Limitations of the Hodgkin-Huxley Formalism: Effects of Single Channel Kinetics on Transmembrane Voltage Dynamics Adam F. Strassberg Computation and Neural Systems Program,

More information

Two dimensional synaptically generated traveling waves in a theta-neuron neuralnetwork

Two dimensional synaptically generated traveling waves in a theta-neuron neuralnetwork Neurocomputing 38}40 (2001) 789}795 Two dimensional synaptically generated traveling waves in a theta-neuron neuralnetwork Remus Osan*, Bard Ermentrout Department of Mathematics, University of Pittsburgh,

More information

Lecture 11 : Simple Neuron Models. Dr Eileen Nugent

Lecture 11 : Simple Neuron Models. Dr Eileen Nugent Lecture 11 : Simple Neuron Models Dr Eileen Nugent Reading List Nelson, Biological Physics, Chapter 12 Phillips, PBoC, Chapter 17 Gerstner, Neuronal Dynamics: from single neurons to networks and models

More information

John Rinzel, x83308, Courant Rm 521, CNS Rm 1005 Eero Simoncelli, x83938, CNS Rm 1030

John Rinzel, x83308, Courant Rm 521, CNS Rm 1005 Eero Simoncelli, x83938, CNS Rm 1030 Computational Modeling of Neuronal Systems (Advanced Topics in Mathematical Biology: G63.2852.001, G80.3042.001) Tuesday, 9:30-11:20am, WWH Rm 1314. Prerequisites: familiarity with linear algebra, applied

More information

Activity Driven Adaptive Stochastic. Resonance. Gregor Wenning and Klaus Obermayer. Technical University of Berlin.

Activity Driven Adaptive Stochastic. Resonance. Gregor Wenning and Klaus Obermayer. Technical University of Berlin. Activity Driven Adaptive Stochastic Resonance Gregor Wenning and Klaus Obermayer Department of Electrical Engineering and Computer Science Technical University of Berlin Franklinstr. 8/9, 187 Berlin fgrewe,obyg@cs.tu-berlin.de

More information

Modeling neural oscillations

Modeling neural oscillations Physiology & Behavior 77 (2002) 629 633 Modeling neural oscillations G. Bard Ermentrout*, Carson C. Chow Department of Mathematics, University of Pittsburgh, Pittsburgh, PA 15260, USA Received 30 July

More information

Topics in Neurophysics

Topics in Neurophysics Topics in Neurophysics Alex Loebel, Martin Stemmler and Anderas Herz Exercise 2 Solution (1) The Hodgkin Huxley Model The goal of this exercise is to simulate the action potential according to the model

More information

Spike Frequency Adaptation Affects the Synchronization Properties of Networks of Cortical Oscillators

Spike Frequency Adaptation Affects the Synchronization Properties of Networks of Cortical Oscillators LETTER Communicated by John Rinzel Spike Frequency Adaptation Affects the Synchronization Properties of Networks of Cortical Oscillators Sharon M. Crook Center for Computational Biology, Montana State

More information

Evaluating Bistability in a Mathematical Model of Circadian Pacemaker Neurons

Evaluating Bistability in a Mathematical Model of Circadian Pacemaker Neurons International Journal of Theoretical and Mathematical Physics 2016, 6(3): 99-103 DOI: 10.5923/j.ijtmp.20160603.02 Evaluating Bistability in a Mathematical Model of Circadian Pacemaker Neurons Takaaki Shirahata

More information

Propagation& Integration: Passive electrical properties

Propagation& Integration: Passive electrical properties Fundamentals of Neuroscience (NSCS 730, Spring 2010) Instructor: Art Riegel; email: Riegel@musc.edu; Room EL 113; time: 9 11 am Office: 416C BSB (792.5444) Propagation& Integration: Passive electrical

More information

NIH Public Access Author Manuscript Neural Comput. Author manuscript; available in PMC 2010 October 13.

NIH Public Access Author Manuscript Neural Comput. Author manuscript; available in PMC 2010 October 13. NIH Public Access Author Manuscript Published in final edited form as: Neural Comput. 2009 March ; 21(3): 704 718. doi:10.1162/neco.2008.12-07-680. A Generalized Linear Integrate-and-Fire Neural Model

More information

Peripheral Nerve II. Amelyn Ramos Rafael, MD. Anatomical considerations

Peripheral Nerve II. Amelyn Ramos Rafael, MD. Anatomical considerations Peripheral Nerve II Amelyn Ramos Rafael, MD Anatomical considerations 1 Physiologic properties of the nerve Irritability of the nerve A stimulus applied on the nerve causes the production of a nerve impulse,

More information

Synchronization in two neurons: Results for a two-component dynamical model with time-delayed inhibition

Synchronization in two neurons: Results for a two-component dynamical model with time-delayed inhibition Physica D 4 (998 47 7 Synchronization in two neurons: Results for a two-component dynamical model with time-delayed inhibition Gilles Renversez Centre de Physique Théorique, UPR 76 du Centre National de

More information

A Generalized Linear Integrate-And-Fire Neural. Model Produces Diverse Spiking Behaviors

A Generalized Linear Integrate-And-Fire Neural. Model Produces Diverse Spiking Behaviors A Generalized Linear Integrate-And-Fire Neural Model Produces Diverse Spiking Behaviors Ştefan Mihalaş and Ernst Niebur Zanvyl Krieger Mind/Brain Institute and Department of Neuroscience Johns Hopkins

More information

Phase Response Properties and Phase-Locking in Neural Systems with Delayed Negative-Feedback. Carter L. Johnson

Phase Response Properties and Phase-Locking in Neural Systems with Delayed Negative-Feedback. Carter L. Johnson Phase Response Properties and Phase-Locking in Neural Systems with Delayed Negative-Feedback Carter L. Johnson Faculty Mentor: Professor Timothy J. Lewis University of California, Davis Abstract Oscillatory

More information

Action Potentials and Synaptic Transmission Physics 171/271

Action Potentials and Synaptic Transmission Physics 171/271 Action Potentials and Synaptic Transmission Physics 171/271 Flavio Fröhlich (flavio@salk.edu) September 27, 2006 In this section, we consider two important aspects concerning the communication between

More information

LIMIT CYCLE OSCILLATORS

LIMIT CYCLE OSCILLATORS MCB 137 EXCITABLE & OSCILLATORY SYSTEMS WINTER 2008 LIMIT CYCLE OSCILLATORS The Fitzhugh-Nagumo Equations The best example of an excitable phenomenon is the firing of a nerve: according to the Hodgkin

More information

Neurons, Synapses, and Signaling

Neurons, Synapses, and Signaling Chapter 48 Neurons, Synapses, and Signaling PowerPoint Lecture Presentations for Biology Eighth Edition Neil Campbell and Jane Reece Lectures by Chris Romero, updated by Erin Barley with contributions

More information

Rhythms in the gamma range (30 80 Hz) and the beta range

Rhythms in the gamma range (30 80 Hz) and the beta range Gamma rhythms and beta rhythms have different synchronization properties N. Kopell, G. B. Ermentrout, M. A. Whittington, and R. D. Traub Department of Mathematics and Center for BioDynamics, Boston University,

More information