High-conductance states in a mean-eld cortical network model

Similar documents
Consider the following spike trains from two different neurons N1 and N2:

How to read a burst duration code

Mechanistic modeling of the retinogeniculate circuit in cat

of the dynamics. There is a competition between the capacity of the network and the stability of the

Supporting Online Material for

Neurophysiology of a VLSI spiking neural network: LANN21

Activity Driven Adaptive Stochastic. Resonance. Gregor Wenning and Klaus Obermayer. Technical University of Berlin.

Two dimensional synaptically generated traveling waves in a theta-neuron neuralnetwork

Chaotic Balanced State in a Model of Cortical Circuits C. van Vreeswijk and H. Sompolinsky Racah Institute of Physics and Center for Neural Computatio

Dynamical Constraints on Computing with Spike Timing in the Cortex

Effects of synaptic conductance on the voltage distribution and firing rate of spiking neurons

Fast neural network simulations with population density methods

1 Balanced networks: Trading speed for noise

Subthreshold cross-correlations between cortical neurons: Areference model with static synapses

How Complex Cells Are Made in a Simple Cell Network

Synfire Waves in Small Balanced Networks

Layer 3 patchy recurrent excitatory connections may determine the spatial organization of sustained activity in the primate prefrontal cortex

The homogeneous Poisson process

Emergence of resonances in neural systems: the interplay between adaptive threshold and short-term synaptic plasticity

Analytical Integrate-and-Fire Neuron Models with Conductance-Based Dynamics for Event-Driven Simulation Strategies

Stationary Bumps in Networks of Spiking Neurons

Controllingthe spikingactivity in excitable membranes via poisoning

arxiv:physics/ v1 [physics.bio-ph] 19 Feb 1999

When is an Integrate-and-fire Neuron like a Poisson Neuron?

Neural variability and Poisson statistics

Event-driven simulations of nonlinear integrate-and-fire neurons

Coherence detection in a spiking neuron via Hebbian learning

Probabilistic Models in Theoretical Neuroscience

An Introductory Course in Computational Neuroscience

Computing with inter-spike interval codes in networks ofintegrate and fire neurons

The Spike Response Model: A Framework to Predict Neuronal Spike Trains

Single Neuron Dynamics for Retaining and Destroying Network Information?

In#uence of dendritic topologyon "ring patterns in model neurons

Synchrony in Neural Systems: a very brief, biased, basic view

Biological Modeling of Neural Networks:

Exercises. Chapter 1. of τ approx that produces the most accurate estimate for this firing pattern.

arxiv: v1 [q-bio.nc] 1 Jun 2014

Learning with Ensembles: How. over-tting can be useful. Anders Krogh Copenhagen, Denmark. Abstract

CSE/NB 528 Final Lecture: All Good Things Must. CSE/NB 528: Final Lecture

Neural spike statistics modify the impact of background noise

Stochastic Oscillator Death in Globally Coupled Neural Systems

Patterns of Synchrony in Neural Networks with Spike Adaptation

INVERSE PROBLEMS FOR THE HODGKIN-HUXLEY MODEL

Stochastic Learning in a Neural Network with Adapting. Synapses. Istituto Nazionale di Fisica Nucleare, Sezione di Bari

When do Correlations Increase with Firing Rates? Abstract. Author Summary. Andrea K. Barreiro 1* and Cheng Ly 2

(a) (b) (c) Time Time. Time

Plasticity and Learning

Dynamical Synapses Give Rise to a Power-Law Distribution of Neuronal Avalanches

Leo Kadanoff and 2d XY Models with Symmetry-Breaking Fields. renormalization group study of higher order gradients, cosines and vortices

Autoassociative Memory Retrieval and Spontaneous Activity Bumps in Small-World Networks of Integrate-and-Fire Neurons

Methods for Estimating the Computational Power and Generalization Capability of Neural Microcircuits

Math in systems neuroscience. Quan Wen

Computing with Inter-spike Interval Codes in Networks of Integrate and Fire Neurons

Neuronal Dynamics: Computational Neuroscience of Single Neurons

A Learning Theory for Reward-Modulated Spike-Timing-Dependent Plasticity with Application to Biofeedback

Localized Excitations in Networks of Spiking Neurons

Divisive Inhibition in Recurrent Networks

Roles of Fluctuations. in Pulsed Neural Networks

Balanced Amplification: A New Mechanism of Selective Amplification of Neural Activity Patterns

Memory capacity of networks with stochastic binary synapses

Visuomotor tracking on a computer screen an experimental paradigm to study the dynamics of motor control

Interspike interval distributions of spiking neurons driven by fluctuating inputs

A Population Density Approach that Facilitates Large-Scale Modeling of Neural Networks: Analysis and an Application to Orientation Tuning

Phase Response. 1 of of 11. Synaptic input advances (excitatory) or delays (inhibitory) spiking

Decoding. How well can we learn what the stimulus is by looking at the neural responses?

Spike-Frequency Adaptation of a Generalized Leaky Integrate-and-Fire Model Neuron

Short-Term Synaptic Plasticity and Network Behavior

BioE 332 Assignment 1: Persistent activity in a spatial working memory model

Decision-making and Weber s law: a neurophysiological model

Balance of Electric and Diffusion Forces

THE LOCUST OLFACTORY SYSTEM AS A CASE STUDY FOR MODELING DYNAMICS OF NEUROBIOLOGICAL NETWORKS: FROM DISCRETE TIME NEURONS TO CONTINUOUS TIME NEURONS


Introduction to Neural Networks U. Minn. Psy 5038 Spring, 1999 Daniel Kersten. Lecture 2a. The Neuron - overview of structure. From Anderson (1995)

Causality and communities in neural networks

NEURAL NETWORK CHAOS ANALYSIS. A. Garliauskas. Mathematics and Informatics Institute, Akademijos 4, 2600 Vilnius, Lithuania.

The High-Conductance State of Cortical Networks

Tuning tuning curves. So far: Receptive fields Representation of stimuli Population vectors. Today: Contrast enhancment, cortical processing

Integration of synaptic inputs in dendritic trees

Information Theory and Neuroscience II

SUPPLEMENTARY INFORMATION

Ranking Neurons for Mining Structure-Activity Relations in Biological Neural Networks: NeuronRank

Neuronal networks, whether real cortical networks (1, 2) or

Infinite systems of interacting chains with memory of variable length - a stochastic model for biological neural nets

Characterizing the firing properties of an adaptive

An alternate burst analysis for detecting intra-burst rings based on inter-burst periods

A gradient descent rule for spiking neurons emitting multiple spikes

Self-organized Criticality and Synchronization in a Pulse-coupled Integrate-and-Fire Neuron Model Based on Small World Networks

Extracting Oscillations: Neuronal Coincidence Detection with Noisy Periodic Spike Input

On the Dynamics of Delayed Neural Feedback Loops. Sebastian Brandt Department of Physics, Washington University in St. Louis

ESTIMATING SCALING EXPONENTS IN AUDITORY-NERVE SPIKE TRAINS USING FRACTAL MODELS INCORPORATING REFRACTORINESS

The Bayesian Brain. Robert Jacobs Department of Brain & Cognitive Sciences University of Rochester. May 11, 2017

University of Bristol - Explore Bristol Research. Early version, also known as pre-print

AT2 Neuromodeling: Problem set #3 SPIKE TRAINS

Towards a theory of cortical columns: From spiking neurons to interacting neural populations of finite size. Abstract.

The Mixed States of Associative Memories Realize Unimodal Distribution of Dominance Durations in Multistable Perception

An analytical model for the large, fluctuating synaptic conductance state typical of neocortical neurons in vivo

How Spike Generation Mechanisms Determine the Neuronal Response to Fluctuating Inputs

Synaptic Input. Linear Model of Synaptic Transmission. Professor David Heeger. September 5, 2000

Neuron. Detector Model. Understanding Neural Components in Detector Model. Detector vs. Computer. Detector. Neuron. output. axon

THE TRANSFER AND PROPAGATION OF CORRELATED NEURONAL ACTIVITY

Transcription:

Neurocomputing 58 60 (2004) 935 940 www.elsevier.com/locate/neucom High-conductance states in a mean-eld cortical network model Alexander Lerchner a;, Mandana Ahmadi b, John Hertz b a Oersted-DTU, Technical University of Denmark, Oersted Plads Bldg 348 Lyngby 2800 Denmark b Nordita, Blegdamsvej 17, Copenhagen 2100, Denmark Abstract Measured responses from visual cortical neurons show that spike times tend to be correlated rather than exactly Poisson distributed. Fano factors vary and are usually greater than 1, indicating a tendency toward spikes being clustered. We show that this behavior emerges naturally in a balanced cortical network model withrandom connectivity and conductance-based synapses. We employ mean-eld theory with correctly colored noise to describe temporal correlations in the neuronal activity. Our results illuminate the connection between two independent experimental ndings: high-conductance states of cortical neurons in their natural environment, and variable non-poissonian spike statistics withfano factors greater than 1. c 2004 Elsevier B.V. All rights reserved. Keywords: Synaptic conductances; Response variability; Cortical dynamics 1. Introduction Neurons in primary visual cortex show a large increase in input conductance during visual activation: in vivo recordings (see, e.g., [2]) show that the conductance can rise to more than three times that of the resting state. Such high-conductance states lead to faster neuronal dynamics than would be expected from the value of the passive membrane time constant, as pointed out by Shelley et al. [9]. Here we use mean-eld theory to study the ring statistics of a model network with balanced excitation and inhibition and observe consistently such high-conductance states during stimulation. Corresponding author. Tel.: +45-2680-1971. E-mail addresses: al@oersted.dtu.dk (A. Lerchner), ahmadi@nordita.dk (Mandana Ahmadi), hertz@nordita.dk (John Hertz). 0925-2312/$ - see front matter c 2004 Elsevier B.V. All rights reserved. doi:10.1016/j.neucom.2004.01.149

936 A. Lerchner et al. / Neurocomputing 58 60 (2004) 935 940 In our study, we characterize the irregularity of ring by the Fano factor F, dened as the ratio of the variance of the spike count to its mean. For temporally uncorrelated spike trains (i.e., Poisson processes) F = 1, while F 1 indicates a tendency for spike clustering (bursts), and F 1 indicates more regular ring withwell-separated spikes. Observed Fano factors for spike trains of primary cortical neurons during stimulation are usually greater than 1 and vary over an entire order of magnitude (see, e.g., [6]). We nd the same dynamics in our model and are able to pinpoint some relevant mechanisms: synaptic ltering leads to spike clustering in states of high conductance (thus F 1), and Fano factors depend sensitively on variations in both threshold and synaptic time constants. 2. The model We investigate a cortical network model that exhibits self-consistently balanced excitation and inhibition. The model consists of two populations of neurons, an excitatory and an inhibitory one, with dilute random connectivity. The model neurons are governed by leaky integrate-and-re subthreshold dynamics with conductance-based synapses. The membrane potential of neuron i in population a (a =1; 2 for excitatory and inhibitory, respectively) obeys du i a(t) dt = g L u i a(t) 2 N b g ij ab (t)(ui a(t) V b ): (1) b=0 j=1 The rst sum runs over all populations b, including the excitatory input population representing input from the LGN and indexed by 0. The second sum runs over all neurons j in population b of size N b. The reversal potential V b for the excitatory inputs (b =0; 1) is higher than the ring threshold, and the one for the inhibitory inputs (V 2 ) is below the reset value. The constant leakage conductance g L is the inverse of the membrane time constant m. The time-dependent conductance g ij ab (t) from neuron j in population b to neuron i in population a is taken as g ij ab (t)= g0 ab exp( (t t j Kb s)= b )(t ts) j (2) s if there is a connection between those two neurons, otherwise zero. The sum runs over all spikes s emitted by neuron j, b is the synaptic time constant for the synapse of type b (excitatory or inhibitory), and is the Heaviside step function. K b denotes the average number of presynaptic neurons in population b. We followed van Vreeswijk and Sompolinsky [10] in scaling the conductances like 1= K b so that the uctuations in the total conductance are of order 1, independent of network size and connectivity.

3. Mean-eld theory A. Lerchner et al. / Neurocomputing 58 60 (2004) 935 940 937 We use mean-eld theory to reduce the full network problem to two neurons: one for eachpopulation, eachreceiving a Gaussian noisy input, the mean and covariance of which depend self-consistently on the ring statistics. This method is exact in the limit of large populations withhomogeneous connection probabilities [5]. Here we consider stationary ring only, for simplicity. Then the time-dependent conductance described in (2) can be replaced by a realization of a Gaussian distributed random variable g ab withmean g ab = gab 0 Kb r b (3) and covariance g ab (t) g ab (t ) =(gab) 0 2 (1 K b =N b )C b (t t ): (4) Here r b is the ring rate of the presynaptic neuron b and C b (t t ) is the autocorrelation function of its synaptically ltered spike trains. A simple approximation for the autocorrelation, analogous to that used by Amit and Brunel [1], Brunel [3], is to take the ring to be temporally uncorrelated. The term (1 K b =N b ) is a correction for the nite connection concentration K b =N b and can be derived using the methods of [8]. The self-consistent balance condition is obtained by setting the net current in (1) to zero when the membrane potential is at threshold a and the conductances have their mean values (3). In the large K b -limit, it reads 2 gab 0 Kb r b ( a V b )=0: (5) b=0 The distribution of the variables g ab can be calculated numerically using an iterative approach[4]. One starts witha guess based on the balance equation (5) for the means and the white-noise approximation for the covariances. One then generates a large sample of specic realizations of g ab (t), which are used to integrate (1) to generate a large sample of spike trains. The latter can then be used to calculate new estimates of the means and covariances by applying (3) and (4) and correcting the initial guesses, moving them toward the new values. These steps are repeated until convergence. 4. Results For the above-described model, we chose parameters corresponding to population sizes of 16,000 excitatory neurons and 4000 inhibitory neurons, representing a small patchof layer IV cat visual cortex. The neurons were assumed to be connected randomly, with 10% connection probability between any two neurons. The ring threshold was xed to 1, excitatory and inhibitory reversal potentials were set to +14=3 and 2=3, respectively, and the membrane time constant m = g 1 L was taken to be 10 ms. For the results presented here, the integration time step was 0:5 ms. Fig. 1 illustrates the importance of coloring the noise produced by intra-cortical activity. The white noise approximation underestimates both the correlation times and

938 A. Lerchner et al. / Neurocomputing 58 60 (2004) 935 940 0.1 0.08 White Noise Colored Noise 0.06 0.04 0.02 0 0.02 40 30 20 10 0 10 20 30 40 50 Time Shift [ms] Fig. 1. Autocorrelation functions for white noise (dotted line) and colored noise (solid line). The white noise approximation underestimates the amount of temporal correlation in the neuronal ring. 10 Fano Factor 8 6 4 2 0 0.94 0.92 0.90 0.88 0.86 Reset 0.84 0.82 0.80 0 1 2 3 τ s 4 5 6 Fig. 2. Fano factors for a range of reset values and synaptic time constants s. Longer synaptic time constants lead to increased clustering (bursts) of spikes, which is reected in higher Fano factors. the strength of the correlations in the neuron s ring: its autocorrelation (dotted line) is both narrower and weaker than the one for colored noise (solid line). Fano factors vary systematically with both the distance between reset and threshold and the synaptic time constant s. Non-zero synaptic time constants consistently produced Fano factors greater than 1. Varying the reset between 0.8 and 0.94 and s between 2 and 6 ms resulted in values for F spanning an entire order of magnitude, from slightly above 1 to about 10 (see Fig. 2).

A. Lerchner et al. / Neurocomputing 58 60 (2004) 935 940 939 5. Discussion In all our simulations, we observed that the membrane potential changed on a considerably faster time scale than the membrane time constant m = 10 ms. This behavior is observed only if conductance-based synapses are included in the integrate-and-re neuron model. To understand this phenomenon, it is convenient to follow Shelley et al. [9] and rewrite the equation for the membrane potential dynamics (1) in the following form: du a (t) = g T (t)(u a (t) V S (t)); (6) dt withthe total conductance g T (t)=g L + b g ab(t), and the eective reversal potential V S (t)=g T (t) 1 b g ab(t)v b. The membrane potential u a (t) follows the eective reversal potential withthe input dependent eective membrane time constant g T (t) 1. Th e eective reversal potential changes on the time scale of the synaptic time constants, which are up to ve times shorter than m in our simulations. However, if the eective membrane time constant is shorter than the synaptic time constant (due to a large enough total conductance), then the neuron will re repeatedly during intervals when V S (t) stays above threshold, as observed in our simulations. In high-conductance states, the ring statistics are strongly inuenced by synaptic dynamics (see Fig. 2). This is in contrast with strictly current-based models, where the neuron reacts too slowly to reect fast synaptic dynamics in its ring. Here, the synaptic ltering of arriving spikes leads to temporal correlations in V S (t) and thus to temporal correlations (in the form of spike clustering) in ring. These correlations, in turn, lead to further correlations in the input, producing still more strongly correlated ring. For this reason, in mean-eld models with conductance-based dynamics, coloring the noise is especially important in order to describe accurately the temporal correlation in ring statistics (see Fig. 1). We conrmed these considerations by running simulations without synaptic ltering ( s = 0). As expected, intra-cortical activity became uncorrelated and the white noise approximation produced the same result as coloring the noise correctly. In that case, Fano factors stayed close to 1 (see Fig. 2), i.e, no tendency of spike clustering was observed. Previous investigations showed that varying the distance between threshold and reset in balanced integrate-and-re networks has a strong eect on the irregularity of the ring [7]. By including a conductance-based description of synapses, we were now able to show a strong eect of synaptic time constants on ring statistics, even if they are several times smaller than the passive membrane time constant: Synaptic ltering facilitates clustering of spikes in states of high conductance. References [1] D.J. Amit, N. Brunel, Model of global spontaneous activity and local structured delay activity during delay periods in the cerebral cortex, Cereb. Cortex 7 (1997) 237 252. [2] L.J. Borg-Graham, C. Monier, Y. Fregnac, Visual input evokes transient and strong shunting inhibition in visual cortical neurons, Nature 393 (1998) 369 374.

940 A. Lerchner et al. / Neurocomputing 58 60 (2004) 935 940 [3] N. Brunel, Dynamics of sparsely connected networks of excitatory and inhibitory spiking neurons, J. Comput. Neurosci. 8 (2000) 183 208. [4] H. Eisfeller, M. Opper, New method for studying the dynamics of disordered spin systems without nite-size eects, Phys. Rev. Lett. 68 (1992) 2094 2097. [5] C. Fulvi Mari, Random networks of spiking neurons: instability in the Xenopus tadpole moto-neural pattern, Phys. Rev. Lett. 85 (2000) 210 213. [6] E.D. Gershon, M.C. Wiener, P.E. Latham, B.J. Richmond, Coding strategies in monkey V1 and inferior temporal cortices, J. Neurophysiol. 79 (1998) 1135 1144. [7] J. Hertz, B.J. Richmond, K. Nilsen, Anomalous response variability in a balanced cortical network model, Neurocomputing 52 54 (2003) 787 792. [8] R. Kree, A. Zippelius, Continuous-time dynamics of asymmetrically diluted neural networks, Phys. Rev. A 36 (1987) 4421 4427. [9] M. Shelley, D. McLaughlin, R. Shapley, J. Wielaard, Visual input evokes transient and strong shunting inhibition in visual cortical neurons, J. Comput. Neurosci. 13 (2002) 93 109. [10] C. van Vreeswijk, H. Sompolinsky, Chaotic balanced state in a model of cortical circuits, Neural Comput. 10 (1998) 1321 1371.