W (x) W (x) (b) (a) 1 N

Similar documents
Visuomotor tracking on a computer screen an experimental paradigm to study the dynamics of motor control

Consider the following spike trains from two different neurons N1 and N2:

Neural spike statistics modify the impact of background noise

How to read a burst duration code

Spike-Frequency Adaptation: Phenomenological Model and Experimental Tests

Self-organized Criticality and Synchronization in a Pulse-coupled Integrate-and-Fire Neuron Model Based on Small World Networks

High-conductance states in a mean-eld cortical network model

Extracting Oscillations: Neuronal Coincidence Detection with Noisy Periodic Spike Input

Coherence detection in a spiking neuron via Hebbian learning

Neural Networks 1 Synchronization in Spiking Neural Networks

Chapter 37 Active Reading Guide Neurons, Synapses, and Signaling

(a) (b) (c) Time Time. Time

Causality and communities in neural networks

Fast neural network simulations with population density methods

The Mixed States of Associative Memories Realize Unimodal Distribution of Dominance Durations in Multistable Perception

Adaptation in the Neural Code of the Retina

Extracting Oscillations: Neuronal Coincidence Detection with Noisy Periodic Spike Input

Supporting Online Material for

Dendrites - receives information from other neuron cells - input receivers.

Hopfield Neural Network and Associative Memory. Typical Myelinated Vertebrate Motoneuron (Wikipedia) Topic 3 Polymers and Neurons Lecture 5

Localized Excitations in Networks of Spiking Neurons

Processing of Time Series by Neural Circuits with Biologically Realistic Synaptic Dynamics

Synchronization, oscillations, and 1/ f noise in networks of spiking neurons

Dynamical Synapses Give Rise to a Power-Law Distribution of Neuronal Avalanches

Detection of spike patterns using pattern ltering, with applications to sleep replay in birdsong

Dynamical Constraints on Computing with Spike Timing in the Cortex

Learning at the edge of chaos : Temporal Coupling of Spiking Neurons Controller for Autonomous Robotic

Nervous Tissue. Neurons Electrochemical Gradient Propagation & Transduction Neurotransmitters Temporal & Spatial Summation

An Introductory Course in Computational Neuroscience

Phase-coupling in Two-Dimensional Networks of Interacting Oscillators

Synaptic Input. Linear Model of Synaptic Transmission. Professor David Heeger. September 5, 2000

Triplets of Spikes in a Model of Spike Timing-Dependent Plasticity

A Three-dimensional Physiologically Realistic Model of the Retina

Modeling Synaptic Plasticity in Conjunction with the Timing of Pre- and Postsynaptic Action Potentials

Probabilistic Models in Theoretical Neuroscience

Control and Integration. Nervous System Organization: Bilateral Symmetric Animals. Nervous System Organization: Radial Symmetric Animals

Introduction to Neural Networks U. Minn. Psy 5038 Spring, 1999 Daniel Kersten. Lecture 2a. The Neuron - overview of structure. From Anderson (1995)

Short-Term Synaptic Plasticity and Network Behavior

Analysis of Neural Networks with Chaotic Dynamics

Chapter 9. Nerve Signals and Homeostasis

Nervous Systems: Neuron Structure and Function

Bursting and Chaotic Activities in the Nonlinear Dynamics of FitzHugh-Rinzel Neuron Model

Biosciences in the 21st century


MEMBRANE POTENTIALS AND ACTION POTENTIALS:

Neural variability and Poisson statistics

Emergence of resonances in neural systems: the interplay between adaptive threshold and short-term synaptic plasticity

Complex Dynamics Is Abolished in Delayed Recurrent Systems with Distributed Feedback Times

Dendritic computation

Two dimensional synaptically generated traveling waves in a theta-neuron neuralnetwork

Neurons, Synapses, and Signaling

Intro and Homeostasis

An Efficient Method for Computing Synaptic Conductances Based on a Kinetic Model of Receptor Binding

Nervous System Organization

Computational Explorations in Cognitive Neuroscience Chapter 2

Biological Modeling of Neural Networks:

The Variance of Covariance Rules for Associative Matrix Memories and Reinforcement Learning

Roles of Fluctuations. in Pulsed Neural Networks

COMP304 Introduction to Neural Networks based on slides by:

Nervous Tissue. Neurons Neural communication Nervous Systems

Neurons and Nervous Systems

On the Computational Complexity of Networks of Spiking Neurons

Activity Driven Adaptive Stochastic. Resonance. Gregor Wenning and Klaus Obermayer. Technical University of Berlin.

Modeling retinal high and low contrast sensitivity lters. T. Lourens. Abstract

Chapter 48 Neurons, Synapses, and Signaling

Effects of Interactive Function Forms and Refractoryperiod in a Self-Organized Critical Model Based on Neural Networks

Visual Selection and Attention Shifting Based on FitzHugh-Nagumo Equations

Dynamic Stochastic Synapses as Computational Units

(Feed-Forward) Neural Networks Dr. Hajira Jabeen, Prof. Jens Lehmann

Effects of synaptic conductance on the voltage distribution and firing rate of spiking neurons

Efficient temporal processing with biologically realistic dynamic synapses

Dendritic cable with active spines: a modelling study in the spike-diffuse-spike framework

Stochastic Oscillator Death in Globally Coupled Neural Systems

Signal, donnée, information dans les circuits de nos cerveaux

Chaotic Balanced State in a Model of Cortical Circuits C. van Vreeswijk and H. Sompolinsky Racah Institute of Physics and Center for Neural Computatio

Lecture 14 Population dynamics and associative memory; stable learning

When is an Integrate-and-fire Neuron like a Poisson Neuron?

Artificial Neural Network and Fuzzy Logic

Nerve Signal Conduction. Resting Potential Action Potential Conduction of Action Potentials

Nervous System Organization

Effects of Interactive Function Forms in a Self-Organized Critical Model Based on Neural Networks

Quantitative Electrophysiology

Emergence of network structure due to spike-timing-dependent plasticity in recurrent neuronal networks IV

The Bayesian Brain. Robert Jacobs Department of Brain & Cognitive Sciences University of Rochester. May 11, 2017

Time-Skew Hebb Rule in a Nonisopotential Neuron

Math in systems neuroscience. Quan Wen

Nonlinear reverse-correlation with synthesized naturalistic noise

Action Potentials & Nervous System. Bio 219 Napa Valley College Dr. Adam Ross

REAL-TIME COMPUTING WITHOUT STABLE

2 1. Introduction. Neuronal networks often exhibit a rich variety of oscillatory behavior. The dynamics of even a single cell may be quite complicated

Abstract. Author Summary

Plasticity and Learning

NOTES: CH 48 Neurons, Synapses, and Signaling

Computing with Inter-spike Interval Codes in Networks of Integrate and Fire Neurons

IN THIS turorial paper we exploit the relationship between

Neurons, Synapses, and Signaling

of the dynamics. There is a competition between the capacity of the network and the stability of the

Introduction Principles of Signaling and Organization p. 3 Signaling in Simple Neuronal Circuits p. 4 Organization of the Retina p.

Neurophysiology. Danil Hammoudi.MD

Physiology Unit 2. MEMBRANE POTENTIALS and SYNAPSES

Transcription:

Delay Adaptation in the Nervous System Christian W. Eurich a, Klaus Pawelzik a, Udo Ernst b, Andreas Thiel a, Jack D. Cowan c and John G. Milton d a Institut fur Theoretische Physik, Universitat Bremen, D-28334 Bremen b Max-Planck-Institut fur Stromungsforschung, D-37073 Gottingen c Departments of Mathematics and Neurology, The University of Chicago, Chicago IL 60637 d Department of Neurology and Committee on Neurobiology, The University of Chicago, Chicago IL 60637 Abstract Time delays are ubiquitous in the nervous system. Empirical ndings suggest that time delays are adapted when considering the synchronous activity of neurons. We introduce a framework for studying the dynamics of self-organized delay adaptation in systems which optimize coincidence of inputs. The framework comprises two families of delay adaptation mechanisms, delay shift and delay selection. For the important case of periodically modulated input we derive conditions for the existence and stability of solutions which constrain learning rules for reliable delay adaptation. Delay adaptation is also applicable in the case of several spatio-temporal neuronal input patterns. Key words: delay adaptation; Hebb rule; learning; spiking neurons. 1 Introduction Interactions in the nervous system are associated with time delays. Postsynaptic potentials have a nite rise time, delays arise from signal integration in the dendritic tree, and there is a considerable conduction time for action potentials running down an axon. A precise neuronal signal integration for the purpose of target localization requires an adaptation of such time delays. Examples include the auditory system of barn owls, echolocation in bats, and the lateral line system of weakly electric sh [8,1]. Time delays and their putative adaptation have to be considered also for synchronization phenomena associated with the binding of sensory information in the neocortex [2]. A number To be published in Neurocomputing 20 December 1999

of observations suggest that time delays in the nervous system are adaptive: time delays in the optic nerve are equalized [10], signals in visual callosal axons arrive simultaneously at all axonal endings [7], internodal distances in the barn owl auditory system are short resulting in a slow signal conduction [1], and neurons in vitro can inhibit the formation of a myelin sheet by ring at a low frequency [11]. Two mechanisms have been proposed for the self-organized adaptation of transmission delays in the nervous system. One mechanism (\delay shift") assumes that the transmission delays themselves are altered [6,4]. This mechanism is possible because transmission velocities in the nervous system can be altered, for example, by changing the length and thickness of dendrites and axons, the extent of myelination of axons, or the density and type of ion channels. The second mechanism (\delay selection") supposes that a range of delay lines are present in the beginning from which during development appropriate subsets become selected [5]. Here we introduce a novel framework to describe the dynamics of self-organized delay adaptation expressed in the form of integro-dierential equations which permit the mechanisms of delay adaptation to be explored in a precise manner. 2 Model Consider a neural network consisting of a large number of presynaptic neurons and one postsynaptic neuron which receives its input via delay lines, i (Fig. 1a). The input I of the postsynaptic neuron at time t reads I(t) = P i;k! i (t? (kt + i )) E(t), where! i denotes the ecacy of synapse i, is the Dirac delta distribution, and denotes the convolution with an excitatory postsynaptic potential (EPSP), E(t). For our analysis, we assume that the presynaptic neurons re synchronously at times kt (k 2 Z); however, our results apply to more general input patterns as well [4]. The postsynaptic neuron res depending on this input and its internal dynamics. Delay adaptation corresponds to changing the delays i (delay shift) and the weights! i (delay selection) such that the postsynaptic potentials align at the soma. We use a Hebbian learning rule depending on correlations between preand postsynaptic activity within a certain time window, x. Delays are shifted according to i / W (x), and they are selected according to! i / W! (x), where W and W! denote learning functions for delay shift and selection, respectively (Fig. 1b). We now introduce a continuous formalism for the adaptation dynamics in order to derive conditions on the learning functions. I(t) is replaced by an input density, J(; t) =!(; t)(; t), where (; t) denotes the density and 2

W (x) 1 2 1 3 N-2 2 3 N-2 N-1 N-1 N N x W (x) (a) (b) x Fig. 1. (a) Overview of the neural network. (b) Schematic examples for the window functions W! and W corresponding to delay selection (top) and delay shift (bottom), respectively. In case of a nite rise time of the postsynaptic potential, E, both functions have to be slightly shifted to the left on the abscissa [5].!(; t) denotes the average weight of connections with delay. The dynamics of the input are governed by two simultaneous equations: a balance equation for the input density, @ @ J(; t) =? (J(; t)v(; t)) + Q(; t) ; (1) @t @ and a continuity equation for (; t) indicating the conservation of the number of neural connections, @ @ (; t) =? ((; t)v(; t)) : (2) @t @ The drift velocity v(; t) and the source term Q(; t) are dened according to Hebbian principles. While in general and! will be modied simultaneously, the two limiting cases of delay shift and delay selection serve to illucidate basic mechanisms. For the simplicity of the formalism, we assume E(t) = (t) in the analytical calculations. 3 Delay shift In this case, the weights are not modied and the source term, Q(; t), on the right hand side of (1) vanishes. The dynamics are governed by (2), where the 3

drift velocity, v = d =dt, of the delays realizes the Hebbian adaptation, Z1 v(; t) := W (? 0 )P ( 0 ; t) d 0 ; (3)?1 and denotes the learning rate. For delays where (; 0) 6= 0 we assume!(; 0) = 1 without loss of generality, and (1) and (2) imply that!(; t) = 1 for all t if (; t) 6= 0. For the distribution of spike times we assume a linear neural response, P (; t) = J(; t). It has been shown that adding a small amount of noise to the input approximately linearizes the neural behavior [12]. Therefore, our approximation is valid if the input is suciently high and if there is some weak random background activity. Linear neural behavior may also occur even without background noise. Equation (2) has two equilibrium solutions. The rst is the homogeneous solution (; t) 0 around which a linear stability analysis yields eigenvalues n with Re( n ) = (2) 3=2 0 n Im( ~W (?2n=T ))=T, where ~W is the Fourier transform of the window function W. For an antisymmetric window function like the one in Fig. 1b at least one of the Re( n ) exceeds zero, and the solution is unstable. The second equilibrium solution is given by (; t) = (? 0 ) provided that P 1 n=?1 W (nt) = 0 which is the case for antisymmetric window functions. The solutions form a one-dimensional manifold described by a parameter 0 2 [0; T ] which is a delay oset common to all input neurons. The Liapunov functional L[] = R (; t)(? R ( 0 ; t) 0 d 0 ) 2 d yields the result that the equilibrium solutions are marginally stable in the 0 direction and stable in the other directions provided that W (x) > 0 for x < 0 and W (x) < 0 for x > 0. For a numerical example, see Fig. 2. The above results also hold for the more general case of nonperiodic and unreliable input patterns which are superimposed on background activity [4]. 4 Delay selection For pure delay selection, the drift velocity of the delays, v(; t), vanishes and the total input of the postsynaptic neuron is not conserved. Equations (1) and (2) result in @!(; t) (; t) @t = Q(; t) : (4) 4

ρ(τ) ρ(τ) ρ(τ) 1.0 0.5 0.0 (a) W τ (t) E(t) 5 4 3 2 (b) 0.5 1 1.0 1.0 0.5 t 0.0 0.5 5 4 (c) 0 0.0 0.5 τ 1.0 5 4 (d) 3 2 1 3 2 1 0 0.0 0.5 τ 1.0 0 0.0 0.5 τ 1.0 Fig. 2. Numerical iteration of (2). (a) Learning function W (t) (solid line) shifted on the abscissa according to an EPSP function, E(t)(dashed line). (b) Initial delay distribution (; 0) = 1 + (), where () is Gaussian white noise with zero mean and a variance of 0:1. (c)-(d) (; t) for t = 11.0 and 14.0, respectively. A single peak emerges and evolves into a delta peak which corresponds to the equilibrium solution. T = 1:0, = 0:1, c = 0:2. From a straightforward generalization of the Hebb rule we obtain the source density Q(; t) =!!(; t)(; t) Z1?1 W! (? 0 )P ( 0 ; t) d 0 (5) with! denoting the corresponding learning rate. Without loss of generality we assume (; 0) 1 which implies (; t) 1 for arbitrary t because v(; t) 0. Equation (4) has an equilibrium solution!(; t) =! 0 provided that R 1?1 W! (x)dx = 0. The real parts of the eigenvalues are given by Re( n ) = p 2!! 0 W! ~ (?n!), where W ~! is the Fourier transform of the window function W!. For a symmetric window function like the one shown in Fig. 1b, the homogeneous solution is unstable. In contrast to case 1, there is no stable solution: weight distributions!(; t) = A(t)(? 0 ) retain their shape 5

but explode in size, i. e., A(t) diverges in nite time. This situation commonly arises in networks with Hebbian learning of synaptic weights [5]. 5 Discussion Due to the delay adaptation mechanism, the postsynaptic neuron becomes sensitive to a certain spatio-temporal input pattern. The idea of an ensemble coding in the nervous system comprises the notion that neurons are involved in multiple tasks. For the temporal coding this requires that they be sensitive to more than one spatio-temporal pattern. In a numerical study we now demonstrate that the delay shift learning rule is capable of adjusting two input patterns for the same postsynaptic neuron. Two populations of presynaptic model neurons generate two dierent temporal patterns of action potentials: the rst population spikes simultaneously, while spikes are generated successively in the second population. The patterns are presented at random and are disturbed by additional random spikes (Fig. 3a). The action potentials are received by a single postsynaptic cell via delayed connections. All model neurons are of the leaky integrate-and-re type with a dynamic threshold [3]. Delays are randomly initialized (Fig. 3b) and then adapted during a training process by applying the window function W shown in Fig. 1 to the dierences between pre- and postsynaptic activity. After training, the synchronous activity of the rst presynaptic population yields connections that are delayed by the same amount (peak in Fig. 3c{d). The second temporal pattern produces connections with increasing delays (baseline in Fig. 3d), which results in synchronous spike arrival, making the postsynaptic neuron responsive to the second pattern as well. Self-organized delay adaptation in sensory neural systems regulates signals carried along separate axons such that they arrive at a post-synaptic neuron simultaneously. Our stability analysis yields conditions on the learning functions W and W! for delay shift and delay selection, respectively, thus placing constraints that ensure that stable solutions exist for arbitrary temporal inputs. Recent experimental estimations of Hebbian learning windows [9] are compatible with our results suggesting that delay adaptation is possible and may be an important mechanism of signal processing in the brain. References [1] C. E. Carr, Processing of temporal information in the brain, Annu. Rev. Neurosci. 16 (1993) 223{243. 6

Fig. 3. Delay shift in response to dierent temporal patterns. (a) Spike trains of 81 presynaptic neurons generating two patterns. (b{d) Delay distributions prior to learning, after 100 and 600 presentations of each pattern, respectively. [2] R. Eckhorn, R. Bauer, W. Jordan, M. Borsch, W. Kruse, M. Munk and H. J. Reitboeck, Coherent oscillations: a mechanism of feature linking in the visual cortex? Biol. Cybern. 60 (1988) 121{130; C. M. Gray, P. Konig, A. K. Engel and W. Singer, Oscillatory responses in cat visual cortex exhibit inter-columnar synchronisation which reects global stimulus properties, Nature 338 (1989) 334{ 337. [3] R. Eckhorn, H. J. Reitboeck, M. Arndt and P. Dicke, Feature linking via synchronization among distributed assemblies: Simulations of results from cat visual cortex, Neural Comput. 2 (1990) 293{307. [4] C. W. Eurich, U. Ernst, K. Pawelzik, J. D. Cowan and J. G. Milton, Dynamics of self-organized delay adaptation, Phys. Rev. Lett. 82 (1999) 1594{1597. [5] W. Gerstner, R. Kempter, J. L. van Hemmen and H. Wagner, A neuronal learning rule for sub-millisecond temporal coding, Nature 383 (1996) 76{78. [6] H. Huning, H. Glunder and G. Palm, Synaptic Delay Learning in Pulse-Coupled Neurons, Neural Comp. 10 (1998) 555{565. [7] G. M. Innocenti, P. Lehmann and J.-C. Houzel, Computational structure of visual callosal axons, Europ. J. Neurosci. 6 (1994) 918{935. [8] L. A. Jeress, A place theory of sound localization, J. Comp. Physiol. Psychol. 41 (1948) 35{39. [9] H. Markram, J. Lubke, M. Frotscher and B. Sakmann, Regulation of synaptic ecacy by coincidence of postsynaptic APs and EPSPs, Science 275 (1997) 213{215. [10] L. R. Stanford, Conduction velocity variations minimize conduction time dierences among retinal ganglion cell axons, Science 238 (1987) 358{360. 7

[11] B. Stevens, S. Tanner and R. D. Fields, Control of myelination by specic patterns of neural impulses, J. Neurosci. 18 (1998) 9303{9311. [12] X. Yu and E. R. Lewis, Studies with spike initiators: linearization by noise allows continuous signal modulation in neural networks, IEEE Trans. Neural Netw. 36 (1989) 36{43. 6 Biosketches Christian W. Eurich got his PhD in Theoretical Physics in 1995 from the University of Bremen (Germany). As a postdoc, he worked with John Milton and Jack Cowan at the University of Chicago, and he spent some time at the Max-Planck Institute for Fluid Dynamics in Gottingen (Germany). In 1997, he returned to the Department of Theoretical Neurophysics at the University of Bremen. His research interests include neural networks with time delays, visuomotor behavior in amphibians, information processing in neural populations, avalanche phenomena in neural networks, and motor control problems such as balancing tasks and postural sway. Klaus Pawelzik nished his PhD in Theoretical Physics in 1990 at the J-W- Goethe University (Frankfurt, Germany). He became fellow post-doc at the Max-Planck- Institute for Brain Research in 1991 and joined the Nonlinear Dynamics Group of Prof. Theo Geisel at the Institute for Theoretical Physics in Frankfurt. He worked at the Computational Neurobiology Lab headed by Terry Sejnowski at the Salk Institute, San Diego in 1994/1995. He continued his work on theoretical aspects of dynamics and coding in neural systems in 1996 at the Max-Planck-Institut fur Stromungsforschung (Gottingen, Germany) until in 1998 he became Professor for Theoretical Physics and Biophysics at the University of Bremen. The range of his interests includes models of the visual system and the hippocampus, networks of spiking neurons, dynamics of synapses, neural coding, data analysis, articial neural networks, and robotics. Udo Ernst is currently nishing his PhD in Theoretical Physics at the J- W-Goethe University (Frankfurt, Germany). Since 1997 he also works at the Max-Planck-Institut fur Stromungsforschung (Gottingen, Germany). His interests cover temporal coding and nonlinear dynamics in neuronal systems, synchronization and oscillation phenomena, and dynamics and organization of receptive elds in the visual cortex. Andreas Thiel studied Physics at the University of Marburg (Germany). In 1998, he nished his Diploma thesis about self-organizing connections between orientation detectors. Since 1999, he is a PhD student at the University of Bremen. 8

Jack D. Cowan nished his PhD in Electrical Engineering in 1967 at the Imperial College of Science and Technology in London. In 1967, he became Professor of Mathematical Biology at the University of Chicago. Since then, he has held several professoral positions there, including positions at the Collegiate Divison of Biology, the Department of Biophysics and Theoretical Biology and the Department of Neurology. In 1989, he became External Professor at Santa Fe Institute. Jack Cowan currently is Professor of Applied Mathematics and Theoretical Biology at the University of Chicago. His research in neurobiology focuses on the development and regeneration of eye-brain connections, the architecture of primate visual cortex, on hallucinations, epilepsies and visual migraines. His interests in applied mathematics include local bifurcation theory, bifurcation in the presence of symmetry and stochastic nonlinear processes with applications to neurobiology. John G. Milton got his PhD in Biophysical Chemistry in 1975 from McGill University (Montreal, Canada). After working in Japan and France as a postdoc, he received his MDCM from McGill University in 1982. From 1987 until 1988 he was Assistant Professor at the Department of Physiology at McGill University. After various guest faculties in Canada and the USA, in 1989 he became Adjunct Professor at the Center of Nonlinear Dynamics in Physiology and Medicine at McGill University. Since 1996, John Milton is also Associate Professor at the Department of Neurology at the University of Chicago. His interests in research include biophysical systems, especially those with delays like the pupil light reex or postural sway, dynamical diseases and spiking neurons. 9