W (x) W (x) (b) (a) 1 N
|
|
- Leonard Davidson
- 6 years ago
- Views:
Transcription
1 Delay Adaptation in the Nervous System Christian W. Eurich a, Klaus Pawelzik a, Udo Ernst b, Andreas Thiel a, Jack D. Cowan c and John G. Milton d a Institut fur Theoretische Physik, Universitat Bremen, D Bremen b Max-Planck-Institut fur Stromungsforschung, D Gottingen c Departments of Mathematics and Neurology, The University of Chicago, Chicago IL d Department of Neurology and Committee on Neurobiology, The University of Chicago, Chicago IL Abstract Time delays are ubiquitous in the nervous system. Empirical ndings suggest that time delays are adapted when considering the synchronous activity of neurons. We introduce a framework for studying the dynamics of self-organized delay adaptation in systems which optimize coincidence of inputs. The framework comprises two families of delay adaptation mechanisms, delay shift and delay selection. For the important case of periodically modulated input we derive conditions for the existence and stability of solutions which constrain learning rules for reliable delay adaptation. Delay adaptation is also applicable in the case of several spatio-temporal neuronal input patterns. Key words: delay adaptation; Hebb rule; learning; spiking neurons. 1 Introduction Interactions in the nervous system are associated with time delays. Postsynaptic potentials have a nite rise time, delays arise from signal integration in the dendritic tree, and there is a considerable conduction time for action potentials running down an axon. A precise neuronal signal integration for the purpose of target localization requires an adaptation of such time delays. Examples include the auditory system of barn owls, echolocation in bats, and the lateral line system of weakly electric sh [8,1]. Time delays and their putative adaptation have to be considered also for synchronization phenomena associated with the binding of sensory information in the neocortex [2]. A number To be published in Neurocomputing 20 December 1999
2 of observations suggest that time delays in the nervous system are adaptive: time delays in the optic nerve are equalized [10], signals in visual callosal axons arrive simultaneously at all axonal endings [7], internodal distances in the barn owl auditory system are short resulting in a slow signal conduction [1], and neurons in vitro can inhibit the formation of a myelin sheet by ring at a low frequency [11]. Two mechanisms have been proposed for the self-organized adaptation of transmission delays in the nervous system. One mechanism (\delay shift") assumes that the transmission delays themselves are altered [6,4]. This mechanism is possible because transmission velocities in the nervous system can be altered, for example, by changing the length and thickness of dendrites and axons, the extent of myelination of axons, or the density and type of ion channels. The second mechanism (\delay selection") supposes that a range of delay lines are present in the beginning from which during development appropriate subsets become selected [5]. Here we introduce a novel framework to describe the dynamics of self-organized delay adaptation expressed in the form of integro-dierential equations which permit the mechanisms of delay adaptation to be explored in a precise manner. 2 Model Consider a neural network consisting of a large number of presynaptic neurons and one postsynaptic neuron which receives its input via delay lines, i (Fig. 1a). The input I of the postsynaptic neuron at time t reads I(t) = P i;k! i (t? (kt + i )) E(t), where! i denotes the ecacy of synapse i, is the Dirac delta distribution, and denotes the convolution with an excitatory postsynaptic potential (EPSP), E(t). For our analysis, we assume that the presynaptic neurons re synchronously at times kt (k 2 Z); however, our results apply to more general input patterns as well [4]. The postsynaptic neuron res depending on this input and its internal dynamics. Delay adaptation corresponds to changing the delays i (delay shift) and the weights! i (delay selection) such that the postsynaptic potentials align at the soma. We use a Hebbian learning rule depending on correlations between preand postsynaptic activity within a certain time window, x. Delays are shifted according to i / W (x), and they are selected according to! i / W! (x), where W and W! denote learning functions for delay shift and selection, respectively (Fig. 1b). We now introduce a continuous formalism for the adaptation dynamics in order to derive conditions on the learning functions. I(t) is replaced by an input density, J(; t) =!(; t)(; t), where (; t) denotes the density and 2
3 W (x) N N-2 N-1 N-1 N N x W (x) (a) (b) x Fig. 1. (a) Overview of the neural network. (b) Schematic examples for the window functions W! and W corresponding to delay selection (top) and delay shift (bottom), respectively. In case of a nite rise time of the postsynaptic potential, E, both functions have to be slightly shifted to the left on the abscissa [5].!(; t) denotes the average weight of connections with delay. The dynamics of the input are governed by two simultaneous equations: a balance equation for the J(; t) =? (J(; t)v(; t)) + Q(; t) and a continuity equation for (; t) indicating the conservation of the number of (; t) =? ((; t)v(; t)) The drift velocity v(; t) and the source term Q(; t) are dened according to Hebbian principles. While in general and! will be modied simultaneously, the two limiting cases of delay shift and delay selection serve to illucidate basic mechanisms. For the simplicity of the formalism, we assume E(t) = (t) in the analytical calculations. 3 Delay shift In this case, the weights are not modied and the source term, Q(; t), on the right hand side of (1) vanishes. The dynamics are governed by (2), where the 3
4 drift velocity, v = d =dt, of the delays realizes the Hebbian adaptation, Z1 v(; t) := W (? 0 )P ( 0 ; t) d 0 ; (3)?1 and denotes the learning rate. For delays where (; 0) 6= 0 we assume!(; 0) = 1 without loss of generality, and (1) and (2) imply that!(; t) = 1 for all t if (; t) 6= 0. For the distribution of spike times we assume a linear neural response, P (; t) = J(; t). It has been shown that adding a small amount of noise to the input approximately linearizes the neural behavior [12]. Therefore, our approximation is valid if the input is suciently high and if there is some weak random background activity. Linear neural behavior may also occur even without background noise. Equation (2) has two equilibrium solutions. The rst is the homogeneous solution (; t) 0 around which a linear stability analysis yields eigenvalues n with Re( n ) = (2) 3=2 0 n Im( ~W (?2n=T ))=T, where ~W is the Fourier transform of the window function W. For an antisymmetric window function like the one in Fig. 1b at least one of the Re( n ) exceeds zero, and the solution is unstable. The second equilibrium solution is given by (; t) = (? 0 ) provided that P 1 n=?1 W (nt) = 0 which is the case for antisymmetric window functions. The solutions form a one-dimensional manifold described by a parameter 0 2 [0; T ] which is a delay oset common to all input neurons. The Liapunov functional L[] = R (; t)(? R ( 0 ; t) 0 d 0 ) 2 d yields the result that the equilibrium solutions are marginally stable in the 0 direction and stable in the other directions provided that W (x) > 0 for x < 0 and W (x) < 0 for x > 0. For a numerical example, see Fig. 2. The above results also hold for the more general case of nonperiodic and unreliable input patterns which are superimposed on background activity [4]. 4 Delay selection For pure delay selection, the drift velocity of the delays, v(; t), vanishes and the total input of the postsynaptic neuron is not conserved. Equations (1) and (2) result t) (; = Q(; t) : (4) 4
5 ρ(τ) ρ(τ) ρ(τ) (a) W τ (t) E(t) (b) t (c) τ (d) τ τ 1.0 Fig. 2. Numerical iteration of (2). (a) Learning function W (t) (solid line) shifted on the abscissa according to an EPSP function, E(t)(dashed line). (b) Initial delay distribution (; 0) = 1 + (), where () is Gaussian white noise with zero mean and a variance of 0:1. (c)-(d) (; t) for t = 11.0 and 14.0, respectively. A single peak emerges and evolves into a delta peak which corresponds to the equilibrium solution. T = 1:0, = 0:1, c = 0:2. From a straightforward generalization of the Hebb rule we obtain the source density Q(; t) =!!(; t)(; t) Z1?1 W! (? 0 )P ( 0 ; t) d 0 (5) with! denoting the corresponding learning rate. Without loss of generality we assume (; 0) 1 which implies (; t) 1 for arbitrary t because v(; t) 0. Equation (4) has an equilibrium solution!(; t) =! 0 provided that R 1?1 W! (x)dx = 0. The real parts of the eigenvalues are given by Re( n ) = p 2!! 0 W! ~ (?n!), where W ~! is the Fourier transform of the window function W!. For a symmetric window function like the one shown in Fig. 1b, the homogeneous solution is unstable. In contrast to case 1, there is no stable solution: weight distributions!(; t) = A(t)(? 0 ) retain their shape 5
6 but explode in size, i. e., A(t) diverges in nite time. This situation commonly arises in networks with Hebbian learning of synaptic weights [5]. 5 Discussion Due to the delay adaptation mechanism, the postsynaptic neuron becomes sensitive to a certain spatio-temporal input pattern. The idea of an ensemble coding in the nervous system comprises the notion that neurons are involved in multiple tasks. For the temporal coding this requires that they be sensitive to more than one spatio-temporal pattern. In a numerical study we now demonstrate that the delay shift learning rule is capable of adjusting two input patterns for the same postsynaptic neuron. Two populations of presynaptic model neurons generate two dierent temporal patterns of action potentials: the rst population spikes simultaneously, while spikes are generated successively in the second population. The patterns are presented at random and are disturbed by additional random spikes (Fig. 3a). The action potentials are received by a single postsynaptic cell via delayed connections. All model neurons are of the leaky integrate-and-re type with a dynamic threshold [3]. Delays are randomly initialized (Fig. 3b) and then adapted during a training process by applying the window function W shown in Fig. 1 to the dierences between pre- and postsynaptic activity. After training, the synchronous activity of the rst presynaptic population yields connections that are delayed by the same amount (peak in Fig. 3c{d). The second temporal pattern produces connections with increasing delays (baseline in Fig. 3d), which results in synchronous spike arrival, making the postsynaptic neuron responsive to the second pattern as well. Self-organized delay adaptation in sensory neural systems regulates signals carried along separate axons such that they arrive at a post-synaptic neuron simultaneously. Our stability analysis yields conditions on the learning functions W and W! for delay shift and delay selection, respectively, thus placing constraints that ensure that stable solutions exist for arbitrary temporal inputs. Recent experimental estimations of Hebbian learning windows [9] are compatible with our results suggesting that delay adaptation is possible and may be an important mechanism of signal processing in the brain. References [1] C. E. Carr, Processing of temporal information in the brain, Annu. Rev. Neurosci. 16 (1993) 223{243. 6
7 Fig. 3. Delay shift in response to dierent temporal patterns. (a) Spike trains of 81 presynaptic neurons generating two patterns. (b{d) Delay distributions prior to learning, after 100 and 600 presentations of each pattern, respectively. [2] R. Eckhorn, R. Bauer, W. Jordan, M. Borsch, W. Kruse, M. Munk and H. J. Reitboeck, Coherent oscillations: a mechanism of feature linking in the visual cortex? Biol. Cybern. 60 (1988) 121{130; C. M. Gray, P. Konig, A. K. Engel and W. Singer, Oscillatory responses in cat visual cortex exhibit inter-columnar synchronisation which reects global stimulus properties, Nature 338 (1989) 334{ 337. [3] R. Eckhorn, H. J. Reitboeck, M. Arndt and P. Dicke, Feature linking via synchronization among distributed assemblies: Simulations of results from cat visual cortex, Neural Comput. 2 (1990) 293{307. [4] C. W. Eurich, U. Ernst, K. Pawelzik, J. D. Cowan and J. G. Milton, Dynamics of self-organized delay adaptation, Phys. Rev. Lett. 82 (1999) 1594{1597. [5] W. Gerstner, R. Kempter, J. L. van Hemmen and H. Wagner, A neuronal learning rule for sub-millisecond temporal coding, Nature 383 (1996) 76{78. [6] H. Huning, H. Glunder and G. Palm, Synaptic Delay Learning in Pulse-Coupled Neurons, Neural Comp. 10 (1998) 555{565. [7] G. M. Innocenti, P. Lehmann and J.-C. Houzel, Computational structure of visual callosal axons, Europ. J. Neurosci. 6 (1994) 918{935. [8] L. A. Jeress, A place theory of sound localization, J. Comp. Physiol. Psychol. 41 (1948) 35{39. [9] H. Markram, J. Lubke, M. Frotscher and B. Sakmann, Regulation of synaptic ecacy by coincidence of postsynaptic APs and EPSPs, Science 275 (1997) 213{215. [10] L. R. Stanford, Conduction velocity variations minimize conduction time dierences among retinal ganglion cell axons, Science 238 (1987) 358{360. 7
8 [11] B. Stevens, S. Tanner and R. D. Fields, Control of myelination by specic patterns of neural impulses, J. Neurosci. 18 (1998) 9303{9311. [12] X. Yu and E. R. Lewis, Studies with spike initiators: linearization by noise allows continuous signal modulation in neural networks, IEEE Trans. Neural Netw. 36 (1989) 36{43. 6 Biosketches Christian W. Eurich got his PhD in Theoretical Physics in 1995 from the University of Bremen (Germany). As a postdoc, he worked with John Milton and Jack Cowan at the University of Chicago, and he spent some time at the Max-Planck Institute for Fluid Dynamics in Gottingen (Germany). In 1997, he returned to the Department of Theoretical Neurophysics at the University of Bremen. His research interests include neural networks with time delays, visuomotor behavior in amphibians, information processing in neural populations, avalanche phenomena in neural networks, and motor control problems such as balancing tasks and postural sway. Klaus Pawelzik nished his PhD in Theoretical Physics in 1990 at the J-W- Goethe University (Frankfurt, Germany). He became fellow post-doc at the Max-Planck- Institute for Brain Research in 1991 and joined the Nonlinear Dynamics Group of Prof. Theo Geisel at the Institute for Theoretical Physics in Frankfurt. He worked at the Computational Neurobiology Lab headed by Terry Sejnowski at the Salk Institute, San Diego in 1994/1995. He continued his work on theoretical aspects of dynamics and coding in neural systems in 1996 at the Max-Planck-Institut fur Stromungsforschung (Gottingen, Germany) until in 1998 he became Professor for Theoretical Physics and Biophysics at the University of Bremen. The range of his interests includes models of the visual system and the hippocampus, networks of spiking neurons, dynamics of synapses, neural coding, data analysis, articial neural networks, and robotics. Udo Ernst is currently nishing his PhD in Theoretical Physics at the J- W-Goethe University (Frankfurt, Germany). Since 1997 he also works at the Max-Planck-Institut fur Stromungsforschung (Gottingen, Germany). His interests cover temporal coding and nonlinear dynamics in neuronal systems, synchronization and oscillation phenomena, and dynamics and organization of receptive elds in the visual cortex. Andreas Thiel studied Physics at the University of Marburg (Germany). In 1998, he nished his Diploma thesis about self-organizing connections between orientation detectors. Since 1999, he is a PhD student at the University of Bremen. 8
9 Jack D. Cowan nished his PhD in Electrical Engineering in 1967 at the Imperial College of Science and Technology in London. In 1967, he became Professor of Mathematical Biology at the University of Chicago. Since then, he has held several professoral positions there, including positions at the Collegiate Divison of Biology, the Department of Biophysics and Theoretical Biology and the Department of Neurology. In 1989, he became External Professor at Santa Fe Institute. Jack Cowan currently is Professor of Applied Mathematics and Theoretical Biology at the University of Chicago. His research in neurobiology focuses on the development and regeneration of eye-brain connections, the architecture of primate visual cortex, on hallucinations, epilepsies and visual migraines. His interests in applied mathematics include local bifurcation theory, bifurcation in the presence of symmetry and stochastic nonlinear processes with applications to neurobiology. John G. Milton got his PhD in Biophysical Chemistry in 1975 from McGill University (Montreal, Canada). After working in Japan and France as a postdoc, he received his MDCM from McGill University in From 1987 until 1988 he was Assistant Professor at the Department of Physiology at McGill University. After various guest faculties in Canada and the USA, in 1989 he became Adjunct Professor at the Center of Nonlinear Dynamics in Physiology and Medicine at McGill University. Since 1996, John Milton is also Associate Professor at the Department of Neurology at the University of Chicago. His interests in research include biophysical systems, especially those with delays like the pupil light reex or postural sway, dynamical diseases and spiking neurons. 9
Visuomotor tracking on a computer screen an experimental paradigm to study the dynamics of motor control
Neurocomputing 58 60 (2004) 517 523 www.elsevier.com/locate/neucom Visuomotor tracking on a computer screen an experimental paradigm to study the dynamics of motor control R. Bormann a;, J.-L. Cabrera
More informationConsider the following spike trains from two different neurons N1 and N2:
About synchrony and oscillations So far, our discussions have assumed that we are either observing a single neuron at a, or that neurons fire independent of each other. This assumption may be correct in
More informationNeural spike statistics modify the impact of background noise
Neurocomputing 38}40 (2001) 445}450 Neural spike statistics modify the impact of background noise Stefan D. Wilke*, Christian W. Eurich Institut fu( r Theoretische Physik, Universita( t Bremen, Postfach
More informationHow to read a burst duration code
Neurocomputing 58 60 (2004) 1 6 www.elsevier.com/locate/neucom How to read a burst duration code Adam Kepecs a;, John Lisman b a Cold Spring Harbor Laboratory, Marks Building, 1 Bungtown Road, Cold Spring
More informationSpike-Frequency Adaptation: Phenomenological Model and Experimental Tests
Spike-Frequency Adaptation: Phenomenological Model and Experimental Tests J. Benda, M. Bethge, M. Hennig, K. Pawelzik & A.V.M. Herz February, 7 Abstract Spike-frequency adaptation is a common feature of
More informationSelf-organized Criticality and Synchronization in a Pulse-coupled Integrate-and-Fire Neuron Model Based on Small World Networks
Commun. Theor. Phys. (Beijing, China) 43 (2005) pp. 466 470 c International Academic Publishers Vol. 43, No. 3, March 15, 2005 Self-organized Criticality and Synchronization in a Pulse-coupled Integrate-and-Fire
More informationHigh-conductance states in a mean-eld cortical network model
Neurocomputing 58 60 (2004) 935 940 www.elsevier.com/locate/neucom High-conductance states in a mean-eld cortical network model Alexander Lerchner a;, Mandana Ahmadi b, John Hertz b a Oersted-DTU, Technical
More informationExtracting Oscillations: Neuronal Coincidence Detection with Noisy Periodic Spike Input
LETTER Communicated by Peter König Extracting Oscillations: Neuronal Coincidence Detection with Noisy Periodic Spike Input Richard Kempter Physik-Department der TU München, D-85747 Garching bei München,
More informationCoherence detection in a spiking neuron via Hebbian learning
Neurocomputing 44 46 (2002) 133 139 www.elsevier.com/locate/neucom Coherence detection in a spiking neuron via Hebbian learning L. Perrinet, M. Samuelides ONERA-DTIM, 2 Av. E. Belin, BP 4025, 31055 Toulouse,
More informationNeural Networks 1 Synchronization in Spiking Neural Networks
CS 790R Seminar Modeling & Simulation Neural Networks 1 Synchronization in Spiking Neural Networks René Doursat Department of Computer Science & Engineering University of Nevada, Reno Spring 2006 Synchronization
More informationChapter 37 Active Reading Guide Neurons, Synapses, and Signaling
Name: AP Biology Mr. Croft Section 1 1. What is a neuron? Chapter 37 Active Reading Guide Neurons, Synapses, and Signaling 2. Neurons can be placed into three groups, based on their location and function.
More information(a) (b) (c) Time Time. Time
Baltzer Journals Stochastic Neurodynamics and the System Size Expansion Toru Ohira and Jack D. Cowan 2 Sony Computer Science Laboratory 3-4-3 Higashi-gotanda, Shinagawa, Tokyo 4, Japan E-mail: ohiracsl.sony.co.jp
More informationCausality and communities in neural networks
Causality and communities in neural networks Leonardo Angelini, Daniele Marinazzo, Mario Pellicoro, Sebastiano Stramaglia TIRES-Center for Signal Detection and Processing - Università di Bari, Bari, Italy
More informationFast neural network simulations with population density methods
Fast neural network simulations with population density methods Duane Q. Nykamp a,1 Daniel Tranchina b,a,c,2 a Courant Institute of Mathematical Science b Department of Biology c Center for Neural Science
More informationThe Mixed States of Associative Memories Realize Unimodal Distribution of Dominance Durations in Multistable Perception
The Mixed States of Associative Memories Realize Unimodal Distribution of Dominance Durations in Multistable Perception Takashi Kanamaru Department of Mechanical Science and ngineering, School of Advanced
More informationAdaptation in the Neural Code of the Retina
Adaptation in the Neural Code of the Retina Lens Retina Fovea Optic Nerve Optic Nerve Bottleneck Neurons Information Receptors: 108 95% Optic Nerve 106 5% After Polyak 1941 Visual Cortex ~1010 Mean Intensity
More informationExtracting Oscillations: Neuronal Coincidence Detection with Noisy Periodic Spike Input
LETTER Communicated by Peter König Extracting Oscillations: Neuronal Coincidence Detection with Noisy Periodic Spike Input Richard Kempter Physik-Department der TU München, D-85747 Garching bei München,
More informationSupporting Online Material for
www.sciencemag.org/cgi/content/full/319/5869/1543/dc1 Supporting Online Material for Synaptic Theory of Working Memory Gianluigi Mongillo, Omri Barak, Misha Tsodyks* *To whom correspondence should be addressed.
More informationDendrites - receives information from other neuron cells - input receivers.
The Nerve Tissue Neuron - the nerve cell Dendrites - receives information from other neuron cells - input receivers. Cell body - includes usual parts of the organelles of a cell (nucleus, mitochondria)
More informationHopfield Neural Network and Associative Memory. Typical Myelinated Vertebrate Motoneuron (Wikipedia) Topic 3 Polymers and Neurons Lecture 5
Hopfield Neural Network and Associative Memory Typical Myelinated Vertebrate Motoneuron (Wikipedia) PHY 411-506 Computational Physics 2 1 Wednesday, March 5 1906 Nobel Prize in Physiology or Medicine.
More informationLocalized Excitations in Networks of Spiking Neurons
Localized Excitations in Networks of Spiking Neurons Hecke Schrobsdorff Bernstein Center for Computational Neuroscience Göttingen Max Planck Institute for Dynamics and Self-Organization Seminar: Irreversible
More informationProcessing of Time Series by Neural Circuits with Biologically Realistic Synaptic Dynamics
Processing of Time Series by Neural Circuits with iologically Realistic Synaptic Dynamics Thomas Natschläger & Wolfgang Maass Institute for Theoretical Computer Science Technische Universität Graz, ustria
More informationSynchronization, oscillations, and 1/ f noise in networks of spiking neurons
Synchronization, oscillations, and 1/ f noise in networks of spiking neurons Martin Stemmler, Marius Usher, and Christof Koch Computation and Neural Systems, 139-74 California Institute of Technology Pasadena,
More informationDynamical Synapses Give Rise to a Power-Law Distribution of Neuronal Avalanches
Dynamical Synapses Give Rise to a Power-Law Distribution of Neuronal Avalanches Anna Levina 3,4, J. Michael Herrmann 1,2, Theo Geisel 1,2,4 1 Bernstein Center for Computational Neuroscience Göttingen 2
More informationDetection of spike patterns using pattern ltering, with applications to sleep replay in birdsong
Neurocomputing 52 54 (2003) 19 24 www.elsevier.com/locate/neucom Detection of spike patterns using pattern ltering, with applications to sleep replay in birdsong Zhiyi Chi a;, Peter L. Rauske b, Daniel
More informationDynamical Constraints on Computing with Spike Timing in the Cortex
Appears in Advances in Neural Information Processing Systems, 15 (NIPS 00) Dynamical Constraints on Computing with Spike Timing in the Cortex Arunava Banerjee and Alexandre Pouget Department of Brain and
More informationLearning at the edge of chaos : Temporal Coupling of Spiking Neurons Controller for Autonomous Robotic
Learning at the edge of chaos : Temporal Coupling of Spiking Neurons Controller for Autonomous Robotic Hédi Soula and Aravind Alwan and Guillaume Beslon ALAB Team of Prisma Lab Computer Science Dept National
More informationNervous Tissue. Neurons Electrochemical Gradient Propagation & Transduction Neurotransmitters Temporal & Spatial Summation
Nervous Tissue Neurons Electrochemical Gradient Propagation & Transduction Neurotransmitters Temporal & Spatial Summation What is the function of nervous tissue? Maintain homeostasis & respond to stimuli
More informationAn Introductory Course in Computational Neuroscience
An Introductory Course in Computational Neuroscience Contents Series Foreword Acknowledgments Preface 1 Preliminary Material 1.1. Introduction 1.1.1 The Cell, the Circuit, and the Brain 1.1.2 Physics of
More informationPhase-coupling in Two-Dimensional Networks of Interacting Oscillators
Phase-coupling in Two-Dimensional Networks of Interacting Oscillators Ernst Niebur, Daniel M. Kammen, Christof Koch, Daniel Ruderman! & Heinz G. Schuster2 Computation and Neural Systems Caltech 216-76
More informationSynaptic Input. Linear Model of Synaptic Transmission. Professor David Heeger. September 5, 2000
Synaptic Input Professor David Heeger September 5, 2000 The purpose of this handout is to go a bit beyond the discussion in Ch. 6 of The Book of Genesis on synaptic input, and give some examples of how
More informationTriplets of Spikes in a Model of Spike Timing-Dependent Plasticity
The Journal of Neuroscience, September 20, 2006 26(38):9673 9682 9673 Behavioral/Systems/Cognitive Triplets of Spikes in a Model of Spike Timing-Dependent Plasticity Jean-Pascal Pfister and Wulfram Gerstner
More informationA Three-dimensional Physiologically Realistic Model of the Retina
A Three-dimensional Physiologically Realistic Model of the Retina Michael Tadross, Cameron Whitehouse, Melissa Hornstein, Vicky Eng and Evangelia Micheli-Tzanakou Department of Biomedical Engineering 617
More informationModeling Synaptic Plasticity in Conjunction with the Timing of Pre- and Postsynaptic Action Potentials
LETTER Communicated by Misha Tsodyks Modeling Synaptic Plasticity in Conjunction with the Timing of Pre- and Postsynaptic Action Potentials Werner M. Kistler J. Leo van Hemmen Physik Department der TU
More informationProbabilistic Models in Theoretical Neuroscience
Probabilistic Models in Theoretical Neuroscience visible unit Boltzmann machine semi-restricted Boltzmann machine restricted Boltzmann machine hidden unit Neural models of probabilistic sampling: introduction
More informationControl and Integration. Nervous System Organization: Bilateral Symmetric Animals. Nervous System Organization: Radial Symmetric Animals
Control and Integration Neurophysiology Chapters 10-12 Nervous system composed of nervous tissue cells designed to conduct electrical impulses rapid communication to specific cells or groups of cells Endocrine
More informationIntroduction to Neural Networks U. Minn. Psy 5038 Spring, 1999 Daniel Kersten. Lecture 2a. The Neuron - overview of structure. From Anderson (1995)
Introduction to Neural Networks U. Minn. Psy 5038 Spring, 1999 Daniel Kersten Lecture 2a The Neuron - overview of structure From Anderson (1995) 2 Lect_2a_Mathematica.nb Basic Structure Information flow:
More informationShort-Term Synaptic Plasticity and Network Behavior
LETTER Communicated by Misha Tsodyks Short-Term Synaptic Plasticity and Network Behavior Werner M. Kistler J. Leo van Hemmen Physik-Department der TU München, D-85747 Garching bei München, Germany We develop
More informationAnalysis of Neural Networks with Chaotic Dynamics
Chaos, Solitonr & Fructals Vol. 3, No. 2, pp. 133-139, 1993 Printed in Great Britain @60-0779/93$6.00 + 40 0 1993 Pergamon Press Ltd Analysis of Neural Networks with Chaotic Dynamics FRANCOIS CHAPEAU-BLONDEAU
More informationChapter 9. Nerve Signals and Homeostasis
Chapter 9 Nerve Signals and Homeostasis A neuron is a specialized nerve cell that is the functional unit of the nervous system. Neural signaling communication by neurons is the process by which an animal
More informationNervous Systems: Neuron Structure and Function
Nervous Systems: Neuron Structure and Function Integration An animal needs to function like a coherent organism, not like a loose collection of cells. Integration = refers to processes such as summation
More informationBursting and Chaotic Activities in the Nonlinear Dynamics of FitzHugh-Rinzel Neuron Model
Bursting and Chaotic Activities in the Nonlinear Dynamics of FitzHugh-Rinzel Neuron Model Abhishek Yadav *#, Anurag Kumar Swami *, Ajay Srivastava * * Department of Electrical Engineering, College of Technology,
More informationBiosciences in the 21st century
Biosciences in the 21st century Lecture 1: Neurons, Synapses, and Signaling Dr. Michael Burger Outline: 1. Why neuroscience? 2. The neuron 3. Action potentials 4. Synapses 5. Organization of the nervous
More informationMax-Planck-Institut fur Mathematik in den Naturwissenschaften Leipzig Synchronized chaos and other coherent states for two coupled neurons by Frank Pasemann Preprint-Nr.: 44 1998 Synchronized Chaos and
More informationMEMBRANE POTENTIALS AND ACTION POTENTIALS:
University of Jordan Faculty of Medicine Department of Physiology & Biochemistry Medical students, 2017/2018 +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ Review: Membrane physiology
More informationNeural variability and Poisson statistics
Neural variability and Poisson statistics January 15, 2014 1 Introduction We are in the process of deriving the Hodgkin-Huxley model. That model describes how an action potential is generated by ion specic
More informationEmergence of resonances in neural systems: the interplay between adaptive threshold and short-term synaptic plasticity
Emergence of resonances in neural systems: the interplay between adaptive threshold and short-term synaptic plasticity Jorge F. Mejias 1,2 and Joaquín J. Torres 2 1 Department of Physics and Center for
More informationComplex Dynamics Is Abolished in Delayed Recurrent Systems with Distributed Feedback Times
Complex Dynamics Is Abolished in Delayed Recurrent Systems with Distributed Feedback Times ANDREAS THIEL, 1 HELMUT SCHWEGLER, 2 AND CHRISTIAN W. EURICH 2 1 Neurobiologie, Carl von Ossietzky Universität
More informationDendritic computation
Dendritic computation Dendrites as computational elements: Passive contributions to computation Active contributions to computation Examples Geometry matters: the isopotential cell Injecting current I
More informationTwo dimensional synaptically generated traveling waves in a theta-neuron neuralnetwork
Neurocomputing 38}40 (2001) 789}795 Two dimensional synaptically generated traveling waves in a theta-neuron neuralnetwork Remus Osan*, Bard Ermentrout Department of Mathematics, University of Pittsburgh,
More informationNeurons, Synapses, and Signaling
Chapter 48 Neurons, Synapses, and Signaling PowerPoint Lecture Presentations for Biology Eighth Edition Neil Campbell and Jane Reece Lectures by Chris Romero, updated by Erin Barley with contributions
More informationIntro and Homeostasis
Intro and Homeostasis Physiology - how the body works. Homeostasis - staying the same. Functional Types of Neurons Sensory (afferent - coming in) neurons: Detects the changes in the body. Informations
More informationAn Efficient Method for Computing Synaptic Conductances Based on a Kinetic Model of Receptor Binding
NOTE Communicated by Michael Hines An Efficient Method for Computing Synaptic Conductances Based on a Kinetic Model of Receptor Binding A. Destexhe Z. F. Mainen T. J. Sejnowski The Howard Hughes Medical
More informationNervous System Organization
The Nervous System Nervous System Organization Receptors respond to stimuli Sensory receptors detect the stimulus Motor effectors respond to stimulus Nervous system divisions Central nervous system Command
More informationComputational Explorations in Cognitive Neuroscience Chapter 2
Computational Explorations in Cognitive Neuroscience Chapter 2 2.4 The Electrophysiology of the Neuron Some basic principles of electricity are useful for understanding the function of neurons. This is
More informationBiological Modeling of Neural Networks:
Week 14 Dynamics and Plasticity 14.1 Reservoir computing - Review:Random Networks - Computing with rich dynamics Biological Modeling of Neural Networks: 14.2 Random Networks - stationary state - chaos
More informationThe Variance of Covariance Rules for Associative Matrix Memories and Reinforcement Learning
NOTE Communicated by David Willshaw The Variance of Covariance Rules for Associative Matrix Memories and Reinforcement Learning Peter Dayan Terrence J. Sejnowski Computational Neurobiology Laboratory,
More informationRoles of Fluctuations. in Pulsed Neural Networks
Ph.D. Thesis Roles of Fluctuations in Pulsed Neural Networks Supervisor Professor Yoichi Okabe Takashi Kanamaru Department of Advanced Interdisciplinary Studies, Faculty of Engineering, The University
More informationCOMP304 Introduction to Neural Networks based on slides by:
COMP34 Introduction to Neural Networks based on slides by: Christian Borgelt http://www.borgelt.net/ Christian Borgelt Introduction to Neural Networks Motivation: Why (Artificial) Neural Networks? (Neuro-)Biology
More informationNervous Tissue. Neurons Neural communication Nervous Systems
Nervous Tissue Neurons Neural communication Nervous Systems What is the function of nervous tissue? Maintain homeostasis & respond to stimuli Sense & transmit information rapidly, to specific cells and
More informationNeurons and Nervous Systems
34 Neurons and Nervous Systems Concept 34.1 Nervous Systems Consist of Neurons and Glia Nervous systems have two categories of cells: Neurons, or nerve cells, are excitable they generate and transmit electrical
More informationOn the Computational Complexity of Networks of Spiking Neurons
On the Computational Complexity of Networks of Spiking Neurons (Extended Abstract) Wolfgang Maass Institute for Theoretical Computer Science Technische Universitaet Graz A-80lO Graz, Austria e-mail: maass@igi.tu-graz.ac.at
More informationActivity Driven Adaptive Stochastic. Resonance. Gregor Wenning and Klaus Obermayer. Technical University of Berlin.
Activity Driven Adaptive Stochastic Resonance Gregor Wenning and Klaus Obermayer Department of Electrical Engineering and Computer Science Technical University of Berlin Franklinstr. 8/9, 187 Berlin fgrewe,obyg@cs.tu-berlin.de
More informationModeling retinal high and low contrast sensitivity lters. T. Lourens. Abstract
Modeling retinal high and low contrast sensitivity lters T. Lourens Department of Computer Science University of Groningen P.O. Box 800, 9700 AV Groningen, The Netherlands E-mail: tino@cs.rug.nl Abstract
More informationChapter 48 Neurons, Synapses, and Signaling
Chapter 48 Neurons, Synapses, and Signaling Concept 48.1 Neuron organization and structure reflect function in information transfer Neurons are nerve cells that transfer information within the body Neurons
More informationEffects of Interactive Function Forms and Refractoryperiod in a Self-Organized Critical Model Based on Neural Networks
Commun. Theor. Phys. (Beijing, China) 42 (2004) pp. 121 125 c International Academic Publishers Vol. 42, No. 1, July 15, 2004 Effects of Interactive Function Forms and Refractoryperiod in a Self-Organized
More informationVisual Selection and Attention Shifting Based on FitzHugh-Nagumo Equations
Visual Selection and Attention Shifting Based on FitzHugh-Nagumo Equations Haili Wang, Yuanhua Qiao, Lijuan Duan, Faming Fang, Jun Miao 3, and Bingpeng Ma 3 College of Applied Science, Beijing University
More informationDynamic Stochastic Synapses as Computational Units
LETTER Communicated by Laurence Abbott Dynamic Stochastic Synapses as Computational Units Wolfgang Maass Institute for Theoretical Computer Science, Technische Universität Graz, A 8010 Graz, Austria Anthony
More information(Feed-Forward) Neural Networks Dr. Hajira Jabeen, Prof. Jens Lehmann
(Feed-Forward) Neural Networks 2016-12-06 Dr. Hajira Jabeen, Prof. Jens Lehmann Outline In the previous lectures we have learned about tensors and factorization methods. RESCAL is a bilinear model for
More informationEffects of synaptic conductance on the voltage distribution and firing rate of spiking neurons
PHYSICAL REVIEW E 69, 051918 (2004) Effects of synaptic conductance on the voltage distribution and firing rate of spiking neurons Magnus J. E. Richardson* Laboratory of Computational Neuroscience, Brain
More informationEfficient temporal processing with biologically realistic dynamic synapses
INSTITUTE OF PHYSICS PUBLISHING NETWORK: COMPUTATION IN NEURAL SYSTEMS Network: Comput. Neural Syst. 12 (21) 75 87 www.iop.org/journals/ne PII: S954-898X(1)2361-3 Efficient temporal processing with biologically
More informationDendritic cable with active spines: a modelling study in the spike-diffuse-spike framework
Dendritic cable with active spines: a modelling study in the spike-diffuse-spike framework Yulia Timofeeva a, Gabriel Lord a and Stephen Coombes b a Department of Mathematics, Heriot-Watt University, Edinburgh,
More informationStochastic Oscillator Death in Globally Coupled Neural Systems
Journal of the Korean Physical Society, Vol. 52, No. 6, June 2008, pp. 19131917 Stochastic Oscillator Death in Globally Coupled Neural Systems Woochang Lim and Sang-Yoon Kim y Department of Physics, Kangwon
More informationSignal, donnée, information dans les circuits de nos cerveaux
NeuroSTIC Brest 5 octobre 2017 Signal, donnée, information dans les circuits de nos cerveaux Claude Berrou Signal, data, information: in the field of telecommunication, everything is clear It is much less
More informationChaotic Balanced State in a Model of Cortical Circuits C. van Vreeswijk and H. Sompolinsky Racah Institute of Physics and Center for Neural Computatio
Chaotic Balanced State in a Model of Cortical Circuits C. van Vreeswij and H. Sompolinsy Racah Institute of Physics and Center for Neural Computation Hebrew University Jerusalem, 91904 Israel 10 March
More informationLecture 14 Population dynamics and associative memory; stable learning
Lecture 14 Population dynamics and associative memory; stable learning -Introduction -Associative Memory -Dense networks (mean-ield) -Population dynamics and Associative Memory -Discussion Systems or computing
More informationWhen is an Integrate-and-fire Neuron like a Poisson Neuron?
When is an Integrate-and-fire Neuron like a Poisson Neuron? Charles F. Stevens Salk Institute MNL/S La Jolla, CA 92037 cfs@salk.edu Anthony Zador Salk Institute MNL/S La Jolla, CA 92037 zador@salk.edu
More informationArtificial Neural Network and Fuzzy Logic
Artificial Neural Network and Fuzzy Logic 1 Syllabus 2 Syllabus 3 Books 1. Artificial Neural Networks by B. Yagnanarayan, PHI - (Cover Topologies part of unit 1 and All part of Unit 2) 2. Neural Networks
More informationNerve Signal Conduction. Resting Potential Action Potential Conduction of Action Potentials
Nerve Signal Conduction Resting Potential Action Potential Conduction of Action Potentials Resting Potential Resting neurons are always prepared to send a nerve signal. Neuron possesses potential energy
More informationNervous System Organization
The Nervous System Chapter 44 Nervous System Organization All animals must be able to respond to environmental stimuli -Sensory receptors = Detect stimulus -Motor effectors = Respond to it -The nervous
More informationEffects of Interactive Function Forms in a Self-Organized Critical Model Based on Neural Networks
Commun. Theor. Phys. (Beijing, China) 40 (2003) pp. 607 613 c International Academic Publishers Vol. 40, No. 5, November 15, 2003 Effects of Interactive Function Forms in a Self-Organized Critical Model
More informationQuantitative Electrophysiology
ECE 795: Quantitative Electrophysiology Notes for Lecture #4 Wednesday, October 4, 2006 7. CHEMICAL SYNAPSES AND GAP JUNCTIONS We will look at: Chemical synapses in the nervous system Gap junctions in
More informationEmergence of network structure due to spike-timing-dependent plasticity in recurrent neuronal networks IV
Biol Cybern (2009) 101:427 444 DOI 10.1007/s00422-009-0346-1 ORIGINAL PAPER Emergence of network structure due to spike-timing-dependent plasticity in recurrent neuronal networks IV Structuring synaptic
More informationThe Bayesian Brain. Robert Jacobs Department of Brain & Cognitive Sciences University of Rochester. May 11, 2017
The Bayesian Brain Robert Jacobs Department of Brain & Cognitive Sciences University of Rochester May 11, 2017 Bayesian Brain How do neurons represent the states of the world? How do neurons represent
More informationTime-Skew Hebb Rule in a Nonisopotential Neuron
Time-Skew Hebb Rule in a Nonisopotential Neuron Barak A. Pearlmutter To appear (1995) in Neural Computation, 7(4) 76 712 Abstract In an isopotential neuron with rapid response, it has been shown that the
More informationMath in systems neuroscience. Quan Wen
Math in systems neuroscience Quan Wen Human brain is perhaps the most complex subject in the universe 1 kg brain 10 11 neurons 180,000 km nerve fiber 10 15 synapses 10 18 synaptic proteins Multiscale
More informationNonlinear reverse-correlation with synthesized naturalistic noise
Cognitive Science Online, Vol1, pp1 7, 2003 http://cogsci-onlineucsdedu Nonlinear reverse-correlation with synthesized naturalistic noise Hsin-Hao Yu Department of Cognitive Science University of California
More informationAction Potentials & Nervous System. Bio 219 Napa Valley College Dr. Adam Ross
Action Potentials & Nervous System Bio 219 Napa Valley College Dr. Adam Ross Review: Membrane potentials exist due to unequal distribution of charge across the membrane Concentration gradients drive ion
More informationREAL-TIME COMPUTING WITHOUT STABLE
REAL-TIME COMPUTING WITHOUT STABLE STATES: A NEW FRAMEWORK FOR NEURAL COMPUTATION BASED ON PERTURBATIONS Wolfgang Maass Thomas Natschlager Henry Markram Presented by Qiong Zhao April 28 th, 2010 OUTLINE
More information2 1. Introduction. Neuronal networks often exhibit a rich variety of oscillatory behavior. The dynamics of even a single cell may be quite complicated
GEOMETRIC ANALYSIS OF POPULATION RHYTHMS IN SYNAPTICALLY COUPLED NEURONAL NETWORKS J. Rubin and D. Terman Dept. of Mathematics; Ohio State University; Columbus, Ohio 43210 Abstract We develop geometric
More informationAbstract. Author Summary
1 Self-organization of microcircuits in networks of spiking neurons with plastic synapses Gabriel Koch Ocker 1,3, Ashok Litwin-Kumar 2,3,4, Brent Doiron 2,3 1: Department of Neuroscience, University of
More informationPlasticity and Learning
Chapter 8 Plasticity and Learning 8.1 Introduction Activity-dependent synaptic plasticity is widely believed to be the basic phenomenon underlying learning and memory, and it is also thought to play a
More informationNOTES: CH 48 Neurons, Synapses, and Signaling
NOTES: CH 48 Neurons, Synapses, and Signaling A nervous system has three overlapping functions: 1) SENSORY INPUT: signals from sensory receptors to integration centers 2) INTEGRATION: information from
More informationComputing with Inter-spike Interval Codes in Networks of Integrate and Fire Neurons
Computing with Inter-spike Interval Codes in Networks of Integrate and Fire Neurons Dileep George a,b Friedrich T. Sommer b a Dept. of Electrical Engineering, Stanford University 350 Serra Mall, Stanford,
More informationIN THIS turorial paper we exploit the relationship between
508 IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL. 10, NO. 3, MAY 1999 Weakly Pulse-Coupled Oscillators, FM Interactions, Synchronization, Oscillatory Associative Memory Eugene M. Izhikevich Abstract We study
More informationNeurons, Synapses, and Signaling
LECTURE PRESENTATIONS For CAMPBELL BIOLOGY, NINTH EDITION Jane B. Reece, Lisa A. Urry, Michael L. Cain, Steven A. Wasserman, Peter V. Minorsky, Robert B. Jackson Chapter 48 Neurons, Synapses, and Signaling
More informationof the dynamics. There is a competition between the capacity of the network and the stability of the
Special Issue on the Role and Control of Random Events in Biological Systems c World Scientic Publishing Company LEARNING SYNFIRE CHAINS: TURNING NOISE INTO SIGNAL JOHN HERTZ and ADAM PRUGEL-BENNETT y
More informationIntroduction Principles of Signaling and Organization p. 3 Signaling in Simple Neuronal Circuits p. 4 Organization of the Retina p.
Introduction Principles of Signaling and Organization p. 3 Signaling in Simple Neuronal Circuits p. 4 Organization of the Retina p. 5 Signaling in Nerve Cells p. 9 Cellular and Molecular Biology of Neurons
More informationNeurophysiology. Danil Hammoudi.MD
Neurophysiology Danil Hammoudi.MD ACTION POTENTIAL An action potential is a wave of electrical discharge that travels along the membrane of a cell. Action potentials are an essential feature of animal
More informationPhysiology Unit 2. MEMBRANE POTENTIALS and SYNAPSES
Physiology Unit 2 MEMBRANE POTENTIALS and SYNAPSES Neuron Communication Neurons are stimulated by receptors on dendrites and cell bodies (soma) Ligand gated ion channels GPCR s Neurons stimulate cells
More information