Coherence detection in a spiking neuron via Hebbian learning

Similar documents
A gradient descent rule for spiking neurons emitting multiple spikes

CSE/NB 528 Final Lecture: All Good Things Must. CSE/NB 528: Final Lecture

Fast neural network simulations with population density methods

High-conductance states in a mean-eld cortical network model

of the dynamics. There is a competition between the capacity of the network and the stability of the

In#uence of dendritic topologyon "ring patterns in model neurons

How to read a burst duration code

Triplets of Spikes in a Model of Spike Timing-Dependent Plasticity

Synaptic dynamics. John D. Murray. Synaptic currents. Simple model of the synaptic gating variable. First-order kinetics

W (x) W (x) (b) (a) 1 N

This script will produce a series of pulses of amplitude 40 na, duration 1ms, recurring every 50 ms.

Short-Term Synaptic Plasticity and Network Behavior

Computing with Inter-spike Interval Codes in Networks of Integrate and Fire Neurons

Learning Spatio-Temporally Encoded Pattern Transformations in Structured Spiking Neural Networks 12

Novel VLSI Implementation for Triplet-based Spike-Timing Dependent Plasticity

CISC 3250 Systems Neuroscience

Computing with inter-spike interval codes in networks ofintegrate and fire neurons

A Learning Theory for Reward-Modulated Spike-Timing-Dependent Plasticity with Application to Biofeedback

STDP Learning of Image Patches with Convolutional Spiking Neural Networks

Causality and communities in neural networks

An Introductory Course in Computational Neuroscience

Basic elements of neuroelectronics -- membranes -- ion channels -- wiring. Elementary neuron models -- conductance based -- modelers alternatives

Decoding. How well can we learn what the stimulus is by looking at the neural responses?

On the Computational Complexity of Networks of Spiking Neurons

Methods for Estimating the Computational Power and Generalization Capability of Neural Microcircuits

An alternate burst analysis for detecting intra-burst rings based on inter-burst periods

Consider the following spike trains from two different neurons N1 and N2:

Control and Integration. Nervous System Organization: Bilateral Symmetric Animals. Nervous System Organization: Radial Symmetric Animals

Topological target patterns and population oscillations in a network with random gap junctional coupling

Biological Modeling of Neural Networks

Linking non-binned spike train kernels to several existing spike train metrics

Introduction to Neural Networks

How do biological neurons learn? Insights from computational modelling of

Supporting Online Material for

Probabilistic Models in Theoretical Neuroscience

Processing of Time Series by Neural Circuits with Biologically Realistic Synaptic Dynamics

Dynamical Constraints on Computing with Spike Timing in the Cortex

Activity Driven Adaptive Stochastic. Resonance. Gregor Wenning and Klaus Obermayer. Technical University of Berlin.

Neuron. Detector Model. Understanding Neural Components in Detector Model. Detector vs. Computer. Detector. Neuron. output. axon

The Spike Response Model: A Framework to Predict Neuronal Spike Trains

Introduction to Neural Networks U. Minn. Psy 5038 Spring, 1999 Daniel Kersten. Lecture 2a. The Neuron - overview of structure. From Anderson (1995)

Modeling Synaptic Plasticity in Conjunction with the Timing of Pre- and Postsynaptic Action Potentials

Sampling-based probabilistic inference through neural and synaptic dynamics

Basic elements of neuroelectronics -- membranes -- ion channels -- wiring

3 Detector vs. Computer

Lecture 11 : Simple Neuron Models. Dr Eileen Nugent

Detection of spike patterns using pattern ltering, with applications to sleep replay in birdsong

An Efficient Method for Computing Synaptic Conductances Based on a Kinetic Model of Receptor Binding

Neurophysiology. Danil Hammoudi.MD

Nervous Systems: Neuron Structure and Function

arxiv: v1 [cs.ne] 19 Sep 2015

Fast and exact simulation methods applied on a broad range of neuron models

Temporal Pattern Analysis

Math in systems neuroscience. Quan Wen

Artificial Neural Network and Fuzzy Logic

Dendritic computation

DEVS Simulation of Spiking Neural Networks

Outline. NIP: Hebbian Learning. Overview. Types of Learning. Neural Information Processing. Amos Storkey

Mechanistic modeling of the retinogeniculate circuit in cat

Computational Explorations in Cognitive Neuroscience Chapter 2

Time-Skew Hebb Rule in a Nonisopotential Neuron

(a) (b) (c) Time Time. Time

Two dimensional synaptically generated traveling waves in a theta-neuron neuralnetwork

Mathematical Models of Dynamic Behavior of Individual Neural Networks of Central Nervous System

Model of a Biological Neuron as a Temporal Neural Network

Learning at the edge of chaos : Temporal Coupling of Spiking Neurons Controller for Autonomous Robotic

Branch-Specific Plasticity Enables Self-Organization of Nonlinear Computation in Single Neurons

Spiking Neural Network Training Using Evolutionary Algorithms

applications to neuroscience

Deconstructing Actual Neurons

arxiv: v1 [cs.ne] 30 Mar 2013

INTRODUCTION TO NEURAL NETWORKS

Neural Coding: Integrate-and-Fire Models of Single and Multi-Neuron Responses

Nervous Tissue. Neurons Electrochemical Gradient Propagation & Transduction Neurotransmitters Temporal & Spatial Summation

(Feed-Forward) Neural Networks Dr. Hajira Jabeen, Prof. Jens Lehmann

Biosciences in the 21st century

Reducing the Variability of Neural Responses: A Computational Theory of Spike-Timing-Dependent Plasticity

Neurons and Nervous Systems

DISCRETE EVENT SIMULATION IN THE NEURON ENVIRONMENT

Neural networks. Chapter 19, Sections 1 5 1

Lecture 4: Feed Forward Neural Networks

Layer 3 patchy recurrent excitatory connections may determine the spatial organization of sustained activity in the primate prefrontal cortex

Modeling of Retinal Ganglion Cell Responses to Electrical Stimulation with Multiple Electrodes L.A. Hruby Salk Institute for Biological Studies

Neuromorphic Network Based on Carbon Nanotube/Polymer Composites

1 Introduction Tasks like voice or face recognition are quite dicult to realize with conventional computer systems, even for the most powerful of them

Neuron Structure. Why? Model 1 Parts of a Neuron. What are the essential structures that make up a neuron?

Shigetaka Fujita. Rokkodai, Nada, Kobe 657, Japan. Haruhiko Nishimura. Yashiro-cho, Kato-gun, Hyogo , Japan. Abstract

Reducing Spike Train Variability: A Computational Theory Of Spike-Timing Dependent Plasticity

Machine Learning. Neural Networks. (slides from Domingos, Pardo, others)

Physiology Unit 2. MEMBRANE POTENTIALS and SYNAPSES

Chapter 9. Nerve Signals and Homeostasis

Biological Modeling of Neural Networks:

Dendrites - receives information from other neuron cells - input receivers.

Chapter 48 Neurons, Synapses, and Signaling

Lecture 14 Population dynamics and associative memory; stable learning

Implementing Synaptic Plasticity in a VLSI Spiking Neural Network Model

Neural Modeling and Computational Neuroscience. Claudio Gallicchio

Machine Learning. Neural Networks. (slides from Domingos, Pardo, others)

Integration of synaptic inputs in dendritic trees

Effects of Interactive Function Forms and Refractoryperiod in a Self-Organized Critical Model Based on Neural Networks

Transcription:

Neurocomputing 44 46 (2002) 133 139 www.elsevier.com/locate/neucom Coherence detection in a spiking neuron via Hebbian learning L. Perrinet, M. Samuelides ONERA-DTIM, 2 Av. E. Belin, BP 4025, 31055 Toulouse, France Abstract It is generally assumehat neurons communicate through temporal ring patterns. As a rst step, we will study the learning of a layer of realistic neurons in the particular case where the relevant messages are formedby temporally correlatedpatterns, or synre patterns. The model is a layer of integrate-and-re neurons with synaptic current dynamics that adapts by minimizing a cost according to a gradient descent scheme. The cost we dene leads to a rule similar to spike-time dependent Hebbian plasticity. Moreover, our results show that the rule that we derive is biologically plausible and leads to the detection of the coherence in the input in an unsupervisedway. An application to shape recognition is shown as an illustration. c 2002 Publishedby Elsevier Science B.V. Keywords: Spiking neural networks; Hebb rule; Spike time dependent Hebbian plasticity; Gradient descent 1. Description of the model 1.1. Coding scheme We will represent (as in [2]) the signal S i at synapse i by the sum of Dirac pulses locatedat the spiking times ti k drawn from the lists of spikes i (see Fig. 1-left). S i = (t ti k ): (1) k i Synre patterns are generatedin analogy with the response of a retina to ashedbinary images. The input of the synapses is characterizedas the output of single-synapse IF Corresponding author. E-mail address: perrinet@cert.fr (L. Perrinet). URL: http://laurent.perrinet.free.fr/app/app.html 0925-2312/02/$ - see front matter c 2002 Publishedby Elsevier Science B.V. PII: S0925-2312(02)00374-0

134 L. Perrinet, M. Samuelides / Neurocomputing 44 46 (2002) 133 139 Fig. 1. Neuron model: (left) Input spikes (with a synre pattern at t = 25 ms), are (middle) modulated in time andamplitude forming postsynaptic current pulses andare nally (right) integratedat the soma. When the potential (plain line) reaches the threshold(dottedline), a spike is emittedanhe potential is decreased. A sample PSP (synapse 1) is also shown. neurons responding to a specic binary input. This response may be described as the sum of two random point processes with dierent time scales. At a narrow time scale, the input is the spontaneous activity, i.e. a background noise independent of time and synapses that may be described by a Poisson point process of rate 1= noise. At a larger time scale, the synre pattern activates a given subset M of synapses once per ash with a temporal correlation dened by its jitter jitter (see Fig. 1-left). 1.2. Integrate-and-re layer We will consider N 1 synapses (indexed by i) connecteo a layer of N 2 neurons j. Those are generalizedversion of IF neurons with synaptic current dynamics, a one compartment model with no delay and the synapses have contacts characterized by their weight w ji. The state variables are N 1 N 2 synaptic driving conductances g ji and N 2 membrane potentials V j. Incoming spikes trigger those conductances by opening the driving gates with time constant g : dg ji = 1 g g ji + w ji S i (2) anhe potential V j at the soma integrates with time constant V the driving currents anhe leaking current g leak (with a potential V rest 70 mv): (see Fig. 1-

middle): V dv j L. Perrinet, M. Samuelides / Neurocomputing 44 46 (2002) 133 139 135 = g leak (V rest V j )+ ( 16i6N 2 g ji ) : (3) When V j reaches the threshold s potential V threshold 54 mv from below, the target neuron res (see Fig. 1-right) andis shunted(v j is set e.g. to V reset 75 mv). 1.3. Reduced equations We introduce reduced equations for this IF Layer to study its dynamical behavior andsimplify its implementation. In fact, this reduction follows the concept of the spike-response model (SRM) which was extensively studied in [2,3]. It is similar to [4] which aimed at reducing the STDHP equations to a set of rst-order equations. dc i = 1 c i + S i ; g dp i V = p i + c i (4) then, V j =V rest +( 16i6N 1 w ji p i ) veries the equation system (2), (3), with g ji =w ji c i. To account for the thresholdmechanism at a spiking time tj k, we may then add a resetting value to V j by setting j (tj k)=v reset V threshold anhen j (t) V = j (t): (5) So that nally, an equivalent version of the IF Layer consists of (4), (5) and ( ) V j (t)=v rest + w ji p i (t) + j (t): (6) 16i6N 1 This formulation depends only on the present state and not on the past values. It is therefore biologically more plausible andcomputationally cheaper. Integrating these equations after emission of a presynaptic spike at t i or a postsynaptic spike at t j leads to ( c i (t) = exp t t ) i ; (7) g p i (t)= ( ( g exp t t i V g g ( (t)=(v reset V threshold ) exp ) ( exp t t i t t j V v )) ; (8) ) : (9) Those equations (6) (9) are the equivalent SRM version of our IF model. More precisely Eq. (7) represents the PostSynaptic Current (PSC), see Fig. 1-middle, and Eq. (8)

136 L. Perrinet, M. Samuelides / Neurocomputing 44 46 (2002) 133 139 the PostSynaptic Potential (PSP), see Fig. 1-right, which may be experimentally observed. 2. The learning mechanism 2.1. Denition of the cost function Basedon neurophysiological studies, we set the following principles: (1) the learning is associatedwith a spiking response: the nth learning step occurs at the nth output ring time t n, (2) to discriminate between the dierent input patterns, the output voltage should be close to a winner-take-all conguration: the potential of the winning neuron (which we index j = j n ) shouldbe above thresholdwhereas other neurons shouldbe hyperpolarized, (3) economy of the total synaptic ecacy andcurrent use shouldbe respected. A possible cost function may therefore be the squareddistance to the potentials of neurons at the ring time t n added to the total sum of the squared weights: 2E = (V j Vj t ) 2 + ( ) 2 16j6N @Vj 2 + w @t ji; 2 (10) 16j6N 2 16i6N 1 16j6N 2 V t j = V rest for j j n ; (11) Vj t n = V threshold +V; (12) where and are scaling parameters andwe set V 5mV. 2.2. Gradient descent It follows from Eqs. (10) and(6): @E =(V j Vj t ) @V j + @V j @w ji @w ji @t =(V j V t j )p i + @V j @t dp i @ @V j + w ji @w ji @t + w ji : We may therefore formulate the gradient descent algorithm in our model as wji n+1 = wji n @E n @w ji =(1 n )wji n + n (Vj t V j ) @V j @V j + n @w ji @t @ V j @w ji

L. Perrinet, M. Samuelides / Neurocomputing 44 46 (2002) 133 139 137 Fig. 2. STDHP: relative weight change versus time dierence between input and output spikes in a single neuron modelized with our model replicating the experimental conditions of [1]. with learning factors n ; n and n which satises (e.g. for n ) n=1::: n and n=1::: 2 n. Finally, writing ṗ i =dp i =, w n+1 ji =(1 n )w n ji + n (V t j V j )p i + n ( wji ṗ i ) ṗ i : (13) 2.3. Spike-time dependent plasticity A closer look at Eq. (13) shows that a direction in the change of w ji is proportional to (V target j V j )p i. This is the Hebbian part of the rule: when a neuron j res after the ring of synapse j, there is a mechanism that strengthen the connection. The strengthening depends therefore on the relative time of the pre- and post-synaptic spikes (see Fig. 2) as is observedin biological systems [1]. 3. Numerical results We implemented this model using discrete versions of the dierential equations (forwardeuler method) on a MATLAB system. 3.1. Response to synre patterns To achieve this experiment we presentedsynre patterns to the layer. The weights were set at random so that the network could re to all the inputs. The patterns were presented at random times that were suciently distant. This unsupervised learning converges quickly, andas may be observedin neuromuscular connectivity, the synapses teno sparsify anhe neurons teno respono only one input (see Fig. 3).

138 L. Perrinet, M. Samuelides / Neurocomputing 44 46 (2002) 133 139 Fig. 3. Coherence detection: (left) dierent input patterns (ring times t = 100; 300; 500; 700; 900 ms) are (right) learnt by the system: only one neuron per input res (100 learning steps). Fig. 4. Orientedbars detection: after learning, the weights show sensitivity to orientation (black:off; white:on; gray:neutral). 3.2. Response to oriented bars The next experiment consistedin applying those results to a basic retina which input consists of centeredrotatedlines. A xedanalogical contrast layer (ON andoff radial cells) sends then spikes to the learning layer that adapts with the rule we presented. We observe unsupervisedemergence of V1-like receptors elds sensitive to the orientation (see Fig. 4). Further experiments with lateral interactions and accounting for dendritic delay show even more realistic lters and column architecture. 4. Conclusion We have presentedan original gradient descent methoo nda learning rule for a layer of spiking neurons. The simplicity of the rule gives a new insight into the comprehension of the mechanism behinhe observedstdhp. Further work is done for the detection of asynchronous patterns.

L. Perrinet, M. Samuelides / Neurocomputing 44 46 (2002) 133 139 139 However, this study should be extended to more realistic spike trains (e.g. bursts), account for more complex behavior (e.g. facilitation anddepression) andmay be extendeo population of neurons andrecurrent systems. Acknowledgements This work has been initiated during the EU Advanced Course in Computational Neuroscience. LP wish to thank its organizers, the teachers, the course-mates andmy tutor, S. Panzeri. References [1] G.-Q. Bi, M.-M. Poo, Synaptic modications in cultured hippocampal neurons: dependence on spike timing, synaptic strength, andpostsynaptic cell type, J. Neurosci. 18 (1998) 10464 10472. [2] W. Gerstner, Spiking neurons, in: W. Maass, C.M. Bishop (Eds.), PulsedCoupledNeural Networks, MIT Press, Cambridge, MA, 1999, pp. 3 54 (Chapter 1). [3] R. Kempter, W. Gerstner, J. van Hemmen, Hebbian learning andspiking neurons, Phys. Rev. E 59 (4) (1999) 4498 4514. [4] L. Perrinet, A. Delorme, S. Thorpe, Network of integrate-and-re neurons using rank order coding A: how to implement spike timing dependent plasticity, Neurocomputing 38 40 (2001) 817 822. Laurent Perrinet is a Ph.D. student in Computational Neuroscience, under the direction of Manuel Samuelides at the CERT-ONERA, Toulouse and in narrow collaboration with the team of Simon Thorpe at the CERCO-CNRS, Toulouse. He works on theoretical andsimulatedaspects of neural coding, especially on the implication of fast-categorization visual experiments. Working areas span learning (especially in spike timing dependant plasticity) and the statistics of natural images.