Neural spike statistics modify the impact of background noise
|
|
- Lilian Perkins
- 5 years ago
- Views:
Transcription
1 Neurocomputing 38}40 (2001) 445}450 Neural spike statistics modify the impact of background noise Stefan D. Wilke*, Christian W. Eurich Institut fu( r Theoretische Physik, Universita( t Bremen, Postfach , D Bremen, Germany Abstract Neural populations in the neocortex typically encode multiple stimulus features, e.g., position, brightness, contrast, and orientation of a visual stimulus in the case of cells in area 17. Here, we perform a Fisher information analysis of the encoding accuracy of a neural population which is sensitive to D stimulus features. The neurons are assumed to exhibit a non-vanishing level of baseline activity. It is shown that the encoding accuracy decreases drastically with D if the spike count variance depends on the mean spike count, as is the case for Poissonian spike statistics. The need to reduce the susceptibility to background noise thus poses severe restrictions on the neural "ring statistics or the number of encoded stimulus features. The results hold for uncorrelated as well as for correlated activity in the neural population Elsevier Science B.V. All rights reserved. PACS: La, !r Keywords: Population coding; Response variability; Baseline activity; Fisher information 1. Introduction The neocortex consists of dozens of areas whose neurons process di!erent*although sometimes overlapping*features of sensory stimuli. This diversi"cation of signal processing poses the problem of binding the di!erent features belonging to a single object in order to obtain an appropriate cortical object representation [8]. This problem could be avoided by neurons that are equally sensitive to all, or at least to many, stimulus features. The observation that this is not the case leads to the * Corresponding author. addresses: swilke@physik.uni-bremen.de (S.D. Wilke), eurich@physik.uni-bremen.de (C.W. Eurich) /01/$ - see front matter 2001 Elsevier Science B.V. All rights reserved. PII: S ( 0 1 )
2 446 S.D. Wilke, C.W. Eurich / Neurocomputing 38}40 (2001) 445}450 question why neurons in many areas are specialized in a restricted number of stimulus features. Part of the answer lies in the fact that the localization of a stimulus de"ned by a large number of features requires the neural population to cover a stimulus space of correspondingly high dimension. The number of neurons necessary for the representation of D features scales as a, where a is a measure for the acuity to be obtained in each dimension. Thus, there is a combinatorial explosion in the population size. However, the fact that large receptive "elds yield a high representational accuracy [10,4,13] may restrict the number of required neurons. Here, we relate the problem of the number of encoded dimensions to the spontaneous activity present in most cortical neurons. A Fisher information analysis [9,13,11,5,6,12] suggests that the neural population meets with a drastic loss of information content as D is increased. Satisfactory representational accuracy can be achieved through a restriction in D or through the presence of additive spike count noise. 2. Theoretical framework Suppose that a neural population of N neurons encodes a stimulus x"(x,2, x ) consisting of D features. Since the neurons "re in a stochastic manner, the response to a given stimulus is characterized by the joint probability distribution P(nx), where n"(n,2, n ) is the vector of spike counts measured within an observation time τ.in order to assess the coding accuracy achieved by the neural population, we employ the total Fisher information J(x), J(x):" 1 J (x), J (x):"e n x ln P(n x), (1) where E n [2] denotes the expectation value over the distribution P(n x), and J (x)is the Fisher information for the individual stimulus feature x. In the present context, the total Fisher information is especially suitable since its inverse is a lower bound for the squared error of an unbiased stimulus reconstruction (CrameH r}rao inequality, see e.g. [3]), E n [(x( (n)!x)]*1/j(x). This inequality holds for any unbiased spike-count based estimate x( (n) of the stimulus x. Correlations between the neural spike counts are described by assuming that P(nx) has the form of a Gaussian probability distribution with covariance matrix Q(x) [1]. A general factorization ansatz for this matrix was made by Zhang and Sejnowski [13], Q (x)"[δ #(1!δ )q]ψ[ f (x)]ψ[ f (x)] (2) with a parameter 1'q*0 measuring correlation strength, and an arbitrary function ψ. The vector f(x)"( f (x),2, f (x)) combines the mean "ring rates, i.e., f(x)"n/τ. As special cases, Eq. (2) covers additive noise (ψ(z),s with an arbitrary constant so0) [7], multiplicative noise (ψ(z)"sz), and independent Poissonian spiking in the
3 S.D. Wilke, C.W. Eurich / Neurocomputing 38}40 (2001) 445} limit Fτ<1 (ψ(z)"zτ and q"0). Note that, even in the absence of correlations (q"0), these three noise models are substantially di!erent in terms of their variance Q (x): The additive noise has constant variance Q (x)"s, while multiplicative and Poissonian noise have variances that increase with the mean, Q (x)"sf (x) and Q (x)"f (x)τ, respectively. The tuning functions f (x), k"1,2, N, are assumed to be of the form f (x)"f(ξ), with ξ :" ξ and ξ :"(x!c)/σ, where σ denotes the neuronal tuning width for feature i. This form implies that the mean "ring rate only depends on the sum of the squares of rescaled distances ξ between the stimulus x and the tuning curve center c. The tuning curve shape function is normalized to unity, so that F denotes the maximal "ring rate. Note that the tuning widths σ need not be equal, thus allowing broad tuning for some and narrow tuning for other stimulus features. If there is su$cient tuning curve overlap and N is large, the Fisher information for the model population is given by [11,13] with J"η σ (1/D) σ K (F, τ, D, q) (3 ) ( K (F, τ, D, q):" 1 ( 1!q A (F, τ, D)#2!q ( 1!q B (F, τ, D), (4) ( where η is the ("xed) number of tuning curves per unit volume in the stimulus space, and the functions A (F, τ, D) and B (F, τ, D) are factors that depend on the shape of ( ( the tuning curve and on the correlation model function ψ, but not on the tuning widths σ. They read as A ( (F, τ, D):" 4F D B ( (F, τ, D):" 4F D π Γ(1#(D/2)) π Γ(1#(D/2)) dξ (ξ)ξ ψ[f(ξ)], (5) dξ ψ[f(ξ)](ξ)ξ. (6) ψ[f(ξ)] As noted in [1,13], the e!ect of uniform correlations of the form of Eq. (2) on the population's Fisher information is a factor 1/(1!q) that increases the accuracy in the presence of positive correlations (q'0). 3. Results and discussion In order to incorporate background activity in the present model, suppose that the neurons' tuning functions f (x) consist of a stimulus response proportional to FI (where I is normalized to unity) and a constant baseline "ring rate level Fζ (0)ζ)1), F(z)"Fζ#F(1!ζ)I(z). (7)
4 448 S.D. Wilke, C.W. Eurich / Neurocomputing 38}40 (2001) 445}450 Fig. 1. Total Fisher information J as a function of baseline activity level ζ for di!erent numbers D of encoded features. The curves have been normalized to unity at ζ"0 in order to make them comparable. Curves for D"1, 2, 3, 10 were obtained assuming independent Poisson neurons, Eq. (8), the dot}dashed curve results from the additive noise model ψ"const., Eq. (9). For the population described in the previous section, the background activity implies that the terms A ( (F, τ, D) and B ( (F, τ, D) become functions of the baseline "ring level ζ. Assuming independent Poissonian "ring at high spike counts, Fτ<1, and Gaussian tuning curves, I(z)"exp(!z/2), one "nds A ( <B ( and K ( (F, τ, D, q"0)" Fτ D (2π) 1!ζ#ζ Li 1!1 ζ, (8) where Li (z):" z/k is the polylogarithm function. The plot of this function in Fig. 1 demonstrates that the baseline activity (as measured by ζ) deteriorates the encoding accuracy as expected. An interesting "nding is that the degree to which this is the case strongly depends on the number of stimulus features D. For large D, background activity appears to be disastrous for the encoding: At D"10, for example, a noise level of ζ"0.1 already leads to a 92% decrease of the Fisher information, which is equivalent to a more than threefold increase of the minimal reconstruction error. The underlying reason for this behavior is that, for high dimensionality, most of the volume of the receptive "eld falls into the `border regiona of the tuning curve (I;1), where the majority of spikes belong to the background activity. A qualitatively similar dependence on ζ and D results for other tuning curve shapes, e.g. cos-tuning.
5 S.D. Wilke, C.W. Eurich / Neurocomputing 38}40 (2001) 445} Eqs. (5)}(7) imply that the impact of background activity on Fisher information depends on the correlation model ψ that is adapted. Assuming additive noise (ψ(z),s), for example, one "nds K (F, τ, D, q)" 1 4F π(1!ζ) ( 1!q sdγ(1#(d/2)) dξi(ξ)ξ. (9) The above equation shows that the ζ-dependence for this kind of "ring rate variance is always proportional to (1!ζ), regardless of the dimension D. The corresponding normalized Fisher information is included in Fig. 1 (dot}dashed line). Apparently, it does not show the increase in susceptibility to background activity at large D, asis obvious from Eq. (9). In addition, its ζ-robustness is always superior to the Poisson model, even at D"1. Thus, a di!erent "ring rate variance can remedy the drastic decrease of Fisher information with the level of background activity for large D. 4. Conclusion Our Fisher information calculation of the encoding accuracy of a spontaneously active neural population suggests that system performance rapidly decreases with the number D of encoded stimulus features. This holds for usual types of noisy responses, where the variance of the spike count increases with the mean spike count. The problem can be avoided through a restriction in D, and does not occur for special types of noise, e.g. with constant variance. The analysis of empirical data from cortical neurons will shed further light on the connection between encoding accuracy, stimulus-uncorrelated background activity, and the number of relevant stimulus features. For example, the measurement of local "eld potentials in cat area 17 suggests the existence of background activity related to the dynamics of the underlying cortical network [2]. The information}theoretic consequences of this behavior remain to be elucidated. References [1] L.F. Abbott, P. Dayan, The e!ect of correlated variability on the accuracy of a population code, Neural Comput. 11 (1999) 91}101. [2] A. Arieli, A. Sterkin, A. Grinvald, A. Aertsen, Dynamics of ongoing activity: Explanation of the Large Variability in Evoked Cortical Responses, Science 273(1996) 1868}1871. [3] T.M. Cover, J.A. Thomas, Elements of Information Theory, Wiley, New York, [4] C.W. Eurich, H. Schwegler, Coarse coding: calculation of the resolution achieved by a population of large receptive "eld neurons, Biol. Cybernet. 76 (1997) 357}363. [5] C.W. Eurich, S.D. Wilke, Multi-dimensional encoding strategy of spiking neurons, Neural Comput. 12 (2000) 1519}1529. [6] C.W. Eurich, S.D. Wilke, H. Schwegler, Neural representation of multi-dimensional stimuli, in: S.A. Solla, T.K. Leen, K.-R. MuK ller (Eds.), Advances in Neural Information Processing Systems, Vol. 12, MIT Press, Cambridge, MA, 2000, pp. 115}121. [7] K.O. Johnson, Sensory discrimination: neural processes preceding discrimination decision, J. Neurophysiol. 43(1980) 1793}1815.
6 450 S.D. Wilke, C.W. Eurich / Neurocomputing 38}40 (2001) 445}450 [8] A.L. Roskies, The binding problem, Neuron 24 (1999) 7}9. [9] H.S. Seung, H. Sompolinski, Simple models for reading neuronal population codes, Proc. Natl. Acad. Sci. USA 90 (1993) 10749} [10] H.P. Snippe, J.J. Koenderink, Discrimination thresholds for channel-coded systems, Biol. Cybernet. 66 (1992) 543}551. [11] S.D. Wilke, C.W. Eurich, What does a neuron talk about?, in: M. Verleysen (Ed.), Proc. ESANN '99, D-Facto, Bruxelles, 1999, pp. 435}440. [12] S.D. Wilke, C.W. Eurich, Representational accuracy of stochastic neural populations, Neural Comput. (2000), submitted. [13] K. Zhang, T.J. Sejnowski, Neuronal tuning: to sharpen or broaden?, Neural Comput. 11 (1999) 75}84. Stefan D. Wilke studied physics at the University of Konstanz (Germany) and at the Freie UniversitaK t Berlin (Germany). In 1998, he "nished his Diploma thesis on mode-coupling theory of the glass transition in ionic #uids. Since 1998, he has been a Ph.D. student at Department of Neurophysics at the University of Bremen. His research interests are theoretical aspects of neural coding and early visual processing. Christian W. Eurich got his Ph.D. in Theoretical Physics in 1995 from the University of Bremen (Germany). As a postdoc, he worked with John Milton and Jack Cowan at the University of Chicago, and he was guest researcher at the Max-Planck Institut fuk r StroK mungsforschung in GoK ttingen (Germany) and at the RIKEN Brain Institute in Tokyo. Since 1997, he has been working at the Department of Theoretical Neurophysics at the University of Bremen. His research interests include neural networks with time delays, neural information processing and neural coding, visuomotor behavior, and psychophysical modelling.
Neuronal Tuning: To Sharpen or Broaden?
NOTE Communicated by Laurence Abbott Neuronal Tuning: To Sharpen or Broaden? Kechen Zhang Howard Hughes Medical Institute, Computational Neurobiology Laboratory, Salk Institute for Biological Studies,
More informationVisuomotor tracking on a computer screen an experimental paradigm to study the dynamics of motor control
Neurocomputing 58 60 (2004) 517 523 www.elsevier.com/locate/neucom Visuomotor tracking on a computer screen an experimental paradigm to study the dynamics of motor control R. Bormann a;, J.-L. Cabrera
More informationW (x) W (x) (b) (a) 1 N
Delay Adaptation in the Nervous System Christian W. Eurich a, Klaus Pawelzik a, Udo Ernst b, Andreas Thiel a, Jack D. Cowan c and John G. Milton d a Institut fur Theoretische Physik, Universitat Bremen,
More informationHigh-conductance states in a mean-eld cortical network model
Neurocomputing 58 60 (2004) 935 940 www.elsevier.com/locate/neucom High-conductance states in a mean-eld cortical network model Alexander Lerchner a;, Mandana Ahmadi b, John Hertz b a Oersted-DTU, Technical
More informationHow to read a burst duration code
Neurocomputing 58 60 (2004) 1 6 www.elsevier.com/locate/neucom How to read a burst duration code Adam Kepecs a;, John Lisman b a Cold Spring Harbor Laboratory, Marks Building, 1 Bungtown Road, Cold Spring
More informationThe Effect of Correlated Variability on the Accuracy of a Population Code
LETTER Communicated by Michael Shadlen The Effect of Correlated Variability on the Accuracy of a Population Code L. F. Abbott Volen Center and Department of Biology, Brandeis University, Waltham, MA 02454-9110,
More informationAccuracy and learning in neuronal populations
M.A.L. Nicolelis (Ed.) Pmgress in Brain Research, Vol. 130 O 2001 Elsevier Science B.V. All rights reserved CHAPTER 21 Accuracy and learning in neuronal populations Kechen Zhang '.* and Terrence J. Sejnowski
More informationPopulation Coding. Maneesh Sahani Gatsby Computational Neuroscience Unit University College London
Population Coding Maneesh Sahani maneesh@gatsby.ucl.ac.uk Gatsby Computational Neuroscience Unit University College London Term 1, Autumn 2010 Coding so far... Time-series for both spikes and stimuli Empirical
More informationunit P[r x*] C decode encode unit P[x r] f(x) x D
Probabilistic Interpretation of Population Codes Richard S. Zemel Peter Dayan Aleandre Pouget zemel@u.arizona.edu dayan@ai.mit.edu ale@salk.edu Abstract We present a theoretical framework for population
More information!) + log(t) # n i. The last two terms on the right hand side (RHS) are clearly independent of θ and can be
Supplementary Materials General case: computing log likelihood We first describe the general case of computing the log likelihood of a sensory parameter θ that is encoded by the activity of neurons. Each
More informationTwo dimensional synaptically generated traveling waves in a theta-neuron neuralnetwork
Neurocomputing 38}40 (2001) 789}795 Two dimensional synaptically generated traveling waves in a theta-neuron neuralnetwork Remus Osan*, Bard Ermentrout Department of Mathematics, University of Pittsburgh,
More information+ + ( + ) = Linear recurrent networks. Simpler, much more amenable to analytic treatment E.g. by choosing
Linear recurrent networks Simpler, much more amenable to analytic treatment E.g. by choosing + ( + ) = Firing rates can be negative Approximates dynamics around fixed point Approximation often reasonable
More informationNeural variability and Poisson statistics
Neural variability and Poisson statistics January 15, 2014 1 Introduction We are in the process of deriving the Hodgkin-Huxley model. That model describes how an action potential is generated by ion specic
More informationEncoding or decoding
Encoding or decoding Decoding How well can we learn what the stimulus is by looking at the neural responses? We will discuss two approaches: devise and evaluate explicit algorithms for extracting a stimulus
More information(a) (b) (c) Time Time. Time
Baltzer Journals Stochastic Neurodynamics and the System Size Expansion Toru Ohira and Jack D. Cowan 2 Sony Computer Science Laboratory 3-4-3 Higashi-gotanda, Shinagawa, Tokyo 4, Japan E-mail: ohiracsl.sony.co.jp
More informationFinding a Basis for the Neural State
Finding a Basis for the Neural State Chris Cueva ccueva@stanford.edu I. INTRODUCTION How is information represented in the brain? For example, consider arm movement. Neurons in dorsal premotor cortex (PMd)
More informationProbabilistic Modeling of Dependencies Among Visual Short-Term Memory Representations
Probabilistic Modeling of Dependencies Among Visual Short-Term Memory Representations A. Emin Orhan Robert A. Jacobs Department of Brain & Cognitive Sciences University of Rochester Rochester, NY 4627
More information3.3 Population Decoding
3.3 Population Decoding 97 We have thus far considered discriminating between two quite distinct stimulus values, plus and minus. Often we are interested in discriminating between two stimulus values s
More informationExercises. Chapter 1. of τ approx that produces the most accurate estimate for this firing pattern.
1 Exercises Chapter 1 1. Generate spike sequences with a constant firing rate r 0 using a Poisson spike generator. Then, add a refractory period to the model by allowing the firing rate r(t) to depend
More informationNeural information often passes through many different
Transmission of population coded information Alfonso Renart, and Mark C. W. van Rossum Instituto de Neurociencias de Alicante. Universidad Miguel Hernndez - CSIC 03550 Sant Joan d Alacant, Spain, Center
More informationInformation Theory. Mark van Rossum. January 24, School of Informatics, University of Edinburgh 1 / 35
1 / 35 Information Theory Mark van Rossum School of Informatics, University of Edinburgh January 24, 2018 0 Version: January 24, 2018 Why information theory 2 / 35 Understanding the neural code. Encoding
More informationLearning with Ensembles: How. over-tting can be useful. Anders Krogh Copenhagen, Denmark. Abstract
Published in: Advances in Neural Information Processing Systems 8, D S Touretzky, M C Mozer, and M E Hasselmo (eds.), MIT Press, Cambridge, MA, pages 190-196, 1996. Learning with Ensembles: How over-tting
More informationAdaptive contrast gain control and information maximization $
Neurocomputing 65 66 (2005) 6 www.elsevier.com/locate/neucom Adaptive contrast gain control and information maximization $ Yuguo Yu a,, Tai Sing Lee b a Center for the Neural Basis of Cognition, Carnegie
More informationThe homogeneous Poisson process
The homogeneous Poisson process during very short time interval Δt there is a fixed probability of an event (spike) occurring independent of what happened previously if r is the rate of the Poisson process,
More informationWhat is the neural code? Sekuler lab, Brandeis
What is the neural code? Sekuler lab, Brandeis What is the neural code? What is the neural code? Alan Litke, UCSD What is the neural code? What is the neural code? What is the neural code? Encoding: how
More informationof the dynamics. There is a competition between the capacity of the network and the stability of the
Special Issue on the Role and Control of Random Events in Biological Systems c World Scientic Publishing Company LEARNING SYNFIRE CHAINS: TURNING NOISE INTO SIGNAL JOHN HERTZ and ADAM PRUGEL-BENNETT y
More informationRESEARCH STATEMENT. Nora Youngs, University of Nebraska - Lincoln
RESEARCH STATEMENT Nora Youngs, University of Nebraska - Lincoln 1. Introduction Understanding how the brain encodes information is a major part of neuroscience research. In the field of neural coding,
More informationSUPPLEMENTARY INFORMATION
Supplementary discussion 1: Most excitatory and suppressive stimuli for model neurons The model allows us to determine, for each model neuron, the set of most excitatory and suppresive features. First,
More informationIn#uence of dendritic topologyon "ring patterns in model neurons
Neurocomputing 38}40 (2001) 183}189 In#uence of dendritic topologyon "ring patterns in model neurons Jacob Duijnhouwer, Michiel W.H. Remme, Arjen van Ooyen*, Jaap van Pelt Netherlands Institute for Brain
More informationSupporting Online Material for
www.sciencemag.org/cgi/content/full/319/5869/1543/dc1 Supporting Online Material for Synaptic Theory of Working Memory Gianluigi Mongillo, Omri Barak, Misha Tsodyks* *To whom correspondence should be addressed.
More informationNeural coding Ecological approach to sensory coding: efficient adaptation to the natural environment
Neural coding Ecological approach to sensory coding: efficient adaptation to the natural environment Jean-Pierre Nadal CNRS & EHESS Laboratoire de Physique Statistique (LPS, UMR 8550 CNRS - ENS UPMC Univ.
More informationSpike-Frequency Adaptation: Phenomenological Model and Experimental Tests
Spike-Frequency Adaptation: Phenomenological Model and Experimental Tests J. Benda, M. Bethge, M. Hennig, K. Pawelzik & A.V.M. Herz February, 7 Abstract Spike-frequency adaptation is a common feature of
More informationEfficient Coding. Odelia Schwartz 2017
Efficient Coding Odelia Schwartz 2017 1 Levels of modeling Descriptive (what) Mechanistic (how) Interpretive (why) 2 Levels of modeling Fitting a receptive field model to experimental data (e.g., using
More informationThe Spike Response Model: A Framework to Predict Neuronal Spike Trains
The Spike Response Model: A Framework to Predict Neuronal Spike Trains Renaud Jolivet, Timothy J. Lewis 2, and Wulfram Gerstner Laboratory of Computational Neuroscience, Swiss Federal Institute of Technology
More informationMemory capacity of networks with stochastic binary synapses
Memory capacity of networks with stochastic binary synapses Alexis M. Dubreuil 1,, Yali Amit 3 and Nicolas Brunel 1, 1 UMR 8118, CNRS, Université Paris Descartes, Paris, France Departments of Statistics
More informationHigh-dimensional geometry of cortical population activity. Marius Pachitariu University College London
High-dimensional geometry of cortical population activity Marius Pachitariu University College London Part I: introduction to the brave new world of large-scale neuroscience Part II: large-scale data preprocessing
More informationControllingthe spikingactivity in excitable membranes via poisoning
Physica A 344 (2004) 665 670 www.elsevier.com/locate/physa Controllingthe spikingactivity in excitable membranes via poisoning Gerhard Schmid, Igor Goychuk, Peter Hanggi Universitat Augsburg, Institut
More informationThe Bayesian Brain. Robert Jacobs Department of Brain & Cognitive Sciences University of Rochester. May 11, 2017
The Bayesian Brain Robert Jacobs Department of Brain & Cognitive Sciences University of Rochester May 11, 2017 Bayesian Brain How do neurons represent the states of the world? How do neurons represent
More informationESTIMATING SCALING EXPONENTS IN AUDITORY-NERVE SPIKE TRAINS USING FRACTAL MODELS INCORPORATING REFRACTORINESS
ESTIMATING SCALING EXPONENTS IN AUDITORY-NERVE SPIKE TRAINS USING FRACTAL MODELS INCORPORATING REFRACTORINESS S.B. LOWEN Department of Electrical and Computer Engineering 44 Cummington Street Boston University,
More informationAn alternate burst analysis for detecting intra-burst rings based on inter-burst periods
Neurocomputing 44 46 (2002) 1155 1159 www.elsevier.com/locate/neucom An alternate burst analysis for detecting intra-burst rings based on inter-burst periods David C. Tam Deptartment of Biological Sciences,
More informationStatistical models for neural encoding
Statistical models for neural encoding Part 1: discrete-time models Liam Paninski Gatsby Computational Neuroscience Unit University College London http://www.gatsby.ucl.ac.uk/ liam liam@gatsby.ucl.ac.uk
More informationSignal detection theory
Signal detection theory z p[r -] p[r +] - + Role of priors: Find z by maximizing P[correct] = p[+] b(z) + p[-](1 a(z)) Is there a better test to use than r? z p[r -] p[r +] - + The optimal
More informationActivity Driven Adaptive Stochastic. Resonance. Gregor Wenning and Klaus Obermayer. Technical University of Berlin.
Activity Driven Adaptive Stochastic Resonance Gregor Wenning and Klaus Obermayer Department of Electrical Engineering and Computer Science Technical University of Berlin Franklinstr. 8/9, 187 Berlin fgrewe,obyg@cs.tu-berlin.de
More informationInsights from a Simple Expression for Linear Fisher Information in a Recurrently Connected Population of Spiking Neurons
LETTER Communicated by Hiroyuki Nakahara Insights from a Simple Expression for Linear Fisher Information in a Recurrently Connected Population of Spiking Neurons Jeffrey Beck jbeck@bcs.rochester.edu Gatsby
More informationPosition Variance, Recurrence and Perceptual Learning
Position Variance, Recurrence and Perceptual Learning Zhaoping Li Peter Dayan Gatsby Computational Neuroscience Unit 7 Queen Square, London, England, WCN 3AR. zhaoping@gatsby.ucl.ac.uk dayan@gatsby.ucl.ac.uk
More informationConsider the following spike trains from two different neurons N1 and N2:
About synchrony and oscillations So far, our discussions have assumed that we are either observing a single neuron at a, or that neurons fire independent of each other. This assumption may be correct in
More informationDesign principles for contrast gain control from an information theoretic perspective
Design principles for contrast gain control from an information theoretic perspective Yuguo Yu Brian Potetz Tai Sing Lee Center for the Neural Computer Science Computer Science Basis of Cognition Department
More information1 Introduction Independent component analysis (ICA) [10] is a statistical technique whose main applications are blind source separation, blind deconvo
The Fixed-Point Algorithm and Maximum Likelihood Estimation for Independent Component Analysis Aapo Hyvarinen Helsinki University of Technology Laboratory of Computer and Information Science P.O.Box 5400,
More informationMid Year Project Report: Statistical models of visual neurons
Mid Year Project Report: Statistical models of visual neurons Anna Sotnikova asotniko@math.umd.edu Project Advisor: Prof. Daniel A. Butts dab@umd.edu Department of Biology Abstract Studying visual neurons
More informationEmergence of Phase- and Shift-Invariant Features by Decomposition of Natural Images into Independent Feature Subspaces
LETTER Communicated by Bartlett Mel Emergence of Phase- and Shift-Invariant Features by Decomposition of Natural Images into Independent Feature Subspaces Aapo Hyvärinen Patrik Hoyer Helsinki University
More informationSynaptic Input. Linear Model of Synaptic Transmission. Professor David Heeger. September 5, 2000
Synaptic Input Professor David Heeger September 5, 2000 The purpose of this handout is to go a bit beyond the discussion in Ch. 6 of The Book of Genesis on synaptic input, and give some examples of how
More information1/12/2017. Computational neuroscience. Neurotechnology.
Computational neuroscience Neurotechnology https://devblogs.nvidia.com/parallelforall/deep-learning-nutshell-core-concepts/ 1 Neurotechnology http://www.lce.hut.fi/research/cogntech/neurophysiology Recording
More informationNeural Encoding: Firing Rates and Spike Statistics
Neural Encoding: Firing Rates and Spike Statistics Dayan and Abbott (21) Chapter 1 Instructor: Yoonsuck Choe; CPSC 644 Cortical Networks Background: Dirac δ Function Dirac δ function has the following
More informationLateral organization & computation
Lateral organization & computation review Population encoding & decoding lateral organization Efficient representations that reduce or exploit redundancy Fixation task 1rst order Retinotopic maps Log-polar
More informationHerausgeber Bernd Krieg-Bruckner, Gerhard Roth, Helmut Schwegler ISSN c 1997 bei den Herausgebern Kontaktadresse: Richard Woesler Institut fur
Algorithm for Classication of Neurons with Application to Amphibian Data Richard Woesler, Gerhard Roth, Helmut Schwegler und Christian Eurich ZKW Bericht Nr. 1/97 ZENTRUM FUR KOGNITIONSWISSENSCHAFTEN Universitat
More informationPosition Variance, Recurrence and Perceptual Learning
Position Variance, Recurrence and Perceptual Learning Zhaoping Li Peter Dayan Gatsby Computational Neuroscience Unit 17 Queen Square, London, England, WCIN 3AR. zhaoping@ga t sby.ucl. a c.uk da y a n @gat
More informationencoding and estimation bottleneck and limits to visual fidelity
Retina Light Optic Nerve photoreceptors encoding and estimation bottleneck and limits to visual fidelity interneurons ganglion cells light The Neural Coding Problem s(t) {t i } Central goals for today:
More information04. Random Variables: Concepts
University of Rhode Island DigitalCommons@URI Nonequilibrium Statistical Physics Physics Course Materials 215 4. Random Variables: Concepts Gerhard Müller University of Rhode Island, gmuller@uri.edu Creative
More informationDendritic cable with active spines: a modelling study in the spike-diffuse-spike framework
Dendritic cable with active spines: a modelling study in the spike-diffuse-spike framework Yulia Timofeeva a, Gabriel Lord a and Stephen Coombes b a Department of Mathematics, Heriot-Watt University, Edinburgh,
More informationNeural Population Structures and Consequences for Neural Coding
Journal of Computational Neuroscience 16, 69 80, 2004 c 2004 Kluwer Academic Publishers. Manufactured in The Netherlands. Neural Population Structures and Consequences for Neural Coding DON H. JOHNSON
More informationComparing integrate-and-fire models estimated using intracellular and extracellular data 1
Comparing integrate-and-fire models estimated using intracellular and extracellular data 1 Liam Paninski a,b,2 Jonathan Pillow b Eero Simoncelli b a Gatsby Computational Neuroscience Unit, University College
More informationInformation Dynamics Foundations and Applications
Gustavo Deco Bernd Schürmann Information Dynamics Foundations and Applications With 89 Illustrations Springer PREFACE vii CHAPTER 1 Introduction 1 CHAPTER 2 Dynamical Systems: An Overview 7 2.1 Deterministic
More informationHow Behavioral Constraints May Determine Optimal Sensory Representations
How Behavioral Constraints May Determine Optimal Sensory Representations by Salinas (2006) CPSC 644 Presented by Yoonsuck Choe Motivation Neural response is typically characterized in terms of a tuning
More informationSPIKE TRIGGERED APPROACHES. Odelia Schwartz Computational Neuroscience Course 2017
SPIKE TRIGGERED APPROACHES Odelia Schwartz Computational Neuroscience Course 2017 LINEAR NONLINEAR MODELS Linear Nonlinear o Often constrain to some form of Linear, Nonlinear computations, e.g. visual
More informationINFORMATION PROCESSING ABILITY OF BINARY DETECTORS AND BLOCK DECODERS. Michael A. Lexa and Don H. Johnson
INFORMATION PROCESSING ABILITY OF BINARY DETECTORS AND BLOCK DECODERS Michael A. Lexa and Don H. Johnson Rice University Department of Electrical and Computer Engineering Houston, TX 775-892 amlexa@rice.edu,
More information3 Neural Decoding. 3.1 Encoding and Decoding. (r 1, r 2,..., r N ) for N neurons is a list of spike-count firing rates, although,
3 Neural Decoding 3.1 Encoding and Decoding In chapters 1 and 2, we considered the problem of predicting neural responses to known stimuli. The nervous system faces the reverse problem, determining what
More informationLinking non-binned spike train kernels to several existing spike train metrics
Linking non-binned spike train kernels to several existing spike train metrics Benjamin Schrauwen Jan Van Campenhout ELIS, Ghent University, Belgium Benjamin.Schrauwen@UGent.be Abstract. This work presents
More informationOptimal Short-Term Population Coding: When Fisher Information Fails
LETTER Communicated by Kechen Zhang Optimal Short-Term Population Coding: When Fisher Information Fails M. Bethge mbethge@physik.uni-bremen.de D. Rotermund davrot@physik.uni-bremen.de K. Pawelzik pawelzik@physik.uni-bremen.de
More informationThe Variance of Covariance Rules for Associative Matrix Memories and Reinforcement Learning
NOTE Communicated by David Willshaw The Variance of Covariance Rules for Associative Matrix Memories and Reinforcement Learning Peter Dayan Terrence J. Sejnowski Computational Neurobiology Laboratory,
More informationLearning and Memory in Neural Networks
Learning and Memory in Neural Networks Guy Billings, Neuroinformatics Doctoral Training Centre, The School of Informatics, The University of Edinburgh, UK. Neural networks consist of computational units
More informationImproved characterization of neural and behavioral response. state-space framework
Improved characterization of neural and behavioral response properties using point-process state-space framework Anna Alexandra Dreyer Harvard-MIT Division of Health Sciences and Technology Speech and
More informationTransformation of stimulus correlations by the retina
Transformation of stimulus correlations by the retina Kristina Simmons (University of Pennsylvania) and Jason Prentice, (now Princeton University) with Gasper Tkacik (IST Austria) Jan Homann (now Princeton
More informationComparison of receptive fields to polar and Cartesian stimuli computed with two kinds of models
Supplemental Material Comparison of receptive fields to polar and Cartesian stimuli computed with two kinds of models Motivation The purpose of this analysis is to verify that context dependent changes
More informationDetection of spike patterns using pattern ltering, with applications to sleep replay in birdsong
Neurocomputing 52 54 (2003) 19 24 www.elsevier.com/locate/neucom Detection of spike patterns using pattern ltering, with applications to sleep replay in birdsong Zhiyi Chi a;, Peter L. Rauske b, Daniel
More informationHopfield Neural Network and Associative Memory. Typical Myelinated Vertebrate Motoneuron (Wikipedia) Topic 3 Polymers and Neurons Lecture 5
Hopfield Neural Network and Associative Memory Typical Myelinated Vertebrate Motoneuron (Wikipedia) PHY 411-506 Computational Physics 2 1 Wednesday, March 5 1906 Nobel Prize in Physiology or Medicine.
More informationMembrane equation. VCl. dv dt + V = V Na G Na + V K G K + V Cl G Cl. G total. C m. G total = G Na + G K + G Cl
Spiking neurons Membrane equation V GNa GK GCl Cm VNa VK VCl dv dt + V = V Na G Na + V K G K + V Cl G Cl G total G total = G Na + G K + G Cl = C m G total Membrane with synaptic inputs V Gleak GNa GK
More informationTHE retina in general consists of three layers: photoreceptors
CS229 MACHINE LEARNING, STANFORD UNIVERSITY, DECEMBER 2016 1 Models of Neuron Coding in Retinal Ganglion Cells and Clustering by Receptive Field Kevin Fegelis, SUID: 005996192, Claire Hebert, SUID: 006122438,
More informationUsing Gaussian Processes for Variance Reduction in Policy Gradient Algorithms *
Proceedings of the 8 th International Conference on Applied Informatics Eger, Hungary, January 27 30, 2010. Vol. 1. pp. 87 94. Using Gaussian Processes for Variance Reduction in Policy Gradient Algorithms
More informationRate- and Phase-coded Autoassociative Memory
Rate- and Phase-coded Autoassociative Memory Máté Lengyel Peter Dayan Gatsby Computational Neuroscience Unit, University College London 7 Queen Square, London WCN 3AR, United Kingdom {lmate,dayan}@gatsby.ucl.ac.uk
More informationcorrelations on the Mutual Information of the system. Our analysis provides an estimate of the effective number of statistically independent degrees o
Population Coding in Neuronal Systems with Correlated Noise Haim Sompolinsky, Hyoungsoo Yoon, Kukjin Kang, Maoz Shamir Racah Institute of Physics and Center for Neural Computation, The Hebrew University
More informationA Monte Carlo Sequential Estimation for Point Process Optimum Filtering
2006 International Joint Conference on Neural Networks Sheraton Vancouver Wall Centre Hotel, Vancouver, BC, Canada July 16-21, 2006 A Monte Carlo Sequential Estimation for Point Process Optimum Filtering
More informationBayesian probability theory and generative models
Bayesian probability theory and generative models Bruno A. Olshausen November 8, 2006 Abstract Bayesian probability theory provides a mathematical framework for peforming inference, or reasoning, using
More information2 Information transmission is typically corrupted by noise during transmission. Various strategies have been adopted for reducing or eliminating the n
Finite size eects and error-free communication in Gaussian channels Ido Kanter and David Saad # Minerva Center and Department of Physics, Bar-Ilan University, Ramat-Gan 52900, Israel. # The Neural Computing
More informationLinear Combinations of Optic Flow Vectors for Estimating Self-Motion a Real-World Test of a Neural Model
Linear Combinations of Optic Flow Vectors for Estimating Self-Motion a Real-World Test of a Neural Model Matthias O. Franz MPI für biologische Kybernetik Spemannstr. 38 D-72076 Tübingen, Germany mof@tuebingen.mpg.de
More informationPERFORMANCE STUDY OF CAUSALITY MEASURES
PERFORMANCE STUDY OF CAUSALITY MEASURES T. Bořil, P. Sovka Department of Circuit Theory, Faculty of Electrical Engineering, Czech Technical University in Prague Abstract Analysis of dynamic relations in
More informationResearch Article Hidden Periodicity and Chaos in the Sequence of Prime Numbers
Advances in Mathematical Physics Volume 2, Article ID 5978, 8 pages doi:.55/2/5978 Research Article Hidden Periodicity and Chaos in the Sequence of Prime Numbers A. Bershadskii Physics Department, ICAR,
More informationEstimation of information-theoretic quantities
Estimation of information-theoretic quantities Liam Paninski Gatsby Computational Neuroscience Unit University College London http://www.gatsby.ucl.ac.uk/ liam liam@gatsby.ucl.ac.uk November 16, 2004 Some
More informationEmpirical Bayes interpretations of random point events
INSTITUTE OF PHYSICS PUBLISHING JOURNAL OF PHYSICS A: MATHEMATICAL AND GENERAL J. Phys. A: Math. Gen. 38 (25) L531 L537 doi:1.188/35-447/38/29/l4 LETTER TO THE EDITOR Empirical Bayes interpretations of
More informationAn Introductory Course in Computational Neuroscience
An Introductory Course in Computational Neuroscience Contents Series Foreword Acknowledgments Preface 1 Preliminary Material 1.1. Introduction 1.1.1 The Cell, the Circuit, and the Brain 1.1.2 Physics of
More informationarxiv: v1 [q-bio.nc] 9 Mar 2016
Temporal code versus rate code for binary Information Sources Agnieszka Pregowska a, Janusz Szczepanski a,, Eligiusz Wajnryb a arxiv:1603.02798v1 [q-bio.nc] 9 Mar 2016 a Institute of Fundamental Technological
More informationReal and Modeled Spike Trains: Where Do They Meet?
Real and Modeled Spike Trains: Where Do They Meet? Vasile V. Moca 1, Danko Nikolić,3, and Raul C. Mureşan 1, 1 Center for Cognitive and Neural Studies (Coneural), Str. Cireşilor nr. 9, 4487 Cluj-Napoca,
More informationAdaptive Velocity Tuning for Visual Motion Estimation
Adaptive Velocity Tuning for Visual Motion Estimation Volker Willert 1 and Julian Eggert 2 1- Darmstadt University of Technology Institute of Automatic Control, Control Theory and Robotics Lab Landgraf-Georg-Str.
More informationOptimal quantization for energy-efficient information transfer in a population of neuron-like devices
Optimal quantization for energy-efficient information transfer in a population of neuron-like devices Mark D. McDonnell a, Nigel G. Stocks b, Charles E. M. Pearce c, Derek Abbott a a Centre for Biomedical
More informationStochastic Learning in a Neural Network with Adapting. Synapses. Istituto Nazionale di Fisica Nucleare, Sezione di Bari
Stochastic Learning in a Neural Network with Adapting Synapses. G. Lattanzi 1, G. Nardulli 1, G. Pasquariello and S. Stramaglia 1 Dipartimento di Fisica dell'universita di Bari and Istituto Nazionale di
More informationInfluence of Criticality on 1/f α Spectral Characteristics of Cortical Neuron Populations
Influence of Criticality on 1/f α Spectral Characteristics of Cortical Neuron Populations Robert Kozma rkozma@memphis.edu Computational Neurodynamics Laboratory, Department of Computer Science 373 Dunn
More informationEects of domain characteristics on instance-based learning algorithms
Theoretical Computer Science 298 (2003) 207 233 www.elsevier.com/locate/tcs Eects of domain characteristics on instance-based learning algorithms Seishi Okamoto, Nobuhiro Yugami Fujitsu Laboratories, 1-9-3
More informationGaussian Process Regression: Active Data Selection and Test Point. Rejection. Sambu Seo Marko Wallat Thore Graepel Klaus Obermayer
Gaussian Process Regression: Active Data Selection and Test Point Rejection Sambu Seo Marko Wallat Thore Graepel Klaus Obermayer Department of Computer Science, Technical University of Berlin Franklinstr.8,
More informationHow to do backpropagation in a brain
How to do backpropagation in a brain Geoffrey Hinton Canadian Institute for Advanced Research & University of Toronto & Google Inc. Prelude I will start with three slides explaining a popular type of deep
More informationSUPPLEMENTARY MATERIAL. Authors: Alan A. Stocker (1) and Eero P. Simoncelli (2)
SUPPLEMENTARY MATERIAL Authors: Alan A. Stocker () and Eero P. Simoncelli () Affiliations: () Dept. of Psychology, Uniersity of Pennsylania 34 Walnut Street 33C Philadelphia, PA 94-68 U.S.A. () Howard
More informationModel neurons!!poisson neurons!
Model neurons!!poisson neurons! Suggested reading:! Chapter 1.4 in Dayan, P. & Abbott, L., heoretical Neuroscience, MI Press, 2001.! Model neurons: Poisson neurons! Contents: Probability of a spike sequence
More information