encoding and estimation bottleneck and limits to visual fidelity

Similar documents
1/12/2017. Computational neuroscience. Neurotechnology.

What is the neural code? Sekuler lab, Brandeis

Information Theory. Mark van Rossum. January 24, School of Informatics, University of Edinburgh 1 / 35

Estimation of information-theoretic quantities

Statistical models for neural encoding

Membrane equation. VCl. dv dt + V = V Na G Na + V K G K + V Cl G Cl. G total. C m. G total = G Na + G K + G Cl

Do Neurons Process Information Efficiently?

The homogeneous Poisson process

SPIKE TRIGGERED APPROACHES. Odelia Schwartz Computational Neuroscience Course 2017

+ + ( + ) = Linear recurrent networks. Simpler, much more amenable to analytic treatment E.g. by choosing

Encoding or decoding

Neural coding Ecological approach to sensory coding: efficient adaptation to the natural environment

Neural Encoding. Mark van Rossum. January School of Informatics, University of Edinburgh 1 / 58

Adaptation in the Neural Code of the Retina

Efficient Spike-Coding with Multiplicative Adaptation in a Spike Response Model

Exercises. Chapter 1. of τ approx that produces the most accurate estimate for this firing pattern.

Neural Coding: Integrate-and-Fire Models of Single and Multi-Neuron Responses

Efficient coding of natural images with a population of noisy Linear-Nonlinear neurons

Transformation of stimulus correlations by the retina

What do V1 neurons like about natural signals

Adaptive contrast gain control and information maximization $

Searching for simple models

Computation in a single neuron: Hodgkin and Huxley revisited

Analyzing Neural Responses to Natural Signals: Maximally Informative Dimensions

Natural Image Statistics and Neural Representations

3.3 Population Decoding

arxiv:cond-mat/ v2 27 Jun 1997

This cannot be estimated directly... s 1. s 2. P(spike, stim) P(stim) P(spike stim) =

Signal detection theory

The Bayesian Brain. Robert Jacobs Department of Brain & Cognitive Sciences University of Rochester. May 11, 2017

Mid Year Project Report: Statistical models of visual neurons

Dimensionality reduction in neural models: an information-theoretic generalization of spiketriggered average and covariance analysis

Comparison of objective functions for estimating linear-nonlinear models

Robust regression and non-linear kernel methods for characterization of neuronal response functions from limited data

Design principles for contrast gain control from an information theoretic perspective

Modeling Convergent ON and OFF Pathways in the Early Visual System

Modeling and Characterization of Neural Gain Control. Odelia Schwartz. A dissertation submitted in partial fulfillment

Features and dimensions: Motion estimation in fly vision

arxiv:physics/ v1 [physics.data-an] 7 Jun 2003

Dimensionality reduction in neural models: An information-theoretic generalization of spike-triggered average and covariance analysis

Statistical models for neural encoding, decoding, information estimation, and optimal on-line stimulus design

CHARACTERIZATION OF NONLINEAR NEURON RESPONSES

Limits of Linear Rate Coding of Dynamic Stimuli by Electroreceptor Afferents

Characterization of Nonlinear Neuron Responses

Two-dimensional adaptation in the auditory forebrain

Modelling and Analysis of Retinal Ganglion Cells Through System Identification

Nonlinear reverse-correlation with synthesized naturalistic noise

Primer: The deconstruction of neuronal spike trains

Comparing integrate-and-fire models estimated using intracellular and extracellular data 1

Learning quadratic receptive fields from neural responses to natural signals: information theoretic and likelihood methods

arxiv:q-bio/ v1 [q-bio.nc] 2 May 2005

Two-Dimensional Time Coding in the Auditory Brainstem

Neural Decoding. Mark van Rossum. School of Informatics, University of Edinburgh. January 2012

Tuning tuning curves. So far: Receptive fields Representation of stimuli Population vectors. Today: Contrast enhancment, cortical processing

Neural Encoding: Firing Rates and Spike Statistics

Dynamical mechanisms underlying contrast gain control in single neurons

BASIC VISUAL SCIENCE CORE

The Spike Response Model: A Framework to Predict Neuronal Spike Trains

Combining biophysical and statistical methods for understanding neural codes

Time-rescaling methods for the estimation and assessment of non-poisson neural encoding models

Maximum Likelihood Estimation of a Stochastic Integrate-and-Fire Neural Model

Phenomenological Models of Neurons!! Lecture 5!

Ising models for neural activity inferred via Selective Cluster Expansion: structural and coding properties

Simultaneous activity measurements on intact mammalian retina

A Deep Learning Model of the Retina

Neural Encoding Models

Entropy and Information in the Neuroscience Laboratory

SUPPLEMENTARY INFORMATION

STA and the encoding and decoding problems. NEU 466M Instructor: Professor Ila R. Fiete Spring 2016

Model-based decoding, information estimation, and change-point detection techniques for multi-neuron spike trains

Limulus. The Neural Code. Response of Visual Neurons 9/21/2011

arxiv: v2 [q-bio.nc] 28 Sep 2009

THE retina in general consists of three layers: photoreceptors

arxiv: v1 [q-bio.nc] 9 Mar 2016

Likelihood-Based Approaches to

CSE/NB 528 Final Lecture: All Good Things Must. CSE/NB 528: Final Lecture

Neural characterization in partially observed populations of spiking neurons

Synergy in a Neural Code

3 Neural Decoding. 3.1 Encoding and Decoding. (r 1, r 2,..., r N ) for N neurons is a list of spike-count firing rates, although,

Entropy and Information in the Neuroscience Laboratory

Lateral organization & computation

Neural Encoding II: Reverse Correlation and Visual Receptive Fields

Firing Rate Distributions and Ef ciency of Information Transmission of Inferior Temporal Cortex Neurons to Natural Visual Stimuli

Maximally Informative Stimulus Energies in the Analysis of Neural Responses to Natural Signals

CHARACTERIZATION OF NONLINEAR NEURON RESPONSES

arxiv: v1 [q-bio.nc] 12 Feb 2009

Linear and Non-Linear Responses to Dynamic Broad-Band Spectra in Primary Auditory Cortex

Neural Population Structures and Consequences for Neural Coding

Population Coding. Maneesh Sahani Gatsby Computational Neuroscience Unit University College London

Another look at the retina as an image dithered scalar quantizer

Optimal Filtering in the Salamander Retina

Bayesian probability theory and generative models

Encoding of Visual Information by LGN Bursts

Characterization of Nonlinear Neuron Responses

Neural variability and Poisson statistics

Model neurons!!poisson neurons!

4.2 Entropy lost and information gained

PLEASE SCROLL DOWN FOR ARTICLE

Efficient coding of natural images with a population of noisy Linear-Nonlinear neurons

Synaptic Input. Linear Model of Synaptic Transmission. Professor David Heeger. September 5, 2000

Transcription:

Retina Light Optic Nerve photoreceptors encoding and estimation bottleneck and limits to visual fidelity interneurons ganglion cells light

The Neural Coding Problem s(t) {t i } Central goals for today: important properties of coding process - to be complete must deal with spikes (not rate) desiderata and comparison of different approaches - how do we know when we are done? - would like 'answer' to provide functional and mechanistic insights reliability and coding efficiency (approaches fundamental limits...)

80 60 velocity (degrees/sec) 40 20 0-20 -40-60 -80 2000 2100 2200 2300 2400 2500 time (msec) stimuli of interest change on time scale comparable to interval between spikes

WHY DO WE NEED MORE THAN RATE CODING? r(t) Most of what we know about how sensory signals are represented comes from measuring dependence of mean firing rate on some parameter of stimulus (receptive fields, tuning curves, ) While this has clearly been fruitful, for some quantitative issues need to be sure we have complete understanding of code - relation between neural activity and behavior - comparison of fidelity at different stages of system In some systems clear that behavior influenced by small # of spikes, making rate not a particularly useful quantity. In many systems time scale of behavioral decision similar to interval between spikes -- hence cannot average over time to get rate.

CODING IS PROBABILISTIC s(t) {ti} {t i } s(t) encoding P[{t i } s(t)] decoding P[s(t) {t i }] Bayes' Rule P[{t i } s(t)] P[s(t)] = P[s(t) {t i }] P[{t i }] Any approach to coding must deal with its probabilistic nature!

PRACTICAL CONSIDERATIONS s(t) {ti} P[{ti} s(t)] and P[s(t) {ti}] are high dimensional {ti} ~ 100 msec window, 1 msec bins s(t) ~ time dimensions + others (space, color,...) collect ~ 10,000 spikes in typical experiment impossible in practice to get entire distribution try to capture structure of P[{ti} s(t)] or P[s(t) {ti}] with finite data

TAYLOR SERIES APPROXIMATION f(x) f(x 1 ) f(x 0 ) x 0 x 1 x df 1 d2f f(x 1 ) = f(x 0 ) + (x 1 -x 0 ) + (x 1 -x 0 )2 +... dx 2 dx2 x 0 x 0 KEY: f(x 0 ) >> (x 1 -x 0 ) df dx x 0 i.e. expand about small parameter

STRUCTURE OF CONDITIONAL DISTRIBUTIONS s(t) {t i } P[{ti} s(t)] P[s(t) {ti}] encoding ambiguous {ti} decoding single-valued s(t) Although encoding and decoding in principle equivalent (Bayes Rule), in practice may not be

ENCODING: WIENER AND VOLTERRA APPROACHES s(t) r(t) What are statistical properties of stimulus leading to spike? (1) Collect stimuli preceding spike (2) Measure mean, covariance,...

FIRST-ORDER FIRING RATE ESTIMATE s(t) {t } i Dashed: r(t) Solid: r est (t) = dt K 1 (t) s(t-t)

ENCODING: WIENER AND VOLTERRA APPROACHES K 1 (t) K 2 (t, t )... r(t) PROS: systematic often easily interpreted CONS: typically does not converge quickly - no small parameter - usually run out of data before terms stop contributing estimates rate, not spike train - hard to say how well it works - effectively assumes spikes are independent

Is explosion due to assumption of independent spikes? - deal with spike interactions directly EJC (2000): We are all losing, the only question is how badly

DECODING: THE LOOK-UP-TABLE APPROACH (de Ruyter van Steveninck and Bialek, 1987) s(t) {t i } s est (t) Estimate stimulus by replacing each spike with spike triggered average Estimate stimulus by replacing each interval with appropriate interval triggered average

ENCODING: determine what stimulus features trigger a spike use to estimate firing rate expansion is in statistical aspect of stimulus (mean, covariance,...) prior to spike DECODING: Look-up-table approach determine what each spike sequence 'stands for' consider only linear relation between stimulus and each spike sequence expansion is in spike sequences

ENTRIES IN THE LOOK-UP TABLE spike-triggered average structure of intervaltriggered averages not simply predicted from STA (de Ruyter van Steveninck and Bialek, 1987)

STIMULUS ESTIMATES USING LOOK-UP TABLE complex patterns (at least up to 3 spikes) continue to help not enough data to go to 4 spike patterns (despite 1e6 spikes in this experiment!) (de Ruyter van Steveninck and Bialek, 1987)

LOOK UP TABLE APPROACH s(t) {t i } s est (t) PROS: systematic easy to compare stimulus and estimate simple to evaluate accuracy deals with non-independence of spikes directly CONS: typically does not converge quickly - again no small parameter - run out of data before clearly done implementation? resolution of overlap difficult

Possible reasons for continued incineration: each spike sequence stands for something special refractoriness introduces another time scale and we get this only slowly by considering each spike sequence

DECODING AS FILTERING s(t) {t i } s est (t) functional approach s est (t) = S F 1 (t-t i ) + S F 2 (t-t i, t-t j ) +... choose F's to minimize c 2 c 2 = < s(t) - s est (t) 2 >

DIRECT DECODING IN FLY MOTION-SENSITIVE NEURON (Bialek et al., 1991) Linear estimation filter Spike-triggered average quality of estimate with linear term similar to that of 3-spike patterns in look-up table approach - i.e. seems to be converging

DIRECT DECODING SUMMARY s(t) {t i } s est (t) PROS: systematic easily interpreted seems to converge easy to evaluate accuracy CONS: still no small parameter mechanistic interpretation?

ENCODING: DECODING: determine what stimulus features trigger a spike Look-up-table approach determine what each spike sequence 'stands for' Direct approach determine what each spike 'stands for' - correct for bias of spiking dynamics

The Neural Coding Problem s(t) {t i } Central goals for today: important properties of coding process - to be complete must deal with spikes (not rate) desiderata and comparison of different approaches - how do we know when we are done? - would like 'answer' to provide functional and mechanistic insights reliability and coding efficiency (approaches fundamental limits...)

CODING EFFICIENCY: HOW DO YOU KNOW WHEN YOU ARE DONE? coding efficiency = information provided by spikes about stimulus spike train entropy stimulus spike train High Efficiency Low Efficiency stimuli spike trains stimuli spike trains

coding efficiency = information provided by spikes about stimulus spike train entropy stimuli spike trains How much does observation of the spike train reduce uncertainty about the stimulus? Prior: P[stimulus] Conditional distribution: P[stimulus spike train] Measure reduction in uncertainty on logarithmic scale (entropy) additivity - information from two independent measurements should add relative measure - compare information about different aspect of stimulus (e.g. color and orientation)

Information theory (Shannon) P[s] observe P[s] P[s {t i }] spikes s s Info = - Ds P[s] log 2 P[s] + Ds Dti P[s {ti}] log 2 P[s {ti}] For Gaussian signal and noise R info = df log 2 [1 + SNR(f)]

NOISE IN THE ESTIMATED STIMULUS difference between stimulus and estimate contains both random and systematic errors separate by measuring correlation at each temporal frequency s (f) = g(f) * [ s(f) + n(f) ] est slope g(f): systematic bias scatter n(f): effective input noise

NOISE IS APPROXIMATELY GAUSSIAN reconstruction of vibration amplitude using afferent from frog sacculus distribution of random errors in estimate

Information theory (Shannon) P[s] observe P[s] P[s {ti}] spikes s s R info = - Ds P[s] log 2 P[s] + Ds Dt i P[s {t i }] log 2 P[s {t i }] For Gaussian signal and noise R info = df log 2 [1 + SNR(f)]

SIGNAL-TO-NOISE IN SACCULUS AFFERENT R = df log [1 + SNR(f)] info 2 = 150 bits/sec

coding efficiency = information provided by spikes about stimulus spike train entropy stimuli spike trains How many distinguishable spike trains can the cell generate? P[spikes]

Spike train entropy (MacKay and McCulloch) 0 1 0 0 1 0 1 0 1 0 Dt S(Dt)= - 1 [p log 2 p + Dt (1-p)log 2 (1-p)] Coding efficiency (1) measure R info (Dt) (2) measure S(Dt) e(dt)

coding efficiency = information provided by spikes about stimulus spike train entropy entropy bits/sec efficiency information timing precision (msec) timing precision (msec)

CODING EFFICIENCY HIGHER FOR 'NATURAL' SIGNALS (Rieke, Bodnar and Bialek, 1995) 1.0 naturalistic sounds e(dt) 0.5 broad band noise 0 0 0.5 1.0 1.5 timing precision (ms) nonlinearities in peripheral auditory system matched to statistical properties of signals

Summary of Coding Efficiency Measurements system efficiency cricket mechanoreceptors >50% frog mechanoreceptors frog auditory afferents >50% 10-30% for broad band noise 50-90% for naturalistic sounds retinal ganglion cells ~20% for white noise inputs (Warland Reinagel and Meister, 1997) EJC (2000): We are all losing, the only question is how badly

SOME OPEN QUESTIONS Coding in cell populations - distributed coding (e.g. correlations) Adaptive codes - how maintain efficient coding when properties of input signals change? Statistics of natural images - efficiency of coding in ganglion cells Optimal coding? - coding is efficient -> predict dynamics?