Lateral organization & computation

Similar documents
+ + ( + ) = Linear recurrent networks. Simpler, much more amenable to analytic treatment E.g. by choosing

Efficient Coding. Odelia Schwartz 2017

SUPPLEMENTARY INFORMATION

Natural Image Statistics

The Bayesian Brain. Robert Jacobs Department of Brain & Cognitive Sciences University of Rochester. May 11, 2017

The homogeneous Poisson process

SPIKE TRIGGERED APPROACHES. Odelia Schwartz Computational Neuroscience Course 2017

RESEARCH STATEMENT. Nora Youngs, University of Nebraska - Lincoln

An Introductory Course in Computational Neuroscience

Motion Perception 1. PSY305 Lecture 12 JV Stone

CHARACTERIZATION OF NONLINEAR NEURON RESPONSES

Exercises. Chapter 1. of τ approx that produces the most accurate estimate for this firing pattern.

Bayesian Computation in Recurrent Neural Circuits

Emergence of Phase- and Shift-Invariant Features by Decomposition of Natural Images into Independent Feature Subspaces

Nature Neuroscience: doi: /nn.2283

unit P[r x*] C decode encode unit P[x r] f(x) x D

Sean Escola. Center for Theoretical Neuroscience

Neural coding Ecological approach to sensory coding: efficient adaptation to the natural environment

Population Coding. Maneesh Sahani Gatsby Computational Neuroscience Unit University College London

Neurons as Monte Carlo Samplers: Bayesian Inference and Learning in Spiking Networks

Neural Coding: Integrate-and-Fire Models of Single and Multi-Neuron Responses

The Hebb rule Neurons that fire together wire together.

Deep learning in the visual cortex

CHARACTERIZATION OF NONLINEAR NEURON RESPONSES

Flexible Gating of Contextual Influences in Natural Vision. Odelia Schwartz University of Miami Oct 2015

Hierarchy. Will Penny. 24th March Hierarchy. Will Penny. Linear Models. Convergence. Nonlinear Models. References

Fundamentals of Computational Neuroscience 2e

Neuronal Tuning: To Sharpen or Broaden?

Phenomenological Models of Neurons!! Lecture 5!

Statistical models for neural encoding

The Effect of Correlated Variability on the Accuracy of a Population Code

Is early vision optimised for extracting higher order dependencies? Karklin and Lewicki, NIPS 2005

Nonlinear reverse-correlation with synthesized naturalistic noise

Visual motion processing and perceptual decision making

Bayesian probability theory and generative models

A Brief Review of Probability, Bayesian Statistics, and Information Theory

Model neurons!!poisson neurons!

Adaptation in the Neural Code of the Retina

Basic Principles of Unsupervised and Unsupervised

Modeling Surround Suppression in V1 Neurons with a Statistically-Derived Normalization Model

Consider the following spike trains from two different neurons N1 and N2:

Supratim Ray

Transformation of stimulus correlations by the retina

Neural variability and Poisson statistics

Implementing a Bayes Filter in a Neural Circuit: The Case of Unknown Stimulus Dynamics

Probabilistic Inference of Hand Motion from Neural Activity in Motor Cortex

!) + log(t) # n i. The last two terms on the right hand side (RHS) are clearly independent of θ and can be

Correlations and neural information coding Shlens et al. 09

How to do backpropagation in a brain

Congruent and Opposite Neurons: Sisters for Multisensory Integration and Segregation

5.0 References. Unsupervised Learning for Boltzmann Machines 15

Gatsby Theoretical Neuroscience Lectures: Non-Gaussian statistics and natural images Parts I-II

Probabilistic Models in Theoretical Neuroscience

Sensory Integration and Density Estimation

ADAPTIVE LATERAL INHIBITION FOR NON-NEGATIVE ICA. Mark Plumbley

Modelling stochastic neural learning

A Monte Carlo Sequential Estimation for Point Process Optimum Filtering

The functional organization of the visual cortex in primates

Neural Encoding: Firing Rates and Spike Statistics

Outline. NIP: Hebbian Learning. Overview. Types of Learning. Neural Information Processing. Amos Storkey

Lecture 16 Deep Neural Generative Models

AT2 Neuromodeling: Problem set #3 SPIKE TRAINS

EXTENSIONS OF ICA AS MODELS OF NATURAL IMAGES AND VISUAL PROCESSING. Aapo Hyvärinen, Patrik O. Hoyer and Jarmo Hurri

Complex Inference in Neural Circuits with Probabilistic Population Codes and Topic Models

This cannot be estimated directly... s 1. s 2. P(spike, stim) P(stim) P(spike stim) =

Probabilistic Modeling of Dependencies Among Visual Short-Term Memory Representations

Shotgun. Auditory. Visual. p(a S) p(v S) p(x) p(x) s j

Predicting response time and error rates in visual search

encoding and estimation bottleneck and limits to visual fidelity

Describing Spike-Trains

Position Variance, Recurrence and Perceptual Learning

Neural Spike Train Analysis 1: Introduction to Point Processes

Neuroscience Introduction

How to read a burst duration code

Efficient Spike-Coding with Multiplicative Adaptation in a Spike Response Model

Adaptive Velocity Tuning for Visual Motion Estimation

Neural information often passes through many different

Time-rescaling methods for the estimation and assessment of non-poisson neural encoding models

Dynamical Constraints on Computing with Spike Timing in the Cortex

Estimation of information-theoretic quantities

Spike Count Correlation Increases with Length of Time Interval in the Presence of Trial-to-Trial Variation

THE functional role of simple and complex cells has

CIFAR Lectures: Non-Gaussian statistics and natural images

Jan 16: The Visual System

SUPPLEMENTARY INFORMATION

Advanced Introduction to Machine Learning CMU-10715

SUPPLEMENTARY MATERIAL. Authors: Alan A. Stocker (1) and Eero P. Simoncelli (2)

3 Neural Decoding. 3.1 Encoding and Decoding. (r 1, r 2,..., r N ) for N neurons is a list of spike-count firing rates, although,

Finding a Basis for the Neural State

CSE/NB 528 Final Lecture: All Good Things Must. CSE/NB 528: Final Lecture

TWO METHODS FOR ESTIMATING OVERCOMPLETE INDEPENDENT COMPONENT BASES. Mika Inki and Aapo Hyvärinen

Experimental design of fmri studies & Resting-State fmri

Mid Year Project Report: Statistical models of visual neurons

High-dimensional geometry of cortical population activity. Marius Pachitariu University College London

Implementing Bayes Rule with Neural Fields

Encoding and Decoding Spikes for Dynamic Stimuli

Bayesian Inference. Will Penny. 24th February Bayesian Inference. Will Penny. Bayesian Inference. References

Modeling and Characterization of Neural Gain Control. Odelia Schwartz. A dissertation submitted in partial fulfillment

Supporting Online Material for

Transcription:

Lateral organization & computation review Population encoding & decoding lateral organization Efficient representations that reduce or exploit redundancy Fixation task 1rst order Retinotopic maps Log-polar model.14.12.1.8.6.4.2 see: smallretinacortexmap.nb Other maps? Grouping what? http://gallantlab.org/publications/huth-etal-212.html A http://gallantlab.org/semanticmovies/ intensity histogram response histogram.6 25.5 2 15.4 1 5 5 1 15 2 25 2nd order, linear PCA co sparse coding theories. dictionary methods ter un ha -p but needs modified Oja rule to capture all components: se ra ke flic.66 t D E time Hz Real Illusory 25.3.1 1 C Efficient representations that reduce or 18 exploit s 12 s redundancy static flickering 2.2 5 B 15 5 1 15 2 25

autoencoder networks Efficient representations that reduce or exploit redundancy 2nd order PCA is a linear transform that decorrelates the coefficients: E(s i s j )=E(s i )E(S j ) L ~ L ICA finds a linear decomposition such that: p(s i,s j)=p(s i )p(s j ) [L(x, y) X s i A i (x, y)] 2 + X i i B(s i ) Hyvärinen, A. (21). Statistical Models of Natural Images and Cortical Visual Representation. Topics in Cognitive Science, 2(2), 251 264. doi:1.1111/j. 1756-8765.29.157.x PCA vs. Linear Discriminant Analysis Higher-order structure? 5-5 5-5 from lecture 18 responses of linear model neurons with receptive fields that are close in space, preferred orientation or spatial frequency are not statistically independent Schwartz, O., & Simoncelli, E. P. (21). Natural signal statistics and sensory gain control. Nature Neuroscience, 4(8), 819 825.

Higher-order structure? More on decorrelation: Accounts for neurophysiological responses of neurons in V1. Schwartz, O., & Simoncelli, E. P. (21). Natural signal statistics and sensory gain control. Nature Neuroscience, 4(8), 819 825. divisive normalization Linear spatial filter Outputs from other cortical cells firing rate n 2 R i = s w ij L j ì R k j=1 kœn i non-orthogonal decorrelation The middle disks have the same physical luminance variance, but the one on the right appears more contrasty, i.e. to have higher variance. orthogonal orthogonal From Heeger This may be a behavioral consequence of an underlying non-linearity in the spatial filtering properties of V1 neurons involving divisive normalization derived from measures of the activity of other nearby neurons. Contingent Adaptation McCollough, C. (1965, 3 September 1965). Color Adaptation of Edge-Detectors in the Human Visual System. Science, 149, 1115-1116. Lateral organization & neural codes How do neural populations represent information? anti-hebbian Working assumptions: R/G hebbian R/G R/G' Lateral organization involves a population of neurons representing features at the same level of abstraction Receptive fields organized along a topographically mapped dimension with overlapping selectivities green horizontal adapting patterns B&W test gratings new gray line old gray line θ' Decoding inferring world property from spikes requires extracting information from the population H Barlow, H. B., & Foldiak, P. (1989). Adaptation and decorrelation in the cortex. In C. Miall, R. M. Durban, & G. J. Mitchison (Ed.), The Computing Neuron Addison-Wesley. V θ Decorrelation due to adaptation ContingentAdaptation.nb θ Mathematica notebook Lect_24b_VisualRepCode.nb

Neural Implementations of Bayesian Inference Perceptual encoding: learning to represent world properties in terms of firing patterns Lecture notes adapted from Alexandre Pouget http://cms.unige.ch/neurosciences/recherche/ Zemel, R. S., Dayan, P., & Pouget, A. (1998). Probabilistic interpretation of population codes Neural Computation, 1(2), 43 43. Perceptual decoding: interpretation of encoded pattern by subsequent neural processes Ma, W. J., Beck, J. M., Latham, P. E., & Pouget, A. (26). Bayesian inference with probabilistic population codes. Nature Neuroscience, 9(11), 1432 1438. doi:1.138/nn179 Probabilistic brains: knowns and unknowns (213.) Pouget, A., Beck, J., Ma, W.J., Latham, P. Nature Neuroscience 16:117-1178. Poisson noise Imagine the following process: we bin time into small intervals, δt. Then, for each interval, we toss a coin with probability, P(head) =p. If we get a head, we record a spike. This is the Bernoulli process of PS#1. For small p, the number of spikes per second follows a Poisson distribution with mean p/δt spikes/second (e.g., p=.1, δt=1ms, mean=1 spikes/sec). Properties of a Poisson process The variance should be equal to the mean A Poisson process does not care about the past, i.e., at a given time step, the outcome of the coin toss is independent of the past ( renewal process ). As a result, the inter-event intervals follow an exponential distribution (Caution: this is not a good marker of a Poisson process)

Poisson process and spiking The inter spike interval (ISI) distribution is close to an exponential except for short intervals (refractory period) and for bursting neurons Poisson process and spiking The variance in the spike count is proportional to the mean but the the constant of proportionality can be higher than 1 and the variance can be an polynomial function of the mean. Log σ 2 = β Log a +log α Actual data Simulated Poisson Process Poisson model Is Poisson variability really noise? to illustrate population coding return to orientation selectivity Where could it come from? Neurons embedded in a recurrent network with sparse connectivity tend to fire with statistics close to Poisson (Van Vreeswick and Sompolinski, Brunel, Banerjee) Could Poisson variability be useful for probabilistic computations? I.e. where knowledge of uncertainty is represented and used? Poisson-like representations can be used for Bayesian integration of information 2

The decoding problem 1 1 8 8 Population Code 6 4 4 2 2-1 1 Direction (deg) s? 6 Given a stimulus with unknown orientation s, what can one say about s given a vector r representing the pattern of neural activity? -1 1 Preferred Direction (deg) Estimation theory: come up with a single value estimate from r Bayesian approach: estimate the posterior p(s r) Pattern of activity (r) Tuning Curves Advantages of a probabilistic representation Cue integration Visuo-Tactile Integration Bimodal p(s Vision,Touch)= Recall Ex 3 in PS #3: Derive the optimal rule for integrating two noisy measurements to estimate the mean μ = + μ + + Probability (Ernst and Banks, Nature, 22) p(s Vision) αp(s Vision) p(s Touch) p(s Touch) μ S (Width) 23

Probabilistic population codes Population codes Alternative: compute a posterior distribution, p(s r) from (Foldiak, 1993; Sanger 1996). r 8 6 4 1 (spike count) (spike count) 1 Population vector 2 - Preferred stimulus P(s r1) Underlying assumption: population codes encode single values..4.2 - S C1 1 g1 8 6 4 2 1 - Preferred S + g2 4 2 - Preferred S P(s r2).4.2-6 4 6 S g=g1+g2 8 2 8 P(s r1+r2) 1 C2.4.2 - S - Preferred S 8 6 r 4 2 - Preferred stimulus p(s r).4 Bayesian decoder probability Standard approach: estimating.2 - stimulus Variability in neural responses for a constant stimulus: Poisson-like