SPIKE TRIGGERED APPROACHES. Odelia Schwartz Computational Neuroscience Course 2017

Similar documents
1/12/2017. Computational neuroscience. Neurotechnology.

Efficient and direct estimation of a neural subunit model for sensory coding

Statistical models for neural encoding

SUPPLEMENTARY INFORMATION

Mid Year Project Report: Statistical models of visual neurons

Phenomenological Models of Neurons!! Lecture 5!

CHARACTERIZATION OF NONLINEAR NEURON RESPONSES

CHARACTERIZATION OF NONLINEAR NEURON RESPONSES

Dimensionality reduction in neural models: an information-theoretic generalization of spiketriggered average and covariance analysis

Characterization of Nonlinear Neuron Responses

What is the neural code? Sekuler lab, Brandeis

Structured hierarchical models for neurons in the early visual system

The homogeneous Poisson process

encoding and estimation bottleneck and limits to visual fidelity

Characterization of Nonlinear Neuron Responses

Dimensionality reduction in neural models: An information-theoretic generalization of spike-triggered average and covariance analysis

Modeling and Characterization of Neural Gain Control. Odelia Schwartz. A dissertation submitted in partial fulfillment

Identifying Functional Bases for Multidimensional Neural Computations

Spatiotemporal Elements of Macaque V1 Receptive Fields

Exercises. Chapter 1. of τ approx that produces the most accurate estimate for this firing pattern.

Neural Coding: Integrate-and-Fire Models of Single and Multi-Neuron Responses

Flexible Gating of Contextual Influences in Natural Vision. Odelia Schwartz University of Miami Oct 2015

Lateral organization & computation

This cannot be estimated directly... s 1. s 2. P(spike, stim) P(stim) P(spike stim) =

Convolutional Spike-triggered Covariance Analysis for Neural Subunit Models

Visual motion processing and perceptual decision making

Convolutional Spike-triggered Covariance Analysis for Neural Subunit Models

Comparison of objective functions for estimating linear-nonlinear models

SUPPLEMENTARY INFORMATION

Neural Encoding Models

Neural Encoding. Mark van Rossum. January School of Informatics, University of Edinburgh 1 / 58

Features and dimensions: Motion estimation in fly vision

+ + ( + ) = Linear recurrent networks. Simpler, much more amenable to analytic treatment E.g. by choosing

Sean Escola. Center for Theoretical Neuroscience

Comparison of receptive fields to polar and Cartesian stimuli computed with two kinds of models

Modeling Convergent ON and OFF Pathways in the Early Visual System

Second Order Dimensionality Reduction Using Minimum and Maximum Mutual Information Models

Learning quadratic receptive fields from neural responses to natural signals: information theoretic and likelihood methods

Two-dimensional adaptation in the auditory forebrain

Membrane equation. VCl. dv dt + V = V Na G Na + V K G K + V Cl G Cl. G total. C m. G total = G Na + G K + G Cl

Statistical models for neural encoding, decoding, information estimation, and optimal on-line stimulus design

Primer: The deconstruction of neuronal spike trains

Synaptic Input. Linear Model of Synaptic Transmission. Professor David Heeger. September 5, 2000

Emergence of Phase- and Shift-Invariant Features by Decomposition of Natural Images into Independent Feature Subspaces

Computation in a single neuron: Hodgkin and Huxley revisited

Inferring synaptic conductances from spike trains under a biophysically inspired point process model

Efficient Coding. Odelia Schwartz 2017

Transformation of stimulus correlations by the retina

The Bayesian Brain. Robert Jacobs Department of Brain & Cognitive Sciences University of Rochester. May 11, 2017

A Deep Learning Model of the Retina

Consider the following spike trains from two different neurons N1 and N2:

Modeling Surround Suppression in V1 Neurons with a Statistically-Derived Normalization Model

Neural characterization in partially observed populations of spiking neurons

High-dimensional geometry of cortical population activity. Marius Pachitariu University College London

Information Theory. Mark van Rossum. January 24, School of Informatics, University of Edinburgh 1 / 35

Neural Codes and Neural Rings: Topology and Algebraic Geometry

Likelihood-Based Approaches to

Comparing integrate-and-fire models estimated using intracellular and extracellular data 1

Visual Motion Analysis by a Neural Network

Efficient Spike-Coding with Multiplicative Adaptation in a Spike Response Model

Robust regression and non-linear kernel methods for characterization of neuronal response functions from limited data

Simple Cell Receptive Fields in V1.

Convolutional networks. Sebastian Seung

An Introductory Course in Computational Neuroscience

Analyzing Neural Responses to Natural Signals: Maximally Informative Dimensions

Natural Image Statistics

Finding a Basis for the Neural State

Charles Cadieu, Minjoon Kouh, Anitha Pasupathy, Charles E. Connor, Maximilian Riesenhuber and Tomaso Poggio

Modelling stochastic neural learning

ScholarOne, 375 Greenbrier Drive, Charlottesville, VA, 22901

Maximally Informative Stimulus Energies in the Analysis of Neural Responses to Natural Signals

Gatsby Theoretical Neuroscience Lectures: Non-Gaussian statistics and natural images Parts I-II

Neural coding Ecological approach to sensory coding: efficient adaptation to the natural environment

Information Theory and Neuroscience II

Nonlinear reverse-correlation with synthesized naturalistic noise

Efficient coding of natural images with a population of noisy Linear-Nonlinear neurons

Modelling temporal structure (in noise and signal)

Neural Spike Train Analysis 1: Introduction to Point Processes

Mathematical Tools for Neuroscience (NEU 314) Princeton University, Spring 2016 Jonathan Pillow. Homework 8: Logistic Regression & Information Theory

Tilt-aftereffect and adaptation of V1 neurons

Using Statistics of Natural Images to Facilitate Automatic Receptive Field Analysis

EXTENSIONS OF ICA AS MODELS OF NATURAL IMAGES AND VISUAL PROCESSING. Aapo Hyvärinen, Patrik O. Hoyer and Jarmo Hurri

Memories Associated with Single Neurons and Proximity Matrices

arxiv:q-bio/ v1 [q-bio.nc] 2 May 2005

Maximum Likelihood Estimation of a Stochastic Integrate-and-Fire Neural Model

Contextual modulation of V1 receptive fields depends on their spatial symmetry

Simultaneous activity measurements on intact mammalian retina

CIFAR Lectures: Non-Gaussian statistics and natural images

Spatiotemporal Response Properties of Optic-Flow Processing Neurons

THE retina in general consists of three layers: photoreceptors

A Monte Carlo Sequential Estimation for Point Process Optimum Filtering

Information maximization in a network of linear neurons

!) + log(t) # n i. The last two terms on the right hand side (RHS) are clearly independent of θ and can be

Tuning tuning curves. So far: Receptive fields Representation of stimuli Population vectors. Today: Contrast enhancment, cortical processing

Blue-Yellow Signals Are Enhanced by Spatiotemporal Luminance Contrast in Macaque V1

RESEARCH STATEMENT. Nora Youngs, University of Nebraska - Lincoln

LGN Input to Simple Cells and Contrast-Invariant Orientation Tuning: An Analysis

60 ms. 0 ms 10 ms 30 ms. 100 ms 150 ms 200 ms 350 ms. 400 ms 470 ms 570 ms 850 ms. 950 ms 1025 ms 1100 ms 1200 ms

Dimension Reduction Techniques. Presented by Jie (Jerry) Yu

Transcription:

SPIKE TRIGGERED APPROACHES Odelia Schwartz Computational Neuroscience Course 2017

LINEAR NONLINEAR MODELS Linear Nonlinear o Often constrain to some form of Linear, Nonlinear computations, e.g. visual receptive fields or filters, followed by nonlinear interactions

LINEAR NONLINEAR MODELS What type of nonlinearities?

DESCRIPTIVE MODELS: DIVISIVE NORMALIZATION o Canonical computation (Carandini, Heeger, 2013) o Has been applied to primary visual cortex (V1) o More broadly, to other systems and modalities, multimodal processing, value encoding, etc

DESCRIPTIVE MODELS: COMPLEX CELLS AND INVARIANCE o after Adelson & Bergen, 1985

FITTING DESCRIPTIVE MODELS TO DATA Linear Nonlinear Poisson

ROADMAP o Simple cell traditional approach o Simple cell (STA) o When STA fails o Complex cell (STC) o Another example (STC) o More generic model with multiple filters

REMINDER: RECEPTIVE FIELD Hubel and Wiesel, 1959 Stimuli Spikes

Primary Visual Cortex (V1) REMINDER: RECEPTIVE FIELD

RECEPTIVE FIELD o Response of a filter = inner/dot product/projection of filter with stimulus

ROADMAP o Simple cell traditional approach o Simple cell (STA) o When STA fails o Complex cell (STC) o Another example (STC) o More generic model with multiple filters

SPIKE-TRIGGERED AVERAGE

SPIKE-TRIGGERED AVERAGE

SPIKE-TRIGGERED AVERAGE

STA SPIKE-TRIGGERED AVERAGE

EFFECT OF NONLINEARITY IN MODEL?

EFFECT OF NONLINEARITY IN MODEL?

EFFECT OF NONLINEARITY IN MODEL? o Nonlinearity sets negative filter responses to zero (firing rates are positive)

SPIKE-TRIGGERED AVERAGE (STA) o Stimuli that are more similar to filter are more likely to elicit a spike

Model: SPIKE-TRIGGERED AVERAGE (STA)

SPIKE-TRIGGERED AVERAGE (STA) STA response Random filter response

Geometrical view: change in the mean Large filter response likely to elicit spike SPIKE-TRIGGERED AVERAGE (STA) Spike stimuli Raw stimuli STA response Random filter response

SPIKE-TRIGGERED AVERAGE (STA) STA We can also recover the nonlinearity

SPIKE-TRIGGERED AVERAGE (STA) We can also recover the nonlinearity

STEPS 1.Assume a model (filter/s, nonlinearity) (we assumed one filter and asymmetric nonlinearity) 2. Estimate model components (filter/s, nonlinearity) (we looked for changes in mean: STA)

ROADMAP o Simple cell traditional approach o Simple cell (STA) o When STA fails o Complex cell (STC) o Another example (STC) o More generic model with multiple filters

BUT STA DOES NOT ALWAYS WORK STA filter??

BUT STA DOES NOT ALWAYS WORK STA filter!

WHAT HAPPENED?? Nonlinearity sets negative filter responses to positive (firing rates are positive)

WHAT HAPPENED?? Spike stimuli Raw stimuli Model filter STA filter! Random filter response Large or small filter response likely to elicit spike Mean stimuli eliciting spikes = 0

CHANGE IN THE VARIANCE Spike stimuli Raw stimuli Positive 0 Model filter Negative STA filter! Large or small filter response likely to elicit spike Random filter response

SPIKE-TRIGGERED COVARIANCE (STC) Spike stimuli Raw stimuli Positive 0 Model filter Negative Standard algebra techniques (eigenvector analysis) recovers changes in variance Random filter response

SPIKE-TRIGGERED COVARIANCE (STC) Model filter Random filter response We can also recover the nonlinearity

STEPS 1.Assume a model (filter/s, nonlinearity) (we assumed one filter and symmetric nonlinearity) 2. Estimate model components (filter/s, nonlinearity) (STA failed) (we looked for changes in variance: STC)

SPIKE-TRIGGERED COVARIANCE (STC) o Figure from Schwartz et al. 2006; see also Rust et al. 2005, de Ruyter & Bialek 1988 o Approach estimates linear subspace and nonlinearity

ROADMAP o Simple cell traditional approach o Simple cell (STA) o When STA fails o Complex cell (STC) o Another example (STC) o More generic model with multiple filters

SPIKE-TRIGGERED COVARIANCE (STC)

CHANGE IN VARIANCE (STC) Spike stimuli Raw stimuli

CHANGE IN VARIANCE (STC) STA filter!

CHANGE IN VARIANCE (STC)

STEPS 1.Assume a model (filter/s, nonlinearity) (we assumed more than one filter and symmetric nonlinearity) 2. Estimate model components (filter/s, nonlinearity) (STA failed) (we looked for changes in variance: STC)

ROADMAP o Simple cell traditional approach o Simple cell (STA) o When STA fails o Complex cell (STC) o Another example (STC) o More generic model with multiple filters

SECOND FILTER SUPPRESSIVE (E.G., DIVISIVE)

SECOND FILTER SUPPRESSIVE (E.G., DIVISIVE) Second filter brings about reduction in variance! Spike stimuli Raw stimuli

SECOND FILTER SUPPRESSIVE (E.G., DIVISIVE) Second filter brings about reduction in variance!

STEPS 1.Assume a model (filter/s, nonlinearity) (we assumed more than one filter and symmetric nonlinearity) 2. Estimate model components (filter/s, nonlinearity) (we looked for changes in variance, this time reduced variance: STC)

SPIKE TRIGGERED APPROACES Change in the mean (STA) Complex cell Divisive normalization Changes in the variance (STC)

ROADMAP o Simple cell traditional approach o Simple cell (STA) o When STA fails o Complex cell (STC) o Another example (STC) o More generic model with multiple filters

MORE GENERAL CLASS OF MODEL Look for changes in both the mean and the variance

APPLICATION: V1 EXPERIMENT

V1 NEURAL DATA: SPIKE-TRIGGERED COVARIANCE o Example V1 neuron estimated filters from Rust et al. 2005

V1 NEURAL DATA: RECALL THE STANDARD MODELS But Data show multiple filters (excitatory and suppressive) for both. Are these really two different classes of neurons, or is there a continuum??

STC ISSUES: HOW MANY SPIKES? Filter estimate depends on: Spatial and time dimensionality of input stimulus (smaller = better estimate) Number of spikes (more = better estimate)

STC CAVEATS Analysis forces filters that are 90 degrees apart! Filters should not be taken literally as physiological mechanisms

STC CAVEATS But true filters are linear combinations of original ( span the same subspace )

STC CAVEATS Analysis forces filters that are 90 degrees apart! Filters should not be taken literally as physiological mechanisms Spiking in neuron may be non Poisson (bursts;; refractory period;; etc.) Filters should not be taken literally as physiological mechanisms There might be more filters affecting neural response than what analysis finds STC guaranteed to work only for Gaussian stimuli There might be changes that are not in the mean or variance (other approaches;; e.g., info theory)

EXAMPLE: FITTING LN-LN MODEL o o o Figure from Pagan et al. 2015 describing retina and V1 with subunits (see Rust et al. 2005; Vintch et al. 2015) In Pagan et al. 2015 addressing higher level brain areas See also Rowekamp et al. 2017 addressing area V2

EXAMPLE: GENERALIZED LINEAR MODEL Stimulus filter Stochastic Nonlinearity spiking e x Post-spike filter Neuron 1 Coupling filters Neuron 2 o Figure from Pillow et al., 2008, describing retina