Flexible Gating of Contextual Influences in Natural Vision. Odelia Schwartz University of Miami Oct 2015
|
|
- Bertha Stephens
- 5 years ago
- Views:
Transcription
1 Flexible Gating of Contextual Influences in Natural Vision Odelia Schwartz University of Miami Oct 05
2
3 Contextual influences Perceptual illusions: no man is an island.. Review paper on context: Schwartz, Hsu, Dayan, Nature Reviews Neuroscience 007
4 Contextual influences Perceptual illusions: no man is an island.. Review paper on context: Schwartz, Hsu, Dayan, Nature Reviews Neuroscience 007
5 Contextual influences Perceptual illusions
6 What about neurons? Cortical neural processing Record From neuron
7 What about neurons? Computer science / Engineering: visual receptive field or filter
8 Contextual influences Cortical visual neurons (V)??
9 Motivation Spatial context plays critical role in object grouping and recognition, and in segmentation. It is key to everyday behavior; deficits have been implicated in neurological and developmental disorders and aging Range of existing experimental data on spatial context (neural; perceptual). Lacking principled explanation Poor understanding for how we (and our cortical neurons) process complex, natural images
10 Outline Experimental data on cortical responses to natural images Computational neural model that captures contextual regularities in natural images Interplay of modeling with biological neural and psychology data (focus on natural images data)
11 Suppression index (SI) Cortical Neurons Response A 30 0 Suppression index (SI) Spatial context and natural scenes 0 0 deg B Data: Adam Kohn lab (Coen-Cagli, Kohn, Schwartz, 05; in press)
12 Cortical Neurons Modulation Ratio (MR) Supp. index (SI) Spatial context and natural scenes facilitation Data suppression Surround inference: Image ID OFF ON deg B Data: Adam Kohn lab (Coen-Cagli, Kohn, Schwartz, 05; in press)
13 Cortical Neurons Spatial context and natural scenes Can we capture data with canonical divisive normalization? (descriptive model)
14 Divisive normalization a Standard normalization RF Response surround Descriptive model Canonical computation (Carandini, Heeger, Nature Reviews Neuro, 0) Has been applied to visual cortex, as well as other systems and c modalities, multimodal processing, value encoding, etc
15 Cortical Neurons x c Canonical divisive normalization: x s R c x c x c + x s V Data: Kohn lab
16 Cortical responses to natural images Modulation Ratio (MR) Supp. index (SI ) facilitation suppression Image ID Image ID Data Standard model We fit the standard normalization model to neural data Poor prediction quality Data: Adam Kohn lab Coen-Cagli, Kohn, Schwartz, 05 (in press)
17 Cortical responses to natural images Modulation Ratio (MR) Supp. index (SI ) facilitation suppression Image ID Image ID Data Standard model Can we explain as strategy to encode natural images optimally based on expected contextual regularities? Data: Adam Kohn lab Coen-Cagli, Kohn, Schwartz, 05 (in press)
18 Outline Experimental data on cortical responses to natural images (standard descriptive model can t explain) Computational neural model that captures contextual regularities in natural images A Interplay of modeling with biological neural and psychology data (focus on natural images data)
19 Two overarching computational principles Sensory processing as inference of properties of the input (can be formalized via probabilistic Bayesian inference) Sensory systems aim to form an efficient code by reducing redundancies of the input (Barlow; also Attneave); influenced by information theory in the 950s
20 Contextual dependencies across space
21 Contextual dependencies across space
22 Contextual dependencies across space
23 Contextual dependencies across space
24 Contextual dependencies across space
25 Contextual dependencies across space Schwartz, Simoncelli, Nature Neuroscience 00
26 Generative model framework x x 0 Hypothesize that cortical neurons aim to reduce statistical dependencies (so as to highlight what is salient) Schwartz, Simoncelli 00 (for salience: Zhaoping Li, 00) Formally, we build a generative model of the dependencies and invert the model (Bayesian inference) richer representation! Andrews, Mallows, 974; Wainwright, Simoncelli, 000; Schwartz, Sejnowski, Dayan 006 Generating the dependencies is a multiplicative process and to undo the dependencies we divide 0
27 Modeling Statistical dependencies: Gaussian Scale Mixture (GSM) x x Filter activations x x Andrews & Mallows, 974; Wainwright & Simoncelli, 000
28 Modeling Statistical dependencies: Gaussian Scale Mixture (GSM) x = vg x = vg Gaussians local Global (shared) mixer g g x x Filter activations x 0g g 0 x Andrews & Mallows, 974; Wainwright & Simoncelli, 000
29 Modeling Statistical dependencies: Gaussian Scale Mixture (GSM) Gaussians local Global (shared) mixer x = vg x = vg g g x x Filter activations Infer local Gaussian E(g x,x ) = Model neuron activity
30 Modeling Statistical dependencies: Gaussian Scale Mixture (GSM) x = vg x = vg Gaussians local Global (shared) mixer g g x x Filter activations x 0g g 0 x EFFICIENT CODING
31 Modeling Statistical dependencies: Gaussian Scale Mixture (GSM) Gaussians local Global (shared) mixer g g x x Filter activations Computed via Bayes rule E(g x, x ) x l ; l = x + x DIVISIVE NORMALIZATION
32 Divisive Normalization Canonical Model Divisive normalization descriptive models have been applied in many neural systems. Here we provide a principled explanation. We will next show that it also leads to a richer model based on image statistics and makes predictions
33 Non-homogeneity of images g c v c g s x c x s homogenous image patches Center and surround dependent Schwartz, Sejnowski, Dayan, 009; Coen-Cagli, Dayan, Schwartz, PLoS Comp Biology 0
34 Non-homogeneity of images g c v c v s g s non-homogenous image patches x c xs Center and surround independent Schwartz, Sejnowski, Dayan, 009; Coen-Cagli, Dayan, Schwartz, PLoS Comp Biology 0
35 x = vg () THExAUTHOR = vg x B( x sign(x ) AUTHOR E[g x] THE = x x B( n(), l ) l B () E[g x] = sign(x ) () () ne of images Non-homogeneity n l B( x x l B( x =,vg) x = vg () E[g x] = sign(x ) () l = x n l = x = vgx = vg i i x = vgx l= vg B( x l =x = vg x x l =x = vg c c i i x i i l = xc xc sxcx l = xs i xi l = x i xi B x v xs v x c s xc() c x xc() ns E[g x] = sign(x ) B( x x vc vc x v l B xs () E[g x] = sign(x ) () c s n l B( gc gc v vc v s c l= x gs gc gs gc l= x i i i v s x xicxi gs l = gs l = xic i xc xs xs vc homogenous vc xc xs xs vc heterogeneous vc Schwartz, Sejnowski, Dayan, 009; Coen-Cagli, Dayan, Schwartz, PLoS Comp Biology 0
36 x = vg () THE xauthor = vg x() x B( THE AUTHOR E[g x] = sign(x ) B( n, l ) x x B () l E Non-homogeneity () () E[g of x] images = sign(x ) n nx x= l B( x l vg B(, ) x = vg () () E[g x] = sign(x ) x = vg l = l = x n x l= vg x = vgx = vg i i B( l =x = vg xc l =x = vg x i x i xcl = i i xc sxcx xc l = x l = xs i xi i i v x xs c xs x B v xs x c c() xc() n E[g x] = sign(x ) B( x x v v sv l B c x vc c s xs () E[g x] = sign(x ) () n gc v l g B( c v s c vc l= gs g x l= c x gs gc i i i v s l = x g x s l = x ic i xic i gs xc xs xs vc divisive vc normalization ON xc xs xs vc divisive vc normalization OFF Schwartz, Sejnowski, Dayan, 009; Coen-Cagli, Dayan, Schwartz, PLoS Comp Biology 0
37 () THE xauthor = vg x() x B( THE AUTHOR E[g x] = sign(x ) B( n, l ) x x xs vc xc xs vc x = vg B () l E Non-homogeneity () () E[g of x] images = sign(x ) n nx x= l B( x l vg B(, ) x = vg () () E[g x] = sign(x ) x = vg l = l = x n x l= vg x = vgx = vg i i B( l =x = vg xc l =x = vg x i x i xcl = i i xc sxcx xc l = x l = xs i xi i i v x xs c xs x B v xs x c c() xc() n E[g x] = sign(x ) B( x x v v sv l B c x vc c s xs () E[g x] = sign(x ) () n gc v l g B( c v s c vc l= gs g x l= c x gs gc i i i v s l = x g x s l = x ic i xic i gs xc xs vc xs vc Schwartz, Sejnowski, Dayan, 009; Coen-Cagli, Dayan, Schwartz, PLoS Comp Biology 0
38 Model: xcl = xc l =x = vg l =x = vg i x i x i xi i i xc l = xsxicxi l = xs i xxic x xs () xs () xc() vce[g xc() vc xs E[g x] x=e sb x] = sign(x ) x Optimizing Image Ensemble v v v vc c s xs x]sv= () E[g ) x] = sign(x ) xs () E[g c sign(x g l B v g cv s c v vc s c l =x l= l = gs gs gc i xi l = x gc i xi i i i i xc l = l = x g xic i gs xc xicxi s xc xc xs xs xs vc - 3x3 spatial positions, 6px separation vc vc - 4 orientations in the center xs vc xs vc xs vc - 4 orientations in the surround - phases (quadrature) - model parameters (prior probability for and also linear covariance matrices) optimized to maximize the likelihood of a database of natural images using Expectation Maximization Coen-Cagli, Dayan, Schwartz, PLoS Comp Biology 0; Schwartz, Sejnowski, Dayan, 006
39 Outline Experimental data on cortical responses to natural images Computational neural model that captures contextual regularities in natural images Interplay of modeling with biological neural and psychology data (focus on natural images data)
40 Cortical predictions for natural images In the past, we have tested modeling with simple stimuli (e.g., Coen-Cagli, Dayan, Schwartz, 0; Schwartz, Sejnowski, Dayan, 009) Here, we make predictions for natural images (Coen-Cagli, Kohn, Schwartz, 05, in press)
41 Flexible Divisive Normalization
42 Model predictions for natural images Homogeneous and heterogeneous determined by model! Expect more suppression in neurons for homogeneous Related to salience (eg, Zhaoping) Coen-Cagli, Kohn, Schwartz, 05; in press
43 Model summary RF Response surround ON OFF surround inference Inference determined by model
44 Model Predictions for Natural Scenes homogeneous versus heterogeneous determined by the model
45 Model Predictions for Natural Scenes Cortical V data: a.5 MR heterogeneous n=6 neurons MR homogeneous Coen-Cagli, Kohn, Schwartz, 05, in press
46 Model Predictions for Natural Scenes Cortical V data: a.5 MR heterogeneous n=6 neurons MR homogeneous Not explained by: firing rate with small frames surround energy Coen-Cagli, Kohn, Schwartz, 05, in press
47 Model predictions for natural images Per image, across neurons homogeneous heterogeneous Coen-Cagli, Kohn, Schwartz, 05; in press
48 Model predictions for natural images Testing predictions with cortical data Homogeneo d NMR when heterogeneous Coen-Cagli, Kohn, Schwartz, 05; in press n=39 images NMR when homogeneous
49 Modulation Ratio (MR) Supp. index (SI) Natural scenes data facilitation Data suppression Surround inference: Image ID OFF ON deg B Coen-Cagli, Kohn, Schwartz, 05, in press
50 Natural scenes data Modulation Ratio (MR) facilitation suppression Image ID Data Standard model Flexible model Surround inference: OFF ON Coen-Cagli, Kohn, Schwartz, 05, in press
51 Model predictions for natural images Comparing model performance for cortical data Standard divisive normalization E c,φ R i = α pref ε + βe c + γe s Flexible divisive normalization: E c,φ R i = α pref ε + βe c + q(c,s)γe s n n Determined by the model (not fit!) if p(ξ c,s) otherwise (similar results if non binary)
52 Natural scenes data Cross-validated prediction quality There are many standard model versions Prediction quality: = oracle (observed mean for each image) 0 = null (mean response across all images) Coen-Cagli, Kohn, Schwartz, 05, in press
53 Divisive normalization: Model Mechanisms Feedback inhibition Distal dendrite inhibition Depressing synapses Internal biochemical adjustments Non-Poisson spike generation
54 Flexible Normalization Mechanism? Adjusting gain by circuit or postsynaptic mechanisms? Distinct classes of inhibitory interneurons? (eg, Adesnik, Scanziani et al. 0; Pfeffer, Scanziani et al. 03; Pi, Kepecs et al. 03; Lee, Rudy et al. 03) Output Surround suppression Gating Pyr SOM VIP Input Normalization Pool
55 Key take-home points New approach to understanding cortical processing of natural images. Rather than fitting more complicated models, use insights from scene statistics Connects to neural computations that are ubiquitous, but enriches the standard model Our results suggest flexibility of contextual influences in natural vision, depending on whether center and surround are deemed statistically homogeneous Next/currently: hierarchical representations; adaptation
56 Acknowledgments Albert Einstein Ruben Coen Cagli Adam Kohn Gatsby, UCL Peter Dayan Salk Institute Terry Sejnowski Funding NIH (Collaborative Research in Computational Neuroscience) Army Research Office Alfred P. Sloan Foundation Google
Efficient Coding. Odelia Schwartz 2017
Efficient Coding Odelia Schwartz 2017 1 Levels of modeling Descriptive (what) Mechanistic (how) Interpretive (why) 2 Levels of modeling Fitting a receptive field model to experimental data (e.g., using
More informationSPIKE TRIGGERED APPROACHES. Odelia Schwartz Computational Neuroscience Course 2017
SPIKE TRIGGERED APPROACHES Odelia Schwartz Computational Neuroscience Course 2017 LINEAR NONLINEAR MODELS Linear Nonlinear o Often constrain to some form of Linear, Nonlinear computations, e.g. visual
More informationIntegrating Flexible Normalization into Mid-Level Representations of Deep Convolutional Neural Networks
arxiv:1806.01823v3 [q-bio.nc] 24 Dec 2018 Integrating Flexible Normalization into Mid-Level Representations of Deep Convolutional Neural Networks Luis Gonzalo Sanchez Giraldo, Odelia Schwartz Computer
More informationSUPPLEMENTARY INFORMATION
Supplementary discussion 1: Most excitatory and suppressive stimuli for model neurons The model allows us to determine, for each model neuron, the set of most excitatory and suppresive features. First,
More informationarxiv: v1 [q-bio.nc] 16 Apr 2018
Appropriate ernels for Divisive Normalization explained by Wilson-Cowan equations J. Malo 1 and M. Bertalmio 2 1 Image Processing Lab. Parc Científic, Universitat de València, Spain 2 Dept. Tecnol. Inf.
More informationModeling Surround Suppression in V1 Neurons with a Statistically-Derived Normalization Model
Presented at: NIPS-98, Denver CO, 1-3 Dec 1998. Pulished in: Advances in Neural Information Processing Systems eds. M. S. Kearns, S. A. Solla, and D. A. Cohn volume 11, pages 153--159 MIT Press, Cambridge,
More informationLateral organization & computation
Lateral organization & computation review Population encoding & decoding lateral organization Efficient representations that reduce or exploit redundancy Fixation task 1rst order Retinotopic maps Log-polar
More informationSoft Mixer Assignment in a Hierarchical Generative Model of Natural Scene Statistics
LETTER Communicated by Michael Lewicki Soft Mixer Assignment in a Hierarchical Generative Model of Natural Scene Statistics Odelia Schwartz odelia@salk.edu Howard Hughes Medical Institute, Computational
More informationDeep learning in the visual cortex
Deep learning in the visual cortex Thomas Serre Brown University. Fundamentals of primate vision. Computational mechanisms of rapid recognition and feedforward processing. Beyond feedforward processing:
More informationIs early vision optimised for extracting higher order dependencies? Karklin and Lewicki, NIPS 2005
Is early vision optimised for extracting higher order dependencies? Karklin and Lewicki, NIPS 2005 Richard Turner (turner@gatsby.ucl.ac.uk) Gatsby Computational Neuroscience Unit, 02/03/2006 Outline Historical
More informationAdaptation in the Neural Code of the Retina
Adaptation in the Neural Code of the Retina Lens Retina Fovea Optic Nerve Optic Nerve Bottleneck Neurons Information Receptors: 108 95% Optic Nerve 106 5% After Polyak 1941 Visual Cortex ~1010 Mean Intensity
More informationHierarchy. Will Penny. 24th March Hierarchy. Will Penny. Linear Models. Convergence. Nonlinear Models. References
24th March 2011 Update Hierarchical Model Rao and Ballard (1999) presented a hierarchical model of visual cortex to show how classical and extra-classical Receptive Field (RF) effects could be explained
More informationCHARACTERIZATION OF NONLINEAR NEURON RESPONSES
CHARACTERIZATION OF NONLINEAR NEURON RESPONSES Matt Whiteway whit8022@umd.edu Dr. Daniel A. Butts dab@umd.edu Neuroscience and Cognitive Science (NACS) Applied Mathematics and Scientific Computation (AMSC)
More informationSparse Coding as a Generative Model
Sparse Coding as a Generative Model image vector neural activity (sparse) feature vector other stuff Find activations by descending E Coefficients via gradient descent Driving input (excitation) Lateral
More informationCHARACTERIZATION OF NONLINEAR NEURON RESPONSES
CHARACTERIZATION OF NONLINEAR NEURON RESPONSES Matt Whiteway whit8022@umd.edu Dr. Daniel A. Butts dab@umd.edu Neuroscience and Cognitive Science (NACS) Applied Mathematics and Scientific Computation (AMSC)
More informationFundamentals of Computational Neuroscience 2e
Fundamentals of Computational Neuroscience 2e January 1, 2010 Chapter 10: The cognitive brain Hierarchical maps and attentive vision A. Ventral visual pathway B. Layered cortical maps Receptive field size
More informationStatistical models for neural encoding
Statistical models for neural encoding Part 1: discrete-time models Liam Paninski Gatsby Computational Neuroscience Unit University College London http://www.gatsby.ucl.ac.uk/ liam liam@gatsby.ucl.ac.uk
More informationWhat is the neural code? Sekuler lab, Brandeis
What is the neural code? Sekuler lab, Brandeis What is the neural code? What is the neural code? Alan Litke, UCSD What is the neural code? What is the neural code? What is the neural code? Encoding: how
More informationHierarchical Sparse Bayesian Learning. Pierre Garrigues UC Berkeley
Hierarchical Sparse Bayesian Learning Pierre Garrigues UC Berkeley Outline Motivation Description of the model Inference Learning Results Motivation Assumption: sensory systems are adapted to the statistical
More informationNeural Coding: Integrate-and-Fire Models of Single and Multi-Neuron Responses
Neural Coding: Integrate-and-Fire Models of Single and Multi-Neuron Responses Jonathan Pillow HHMI and NYU http://www.cns.nyu.edu/~pillow Oct 5, Course lecture: Computational Modeling of Neuronal Systems
More informationNatural Image Statistics
Natural Image Statistics A probabilistic approach to modelling early visual processing in the cortex Dept of Computer Science Early visual processing LGN V1 retina From the eye to the primary visual cortex
More informationNonlinear reverse-correlation with synthesized naturalistic noise
Cognitive Science Online, Vol1, pp1 7, 2003 http://cogsci-onlineucsdedu Nonlinear reverse-correlation with synthesized naturalistic noise Hsin-Hao Yu Department of Cognitive Science University of California
More informationAn Introductory Course in Computational Neuroscience
An Introductory Course in Computational Neuroscience Contents Series Foreword Acknowledgments Preface 1 Preliminary Material 1.1. Introduction 1.1.1 The Cell, the Circuit, and the Brain 1.1.2 Physics of
More informationSean Escola. Center for Theoretical Neuroscience
Employing hidden Markov models of neural spike-trains toward the improved estimation of linear receptive fields and the decoding of multiple firing regimes Sean Escola Center for Theoretical Neuroscience
More informationThe Temporal Dynamics of Cortical Normalization Models of Decision-making
Letters in Biomathematics An International Journal Volume I, issue 2 (24) http://www.lettersinbiomath.org Research Article The Temporal Dynamics of Cortical Normalization Models of Decision-making Thomas
More informationMid Year Project Report: Statistical models of visual neurons
Mid Year Project Report: Statistical models of visual neurons Anna Sotnikova asotniko@math.umd.edu Project Advisor: Prof. Daniel A. Butts dab@umd.edu Department of Biology Abstract Studying visual neurons
More informationGentle Introduction to Infinite Gaussian Mixture Modeling
Gentle Introduction to Infinite Gaussian Mixture Modeling with an application in neuroscience By Frank Wood Rasmussen, NIPS 1999 Neuroscience Application: Spike Sorting Important in neuroscience and for
More informationDecoding conceptual representations
Decoding conceptual representations!!!! Marcel van Gerven! Computational Cognitive Neuroscience Lab (www.ccnlab.net) Artificial Intelligence Department Donders Centre for Cognition Donders Institute for
More informationCorrelations and neural information coding Shlens et al. 09
Correlations and neural information coding Shlens et al. 09 Joel Zylberberg www.jzlab.org The neural code is not one-to-one ρ = 0.52 ρ = 0.80 d 3s [Max Turner, UW] b # of trials trial 1!! trial 2!!.!.!.
More informationVisual motion processing and perceptual decision making
Visual motion processing and perceptual decision making Aziz Hurzook (ahurzook@uwaterloo.ca) Oliver Trujillo (otrujill@uwaterloo.ca) Chris Eliasmith (celiasmith@uwaterloo.ca) Centre for Theoretical Neuroscience,
More informationSynaptic Input. Linear Model of Synaptic Transmission. Professor David Heeger. September 5, 2000
Synaptic Input Professor David Heeger September 5, 2000 The purpose of this handout is to go a bit beyond the discussion in Ch. 6 of The Book of Genesis on synaptic input, and give some examples of how
More informationTransformation of stimulus correlations by the retina
Transformation of stimulus correlations by the retina Kristina Simmons (University of Pennsylvania) and Jason Prentice, (now Princeton University) with Gasper Tkacik (IST Austria) Jan Homann (now Princeton
More informationStatistical models for neural encoding, decoding, information estimation, and optimal on-line stimulus design
Statistical models for neural encoding, decoding, information estimation, and optimal on-line stimulus design Liam Paninski Department of Statistics and Center for Theoretical Neuroscience Columbia University
More informationHow to make computers work like the brain
How to make computers work like the brain (without really solving the brain) Dileep George a single special machine can be made to do the work of all. It could in fact be made to work as a model of any
More informationThe Bayesian Brain. Robert Jacobs Department of Brain & Cognitive Sciences University of Rochester. May 11, 2017
The Bayesian Brain Robert Jacobs Department of Brain & Cognitive Sciences University of Rochester May 11, 2017 Bayesian Brain How do neurons represent the states of the world? How do neurons represent
More information1/12/2017. Computational neuroscience. Neurotechnology.
Computational neuroscience Neurotechnology https://devblogs.nvidia.com/parallelforall/deep-learning-nutshell-core-concepts/ 1 Neurotechnology http://www.lce.hut.fi/research/cogntech/neurophysiology Recording
More information+ + ( + ) = Linear recurrent networks. Simpler, much more amenable to analytic treatment E.g. by choosing
Linear recurrent networks Simpler, much more amenable to analytic treatment E.g. by choosing + ( + ) = Firing rates can be negative Approximates dynamics around fixed point Approximation often reasonable
More informationRESEARCH STATEMENT. Nora Youngs, University of Nebraska - Lincoln
RESEARCH STATEMENT Nora Youngs, University of Nebraska - Lincoln 1. Introduction Understanding how the brain encodes information is a major part of neuroscience research. In the field of neural coding,
More informationHow to read a burst duration code
Neurocomputing 58 60 (2004) 1 6 www.elsevier.com/locate/neucom How to read a burst duration code Adam Kepecs a;, John Lisman b a Cold Spring Harbor Laboratory, Marks Building, 1 Bungtown Road, Cold Spring
More informationSampling-based probabilistic inference through neural and synaptic dynamics
Sampling-based probabilistic inference through neural and synaptic dynamics Wolfgang Maass for Robert Legenstein Institute for Theoretical Computer Science Graz University of Technology, Austria Institute
More informationEfficient and direct estimation of a neural subunit model for sensory coding
To appear in: Neural Information Processing Systems (NIPS), Lake Tahoe, Nevada. December 3-6, 22. Efficient and direct estimation of a neural subunit model for sensory coding Brett Vintch Andrew D. Zaharia
More informationSUPPLEMENTARY INFORMATION
Spatio-temporal correlations and visual signaling in a complete neuronal population Jonathan W. Pillow 1, Jonathon Shlens 2, Liam Paninski 3, Alexander Sher 4, Alan M. Litke 4,E.J.Chichilnisky 2, Eero
More informationA General Mechanism for Tuning: Gain Control Circuits and Synapses Underlie Tuning of Cortical Neurons
massachusetts institute of technology computer science and artificial intelligence laboratory A General Mechanism for Tuning: Gain Control Circuits and Synapses Underlie Tuning of Cortical Neurons Minjoon
More informationTuning tuning curves. So far: Receptive fields Representation of stimuli Population vectors. Today: Contrast enhancment, cortical processing
Tuning tuning curves So far: Receptive fields Representation of stimuli Population vectors Today: Contrast enhancment, cortical processing Firing frequency N 3 s max (N 1 ) = 40 o N4 N 1 N N 5 2 s max
More informationEmergence of Phase- and Shift-Invariant Features by Decomposition of Natural Images into Independent Feature Subspaces
LETTER Communicated by Bartlett Mel Emergence of Phase- and Shift-Invariant Features by Decomposition of Natural Images into Independent Feature Subspaces Aapo Hyvärinen Patrik Hoyer Helsinki University
More informationModeling and Characterization of Neural Gain Control. Odelia Schwartz. A dissertation submitted in partial fulfillment
Modeling and Characterization of Neural Gain Control by Odelia Schwartz A dissertation submitted in partial fulfillment of the requirements for the degree of Doctor of Philosophy Center for Neural Science
More informationCSE/NB 528 Final Lecture: All Good Things Must. CSE/NB 528: Final Lecture
CSE/NB 528 Final Lecture: All Good Things Must 1 Course Summary Where have we been? Course Highlights Where do we go from here? Challenges and Open Problems Further Reading 2 What is the neural code? What
More informationTHE retina in general consists of three layers: photoreceptors
CS229 MACHINE LEARNING, STANFORD UNIVERSITY, DECEMBER 2016 1 Models of Neuron Coding in Retinal Ganglion Cells and Clustering by Receptive Field Kevin Fegelis, SUID: 005996192, Claire Hebert, SUID: 006122438,
More informationEfficient coding of natural images with a population of noisy Linear-Nonlinear neurons
Efficient coding of natural images with a population of noisy Linear-Nonlinear neurons Yan Karklin and Eero P. Simoncelli NYU Overview Efficient coding is a well-known objective for the evaluation and
More informationA biologically plausible network for the computation of orientation dominance
A biologically plausible network for the computation of orientation dominance Kritika Muralidharan Statistical Visual Computing Laboratory University of California San Diego La Jolla, CA 9239 krmurali@ucsd.edu
More informationTilt-aftereffect and adaptation of V1 neurons
Tilt-aftereffect and adaptation of V1 neurons Dezhe Jin Department of Physics The Pennsylvania State University Outline The tilt aftereffect (TAE) Classical model of neural basis of TAE Neural data on
More informationBayesian Computation Emerges in Generic Cortical Microcircuits through Spike-Timing-Dependent Plasticity
Bayesian Computation Emerges in Generic Cortical Microcircuits through Spike-Timing-Dependent Plasticity Bernhard Nessler 1 *, Michael Pfeiffer 1,2, Lars Buesing 1, Wolfgang Maass 1 1 Institute for Theoretical
More informationLearning features by contrasting natural images with noise
Learning features by contrasting natural images with noise Michael Gutmann 1 and Aapo Hyvärinen 12 1 Dept. of Computer Science and HIIT, University of Helsinki, P.O. Box 68, FIN-00014 University of Helsinki,
More informationAdaptive Velocity Tuning for Visual Motion Estimation
Adaptive Velocity Tuning for Visual Motion Estimation Volker Willert 1 and Julian Eggert 2 1- Darmstadt University of Technology Institute of Automatic Control, Control Theory and Robotics Lab Landgraf-Georg-Str.
More informationDynamic Causal Modelling for EEG/MEG: principles J. Daunizeau
Dynamic Causal Modelling for EEG/MEG: principles J. Daunizeau Motivation, Brain and Behaviour group, ICM, Paris, France Overview 1 DCM: introduction 2 Dynamical systems theory 3 Neural states dynamics
More informationPopulation Coding. Maneesh Sahani Gatsby Computational Neuroscience Unit University College London
Population Coding Maneesh Sahani maneesh@gatsby.ucl.ac.uk Gatsby Computational Neuroscience Unit University College London Term 1, Autumn 2010 Coding so far... Time-series for both spikes and stimuli Empirical
More informationDynamic Modeling of Brain Activity
0a Dynamic Modeling of Brain Activity EIN IIN PC Thomas R. Knösche, Leipzig Generative Models for M/EEG 4a Generative Models for M/EEG states x (e.g. dipole strengths) parameters parameters (source positions,
More informationBayesian probability theory and generative models
Bayesian probability theory and generative models Bruno A. Olshausen November 8, 2006 Abstract Bayesian probability theory provides a mathematical framework for peforming inference, or reasoning, using
More informationA Monte Carlo Sequential Estimation for Point Process Optimum Filtering
2006 International Joint Conference on Neural Networks Sheraton Vancouver Wall Centre Hotel, Vancouver, BC, Canada July 16-21, 2006 A Monte Carlo Sequential Estimation for Point Process Optimum Filtering
More informationCharacterization of Nonlinear Neuron Responses
Characterization of Nonlinear Neuron Responses Mid Year Report Matt Whiteway Department of Applied Mathematics and Scientific Computing whit822@umd.edu Advisor Dr. Daniel A. Butts Neuroscience and Cognitive
More informationBayesian Computation in Recurrent Neural Circuits
Bayesian Computation in Recurrent Neural Circuits Rajesh P. N. Rao Department of Computer Science and Engineering University of Washington Seattle, WA 98195 E-mail: rao@cs.washington.edu Appeared in: Neural
More informationBayesian Inference Course, WTCN, UCL, March 2013
Bayesian Course, WTCN, UCL, March 2013 Shannon (1948) asked how much information is received when we observe a specific value of the variable x? If an unlikely event occurs then one would expect the information
More informationComplex Inference in Neural Circuits with Probabilistic Population Codes and Topic Models
Complex Inference in Neural Circuits with Probabilistic Population Codes and Topic Models Jeff Beck Department of Brain and Cognitive Sciences University of Rochester jbeck@bcs.rochester.edu Katherine
More informationEXTENSIONS OF ICA AS MODELS OF NATURAL IMAGES AND VISUAL PROCESSING. Aapo Hyvärinen, Patrik O. Hoyer and Jarmo Hurri
EXTENSIONS OF ICA AS MODELS OF NATURAL IMAGES AND VISUAL PROCESSING Aapo Hyvärinen, Patrik O. Hoyer and Jarmo Hurri Neural Networks Research Centre Helsinki University of Technology P.O. Box 5400, FIN-02015
More informationPrinciples of DCM. Will Penny. 26th May Principles of DCM. Will Penny. Introduction. Differential Equations. Bayesian Estimation.
26th May 2011 Dynamic Causal Modelling Dynamic Causal Modelling is a framework studying large scale brain connectivity by fitting differential equation models to brain imaging data. DCMs differ in their
More informationLearning Nonlinear Statistical Regularities in Natural Images by Modeling the Outer Product of Image Intensities
LETTER Communicated by Odelia Schwartz Learning Nonlinear Statistical Regularities in Natural Images by Modeling the Outer Product of Image Intensities Peng Qi pengrobertqi@163.com Xiaolin Hu xlhu@tsinghua.edu.cn
More informationHigh-dimensional geometry of cortical population activity. Marius Pachitariu University College London
High-dimensional geometry of cortical population activity Marius Pachitariu University College London Part I: introduction to the brave new world of large-scale neuroscience Part II: large-scale data preprocessing
More informationDivisive Inhibition in Recurrent Networks
Divisive Inhibition in Recurrent Networks Frances S. Chance and L. F. Abbott Volen Center for Complex Systems and Department of Biology Brandeis University Waltham MA 2454-911 Abstract Models of visual
More informationProbabilistic Models in Theoretical Neuroscience
Probabilistic Models in Theoretical Neuroscience visible unit Boltzmann machine semi-restricted Boltzmann machine restricted Boltzmann machine hidden unit Neural models of probabilistic sampling: introduction
More informationA Model for Real-Time Computation in Generic Neural Microcircuits
A Model for Real-Time Computation in Generic Neural Microcircuits Wolfgang Maass, Thomas Natschläger Institute for Theoretical Computer Science Technische Universitaet Graz A-81 Graz, Austria maass, tnatschl
More informationEfficient coding of natural images with a population of noisy Linear-Nonlinear neurons
To appear in: Neural Information Processing Systems (NIPS), http://nips.cc/ Granada, Spain. December 12-15, 211. Efficient coding of natural images with a population of noisy Linear-Nonlinear neurons Yan
More informationREAL-TIME COMPUTING WITHOUT STABLE
REAL-TIME COMPUTING WITHOUT STABLE STATES: A NEW FRAMEWORK FOR NEURAL COMPUTATION BASED ON PERTURBATIONS Wolfgang Maass Thomas Natschlager Henry Markram Presented by Qiong Zhao April 28 th, 2010 OUTLINE
More informationTHE MAN WHO MISTOOK HIS COMPUTER FOR A HAND
THE MAN WHO MISTOOK HIS COMPUTER FOR A HAND Neural Control of Robotic Devices ELIE BIENENSTOCK, MICHAEL J. BLACK, JOHN DONOGHUE, YUN GAO, MIJAIL SERRUYA BROWN UNIVERSITY Applied Mathematics Computer Science
More informationAchievable rates for pattern recognition
Achievable rates for pattern recognition M Brandon Westover Joseph A O Sullivan Washington University in Saint Louis Departments of Physics and Electrical Engineering Goals of Information Theory Models
More information!) + log(t) # n i. The last two terms on the right hand side (RHS) are clearly independent of θ and can be
Supplementary Materials General case: computing log likelihood We first describe the general case of computing the log likelihood of a sensory parameter θ that is encoded by the activity of neurons. Each
More informationComparison of objective functions for estimating linear-nonlinear models
Comparison of objective functions for estimating linear-nonlinear models Tatyana O. Sharpee Computational Neurobiology Laboratory, the Salk Institute for Biological Studies, La Jolla, CA 937 sharpee@salk.edu
More informationScholarOne, 375 Greenbrier Drive, Charlottesville, VA, 22901
Page 1 of 43 Information maximization as a principle for contrast gain control Journal: Manuscript ID: Manuscript Type: Manuscript Section: Conflict of Interest: Date Submitted by the Author: Keywords:
More informationBiosciences in the 21st century
Biosciences in the 21st century Lecture 1: Neurons, Synapses, and Signaling Dr. Michael Burger Outline: 1. Why neuroscience? 2. The neuron 3. Action potentials 4. Synapses 5. Organization of the nervous
More informationHigher Order Statistics
Higher Order Statistics Matthias Hennig Neural Information Processing School of Informatics, University of Edinburgh February 12, 2018 1 0 Based on Mark van Rossum s and Chris Williams s old NIP slides
More informationProcessing of Time Series by Neural Circuits with Biologically Realistic Synaptic Dynamics
Processing of Time Series by Neural Circuits with iologically Realistic Synaptic Dynamics Thomas Natschläger & Wolfgang Maass Institute for Theoretical Computer Science Technische Universität Graz, ustria
More informationProbabilistic Inference of Hand Motion from Neural Activity in Motor Cortex
Probabilistic Inference of Hand Motion from Neural Activity in Motor Cortex Y Gao M J Black E Bienenstock S Shoham J P Donoghue Division of Applied Mathematics, Brown University, Providence, RI 292 Dept
More informationExperimental design of fmri studies
Experimental design of fmri studies Sandra Iglesias With many thanks for slides & images to: Klaas Enno Stephan, FIL Methods group, Christian Ruff SPM Course 2015 Overview of SPM Image time-series Kernel
More informationChoice Set Effects. Mark Dean. Behavioral Economics G6943 Autumn 2018
Choice Set Effects Mark Dean Behavioral Economics G6943 Autumn 2018 Context Effects We will now think about context effects that are more general than simple reference dependence Standard Model: Adding
More informationModel neurons!!poisson neurons!
Model neurons!!poisson neurons! Suggested reading:! Chapter 1.4 in Dayan, P. & Abbott, L., heoretical Neuroscience, MI Press, 2001.! Model neurons: Poisson neurons! Contents: Probability of a spike sequence
More informationPhenomenological Models of Neurons!! Lecture 5!
Phenomenological Models of Neurons!! Lecture 5! 1! Some Linear Algebra First!! Notes from Eero Simoncelli 2! Vector Addition! Notes from Eero Simoncelli 3! Scalar Multiplication of a Vector! 4! Vector
More informationA Biologically-Inspired Model for Recognition of Overlapped Patterns
A Biologically-Inspired Model for Recognition of Overlapped Patterns Mohammad Saifullah Department of Computer and Information Science Linkoping University, Sweden Mohammad.saifullah@liu.se Abstract. In
More informationBayesian inference for low rank spatiotemporal neural receptive fields
published in: Advances in Neural Information Processing Systems 6 (03), 688 696. Bayesian inference for low rank spatiotemporal neural receptive fields Mijung Park Electrical and Computer Engineering The
More informationEfficient Likelihood-Free Inference
Efficient Likelihood-Free Inference Michael Gutmann http://homepages.inf.ed.ac.uk/mgutmann Institute for Adaptive and Neural Computation School of Informatics, University of Edinburgh 8th November 2017
More informationSupporting Online Material for
www.sciencemag.org/cgi/content/full/331/6013/83/dc1 Supporting Online Material for Spontaneous Cortical Activity Reveals Hallmarks of an Optimal Internal Model of the Environment Pietro Berkes,* Gergő
More informationCausality and communities in neural networks
Causality and communities in neural networks Leonardo Angelini, Daniele Marinazzo, Mario Pellicoro, Sebastiano Stramaglia TIRES-Center for Signal Detection and Processing - Università di Bari, Bari, Italy
More informationComparing integrate-and-fire models estimated using intracellular and extracellular data 1
Comparing integrate-and-fire models estimated using intracellular and extracellular data 1 Liam Paninski a,b,2 Jonathan Pillow b Eero Simoncelli b a Gatsby Computational Neuroscience Unit, University College
More informationGatsby Theoretical Neuroscience Lectures: Non-Gaussian statistics and natural images Parts I-II
Gatsby Theoretical Neuroscience Lectures: Non-Gaussian statistics and natural images Parts I-II Gatsby Unit University College London 27 Feb 2017 Outline Part I: Theory of ICA Definition and difference
More informationStructured hierarchical models for neurons in the early visual system
Structured hierarchical models for neurons in the early visual system by Brett Vintch A dissertation submitted in partial fulfillment of the requirements for the degree of Doctor of Philosophy Center for
More informationImplementing a Bayes Filter in a Neural Circuit: The Case of Unknown Stimulus Dynamics
Implementing a Bayes Filter in a Neural Circuit: The Case of Unknown Stimulus Dynamics arxiv:1512.07839v4 [cs.lg] 7 Jun 2017 Sacha Sokoloski Max Planck Institute for Mathematics in the Sciences Abstract
More informationNeural Spike Train Analysis 1: Introduction to Point Processes
SAMSI Summer 2015: CCNS Computational Neuroscience Summer School Neural Spike Train Analysis 1: Introduction to Point Processes Uri Eden BU Department of Mathematics and Statistics July 27, 2015 Spikes
More informationBayesian Computation Emerges in Generic Cortical Microcircuits through Spike-Timing-Dependent Plasticity
Bayesian Computation Emerges in Generic Cortical Microcircuits through Spike-Timing-Dependent Plasticity Bernhard Nessler 1,, Michael Pfeiffer 2,1, Lars Buesing 1, Wolfgang Maass 1 nessler@igi.tugraz.at,
More informationHow to do backpropagation in a brain
How to do backpropagation in a brain Geoffrey Hinton Canadian Institute for Advanced Research & University of Toronto & Google Inc. Prelude I will start with three slides explaining a popular type of deep
More informationExperimental design of fmri studies
Experimental design of fmri studies Zurich SPM Course 2016 Sandra Iglesias Translational Neuromodeling Unit (TNU) Institute for Biomedical Engineering (IBT) University and ETH Zürich With many thanks for
More informationThe homogeneous Poisson process
The homogeneous Poisson process during very short time interval Δt there is a fixed probability of an event (spike) occurring independent of what happened previously if r is the rate of the Poisson process,
More information