STA and the encoding and decoding problems. NEU 466M Instructor: Professor Ila R. Fiete Spring 2016

Size: px
Start display at page:

Download "STA and the encoding and decoding problems. NEU 466M Instructor: Professor Ila R. Fiete Spring 2016"

Transcription

1 STA and the encoding and decoding problems NEU 466M Instructor: Professor Ila R. Fiete Spring 2016

2 Last Gme: sgmulus, spike cross- correlagon 16 x single H1 neuron in lobula plate per hemisphere When sgmulus goes up (more posigve velocity), response increases (more spikes): posigve peak. C(stim, rho) Response lags sgmulus (peak to right of zero- shiv) sample number (500 Hz) x 10 5 Cross- correlagon uncovers relagonships between Gme- series. What specifically does it mean about sgmulus à spike response?

3 Back to original goal: Modeling WHAT DOES IT MEAN TO BUILD A MODEL OF OBSERVATIONS OF A STIMULUS AND RESPONSE?

4 Modeling spike train data Model: Simple, predicgve descripgon. But what is it we want to describe/predict? OpGon 1) Given sgmulus, predict spikes? OpGon 2) Given spikes, predict sgmulus?

5 Modeling spike train data Model: Simple, predicgve descripgon. But what is it we want to describe/predict? OpGon 1): Given sgmulus, predict spikes? Encoding model OpGon 2): Given spikes, predict sgmulus?

6 Modeling spike train data Model: Simple, predicgve descripgon. But what is it we want to describe/predict? OpGon 1): Given sgmulus, predict spikes? OpGon 2): Given spikes, predict sgmulus? Encoding model Decoding model

7 Modeling spike train data Model: Simple, predicgve descripgon. But what is it we want to describe/predict? OpGon 1): Given sgmulus, predict spikes? Encoding model OpGon 2): Given spikes, predict sgmulus? Decoding model Both are good and closely related modeling goals!

8 Decoding problem Given a spike, what was the sgmulus?

9 The spike- triggered average from Spikes, Rieke et al. Given a spike, what was the mean sgmulus that led up to it?

10 The spike- triggered average from Spikes, Rieke et al. STA: (average) sgmulus feature to which cell responds

11 The spike- triggered average stimulus s(t) N spikes at times t i (i =1 N) STA( ) = 1 N NX i=1 s(t i )

12 The spike- triggered average as a correlagon STA( ) = 1 N = 1 N NX i=1 X t s(t i ) (t)s(t ) is the spike vector of 0 0 s, 1 0 s = 1 N C s( ) STA = CorrelaGon between spike- train, sgmulus at negagve (earlier) Gmes Reverse correlagon

13 STA and reverse correlagon STA implies that the response is a binary spike- train. Reverse correlagon: the response can be any Gme- varying signal. Also called white noise analysis (we will see why later).

14 STA and the decoding problem Decoding problem: Infer sgmulus given spike train. Mindreading : read spike output and infer what the brain saw. STA: Given that cell fired spike, STA returns average of preceding sgmulus.

15 Decoding problem Volterra series expansion: stimulus s(t) N spikes at times t i (i =1 N) s est (t) = X i F 1 (t t i )+ X i,j F 2 (t t i,t t j )+ each spike an independent event, and contributes independently to sgmulus reconstrucgon spike pairs in specific configuragon carry informagon about sgmulus, beyond that contained in their individual occurrences. spike pair a separate event contribugng to reconstrucgon.

16 Decoding problem Volterra series expansion: s est (t) = X i F 1 (t t i )+ X i,j F 2 (t t i,t t j )+ each spike an independent event given sgmulus, and contributes independently to sgmulus reconstrucgon STA spike pairs in specific configuragon carry informagon about sgmulus, beyond that contained in their individual occurrences. spike pair an independent event contribugng to reconstrucgon.

17 Geometric view length- T sgmulus vector preceding Gme point t: {s(t T ) s(t 2)s(t 1)} t 2 s(t 1) s(t 2) t T t 1 s(t T ) sgmulus space

18 Geometric view length- T sgmulus vector preceding Gme point t: {s(t T ) s(t 2)s(t 1)} t 2 s(t 1) s(t 2) t T t 1 s(t T ) Any possible sgmulus Gme- series is one point in sgmulus space

19 Geometric view of STA presented sgmuli s(t 1) s(t 2) s(t T )

20 Geometric view of STA s(t 1) s(t 2) s(t T ) presented sgmuli effecgve sgmuli (evoked spike)

21 Geometric view of STA s(t 1) STA s(t 2) s(t T ) presented sgmuli effecgve sgmuli (evoked spike) STA picks single direcgon in sgmulus space

22 Geometric view of STA STA STA

23 Geometric view of STA: when does it fail? STA STA points in direcgon where sgmuli were actually ineffecgve in producing spikes.

24 Geometric view of STA: when does it fail? Same caugon as correlagon: measure of linear relagonship between sgmulus, response. If response is specific nonlinear funcgon of sgmulus, then STA may not be informagve. STA = 0 Example: mogon energy model for complex cells in V1.

25 Summary: STA Simple/compact descripgon of data. ExtracGng single feature of data. Linear feature; first term in Volterra expansion. Test: PredicGon of response (encoding). Homework.

Time- varying signals: cross- and auto- correla5on, correlograms. NEU 466M Instructor: Professor Ila R. Fiete Spring 2016

Time- varying signals: cross- and auto- correla5on, correlograms. NEU 466M Instructor: Professor Ila R. Fiete Spring 2016 Time- varying signals: cross- and auto- correla5on, correlograms NEU 466M Instructor: Professor Ila R. Fiete Spring 2016 Sta5s5cal measures We first considered simple sta5s5cal measures for single variables

More information

encoding and estimation bottleneck and limits to visual fidelity

encoding and estimation bottleneck and limits to visual fidelity Retina Light Optic Nerve photoreceptors encoding and estimation bottleneck and limits to visual fidelity interneurons ganglion cells light The Neural Coding Problem s(t) {t i } Central goals for today:

More information

Mid Year Project Report: Statistical models of visual neurons

Mid Year Project Report: Statistical models of visual neurons Mid Year Project Report: Statistical models of visual neurons Anna Sotnikova asotniko@math.umd.edu Project Advisor: Prof. Daniel A. Butts dab@umd.edu Department of Biology Abstract Studying visual neurons

More information

Phenomenological Models of Neurons!! Lecture 5!

Phenomenological Models of Neurons!! Lecture 5! Phenomenological Models of Neurons!! Lecture 5! 1! Some Linear Algebra First!! Notes from Eero Simoncelli 2! Vector Addition! Notes from Eero Simoncelli 3! Scalar Multiplication of a Vector! 4! Vector

More information

The homogeneous Poisson process

The homogeneous Poisson process The homogeneous Poisson process during very short time interval Δt there is a fixed probability of an event (spike) occurring independent of what happened previously if r is the rate of the Poisson process,

More information

CHARACTERIZATION OF NONLINEAR NEURON RESPONSES

CHARACTERIZATION OF NONLINEAR NEURON RESPONSES CHARACTERIZATION OF NONLINEAR NEURON RESPONSES Matt Whiteway whit8022@umd.edu Dr. Daniel A. Butts dab@umd.edu Neuroscience and Cognitive Science (NACS) Applied Mathematics and Scientific Computation (AMSC)

More information

Consider the following spike trains from two different neurons N1 and N2:

Consider the following spike trains from two different neurons N1 and N2: About synchrony and oscillations So far, our discussions have assumed that we are either observing a single neuron at a, or that neurons fire independent of each other. This assumption may be correct in

More information

SPIKE TRIGGERED APPROACHES. Odelia Schwartz Computational Neuroscience Course 2017

SPIKE TRIGGERED APPROACHES. Odelia Schwartz Computational Neuroscience Course 2017 SPIKE TRIGGERED APPROACHES Odelia Schwartz Computational Neuroscience Course 2017 LINEAR NONLINEAR MODELS Linear Nonlinear o Often constrain to some form of Linear, Nonlinear computations, e.g. visual

More information

What is the neural code? Sekuler lab, Brandeis

What is the neural code? Sekuler lab, Brandeis What is the neural code? Sekuler lab, Brandeis What is the neural code? What is the neural code? Alan Litke, UCSD What is the neural code? What is the neural code? What is the neural code? Encoding: how

More information

Robust regression and non-linear kernel methods for characterization of neuronal response functions from limited data

Robust regression and non-linear kernel methods for characterization of neuronal response functions from limited data Robust regression and non-linear kernel methods for characterization of neuronal response functions from limited data Maneesh Sahani Gatsby Computational Neuroscience Unit University College, London Jennifer

More information

Neural Coding: Integrate-and-Fire Models of Single and Multi-Neuron Responses

Neural Coding: Integrate-and-Fire Models of Single and Multi-Neuron Responses Neural Coding: Integrate-and-Fire Models of Single and Multi-Neuron Responses Jonathan Pillow HHMI and NYU http://www.cns.nyu.edu/~pillow Oct 5, Course lecture: Computational Modeling of Neuronal Systems

More information

CSE/NB 528 Final Lecture: All Good Things Must. CSE/NB 528: Final Lecture

CSE/NB 528 Final Lecture: All Good Things Must. CSE/NB 528: Final Lecture CSE/NB 528 Final Lecture: All Good Things Must 1 Course Summary Where have we been? Course Highlights Where do we go from here? Challenges and Open Problems Further Reading 2 What is the neural code? What

More information

Sean Escola. Center for Theoretical Neuroscience

Sean Escola. Center for Theoretical Neuroscience Employing hidden Markov models of neural spike-trains toward the improved estimation of linear receptive fields and the decoding of multiple firing regimes Sean Escola Center for Theoretical Neuroscience

More information

CSE446: Neural Networks Winter 2015

CSE446: Neural Networks Winter 2015 CSE446: Neural Networks Winter 2015 Luke Ze

More information

Neuronal Dynamics: Computational Neuroscience of Single Neurons

Neuronal Dynamics: Computational Neuroscience of Single Neurons Week 5 part 3a :Three definitions of rate code Neuronal Dynamics: Computational Neuroscience of Single Neurons Week 5 Variability and Noise: The question of the neural code Wulfram Gerstner EPFL, Lausanne,

More information

CHARACTERIZATION OF NONLINEAR NEURON RESPONSES

CHARACTERIZATION OF NONLINEAR NEURON RESPONSES CHARACTERIZATION OF NONLINEAR NEURON RESPONSES Matt Whiteway whit8022@umd.edu Dr. Daniel A. Butts dab@umd.edu Neuroscience and Cognitive Science (NACS) Applied Mathematics and Scientific Computation (AMSC)

More information

1/12/2017. Computational neuroscience. Neurotechnology.

1/12/2017. Computational neuroscience. Neurotechnology. Computational neuroscience Neurotechnology https://devblogs.nvidia.com/parallelforall/deep-learning-nutshell-core-concepts/ 1 Neurotechnology http://www.lce.hut.fi/research/cogntech/neurophysiology Recording

More information

+ + ( + ) = Linear recurrent networks. Simpler, much more amenable to analytic treatment E.g. by choosing

+ + ( + ) = Linear recurrent networks. Simpler, much more amenable to analytic treatment E.g. by choosing Linear recurrent networks Simpler, much more amenable to analytic treatment E.g. by choosing + ( + ) = Firing rates can be negative Approximates dynamics around fixed point Approximation often reasonable

More information

How can Punjab adapt to a changing climate? A systems approach to managing water, energy, agriculture and income. Upmanu Lall Columbia University

How can Punjab adapt to a changing climate? A systems approach to managing water, energy, agriculture and income. Upmanu Lall Columbia University How can Punjab adapt to a changing climate? A systems approach to managing water, energy, agriculture and income Upmanu Lall Columbia University Outline The climate context (focus of this talk) The long

More information

Membrane equation. VCl. dv dt + V = V Na G Na + V K G K + V Cl G Cl. G total. C m. G total = G Na + G K + G Cl

Membrane equation. VCl. dv dt + V = V Na G Na + V K G K + V Cl G Cl. G total. C m. G total = G Na + G K + G Cl Spiking neurons Membrane equation V GNa GK GCl Cm VNa VK VCl dv dt + V = V Na G Na + V K G K + V Cl G Cl G total G total = G Na + G K + G Cl = C m G total Membrane with synaptic inputs V Gleak GNa GK

More information

UVA CS 6316/4501 Fall 2016 Machine Learning. Lecture 6: Linear Regression Model with RegularizaEons. Dr. Yanjun Qi. University of Virginia

UVA CS 6316/4501 Fall 2016 Machine Learning. Lecture 6: Linear Regression Model with RegularizaEons. Dr. Yanjun Qi. University of Virginia UVA CS 6316/4501 Fall 2016 Machine Learning Lecture 6: Linear Regression Model with RegularizaEons Dr. Yanjun Qi University of Virginia Department of Computer Science 1 Where are we? è Five major secgons

More information

Statistical models for neural encoding

Statistical models for neural encoding Statistical models for neural encoding Part 1: discrete-time models Liam Paninski Gatsby Computational Neuroscience Unit University College London http://www.gatsby.ucl.ac.uk/ liam liam@gatsby.ucl.ac.uk

More information

Hopfield Neural Network and Associative Memory. Typical Myelinated Vertebrate Motoneuron (Wikipedia) Topic 3 Polymers and Neurons Lecture 5

Hopfield Neural Network and Associative Memory. Typical Myelinated Vertebrate Motoneuron (Wikipedia) Topic 3 Polymers and Neurons Lecture 5 Hopfield Neural Network and Associative Memory Typical Myelinated Vertebrate Motoneuron (Wikipedia) PHY 411-506 Computational Physics 2 1 Wednesday, March 5 1906 Nobel Prize in Physiology or Medicine.

More information

Neural Spike Train Analysis 1: Introduction to Point Processes

Neural Spike Train Analysis 1: Introduction to Point Processes SAMSI Summer 2015: CCNS Computational Neuroscience Summer School Neural Spike Train Analysis 1: Introduction to Point Processes Uri Eden BU Department of Mathematics and Statistics July 27, 2015 Spikes

More information

Mathematical Tools for Neuroscience (NEU 314) Princeton University, Spring 2016 Jonathan Pillow. Homework 8: Logistic Regression & Information Theory

Mathematical Tools for Neuroscience (NEU 314) Princeton University, Spring 2016 Jonathan Pillow. Homework 8: Logistic Regression & Information Theory Mathematical Tools for Neuroscience (NEU 34) Princeton University, Spring 206 Jonathan Pillow Homework 8: Logistic Regression & Information Theory Due: Tuesday, April 26, 9:59am Optimization Toolbox One

More information

Linear algebra. NEU 466M Instructor: Professor Ila R. Fiete Spring 2016

Linear algebra. NEU 466M Instructor: Professor Ila R. Fiete Spring 2016 Linear algebra NEU M Instructor: Professor Ila R. Fiete Spring 01 NotaBon Matrices: upper-case A, B, U, W Vector: bold, (usually) lower-case x, y, v, w x! x (handwribng: ) Elements of matrix, vector: lower-case

More information

Ecal Efficiency Studies. Karen Andeen 18 June 2013

Ecal Efficiency Studies. Karen Andeen 18 June 2013 Ecal Efficiency Studies Karen Andeen 8 June 0 What we re going for In general, we need an overall efficiency measurement, and we need to understand how much we can trust our monte carlo for this. We are

More information

Exercises. Chapter 1. of τ approx that produces the most accurate estimate for this firing pattern.

Exercises. Chapter 1. of τ approx that produces the most accurate estimate for this firing pattern. 1 Exercises Chapter 1 1. Generate spike sequences with a constant firing rate r 0 using a Poisson spike generator. Then, add a refractory period to the model by allowing the firing rate r(t) to depend

More information

Population Coding. Maneesh Sahani Gatsby Computational Neuroscience Unit University College London

Population Coding. Maneesh Sahani Gatsby Computational Neuroscience Unit University College London Population Coding Maneesh Sahani maneesh@gatsby.ucl.ac.uk Gatsby Computational Neuroscience Unit University College London Term 1, Autumn 2010 Coding so far... Time-series for both spikes and stimuli Empirical

More information

The Bayesian Brain. Robert Jacobs Department of Brain & Cognitive Sciences University of Rochester. May 11, 2017

The Bayesian Brain. Robert Jacobs Department of Brain & Cognitive Sciences University of Rochester. May 11, 2017 The Bayesian Brain Robert Jacobs Department of Brain & Cognitive Sciences University of Rochester May 11, 2017 Bayesian Brain How do neurons represent the states of the world? How do neurons represent

More information

ECE295, Data Assimila0on and Inverse Problems, Spring 2015

ECE295, Data Assimila0on and Inverse Problems, Spring 2015 ECE295, Data Assimila0on and Inverse Problems, Spring 2015 Peter Gersto), 534-7768, gersto)@ucsd.edu We meet Wednesday from 5 to 6:20pm in HHS 2305A Text for first 5 classes: Parameter EsGmaGon and Inverse

More information

Encoding or decoding

Encoding or decoding Encoding or decoding Decoding How well can we learn what the stimulus is by looking at the neural responses? We will discuss two approaches: devise and evaluate explicit algorithms for extracting a stimulus

More information

Neural Encoding: Firing Rates and Spike Statistics

Neural Encoding: Firing Rates and Spike Statistics Neural Encoding: Firing Rates and Spike Statistics Dayan and Abbott (21) Chapter 1 Instructor: Yoonsuck Choe; CPSC 644 Cortical Networks Background: Dirac δ Function Dirac δ function has the following

More information

Error Correcting Codes: Combinatorics, Algorithms and Applications Spring Homework Due Monday March 23, 2009 in class

Error Correcting Codes: Combinatorics, Algorithms and Applications Spring Homework Due Monday March 23, 2009 in class Error Correcting Codes: Combinatorics, Algorithms and Applications Spring 2009 Homework Due Monday March 23, 2009 in class You can collaborate in groups of up to 3. However, the write-ups must be done

More information

Modelling stochastic neural learning

Modelling stochastic neural learning Modelling stochastic neural learning Computational Neuroscience András Telcs telcs.andras@wigner.mta.hu www.cs.bme.hu/~telcs http://pattern.wigner.mta.hu/participants/andras-telcs Compiled from lectures

More information

Variational Auto-Encoders (VAE)

Variational Auto-Encoders (VAE) Variational Auto-Encoders (VAE) Jonathan Pillow Lecture 21 slides NEU 560 Spring 2018 VAE Generative Model latent ~z N (0, I) data f aaacyhicdzhnbhmxemed5assxy0cuvhesfyidiskofb0ungoiktasnk0cpzzxoq/zm+mjsxfeasu8as8ew+dk64qswaks3/n7z9jj2dipfbyfl862z279+4/2huyp3r85omz/ypn5940jkofg2ncymi8skghjwildkwdpiyslibz4xw/widzwugvulqwuuxki1pwhik1qmcvzgdzel9b9ip10f1rtqjl2jgbh3r+vlpdgwuauwted8vc4igwh4jlihnvelcmz9kvdjputiefhfwdi32dmlnag5eorrro/l0rmpj+qsbjqrjo/dzbjf/fhg3wh0zbansgah57ud1iioaupqdt4ycjxcbbubpprztpmgmc0x/leaxhmhulmj6gsuhfdjfhbrnjjkbnywixfrob0xuuhdpegg+bnqnhjusyqjr1d2umhzun/ljwv55tpxjftcqzdkfbehgt6ai4vdkg8xbmy562w27vclech/bkold+ftc9+tjuey+8jk/ig1ks9+sinjaz0iecspknfcc/sk+zza6z5a0167q1l8hgzf9/a5t+5qk=

More information

How Behavioral Constraints May Determine Optimal Sensory Representations

How Behavioral Constraints May Determine Optimal Sensory Representations How Behavioral Constraints May Determine Optimal Sensory Representations by Salinas (2006) CPSC 644 Presented by Yoonsuck Choe Motivation Neural response is typically characterized in terms of a tuning

More information

Efficient coding of natural images with a population of noisy Linear-Nonlinear neurons

Efficient coding of natural images with a population of noisy Linear-Nonlinear neurons Efficient coding of natural images with a population of noisy Linear-Nonlinear neurons Yan Karklin and Eero P. Simoncelli NYU Overview Efficient coding is a well-known objective for the evaluation and

More information

An Introductory Course in Computational Neuroscience

An Introductory Course in Computational Neuroscience An Introductory Course in Computational Neuroscience Contents Series Foreword Acknowledgments Preface 1 Preliminary Material 1.1. Introduction 1.1.1 The Cell, the Circuit, and the Brain 1.1.2 Physics of

More information

Transformation of stimulus correlations by the retina

Transformation of stimulus correlations by the retina Transformation of stimulus correlations by the retina Kristina Simmons (University of Pennsylvania) and Jason Prentice, (now Princeton University) with Gasper Tkacik (IST Austria) Jan Homann (now Princeton

More information

This cannot be estimated directly... s 1. s 2. P(spike, stim) P(stim) P(spike stim) =

This cannot be estimated directly... s 1. s 2. P(spike, stim) P(stim) P(spike stim) = LNP cascade model Simplest successful descriptive spiking model Easily fit to (extracellular) data Descriptive, and interpretable (although not mechanistic) For a Poisson model, response is captured by

More information

3.3 Population Decoding

3.3 Population Decoding 3.3 Population Decoding 97 We have thus far considered discriminating between two quite distinct stimulus values, plus and minus. Often we are interested in discriminating between two stimulus values s

More information

Characterization of Nonlinear Neuron Responses

Characterization of Nonlinear Neuron Responses Characterization of Nonlinear Neuron Responses Mid Year Report Matt Whiteway Department of Applied Mathematics and Scientific Computing whit822@umd.edu Advisor Dr. Daniel A. Butts Neuroscience and Cognitive

More information

Lateral organization & computation

Lateral organization & computation Lateral organization & computation review Population encoding & decoding lateral organization Efficient representations that reduce or exploit redundancy Fixation task 1rst order Retinotopic maps Log-polar

More information

AT2 Neuromodeling: Problem set #3 SPIKE TRAINS

AT2 Neuromodeling: Problem set #3 SPIKE TRAINS AT2 Neuromodeling: Problem set #3 SPIKE TRAINS Younesse Kaddar PROBLEM 1: Poisson spike trains Link of the ipython notebook for the code Brain neuron emit spikes seemingly randomly: we will aim to model

More information

Neural Encoding II: Reverse Correlation and Visual Receptive Fields

Neural Encoding II: Reverse Correlation and Visual Receptive Fields Chapter 2 Neural Encoding II: Reverse Correlation and Visual Receptive Fields 2.1 Introduction The spike-triggered average stimulus introduced in chapter 1 is a standard way of characterizing the selectivity

More information

A Monte Carlo Sequential Estimation for Point Process Optimum Filtering

A Monte Carlo Sequential Estimation for Point Process Optimum Filtering 2006 International Joint Conference on Neural Networks Sheraton Vancouver Wall Centre Hotel, Vancouver, BC, Canada July 16-21, 2006 A Monte Carlo Sequential Estimation for Point Process Optimum Filtering

More information

Linear and Non-Linear Responses to Dynamic Broad-Band Spectra in Primary Auditory Cortex

Linear and Non-Linear Responses to Dynamic Broad-Band Spectra in Primary Auditory Cortex Linear and Non-Linear Responses to Dynamic Broad-Band Spectra in Primary Auditory Cortex D. J. Klein S. A. Shamma J. Z. Simon D. A. Depireux,2,2 2 Department of Electrical Engineering Supported in part

More information

Estimation of information-theoretic quantities

Estimation of information-theoretic quantities Estimation of information-theoretic quantities Liam Paninski Gatsby Computational Neuroscience Unit University College London http://www.gatsby.ucl.ac.uk/ liam liam@gatsby.ucl.ac.uk November 16, 2004 Some

More information

Sample sta*s*cs and linear regression. NEU 466M Instructor: Professor Ila R. Fiete Spring 2016

Sample sta*s*cs and linear regression. NEU 466M Instructor: Professor Ila R. Fiete Spring 2016 Sample sta*s*cs and linear regression NEU 466M Instructor: Professor Ila R. Fiete Spring 2016 Mean {x 1,,x N } N samples of variable x hxi 1 N NX i=1 x i sample mean mean(x) other notation: x Binned version

More information

CS60021: Scalable Data Mining. Similarity Search and Hashing. Sourangshu Bha>acharya

CS60021: Scalable Data Mining. Similarity Search and Hashing. Sourangshu Bha>acharya CS62: Scalable Data Mining Similarity Search and Hashing Sourangshu Bha>acharya Finding Similar Items Distance Measures Goal: Find near-neighbors in high-dim. space We formally define near neighbors as

More information

Probabilistic Models in Theoretical Neuroscience

Probabilistic Models in Theoretical Neuroscience Probabilistic Models in Theoretical Neuroscience visible unit Boltzmann machine semi-restricted Boltzmann machine restricted Boltzmann machine hidden unit Neural models of probabilistic sampling: introduction

More information

Combining biophysical and statistical methods for understanding neural codes

Combining biophysical and statistical methods for understanding neural codes Combining biophysical and statistical methods for understanding neural codes Liam Paninski Department of Statistics and Center for Theoretical Neuroscience Columbia University http://www.stat.columbia.edu/

More information

Novel Quantization Strategies for Linear Prediction with Guarantees

Novel Quantization Strategies for Linear Prediction with Guarantees Novel Novel for Linear Simon Du 1/10 Yichong Xu Yuan Li Aarti Zhang Singh Pulkit Background Motivation: Brain Computer Interface (BCI). Predict whether an individual is trying to move his hand towards

More information

!) + log(t) # n i. The last two terms on the right hand side (RHS) are clearly independent of θ and can be

!) + log(t) # n i. The last two terms on the right hand side (RHS) are clearly independent of θ and can be Supplementary Materials General case: computing log likelihood We first describe the general case of computing the log likelihood of a sensory parameter θ that is encoded by the activity of neurons. Each

More information

3 Neural Decoding. 3.1 Encoding and Decoding. (r 1, r 2,..., r N ) for N neurons is a list of spike-count firing rates, although,

3 Neural Decoding. 3.1 Encoding and Decoding. (r 1, r 2,..., r N ) for N neurons is a list of spike-count firing rates, although, 3 Neural Decoding 3.1 Encoding and Decoding In chapters 1 and 2, we considered the problem of predicting neural responses to known stimuli. The nervous system faces the reverse problem, determining what

More information

Artificial Neural Networks Examination, June 2005

Artificial Neural Networks Examination, June 2005 Artificial Neural Networks Examination, June 2005 Instructions There are SIXTY questions. (The pass mark is 30 out of 60). For each question, please select a maximum of ONE of the given answers (either

More information

Searching for simple models

Searching for simple models Searching for simple models 9th Annual Pinkel Endowed Lecture Institute for Research in Cognitive Science University of Pennsylvania Friday April 7 William Bialek Joseph Henry Laboratories of Physics,

More information

Memories Associated with Single Neurons and Proximity Matrices

Memories Associated with Single Neurons and Proximity Matrices Memories Associated with Single Neurons and Proximity Matrices Subhash Kak Oklahoma State University, Stillwater Abstract: This paper extends the treatment of single-neuron memories obtained by the use

More information

Neural Encoding Models

Neural Encoding Models Neural Encoding Models Maneesh Sahani maneesh@gatsby.ucl.ac.uk Gatsby Computational Neuroscience Unit University College London Term 1, Autumn 2011 Studying sensory systems x(t) y(t) Decoding: ˆx(t)= G[y(t)]

More information

Introduction to representational similarity analysis

Introduction to representational similarity analysis Introduction to representational similarity analysis Nikolaus Kriegeskorte MRC Cognition and Brain Sciences Unit Cambridge RSA Workshop, 16-17 February 2015 a c t i v i t y d i s s i m i l a r i t y Representational

More information

Compartmental Modelling

Compartmental Modelling Modelling Neurons Computing and the Brain Compartmental Modelling Spring 2010 2 1 Equivalent Electrical Circuit A patch of membrane is equivalent to an electrical circuit This circuit can be described

More information

Signal detection theory

Signal detection theory Signal detection theory z p[r -] p[r +] - + Role of priors: Find z by maximizing P[correct] = p[+] b(z) + p[-](1 a(z)) Is there a better test to use than r? z p[r -] p[r +] - + The optimal

More information

Artificial Intelligence (AI) Common AI Methods. Training. Signals to Perceptrons. Artificial Neural Networks (ANN) Artificial Intelligence

Artificial Intelligence (AI) Common AI Methods. Training. Signals to Perceptrons. Artificial Neural Networks (ANN) Artificial Intelligence Artificial Intelligence (AI) Artificial Intelligence AI is an attempt to reproduce intelligent reasoning using machines * * H. M. Cartwright, Applications of Artificial Intelligence in Chemistry, 1993,

More information

Representational similarity analysis. Nikolaus Kriegeskorte MRC Cognition and Brain Sciences Unit Cambridge, UK

Representational similarity analysis. Nikolaus Kriegeskorte MRC Cognition and Brain Sciences Unit Cambridge, UK Representational similarity analysis Nikolaus Kriegeskorte MRC Cognition and Brain Sciences Unit Cambridge, UK a c t i v i t y d i s s i m i l a r i t y Representational similarity analysis stimulus (e.g.

More information

Chasing down the neural code with mathematics and modeling

Chasing down the neural code with mathematics and modeling ABOUT 45 mins... WITH LOTS OF ASIDES... FITS IF DELETE THE STUFF AFTER THE MOVIE OF DOTS AND GO STRAIGHT TO QUESTION -- HOW DOES THIS HAPPEN, NO NET MODEL NEEDED HERE. Chasing down the neural code with

More information

Deep Learning Basics Lecture 8: Autoencoder & DBM. Princeton University COS 495 Instructor: Yingyu Liang

Deep Learning Basics Lecture 8: Autoencoder & DBM. Princeton University COS 495 Instructor: Yingyu Liang Deep Learning Basics Lecture 8: Autoencoder & DBM Princeton University COS 495 Instructor: Yingyu Liang Autoencoder Autoencoder Neural networks trained to attempt to copy its input to its output Contain

More information

CHAPTER 3. Pattern Association. Neural Networks

CHAPTER 3. Pattern Association. Neural Networks CHAPTER 3 Pattern Association Neural Networks Pattern Association learning is the process of forming associations between related patterns. The patterns we associate together may be of the same type or

More information

RESEARCH STATEMENT. Nora Youngs, University of Nebraska - Lincoln

RESEARCH STATEMENT. Nora Youngs, University of Nebraska - Lincoln RESEARCH STATEMENT Nora Youngs, University of Nebraska - Lincoln 1. Introduction Understanding how the brain encodes information is a major part of neuroscience research. In the field of neural coding,

More information

Optimal Adaptation Principles In Neural Systems

Optimal Adaptation Principles In Neural Systems University of Pennsylvania ScholarlyCommons Publicly Accessible Penn Dissertations 2017 Optimal Adaptation Principles In Neural Systems Kamesh Krishnamurthy University of Pennsylvania, kameshkk@gmail.com

More information

Artificial Neural Networks. Q550: Models in Cognitive Science Lecture 5

Artificial Neural Networks. Q550: Models in Cognitive Science Lecture 5 Artificial Neural Networks Q550: Models in Cognitive Science Lecture 5 "Intelligence is 10 million rules." --Doug Lenat The human brain has about 100 billion neurons. With an estimated average of one thousand

More information

2- AUTOASSOCIATIVE NET - The feedforward autoassociative net considered in this section is a special case of the heteroassociative net.

2- AUTOASSOCIATIVE NET - The feedforward autoassociative net considered in this section is a special case of the heteroassociative net. 2- AUTOASSOCIATIVE NET - The feedforward autoassociative net considered in this section is a special case of the heteroassociative net. - For an autoassociative net, the training input and target output

More information

Hierarchy. Will Penny. 24th March Hierarchy. Will Penny. Linear Models. Convergence. Nonlinear Models. References

Hierarchy. Will Penny. 24th March Hierarchy. Will Penny. Linear Models. Convergence. Nonlinear Models. References 24th March 2011 Update Hierarchical Model Rao and Ballard (1999) presented a hierarchical model of visual cortex to show how classical and extra-classical Receptive Field (RF) effects could be explained

More information

Artificial Intelligence Hopfield Networks

Artificial Intelligence Hopfield Networks Artificial Intelligence Hopfield Networks Andrea Torsello Network Topologies Single Layer Recurrent Network Bidirectional Symmetric Connection Binary / Continuous Units Associative Memory Optimization

More information

Using persistent homology to reveal hidden information in neural data

Using persistent homology to reveal hidden information in neural data Using persistent homology to reveal hidden information in neural data Department of Mathematical Sciences, Norwegian University of Science and Technology ACAT Final Project Meeting, IST Austria July 9,

More information

Lecture 6. Regression

Lecture 6. Regression Lecture 6. Regression Prof. Alan Yuille Summer 2014 Outline 1. Introduction to Regression 2. Binary Regression 3. Linear Regression; Polynomial Regression 4. Non-linear Regression; Multilayer Perceptron

More information

Bayesian probability theory and generative models

Bayesian probability theory and generative models Bayesian probability theory and generative models Bruno A. Olshausen November 8, 2006 Abstract Bayesian probability theory provides a mathematical framework for peforming inference, or reasoning, using

More information

Digital Logic Design ENEE x. Lecture 14

Digital Logic Design ENEE x. Lecture 14 Digital Logic Design ENEE 244-010x Lecture 14 Announcements Homework 6 due today Agenda Last time: Binary Adders and Subtracters (5.1, 5.1.1) Carry Lookahead Adders (5.1.2, 5.1.3) This time: Decimal Adders

More information

Feedforward Neural Nets and Backpropagation

Feedforward Neural Nets and Backpropagation Feedforward Neural Nets and Backpropagation Julie Nutini University of British Columbia MLRG September 28 th, 2016 1 / 23 Supervised Learning Roadmap Supervised Learning: Assume that we are given the features

More information

What do V1 neurons like about natural signals

What do V1 neurons like about natural signals What do V1 neurons like about natural signals Yuguo Yu Richard Romero Tai Sing Lee Center for the Neural Computer Science Computer Science Basis of Cognition Department Department Carnegie Mellon University

More information

REAL-TIME COMPUTING WITHOUT STABLE

REAL-TIME COMPUTING WITHOUT STABLE REAL-TIME COMPUTING WITHOUT STABLE STATES: A NEW FRAMEWORK FOR NEURAL COMPUTATION BASED ON PERTURBATIONS Wolfgang Maass Thomas Natschlager Henry Markram Presented by Qiong Zhao April 28 th, 2010 OUTLINE

More information

Artificial Neural Networks Examination, March 2004

Artificial Neural Networks Examination, March 2004 Artificial Neural Networks Examination, March 2004 Instructions There are SIXTY questions (worth up to 60 marks). The exam mark (maximum 60) will be added to the mark obtained in the laborations (maximum

More information

The functional organization of the visual cortex in primates

The functional organization of the visual cortex in primates The functional organization of the visual cortex in primates Dominated by LGN M-cell input Drosal stream for motion perception & spatial localization V5 LIP/7a V2 V4 IT Ventral stream for object recognition

More information

Designing Information Devices and Systems II Fall 2015 Note 5

Designing Information Devices and Systems II Fall 2015 Note 5 EE 16B Designing Information Devices and Systems II Fall 01 Note Lecture given by Babak Ayazifar (9/10) Notes by: Ankit Mathur Spectral Leakage Example Compute the length-8 and length-6 DFT for the following

More information

Neural Encoding I: Firing Rates and Spike Statistics

Neural Encoding I: Firing Rates and Spike Statistics Chapter 1 Neural Encoding I: Firing Rates and Spike Statistics 1.1 Introduction Neurons are remarkable among the cells of the body in their ability to propagate signals rapidly over large distances. They

More information

Linear Independence. MATH 322, Linear Algebra I. J. Robert Buchanan. Spring Department of Mathematics

Linear Independence. MATH 322, Linear Algebra I. J. Robert Buchanan. Spring Department of Mathematics Linear Independence MATH 322, Linear Algebra I J. Robert Buchanan Department of Mathematics Spring 2015 Introduction Given a set of vectors {v 1, v 2,..., v r } and another vector v span{v 1, v 2,...,

More information

Modelling and Analysis of Retinal Ganglion Cells Through System Identification

Modelling and Analysis of Retinal Ganglion Cells Through System Identification Modelling and Analysis of Retinal Ganglion Cells Through System Identification Dermot Kerr 1, Martin McGinnity 2 and Sonya Coleman 1 1 School of Computing and Intelligent Systems, University of Ulster,

More information

Supporting Online Material for

Supporting Online Material for www.sciencemag.org/cgi/content/full/319/5869/1543/dc1 Supporting Online Material for Synaptic Theory of Working Memory Gianluigi Mongillo, Omri Barak, Misha Tsodyks* *To whom correspondence should be addressed.

More information

We want to show P (n) is true for all integers

We want to show P (n) is true for all integers Generalized Induction Proof: Let P (n) be the proposition 1 + 2 + 2 2 + + 2 n = 2 n+1 1. We want to show P (n) is true for all integers n 0. Generalized Induction Example: Use generalized induction to

More information

7 Rate-Based Recurrent Networks of Threshold Neurons: Basis for Associative Memory

7 Rate-Based Recurrent Networks of Threshold Neurons: Basis for Associative Memory Physics 178/278 - David Kleinfeld - Fall 2005; Revised for Winter 2017 7 Rate-Based Recurrent etworks of Threshold eurons: Basis for Associative Memory 7.1 A recurrent network with threshold elements The

More information

Information Theory. Mark van Rossum. January 24, School of Informatics, University of Edinburgh 1 / 35

Information Theory. Mark van Rossum. January 24, School of Informatics, University of Edinburgh 1 / 35 1 / 35 Information Theory Mark van Rossum School of Informatics, University of Edinburgh January 24, 2018 0 Version: January 24, 2018 Why information theory 2 / 35 Understanding the neural code. Encoding

More information

STA 414/2104, Spring 2014, Practice Problem Set #1

STA 414/2104, Spring 2014, Practice Problem Set #1 STA 44/4, Spring 4, Practice Problem Set # Note: these problems are not for credit, and not to be handed in Question : Consider a classification problem in which there are two real-valued inputs, and,

More information

Cooperative Integration and Representation Underlying Bilateral Network of Fly Motion-Sensitive Neurons

Cooperative Integration and Representation Underlying Bilateral Network of Fly Motion-Sensitive Neurons Cooperative Integration and Representation Underlying Bilateral Network of Fly Motion-Sensitive Neurons Yoshinori Suzuki 1 *, Takako Morimoto 2, Hiroyoshi Miyakawa 2, Toru Aonishi 1 1 Interdisciplinary

More information

Neural Decoding. Chapter Encoding and Decoding

Neural Decoding. Chapter Encoding and Decoding Chapter 3 Neural Decoding 3.1 Encoding and Decoding In chapters 1 and 2, we considered the problem of predicting neural responses to known stimuli. The nervous system faces the reverse problem, determining

More information

ECE 341. Lecture # 3

ECE 341. Lecture # 3 ECE 341 Lecture # 3 Instructor: Zeshan Chishti zeshan@ece.pdx.edu October 7, 2013 Portland State University Lecture Topics Counters Finite State Machines Decoders Multiplexers Reference: Appendix A of

More information

Biosciences in the 21st century

Biosciences in the 21st century Biosciences in the 21st century Lecture 1: Neurons, Synapses, and Signaling Dr. Michael Burger Outline: 1. Why neuroscience? 2. The neuron 3. Action potentials 4. Synapses 5. Organization of the nervous

More information

Lecture IV: LTI models of physical systems

Lecture IV: LTI models of physical systems BME 171: Signals and Systems Duke University September 5, 2008 This lecture Plan for the lecture: 1 Interconnections of linear systems 2 Differential equation models of LTI systems 3 eview of linear circuit

More information

Describing Spike-Trains

Describing Spike-Trains Describing Spike-Trains Maneesh Sahani Gatsby Computational Neuroscience Unit University College London Term 1, Autumn 2012 Neural Coding The brain manipulates information by combining and generating action

More information

Physics 178/278 - David Kleinfeld - Winter 2017

Physics 178/278 - David Kleinfeld - Winter 2017 Physics 178/278 - David Kleinfeld - Winter 2017 10 Layered networks We consider feedforward networks. These may be considered as the front end of nervous system, such as retinal ganglia cells that feed

More information

Neural coding Ecological approach to sensory coding: efficient adaptation to the natural environment

Neural coding Ecological approach to sensory coding: efficient adaptation to the natural environment Neural coding Ecological approach to sensory coding: efficient adaptation to the natural environment Jean-Pierre Nadal CNRS & EHESS Laboratoire de Physique Statistique (LPS, UMR 8550 CNRS - ENS UPMC Univ.

More information