Structure of Correlations in Neuronal Networks

Size: px
Start display at page:

Download "Structure of Correlations in Neuronal Networks"

Transcription

1 Structure of Correlations in Neuronal Networks Krešimir Josić University of Houston James Trousdale (UH), Yu Hu (UW) Eric Shea-Brown (UW)

2 The Connectome Van J. Wedeen, MGH/Harvard

3 The Connectome Van J. Wedeen, MGH/Harvard Song, et al. 2004

4 Recordings from neurons

5 Recordings from neurons

6 Recordings from neurons

7 Recordings from neurons Are neuronal responses dependent or independent? Tolias, Dragoi, Smirnakis, Angelaki,...

8 Recordings from neurons Are neuronal responses dependent or independent? Tolias, Dragoi, Smirnakis, Angelaki,...

9 Recordings from neurons How are structure and dynamics related in neuronal networks? Are neuronal responses dependent or independent? Tolias, Dragoi, Smirnakis, Angelaki,...

10 Recordings from neurons How are structure and dynamics related in neuronal networks? Synchrony - is probably atypical Are neuronal responses dependent or independent? Tolias, Dragoi, Smirnakis, Angelaki,...

11 Correlation - a measure of dependence Spike Count Correlation neuron 1 neuron 2

12 neuron 1 neuron 2

13 neuron 1 neuron 2

14 neuron 1 neuron 2

15 neuron 1 neuron 2

16 Correlations neuron 1 neuron 2 n - (random) number of spikes of neuron i during a time T. i Correlation coefficient of the output is T ρ T = Cov(n 1,n 2 ) Var(n1 )Var(n 2 )

17 Correlations neuron 1 neuron 2 n - (random) number of spikes of neuron i during a time T. i Correlation coefficient of the output is T Spikes fired by cell 2 ρ T = Cov(n 1,n 2 ) Var(n1 )Var(n 2 ) Spikes fired by cell 1 low correlation

18 Correlations neuron 1 neuron 2 n - (random) number of spikes of neuron i during a time T. i Correlation coefficient of the output is T Spikes fired by cell 2 ρ T = Cov(n 1,n 2 ) Var(n1 )Var(n 2 ) Spikes fired by cell 1 high correlation

19 Short vs long timescale correlations

20 t correlation c 0 Cross-Correlation Function 0.3 Input correlation c Spike train A Long-timescale correlation 1 ) 40 t -3 t t 2 3 Spike train B Conditional probability of w do you need to normalize the resulting histogram to get h B,A (t)? spike in B, given spike in A. Short-timescal (synchrony) sed in proving relation Eq. (4) can again be used to relate the crossith the cross-covariance density c A,B (t) =r B h A,B (t) r A r B, where 10 msr A and e rates of the two processes. Here c A,B (T )( t) 2 can again be interpreted covariance of N A (t + T,t + T + t) and N B (t, t + t).

21 t correlation c 0 Cross-Correlation Function 0.3 Input correlation c Spike train A Long-timescale correlation 1 ) 40 t -3 t t 2 3 Spike train B Conditional probability of w do you need to normalize the resulting histogram to get h B,A (t)? spike in B, given spike in A. Short-timescal (synchrony) sed in proving relation Eq. (4) can again be used to relate the crossith the cross-covariance density c A,B (t) =r B h A,B (t) r A r B, where 10 msr A and e rates of the two processes. Here c A,B (T )( t) 2 can again be interpreted covariance of N A (t + T,t + T + t) and N B (t, t + t).

22 ut correlation c ) 40 f i (t) = t 2 cross-correlation t function C -3 ij (τ) is defined as t 3 j (J ij y j )(t), where J ij (t) = S,j 0 The Cross-Correlation N N matrix J contains the synaptic kernels, while the matrix Function,C weights, and henceinput definescorrelation the networkc architecture. i,j (τ) In particular, W ij = of a synaptic connection from cell j to cell i. Table 1 provides an overview of all parameters and variables. Measures of spike time correlation We quantify dependencies Spike train A between the responses of cells in the network u and cross-correlation functions [Gabbiani and Cox, 2010]. For a pair of sp After normalization C ij (τ) =cov(y i (t + τ),y j (t)). The auto-correlation Spike train Bfunction C ii (t) is the cross-correlation between a s C(t) is the matrix of cross-correlation functions. Denoting by N yi (t 1,t 2 ) Conditional of spikes probability over a timeof window [t 1,t 2 ], the spike count correlation, ρ ij (τ), w do you need to normalize is defined theas, resulting histogram to get h B,A (t)? spike in B, given spike in A. cov N yi (t, t + τ),n yj (t, t + τ) sed in proving relation Eq. (4) can again be ρ ij used (τ) = to relate the crossith the cross-covariance density c A,B (t) =r B h A,B (t) r A var r B, where 10 (N yi ms (t, r A t + and τ)) var N yj (t, t + τ) e rates of the two processes. We assume Here stationarity c A,B (T )( t) of 2 the canspiking again processes be interpreted so that ρ ij does not dep covariance of N A (t + covariance T,t + T + is t) related and Nto B the (t, t cross-correlation + t). function by [Bair et al., b] 7 cov N y (t, t + τ),n y (t, t + τ) τ = C ij (s)(τ s

23 Correlations Impact Neural Computation A B J* (binary input) 0.5 h Cov h h J Cov Reliability 0 Cov(h) -0.5 D E Tkačik, et al. 2010

24 Models of Neurons - Integrate and Fire dv dt = V τ m +Ψ(V )+µ + 2Dη(t) V (t) =V θ V (t + )=V reset Subthreshold membrane potential Fire and Reset V θ V reset Rate r - number of spikes per second

25 Linear response kernel, A(t),X(t)

26 Linear response kernel, A(t),X(t)

27 Linear response kernel, A(t),X(t)

28 Linear response kernel, A(t),X(t)

29 Linear response kernel, A(t),X(t) r(t) =r 0 +(A X)(t), in the absence of the si

30 Structure or correlations in networks dv i dt = V i τ m +Ψ(V i )+ µ + 2Dη(t)+(f i f i ) spike generating current synaptic connections y j (t) = i f i (t) = j δ(t t j i ) (J ij y j )(t), output spike train of cell j synaptic coupling J ij (t) = t τ W D,j ij exp t τ D,j τs,j 2 τ S,j t τ D,j 0 t<τ D,j ynaptic kernels, while the matrix W contains the syn Nykamp

31 Can we estimate the correlation structure? The output of a model neuron is a spike train y j (t) = i δ(t t j i ) Linear response gives the output rate as r(t) =r 0 +(A X)(t), in the absence of the si

32 Can we estimate the correlation structure? The output of a model neuron is a spike train y j (t) = i δ(t t j i ) Linear response gives the output rate as r(t) =r 0 +(A X)(t), in the absence of the si How do we use this to compute the cross-correlation? Idea goes back to Lindner, Doiron, Longtin, 2005

33 Can we estimate the correlation structure? dv dt = V τ m +Ψ(V )+µ 0 + 2Dη(t) y 0 (t) = i δ(t t 0 i )

34 Can we estimate the correlation structure? dv dt = V τ m +Ψ(V )+µ 0 + 2Dη(t) +X(t) y 0 (t) = i δ(t t 0 i )

35 Can we estimate the correlation structure? dv dt = V τ m +Ψ(V )+µ 0 + 2Dη(t) +X(t)

36 Can we estimate the correlation structure? dv dt = V τ m +Ψ(V )+µ 0 + 2Dη(t) +X(t) y(t) = i δ(t t i )

37 Can we estimate the correlation structure? dv dt = V τ m +Ψ(V )+µ 0 + 2Dη(t) +X(t) y(t) = i δ(t t i ) Use linear response to obtain a mixed point/continuous process y(t) y 1 (t) =y 0 (t)+(a X)(t). tion of the spike train generated

38 Can we estimate the correlation structure? dv dt = V τ m +Ψ(V )+µ 0 + 2Dη(t) +X(t) y(t) = i δ(t t i ) Use linear response to obtain a mixed point/continuous process y(t) y 1 (t) =y 0 (t)+(a X)(t). tion of the spike train generated Which averages out to the right thing r(t) r 0 +(A X)(t)

39 Approximate network correlations The linear response approximation now takes the form yi 1 (t) =yi 0 (t)+ ( Ki,j [yj 0 r j ] ) (t) all inputs K i,j =(A i J i,j )(t) We can use this to approximate the cross-covariances C ij (τ) C 1 ij(τ) =E (y 1 i (t + τ) r i )(y 1 j (t) r j ) = δ ij C 0 ii(τ)+(k ij C 0 jj)(τ)+(k ji C0 ii)(τ)+ k (K ik K jk C0 kk )(τ) Ostojic, Brunel, Hakim, 2009, Trousdale, Yu, Shea-Brown, Josić, 2011

40 Impact of non-immediate neighbors We use an iterative construction y n+1 (t) =y 0 (t)+(k [y n r]) (t) n+1 ( ) = y 0 (t)+ K (k) [y 0 r] k=1 (t) Which gives the n-th approximation to the cross-correlation After taking the Fourier transform, and the limit n C (ω) = lim n C n (ω) =(I K(ω)) 1 C 0 (ω)(i K (ω)) 1.

41 Impact of non-immediate neighbors We use an iterative construction y n+1 (t) =y 0 (t)+(k [y n r]) (t) n+1 ( ) = y 0 (t)+ K (k) [y 0 r] k=1 (t) Which gives the n-th approximation to the cross-correlation After taking the Fourier transform, and the limit n C (ω) = lim n C n (ω) =(I K(ω)) 1 C 0 (ω)(i K (ω)) 1. K i,j =(A i J i,j )(t)

42 The iterative construction Rangan 2009 Pernice, Staube, Cardanobile, Rotter 2011 Trousdale, Yu, Shea-Brown, Josić, 2011

43 The iterative construction Rangan 2009 Pernice, Staube, Cardanobile, Rotter 2011 Trousdale, Yu, Shea-Brown, Josić, 2011

44 The iterative construction Rangan 2009 Pernice, Staube, Cardanobile, Rotter 2011 Trousdale, Yu, Shea-Brown, Josić, 2011

45 The approximation works well Cross-correlation between two excitatory cells as we shift from balance to excess inhibition

46 Expansion in terms of paths through the graph C (τ) = lim n n k,l ( ) K (k) C 0 K (l)t (τ) j a 0 a 1 a 2 i a 1 b 1 i b 2 j K ian 1 K an 1 a n 2 K a1 j C 0 jj K ian 1 K an 1a n 2 K a1 a 0 C 0 a0a0 K a 0 b 1 K b m 2 b m 1 K b m 1 j

47 How does local structure determine correlations? Song, et al. 2005

48 How do small motifs impact the correlation structure?!" #" $" %#&'()#*)" " q div = (Wi,kW 0 j,k)/n 0 3 p 2 i,j,k Sporns and Kötter, 2004

49 How do small motifs impact the correlation structure?!" #" $" $"!" #" $"!" #" %#&'()#*)" " " +,*&'()#*)" +-.#*" q div = (Wi,kW 0 j,k)/n 0 3 p 2 i,j,k q con q ch Sporns and Kötter, 2004

50 Mean correlations in structured networks

51 How do small motifs impact the correlation structure?!" #" $" $"!" #" $"!" #" %#&'()#*)" " " +,*&'()#*)" +-.#*"

52 Correlations with homogeneity C (ω) =(I K(ω)) 1 ỹ 0 (ω)ỹ 0 (ω)(i K (ω)) 1 Assuming homogeneity in uncoupled cells, and evaluating at ω = 0 C (0) = C 0 (0)(I ÃW) 1 (I ÃWT ) 1 After expanding and truncating at second order in connection strength, writing ww 0 = W C C 0 I + ÃwW0 + ÃwW0T + Ãw 2 W 0 W 0T + Ãw 2 W Ãw 2 W 0T 2

53 Averagd network correlations C C 0 I + ÃwW0 + ÃwW0T + Ãw 2 W 0 W 0T + Ãw 2 W Ãw 2 W 0T 2 Averaging over the network C C 0 1 N +2Ãwp +3N Ãw 2 p 2 + N Ãw 2 qdiv +2N Ãw 2 qch.

54 Resumming

55 C C 0 = 1 N 1 1 (NÃw)n L T WnL 0 1+ n=1 1 1 (NÃw)m L T Wm L 0T, m=1 Resumming (NÃw)n+m L T Wn,mL 0 n,m=1 Keeping contribution of second order motifs C C 0 = 1 N NÃw qdiv NÃw NÃw 2 2. p qch

56 C C 0 = 1 N 1 1 (NÃw)n L T WnL 0 1+ n=1 1 1 (NÃw)m L T Wm L 0T, m=1 Resumming (NÃw)n+m L T Wn,mL 0 n,m=1 Keeping contribution of second order motifs C C 0 = 1 N NÃw qdiv NÃw NÃw 2 2. p qch

57 Theory extends to EI % & ' ( +, - %."!"#!*#!+#!,#!-#!.# ) * %%" %&"!/# 01# " /!0123!43" !43" " $%&'(#

58 Conclusion - Linear response theory can be used to understand the statistical structure of population activity. - Cross-correlation functions can be understood in terms of contributions from paths through the network. Thus architecture and population activity can be related. - This local theory applies to any network where interactions can be linearized - There is a lot more to do - see Bullmore and Sporns, Nat Neurosci, 2009

59 University of Houston Manisha Bhardwaj Becky Chen Manuel Lopez Robert Rosenbaum (Pitt) James Trousdale University of Washington, Seattle Eric Shea-Brown Yu Hu University of Pittsburgh Brent Doiron

Motif Statistics and Spike Correlations in Neuronal Networks

Motif Statistics and Spike Correlations in Neuronal Networks Motif Statistics and Spike Correlations in Neuronal Networks Yu Hu 1, James Trousdale 2, Krešimir Josić 2,3, and Eric Shea-Brown 1,4 arxiv:126.3537v1 [q-bio.nc] 15 Jun 212 1 Department of Applied Mathematics,

More information

Consider the following spike trains from two different neurons N1 and N2:

Consider the following spike trains from two different neurons N1 and N2: About synchrony and oscillations So far, our discussions have assumed that we are either observing a single neuron at a, or that neurons fire independent of each other. This assumption may be correct in

More information

Abstract. Author Summary

Abstract. Author Summary 1 Self-organization of microcircuits in networks of spiking neurons with plastic synapses Gabriel Koch Ocker 1,3, Ashok Litwin-Kumar 2,3,4, Brent Doiron 2,3 1: Department of Neuroscience, University of

More information

Copyright by. Changan Liu. May, 2017

Copyright by. Changan Liu. May, 2017 Copyright by Changan Liu May, 2017 THE IMPACT OF STDP AND CORRELATED ACTIVITY ON NETWORK STRUCTURE A Dissertation Presented to the Faculty of the Department of Mathematics University of Houston In Partial

More information

Investigating the Correlation Firing Rate Relationship in Heterogeneous Recurrent Networks

Investigating the Correlation Firing Rate Relationship in Heterogeneous Recurrent Networks Journal of Mathematical Neuroscience (2018) 8:8 https://doi.org/10.1186/s13408-018-0063-y RESEARCH OpenAccess Investigating the Correlation Firing Rate Relationship in Heterogeneous Recurrent Networks

More information

Fast neural network simulations with population density methods

Fast neural network simulations with population density methods Fast neural network simulations with population density methods Duane Q. Nykamp a,1 Daniel Tranchina b,a,c,2 a Courant Institute of Mathematical Science b Department of Biology c Center for Neural Science

More information

THE TRANSFER AND PROPAGATION OF CORRELATED NEURONAL ACTIVITY

THE TRANSFER AND PROPAGATION OF CORRELATED NEURONAL ACTIVITY THE TRANSFER AND PROPAGATION OF CORRELATED NEURONAL ACTIVITY A Dissertation Presented to the Faculty of the Department of Mathematics University of Houston In Partial Fulfillment of the Requirements for

More information

Synchrony in Stochastic Pulse-coupled Neuronal Network Models

Synchrony in Stochastic Pulse-coupled Neuronal Network Models Synchrony in Stochastic Pulse-coupled Neuronal Network Models Katie Newhall Gregor Kovačič and Peter Kramer Aaditya Rangan and David Cai 2 Rensselaer Polytechnic Institute, Troy, New York 2 Courant Institute,

More information

14 - Gaussian Stochastic Processes

14 - Gaussian Stochastic Processes 14-1 Gaussian Stochastic Processes S. Lall, Stanford 211.2.24.1 14 - Gaussian Stochastic Processes Linear systems driven by IID noise Evolution of mean and covariance Example: mass-spring system Steady-state

More information

Is the superposition of many random spike trains a Poisson process?

Is the superposition of many random spike trains a Poisson process? Is the superposition of many random spike trains a Poisson process? Benjamin Lindner Max-Planck-Institut für Physik komplexer Systeme, Dresden Reference: Phys. Rev. E 73, 2291 (26) Outline Point processes

More information

Synaptic dynamics. John D. Murray. Synaptic currents. Simple model of the synaptic gating variable. First-order kinetics

Synaptic dynamics. John D. Murray. Synaptic currents. Simple model of the synaptic gating variable. First-order kinetics Synaptic dynamics John D. Murray A dynamical model for synaptic gating variables is presented. We use this to study the saturation of synaptic gating at high firing rate. Shunting inhibition and the voltage

More information

Evolution of the Average Synaptic Update Rule

Evolution of the Average Synaptic Update Rule Supporting Text Evolution of the Average Synaptic Update Rule In this appendix we evaluate the derivative of Eq. 9 in the main text, i.e., we need to calculate log P (yk Y k, X k ) γ log P (yk Y k ). ()

More information

The homogeneous Poisson process

The homogeneous Poisson process The homogeneous Poisson process during very short time interval Δt there is a fixed probability of an event (spike) occurring independent of what happened previously if r is the rate of the Poisson process,

More information

Training and spontaneous reinforcement of neuronal assemblies by spike timing

Training and spontaneous reinforcement of neuronal assemblies by spike timing Training and spontaneous reinforcement of neuronal assemblies by spike timing Gabriel Koch Ocker,3,4, Brent Doiron,3 : Department of Neuroscience, University of Pittsburgh, Pittsburgh, PA, USA : Department

More information

Neuronal Dynamics: Computational Neuroscience of Single Neurons

Neuronal Dynamics: Computational Neuroscience of Single Neurons Week 5 part 3a :Three definitions of rate code Neuronal Dynamics: Computational Neuroscience of Single Neurons Week 5 Variability and Noise: The question of the neural code Wulfram Gerstner EPFL, Lausanne,

More information

When do Correlations Increase with Firing Rates? Abstract. Author Summary. Andrea K. Barreiro 1* and Cheng Ly 2

When do Correlations Increase with Firing Rates? Abstract. Author Summary. Andrea K. Barreiro 1* and Cheng Ly 2 When do Correlations Increase with Firing Rates? Andrea K. Barreiro 1* and Cheng Ly 2 1 Department of Mathematics, Southern Methodist University, Dallas, TX 75275 U.S.A. 2 Department of Statistical Sciences

More information

CSE/NB 528 Final Lecture: All Good Things Must. CSE/NB 528: Final Lecture

CSE/NB 528 Final Lecture: All Good Things Must. CSE/NB 528: Final Lecture CSE/NB 528 Final Lecture: All Good Things Must 1 Course Summary Where have we been? Course Highlights Where do we go from here? Challenges and Open Problems Further Reading 2 What is the neural code? What

More information

Mathematical Methods for Neurosciences. ENS - Master MVA Paris 6 - Master Maths-Bio ( )

Mathematical Methods for Neurosciences. ENS - Master MVA Paris 6 - Master Maths-Bio ( ) Mathematical Methods for Neurosciences. ENS - Master MVA Paris 6 - Master Maths-Bio (2014-2015) Etienne Tanré - Olivier Faugeras INRIA - Team Tosca November 26th, 2014 E. Tanré (INRIA - Team Tosca) Mathematical

More information

Neural Encoding: Firing Rates and Spike Statistics

Neural Encoding: Firing Rates and Spike Statistics Neural Encoding: Firing Rates and Spike Statistics Dayan and Abbott (21) Chapter 1 Instructor: Yoonsuck Choe; CPSC 644 Cortical Networks Background: Dirac δ Function Dirac δ function has the following

More information

CORRELATION TRANSFER FROM BASAL GANGLIA TO THALAMUS IN PARKINSON S DISEASE. by Pamela Reitsma. B.S., University of Maine, 2007

CORRELATION TRANSFER FROM BASAL GANGLIA TO THALAMUS IN PARKINSON S DISEASE. by Pamela Reitsma. B.S., University of Maine, 2007 CORRELATION TRANSFER FROM BASAL GANGLIA TO THALAMUS IN PARKINSON S DISEASE by Pamela Reitsma B.S., University of Maine, 27 Submitted to the Graduate Faculty of the Department of Mathematics in partial

More information

Introduction. Stochastic Processes. Will Penny. Stochastic Differential Equations. Stochastic Chain Rule. Expectations.

Introduction. Stochastic Processes. Will Penny. Stochastic Differential Equations. Stochastic Chain Rule. Expectations. 19th May 2011 Chain Introduction We will Show the relation between stochastic differential equations, Gaussian processes and methods This gives us a formal way of deriving equations for the activity of

More information

Infinite systems of interacting chains with memory of variable length - a stochastic model for biological neural nets

Infinite systems of interacting chains with memory of variable length - a stochastic model for biological neural nets Infinite systems of interacting chains with memory of variable length - a stochastic model for biological neural nets Antonio Galves Universidade de S.Paulo Fapesp Center for Neuromathematics Eurandom,

More information

Describing Spike-Trains

Describing Spike-Trains Describing Spike-Trains Maneesh Sahani Gatsby Computational Neuroscience Unit University College London Term 1, Autumn 2012 Neural Coding The brain manipulates information by combining and generating action

More information

Diego A. Gutnisky and Kresimir Josic J Neurophysiol 103: , First published Dec 23, 2009; doi: /jn

Diego A. Gutnisky and Kresimir Josic J Neurophysiol 103: , First published Dec 23, 2009; doi: /jn Diego A. Gutnisky and Kresimir Josic J Neurophysiol 13:2912-293, 21. First published Dec 23, 29; doi:1.1152/jn.518.29 You might find this additional information useful... This article cites 83 articles,

More information

Learning Spatio-Temporally Encoded Pattern Transformations in Structured Spiking Neural Networks 12

Learning Spatio-Temporally Encoded Pattern Transformations in Structured Spiking Neural Networks 12 Learning Spatio-Temporally Encoded Pattern Transformations in Structured Spiking Neural Networks 12 André Grüning, Brian Gardner and Ioana Sporea Department of Computer Science University of Surrey Guildford,

More information

Fourier Analysis Overview (0B)

Fourier Analysis Overview (0B) CTFS: Continuous Time Fourier Series CTFT: Continuous Time Fourier Transform DTFS: Fourier Series DTFT: Fourier Transform DFT: Discrete Fourier Transform Copyright (c) 2009-2016 Young W. Lim. Permission

More information

B8.3 Mathematical Models for Financial Derivatives. Hilary Term Solution Sheet 2

B8.3 Mathematical Models for Financial Derivatives. Hilary Term Solution Sheet 2 B8.3 Mathematical Models for Financial Derivatives Hilary Term 18 Solution Sheet In the following W t ) t denotes a standard Brownian motion and t > denotes time. A partition π of the interval, t is a

More information

On the Dynamics of Delayed Neural Feedback Loops. Sebastian Brandt Department of Physics, Washington University in St. Louis

On the Dynamics of Delayed Neural Feedback Loops. Sebastian Brandt Department of Physics, Washington University in St. Louis On the Dynamics of Delayed Neural Feedback Loops Sebastian Brandt Department of Physics, Washington University in St. Louis Overview of Dissertation Chapter 2: S. F. Brandt, A. Pelster, and R. Wessel,

More information

Ranking Neurons for Mining Structure-Activity Relations in Biological Neural Networks: NeuronRank

Ranking Neurons for Mining Structure-Activity Relations in Biological Neural Networks: NeuronRank Ranking Neurons for Mining Structure-Activity Relations in Biological Neural Networks: NeuronRank Tayfun Gürel a,b,1, Luc De Raedt a,b, Stefan Rotter a,c a Bernstein Center for Computational Neuroscience,

More information

FMIA. Fluid Mechanics and Its Applications 113 Series Editor: A. Thess. Moukalled Mangani Darwish. F. Moukalled L. Mangani M.

FMIA. Fluid Mechanics and Its Applications 113 Series Editor: A. Thess. Moukalled Mangani Darwish. F. Moukalled L. Mangani M. FMIA F. Moukalled L. Mangani M. Darwish An Advanced Introduction with OpenFOAM and Matlab This textbook explores both the theoretical foundation of the Finite Volume Method (FVM) and its applications in

More information

1 Balanced networks: Trading speed for noise

1 Balanced networks: Trading speed for noise Physics 178/278 - David leinfeld - Winter 2017 (Corrected yet incomplete notes) 1 Balanced networks: Trading speed for noise 1.1 Scaling of neuronal inputs An interesting observation is that the subthresold

More information

3.3 Discrete Hopfield Net An iterative autoassociative net similar to the nets described in the previous sections has been developed by Hopfield

3.3 Discrete Hopfield Net An iterative autoassociative net similar to the nets described in the previous sections has been developed by Hopfield 3.3 Discrete Hopfield Net An iterative autoassociative net similar to the nets described in the previous sections has been developed by Hopfield (1982, 1984). - The net is a fully interconnected neural

More information

Phase Response Properties and Phase-Locking in Neural Systems with Delayed Negative-Feedback. Carter L. Johnson

Phase Response Properties and Phase-Locking in Neural Systems with Delayed Negative-Feedback. Carter L. Johnson Phase Response Properties and Phase-Locking in Neural Systems with Delayed Negative-Feedback Carter L. Johnson Faculty Mentor: Professor Timothy J. Lewis University of California, Davis Abstract Oscillatory

More information

EFFECTS OF SPIKE-DRIVEN FEEDBACK ON NEURAL GAIN AND PAIRWISE CORRELATION

EFFECTS OF SPIKE-DRIVEN FEEDBACK ON NEURAL GAIN AND PAIRWISE CORRELATION EFFECTS OF SPIKE-DRIVEN FEEDBACK ON NEURAL GAIN AND PAIRWISE CORRELATION by John D. Bartels B.S. Computer Science, Rensselaer Polytechnic Institute, 996 Submitted to the Graduate Faculty of the Arts and

More information

Model neurons!!poisson neurons!

Model neurons!!poisson neurons! Model neurons!!poisson neurons! Suggested reading:! Chapter 1.4 in Dayan, P. & Abbott, L., heoretical Neuroscience, MI Press, 2001.! Model neurons: Poisson neurons! Contents: Probability of a spike sequence

More information

High-conductance states in a mean-eld cortical network model

High-conductance states in a mean-eld cortical network model Neurocomputing 58 60 (2004) 935 940 www.elsevier.com/locate/neucom High-conductance states in a mean-eld cortical network model Alexander Lerchner a;, Mandana Ahmadi b, John Hertz b a Oersted-DTU, Technical

More information

On Parameter Estimation for Neuron Models

On Parameter Estimation for Neuron Models On Parameter Estimation for Neuron Models Abhijit Biswas Department of Mathematics Temple University November 30th, 2017 Abhijit Biswas (Temple University) On Parameter Estimation for Neuron Models November

More information

Controlled Diffusions and Hamilton-Jacobi Bellman Equations

Controlled Diffusions and Hamilton-Jacobi Bellman Equations Controlled Diffusions and Hamilton-Jacobi Bellman Equations Emo Todorov Applied Mathematics and Computer Science & Engineering University of Washington Winter 2014 Emo Todorov (UW) AMATH/CSE 579, Winter

More information

Supporting Online Material for

Supporting Online Material for www.sciencemag.org/cgi/content/full/319/5869/1543/dc1 Supporting Online Material for Synaptic Theory of Working Memory Gianluigi Mongillo, Omri Barak, Misha Tsodyks* *To whom correspondence should be addressed.

More information

Activity Driven Adaptive Stochastic. Resonance. Gregor Wenning and Klaus Obermayer. Technical University of Berlin.

Activity Driven Adaptive Stochastic. Resonance. Gregor Wenning and Klaus Obermayer. Technical University of Berlin. Activity Driven Adaptive Stochastic Resonance Gregor Wenning and Klaus Obermayer Department of Electrical Engineering and Computer Science Technical University of Berlin Franklinstr. 8/9, 187 Berlin fgrewe,obyg@cs.tu-berlin.de

More information

Multiplicatively interacting point processes and applications to neural modeling

Multiplicatively interacting point processes and applications to neural modeling J Comput Neurosci (21) 28:267 284 DOI 1.17/s1827-9-24- Multiplicatively interacting point processes and applications to neural modeling Stefano Cardanobile Stefan Rotter Received: 26 June 29 / Revised:

More information

A Population Density Approach that Facilitates Large-Scale Modeling of Neural Networks: Analysis and an Application to Orientation Tuning

A Population Density Approach that Facilitates Large-Scale Modeling of Neural Networks: Analysis and an Application to Orientation Tuning Journal of Computational Neuroscience, 8, 19 5 (2) c 2 Kluwer Academic Publishers, Boston. Manufactured in The Netherlands. A Population Density Approach that Facilitates Large-Scale Modeling of Neural

More information

Neuronal Dynamics: Computational Neuroscience of Single Neurons

Neuronal Dynamics: Computational Neuroscience of Single Neurons Week 4 part 5: Nonlinear Integrate-and-Fire Model 4.1 From Hodgkin-Huxley to 2D Neuronal Dynamics: Computational Neuroscience of Single Neurons Week 4 Recing detail: Two-dimensional neuron models Wulfram

More information

Statistical Machine Learning from Data

Statistical Machine Learning from Data January 17, 2006 Samy Bengio Statistical Machine Learning from Data 1 Statistical Machine Learning from Data Other Artificial Neural Networks Samy Bengio IDIAP Research Institute, Martigny, Switzerland,

More information

A Short Course in Basic Statistics

A Short Course in Basic Statistics A Short Course in Basic Statistics Ian Schindler November 5, 2017 Creative commons license share and share alike BY: C 1 Descriptive Statistics 1.1 Presenting statistical data Definition 1 A statistical

More information

Poisson Processes for Neuroscientists

Poisson Processes for Neuroscientists Poisson Processes for Neuroscientists Thibaud Taillefumier This note is an introduction to the key properties of Poisson processes, which are extensively used to simulate spike trains. For being mathematical

More information

Correlations and neural information coding Shlens et al. 09

Correlations and neural information coding Shlens et al. 09 Correlations and neural information coding Shlens et al. 09 Joel Zylberberg www.jzlab.org The neural code is not one-to-one ρ = 0.52 ρ = 0.80 d 3s [Max Turner, UW] b # of trials trial 1!! trial 2!!.!.!.

More information

On the efficient calculation of van Rossum distances.

On the efficient calculation of van Rossum distances. On the efficient calculation of van Rossum distances. Conor Houghton 1 and Thomas Kreuz 2 1 School of Mathematics, Trinity College Dublin, Ireland. 2 Institute for complex systems - CNR, Sesto Fiorentino,

More information

Section 2. Basic formulas and identities in Riemannian geometry

Section 2. Basic formulas and identities in Riemannian geometry Section 2. Basic formulas and identities in Riemannian geometry Weimin Sheng and 1. Bianchi identities The first and second Bianchi identities are R ijkl + R iklj + R iljk = 0 R ijkl,m + R ijlm,k + R ijmk,l

More information

Entrainment and Chaos in the Hodgkin-Huxley Oscillator

Entrainment and Chaos in the Hodgkin-Huxley Oscillator Entrainment and Chaos in the Hodgkin-Huxley Oscillator Kevin K. Lin http://www.cims.nyu.edu/ klin Courant Institute, New York University Mostly Biomath - 2005.4.5 p.1/42 Overview (1) Goal: Show that the

More information

MATH 38061/MATH48061/MATH68061: MULTIVARIATE STATISTICS Solutions to Problems on Random Vectors and Random Sampling. 1+ x2 +y 2 ) (n+2)/2

MATH 38061/MATH48061/MATH68061: MULTIVARIATE STATISTICS Solutions to Problems on Random Vectors and Random Sampling. 1+ x2 +y 2 ) (n+2)/2 MATH 3806/MATH4806/MATH6806: MULTIVARIATE STATISTICS Solutions to Problems on Rom Vectors Rom Sampling Let X Y have the joint pdf: fx,y) + x +y ) n+)/ π n for < x < < y < this is particular case of the

More information

Adaptation in the Neural Code of the Retina

Adaptation in the Neural Code of the Retina Adaptation in the Neural Code of the Retina Lens Retina Fovea Optic Nerve Optic Nerve Bottleneck Neurons Information Receptors: 108 95% Optic Nerve 106 5% After Polyak 1941 Visual Cortex ~1010 Mean Intensity

More information

Neural Coding: Integrate-and-Fire Models of Single and Multi-Neuron Responses

Neural Coding: Integrate-and-Fire Models of Single and Multi-Neuron Responses Neural Coding: Integrate-and-Fire Models of Single and Multi-Neuron Responses Jonathan Pillow HHMI and NYU http://www.cns.nyu.edu/~pillow Oct 5, Course lecture: Computational Modeling of Neuronal Systems

More information

Information filtering by synchronous spikes in a neural population

Information filtering by synchronous spikes in a neural population J Comput Neurosci (3) 34:85 3 DOI.7/s87--4-9 Information filtering by synchronous spikes in a neural population Nahal Sharafi Jan Benda Benjamin Lindner Received: 4 May / Revised: July / Accepted: 8 August

More information

Finite volume and asymptotic methods for stochastic neuron models with correlated inputs

Finite volume and asymptotic methods for stochastic neuron models with correlated inputs J. Math. Biol. DOI 0.007/s0085-0-045-3 Mathematical Biology Finite volume and asymptotic methods for stochastic neuron models with correlated inputs Robert Rosenbaum Fabien Marpeau Jianfu Ma Aditya Barua

More information

Decorrelation of neural-network activity by inhibitory feedback

Decorrelation of neural-network activity by inhibitory feedback Decorrelation of neural-network activity by inhibitory feedback Tom Tetzlaff 1,2,#,, Moritz Helias 1,3,#, Gaute T. Einevoll 2, Markus Diesmann 1,3 1 Inst. of Neuroscience and Medicine (INM-6), Computational

More information

Chapter 3 - Temporal processes

Chapter 3 - Temporal processes STK4150 - Intro 1 Chapter 3 - Temporal processes Odd Kolbjørnsen and Geir Storvik January 23 2017 STK4150 - Intro 2 Temporal processes Data collected over time Past, present, future, change Temporal aspect

More information

Neural Networks. Fundamentals of Neural Networks : Architectures, Algorithms and Applications. L, Fausett, 1994

Neural Networks. Fundamentals of Neural Networks : Architectures, Algorithms and Applications. L, Fausett, 1994 Neural Networks Neural Networks Fundamentals of Neural Networks : Architectures, Algorithms and Applications. L, Fausett, 1994 An Introduction to Neural Networks (nd Ed). Morton, IM, 1995 Neural Networks

More information

Exploring a Simple Discrete Model of Neuronal Networks

Exploring a Simple Discrete Model of Neuronal Networks Exploring a Simple Discrete Model of Neuronal Networks Winfried Just Ohio University Joint work with David Terman, Sungwoo Ahn,and Xueying Wang August 6, 2010 An ODE Model of Neuronal Networks by Terman

More information

THE LOCUST OLFACTORY SYSTEM AS A CASE STUDY FOR MODELING DYNAMICS OF NEUROBIOLOGICAL NETWORKS: FROM DISCRETE TIME NEURONS TO CONTINUOUS TIME NEURONS

THE LOCUST OLFACTORY SYSTEM AS A CASE STUDY FOR MODELING DYNAMICS OF NEUROBIOLOGICAL NETWORKS: FROM DISCRETE TIME NEURONS TO CONTINUOUS TIME NEURONS 1 THE LOCUST OLFACTORY SYSTEM AS A CASE STUDY FOR MODELING DYNAMICS OF NEUROBIOLOGICAL NETWORKS: FROM DISCRETE TIME NEURONS TO CONTINUOUS TIME NEURONS B. QUENET 1 AND G. HORCHOLLE-BOSSAVIT 2 1 Equipe de

More information

Radial-Basis Function Networks

Radial-Basis Function Networks Radial-Basis Function etworks A function is radial basis () if its output depends on (is a non-increasing function of) the distance of the input from a given stored vector. s represent local receptors,

More information

Integration of synaptic inputs in dendritic trees

Integration of synaptic inputs in dendritic trees Integration of synaptic inputs in dendritic trees Theoretical Neuroscience Fabrizio Gabbiani Division of Neuroscience Baylor College of Medicine One Baylor Plaza Houston, TX 77030 e-mail:gabbiani@bcm.tmc.edu

More information

A LOWER BOUND ON BLOWUP RATES FOR THE 3D INCOMPRESSIBLE EULER EQUATION AND A SINGLE EXPONENTIAL BEALE-KATO-MAJDA ESTIMATE. 1.

A LOWER BOUND ON BLOWUP RATES FOR THE 3D INCOMPRESSIBLE EULER EQUATION AND A SINGLE EXPONENTIAL BEALE-KATO-MAJDA ESTIMATE. 1. A LOWER BOUND ON BLOWUP RATES FOR THE 3D INCOMPRESSIBLE EULER EQUATION AND A SINGLE EXPONENTIAL BEALE-KATO-MAJDA ESTIMATE THOMAS CHEN AND NATAŠA PAVLOVIĆ Abstract. We prove a Beale-Kato-Majda criterion

More information

The Spike Response Model: A Framework to Predict Neuronal Spike Trains

The Spike Response Model: A Framework to Predict Neuronal Spike Trains The Spike Response Model: A Framework to Predict Neuronal Spike Trains Renaud Jolivet, Timothy J. Lewis 2, and Wulfram Gerstner Laboratory of Computational Neuroscience, Swiss Federal Institute of Technology

More information

LINEAR SYSTEMS. J. Elder PSYC 6256 Principles of Neural Coding

LINEAR SYSTEMS. J. Elder PSYC 6256 Principles of Neural Coding LINEAR SYSTEMS Linear Systems 2 Neural coding and cognitive neuroscience in general concerns input-output relationships. Inputs Light intensity Pre-synaptic action potentials Number of items in display

More information

Autoassociative Memory Retrieval and Spontaneous Activity Bumps in Small-World Networks of Integrate-and-Fire Neurons

Autoassociative Memory Retrieval and Spontaneous Activity Bumps in Small-World Networks of Integrate-and-Fire Neurons Autoassociative Memory Retrieval and Spontaneous Activity Bumps in Small-World Networks of Integrate-and-Fire Neurons Anastasia Anishchenko Department of Physics and Brain Science Program Brown University,

More information

Exercises. Chapter 1. of τ approx that produces the most accurate estimate for this firing pattern.

Exercises. Chapter 1. of τ approx that produces the most accurate estimate for this firing pattern. 1 Exercises Chapter 1 1. Generate spike sequences with a constant firing rate r 0 using a Poisson spike generator. Then, add a refractory period to the model by allowing the firing rate r(t) to depend

More information

Correlated noise and the retina s population code for direction

Correlated noise and the retina s population code for direction Correlated noise and the retina s population code for direction Eric Shea-Brown Joel Zylberberg Jon Cafaro Max Turner Greg Schwartz Fred Rieke University of Washington 1 DS cell responses are noisy Stimulus

More information

Probabilistic Models in Theoretical Neuroscience

Probabilistic Models in Theoretical Neuroscience Probabilistic Models in Theoretical Neuroscience visible unit Boltzmann machine semi-restricted Boltzmann machine restricted Boltzmann machine hidden unit Neural models of probabilistic sampling: introduction

More information

(Feed-Forward) Neural Networks Dr. Hajira Jabeen, Prof. Jens Lehmann

(Feed-Forward) Neural Networks Dr. Hajira Jabeen, Prof. Jens Lehmann (Feed-Forward) Neural Networks 2016-12-06 Dr. Hajira Jabeen, Prof. Jens Lehmann Outline In the previous lectures we have learned about tensors and factorization methods. RESCAL is a bilinear model for

More information

Chapter 4 The Fourier Series and Fourier Transform

Chapter 4 The Fourier Series and Fourier Transform Chapter 4 The Fourier Series and Fourier Transform Fourier Series Representation of Periodic Signals Let x(t) be a CT periodic signal with period T, i.e., xt ( + T) = xt ( ), t R Example: the rectangular

More information

Math in systems neuroscience. Quan Wen

Math in systems neuroscience. Quan Wen Math in systems neuroscience Quan Wen Human brain is perhaps the most complex subject in the universe 1 kg brain 10 11 neurons 180,000 km nerve fiber 10 15 synapses 10 18 synaptic proteins Multiscale

More information

EXAMINING HETEROGENEOUS WEIGHT PERTURBATIONS IN NEURAL NETWORKS WITH SPIKE-TIMING-DEPENDENT PLASTICITY

EXAMINING HETEROGENEOUS WEIGHT PERTURBATIONS IN NEURAL NETWORKS WITH SPIKE-TIMING-DEPENDENT PLASTICITY EXAMINING HETEROGENEOUS WEIGHT PERTURBATIONS IN NEURAL NETWORKS WITH SPIKE-TIMING-DEPENDENT PLASTICITY by Colin Bredenberg University of Pittsburgh, 2017 Submitted to the Graduate Faculty of the Department

More information

arxiv: v3 [q-bio.nc] 18 Feb 2015

arxiv: v3 [q-bio.nc] 18 Feb 2015 1 Inferring Synaptic Structure in presence of eural Interaction Time Scales Cristiano Capone 1,2,5,, Carla Filosa 1, Guido Gigante 2,3, Federico Ricci-Tersenghi 1,4,5, Paolo Del Giudice 2,5 arxiv:148.115v3

More information

Discretization of SDEs: Euler Methods and Beyond

Discretization of SDEs: Euler Methods and Beyond Discretization of SDEs: Euler Methods and Beyond 09-26-2006 / PRisMa 2006 Workshop Outline Introduction 1 Introduction Motivation Stochastic Differential Equations 2 The Time Discretization of SDEs Monte-Carlo

More information

Single neuron models. L. Pezard Aix-Marseille University

Single neuron models. L. Pezard Aix-Marseille University Single neuron models L. Pezard Aix-Marseille University Biophysics Biological neuron Biophysics Ionic currents Passive properties Active properties Typology of models Compartmental models Differential

More information

Answer Key b c d e. 14. b c d e. 15. a b c e. 16. a b c e. 17. a b c d. 18. a b c e. 19. a b d e. 20. a b c e. 21. a c d e. 22.

Answer Key b c d e. 14. b c d e. 15. a b c e. 16. a b c e. 17. a b c d. 18. a b c e. 19. a b d e. 20. a b c e. 21. a c d e. 22. Math 20580 Answer Key 1 Your Name: Final Exam May 8, 2007 Instructor s name: Record your answers to the multiple choice problems by placing an through one letter for each problem on this answer sheet.

More information

GMM, HAC estimators, & Standard Errors for Business Cycle Statistics

GMM, HAC estimators, & Standard Errors for Business Cycle Statistics GMM, HAC estimators, & Standard Errors for Business Cycle Statistics Wouter J. Den Haan London School of Economics c Wouter J. Den Haan Overview Generic GMM problem Estimation Heteroskedastic and Autocorrelation

More information

Will Penny. 21st April The Macroscopic Brain. Will Penny. Cortical Unit. Spectral Responses. Macroscopic Models. Steady-State Responses

Will Penny. 21st April The Macroscopic Brain. Will Penny. Cortical Unit. Spectral Responses. Macroscopic Models. Steady-State Responses The The 21st April 2011 Jansen and Rit (1995), building on the work of Lopes Da Sliva and others, developed a biologically inspired model of EEG activity. It was originally developed to explain alpha activity

More information

Phase-locking in weakly heterogeneous neuronal networks

Phase-locking in weakly heterogeneous neuronal networks Physica D 118 (1998) 343 370 Phase-locking in weakly heterogeneous neuronal networks Carson C. Chow 1 Department of Mathematics and Center for BioDynamics, Boston University, Boston, MA 02215, USA Received

More information

A DYNAMICAL STATE UNDERLYING THE SECOND ORDER MAXIMUM ENTROPY PRINCIPLE IN NEURONAL NETWORKS

A DYNAMICAL STATE UNDERLYING THE SECOND ORDER MAXIMUM ENTROPY PRINCIPLE IN NEURONAL NETWORKS COMMUN. MATH. SCI. Vol. 15, No. 3, pp. 665 692 c 2017 International Press A DYNAMICAL STATE UNDERLYING THE SECOND ORDER MAXIMUM ENTROPY PRINCIPLE IN NEURONAL NETWORKS ZHI-QIN JOHN XU, GUOQIANG BI, DOUGLAS

More information

Spike Count Correlation Increases with Length of Time Interval in the Presence of Trial-to-Trial Variation

Spike Count Correlation Increases with Length of Time Interval in the Presence of Trial-to-Trial Variation NOTE Communicated by Jonathan Victor Spike Count Correlation Increases with Length of Time Interval in the Presence of Trial-to-Trial Variation Robert E. Kass kass@stat.cmu.edu Valérie Ventura vventura@stat.cmu.edu

More information

Introduction to Neural Networks

Introduction to Neural Networks Introduction to Neural Networks What are (Artificial) Neural Networks? Models of the brain and nervous system Highly parallel Process information much more like the brain than a serial computer Learning

More information

Dynamical systems in neuroscience. Pacific Northwest Computational Neuroscience Connection October 1-2, 2010

Dynamical systems in neuroscience. Pacific Northwest Computational Neuroscience Connection October 1-2, 2010 Dynamical systems in neuroscience Pacific Northwest Computational Neuroscience Connection October 1-2, 2010 What do I mean by a dynamical system? Set of state variables Law that governs evolution of state

More information

Signals & Systems. Lecture 5 Continuous-Time Fourier Transform. Alp Ertürk

Signals & Systems. Lecture 5 Continuous-Time Fourier Transform. Alp Ertürk Signals & Systems Lecture 5 Continuous-Time Fourier Transform Alp Ertürk alp.erturk@kocaeli.edu.tr Fourier Series Representation of Continuous-Time Periodic Signals Synthesis equation: x t = a k e jkω

More information

ENGIN 211, Engineering Math. Fourier Series and Transform

ENGIN 211, Engineering Math. Fourier Series and Transform ENGIN 11, Engineering Math Fourier Series and ransform 1 Periodic Functions and Harmonics f(t) Period: a a+ t Frequency: f = 1 Angular velocity (or angular frequency): ω = ππ = π Such a periodic function

More information

First Order Initial Value Problems

First Order Initial Value Problems First Order Initial Value Problems A first order initial value problem is the problem of finding a function xt) which satisfies the conditions x = x,t) x ) = ξ 1) where the initial time,, is a given real

More information

Factors affecting phase synchronization in integrate-and-fire oscillators

Factors affecting phase synchronization in integrate-and-fire oscillators J Comput Neurosci (26) 2:9 2 DOI.7/s827-6-674-6 Factors affecting phase synchronization in integrate-and-fire oscillators Todd W. Troyer Received: 24 May 25 / Revised: 9 November 25 / Accepted: November

More information

2.161 Signal Processing: Continuous and Discrete Fall 2008

2.161 Signal Processing: Continuous and Discrete Fall 2008 MIT OpenCourseWare http://ocw.mit.edu 2.6 Signal Processing: Continuous and Discrete Fall 2008 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms. MASSACHUSETTS

More information

Lesson 2: Analysis of time series

Lesson 2: Analysis of time series Lesson 2: Analysis of time series Time series Main aims of time series analysis choosing right model statistical testing forecast driving and optimalisation Problems in analysis of time series time problems

More information

Robust regression and non-linear kernel methods for characterization of neuronal response functions from limited data

Robust regression and non-linear kernel methods for characterization of neuronal response functions from limited data Robust regression and non-linear kernel methods for characterization of neuronal response functions from limited data Maneesh Sahani Gatsby Computational Neuroscience Unit University College, London Jennifer

More information

Event-driven simulations of nonlinear integrate-and-fire neurons

Event-driven simulations of nonlinear integrate-and-fire neurons Event-driven simulations of nonlinear integrate-and-fire neurons A. Tonnelier, H. Belmabrouk, D. Martinez Cortex Project, LORIA, Campus Scientifique, B.P. 239, 54 56 Vandoeuvre-lès-Nancy, France Abstract

More information

Radial-Basis Function Networks

Radial-Basis Function Networks Radial-Basis Function etworks A function is radial () if its output depends on (is a nonincreasing function of) the distance of the input from a given stored vector. s represent local receptors, as illustrated

More information

Solving TSP Using Lotka-Volterra Neural Networks without Self-Excitatory

Solving TSP Using Lotka-Volterra Neural Networks without Self-Excitatory Solving TSP Using Lotka-Volterra Neural Networks without Self-Excitatory Manli Li, Jiali Yu, Stones Lei Zhang, Hong Qu Computational Intelligence Laboratory, School of Computer Science and Engineering,

More information

Decoding. How well can we learn what the stimulus is by looking at the neural responses?

Decoding. How well can we learn what the stimulus is by looking at the neural responses? Decoding How well can we learn what the stimulus is by looking at the neural responses? Two approaches: devise explicit algorithms for extracting a stimulus estimate directly quantify the relationship

More information

EE102 Homework 2, 3, and 4 Solutions

EE102 Homework 2, 3, and 4 Solutions EE12 Prof. S. Boyd EE12 Homework 2, 3, and 4 Solutions 7. Some convolution systems. Consider a convolution system, y(t) = + u(t τ)h(τ) dτ, where h is a function called the kernel or impulse response of

More information

OHSx XM521 Multivariable Differential Calculus: Homework Solutions 14.1

OHSx XM521 Multivariable Differential Calculus: Homework Solutions 14.1 OHSx XM5 Multivariable Differential Calculus: Homework Solutions 4. (8) Describe the graph of the equation. r = i + tj + (t )k. Solution: Let y(t) = t, so that z(t) = t = y. In the yz-plane, this is just

More information

REPEATED MEASURES. Copyright c 2012 (Iowa State University) Statistics / 29

REPEATED MEASURES. Copyright c 2012 (Iowa State University) Statistics / 29 REPEATED MEASURES Copyright c 2012 (Iowa State University) Statistics 511 1 / 29 Repeated Measures Example In an exercise therapy study, subjects were assigned to one of three weightlifting programs i=1:

More information

arxiv: v2 [q-bio.nc] 7 Nov 2013

arxiv: v2 [q-bio.nc] 7 Nov 2013 arxiv:1307.5728v2 [q-bio.nc] 7 Nov 2013 How adaptation currents change threshold, gain and variability of neuronal spiking Josef Ladenbauer 1,2, Moritz Augustin 1,2, Klaus Obermayer 1,2 1 Neural Information

More information

Information Theory. Mark van Rossum. January 24, School of Informatics, University of Edinburgh 1 / 35

Information Theory. Mark van Rossum. January 24, School of Informatics, University of Edinburgh 1 / 35 1 / 35 Information Theory Mark van Rossum School of Informatics, University of Edinburgh January 24, 2018 0 Version: January 24, 2018 Why information theory 2 / 35 Understanding the neural code. Encoding

More information