Supplementary information for:
|
|
- Griffin Stephens
- 6 years ago
- Views:
Transcription
1 Supplementary information for: Synaptic polarity of the interneuron circuit controlling C. elegans locomotion Franciszek Rakowski a, Jagan Srinivasan b, Paul W. Sternberg b, and Jan Karbowski c,d a Interdisciplinary Center for Mathematical and Computational Modeling, University of Warsaw, Warsaw, Poland b California Institute of Technology, Division of Biology, Pasadena, CA 91125, USA c Institute of Applied Mathematics and Mechanics, University of Warsaw, Warsaw, Poland d Institute of Biocybernetics and Biomedical Engineering, Polish Academy of Sciences, Warsaw, Poland 1
2 Activity equations for interneurons. The dynamics of the interneurons in the locomotory circuit are based on the derived Eq. (4) in the main text. They are represented by a following set of differential equations: τ dav B = AV B + ǫ ASH w AV B,ASH H(ASH) + ǫ PV C w AV B,PV C H(PV C) +ǫ AV A w AV B,AV A H(AV A) + ǫ AV D w AV B,AV D H(AV D) +ǫ DV A w AV B,DV A H(DV A) + ǫ 2 AV B ǫ2 DV A g AV B,DV A(DV A AV B) +ǫ 2 AV B g AV B,F(E f AV B) + ǫ 2 AV B g AV B,B(E b AV B) + X AV B, (1) τ dpv C = PV C + ǫ AV A w PV C,AV A H(AV A) + ǫ DV A w PV C,DV A H(DV A) +ǫ AV D w PV C,AV D H(AV D) + ǫ AV E w PV C,AV E H(AV E) +w PV C,F H(E f ) + w PV C,B H(E b ) +ǫ 2 PV C ǫ2 AV A g PV C,AV A(AV A PV C) + ǫ 2 PV C ǫ2 DV A g PV C,DV A(DV A PV C) +ǫ 2 PV C g PV C,F(E f PV C) + ǫ 2 PV C g PV C,B(E b PV C) + X PV C, (2) 2
3 τ ddv A = DV A + ǫ PV C w DV A,PV C H(PV C) + w DV A,F H(E f ) +ǫ 2 DV A ǫ2 AV B g DV A,AV B(AV B DV A) + ǫ 2 DV A ǫ2 PV C g DV A,PV C(PV C DV A) +ǫ 2 DV Ag DV A,F (E f DV A) + X DV A, (3) τ dav A = AV A + ǫ ASH w AV A,ASH H(ASH) + ǫ AV B w AV A,AV B H(AV B) +ǫ PV C w AV A,PV C H(PV C) + ǫ AV D w AV A,AV D H(AV D) +ǫ AV E w AV A,AV E H(AV E) + ǫ DV A w AV A,DV A H(DV A) +w AV A,B H(E b ) + ǫ 2 AV A ǫ2 PV C g AV A,PV C(PV C AV A) +ǫ 2 AV A g AV A,B(E b AV A) + ǫ 2 AV A g AV A,F(E f AV A) + X AV A, (4) τ dav D = AV D + ǫ AV B w AV D,AV B H(AV B) + ǫ PV C w AV D,PV C H(PV C) +ǫ ASH w AV D,ASH H(ASH) + ǫ AV A w AV D,AV A H(AV A) +ǫ AV E w AV D,AV E H(AV E) + w AV D,B H(E b ) + X AV D, (5) 3
4 τ dav E = AV E + ǫ AV B w AV E,AV B H(AV B) + ǫ PV C w AV E,PV C H(PV C) +ǫ DV A w AV E,DV A H(DV A) + ǫ ASH w AV E,ASH H(ASH) +ǫ AV A w AV E,AV A H(AV A) + X AV E, (6) where AV B, PV C, DV A, AV A, AV D, AV E are the relative activities of the corresponding interneurons with respect to their resting values. The parameters w ij are synaptic strengths, and g ij are gap-junction (electrical) couplings, where the subscripts i and j denote the above interneurons. The symbol ǫ i denotes synaptic polarity of the neuron i, and it assumes value 1 (if the neuron is excitatory), value 1 (if inhibitory), or 0 (if the neuron is absent because of the ablation). Note that gap junction couplings contain the ǫ 2 i factors, which are either 1 (if the neuron i is present), or 0 (if it is removed from the network). The parameter X i describes a constant in time input coming from the upstream neurons to the interneuron i. For all interneurons except PVC this input comes from the head neurons. It is represented by X i = x 0 +σz i, where x 0 = 2.0 mv, σ is a variable parameter, and z i is either 0 (weak input) or 1 (strong input). The parameter z i, similar to ǫ i, is unknown. We want to find both of them for each interneuron. The above pre-motor interneurons make synaptic and gap junction connections with downstream excitatory motor neurons. Two separate groups of these motor neurons generating forward and backward motion, called B and A respectively, directly connect locomotory muscles. The activities of excitatory motor neurons are given by: 4
5 τ de f = E f + ǫ PV C w F,PV C H(PV C) + ǫ DV A w F,DV A H(DV A) +ǫ AV A w F,AV A H(AV A) + ǫ AV B w F,AV B H(AV B) +ǫ AV D w F,AV D H(AV D) + ǫ AV E w F,AV E H(AV E) +ǫ 2 AV B g F,AV B(AV B E f ) + ǫ 2 AV A g F,AV A(AV A E f ) +ǫ 2 PV C g F,PV C(PV C E f ) + ǫ 2 DV A g F,DV A(DV A E f ), (7) and τ de b = E b + ǫ AV A w B,AV A H(AV A) + ǫ AV D w B,AV D H(AV D) +ǫ AV E w B,AV E H(AV E) + ǫ AV B w B,AV B H(AV B) +ǫ PV C w B,PV C H(PV C) + ǫ DV A w B,DV A H(DV A) +ǫ 2 AV Ag B,AV A (AV A E b ) + ǫ 2 AV Bg B,AV B (AV B E b ) + ǫ 2 PV Cg B,PV C (PV C E b ), (8) where E f and E b are the total relative activities of forward (type B) and backward (type A) motor neurons, respectively, measured from their resting voltages. Three state model of C. elegans locomotion. We assume that C. elegans locomotion can be described approximately as a three state model. These three states correspond to forward movement, backward movement, 5
6 and stopped time (no motion). The average times the worm spends in each state are denoted as T f, T b, and T s, respectively. With each state we associate probabilities of occurance, as is explained below. The probability of forward motion P f can be written as P f = Z exp[(e f E b )/η 0 ], (9) where is some activity threshold for locomotion initiation, the parameter η 0 characterizes the level of noise in the system, and Z is the normalization factor. By symmetry considerations, we can write the probability for backward motion P b as P b = Z exp[(e b E f )/η 0 ]. (10) The choice of the exponential function in P f and P b is motivated by two major arguments. First, with the exponentials both P f and P b are always increasing and positive functions of the arguments E f E b and E b E f for the whole range of their variabilities (from to + ), which is generally not the case for other simple choices, in particular polynomials. For example, the choice P f (E f E b ) n, with n an even integer, is not satisfactory because the probability P f is a non-monotonic function of its argument (it has a minimum for E f = E b + ). Similarly, if the exponent n is an odd integer, then P f becomes negative for E f < E b +, which is clearly wrong. A second argument in favor of the exponential function in the probabilities is the fact that many 6
7 phenomena occurring in nature have a similar type of dependence. The probability that the nematode does not move, i.e. it is in the stopped state, is ( ) Ef E b P s = Z S s, (11) where S s (x) is some unknown function that should have the following properties. For x 1, i.e. when E f E b is much smaller than the motion activation threshold, the function S s (x) 1. For x 1, i.e. when either E f or E b dominates, the worm should not be motionless, which corresponds to S s (x) 0. Additionally, due to symmetry one should have S s ( x) = S s (x). The form of the S s (x) function is however irrelevant for the kind of computations made in this paper (see below). From the normalization condition for probabilities, P f + P b + P s = 1, we can determine Z, which allows us to find explicit forms for the state probabilities. The latter read: P f = e (E f E b )/η 0 e (E f E b )/η 0 + e (E b E f )/η 0 + Ss (12) P b = e (E b E f )/η 0 e (E f E b )/η 0 + e (E b E f )/η 0 + Ss (13) 7
8 P s = S s e (E f E b )/η 0 + e (E b E f )/η 0 + Ss (14) In a case when the activity of forward motor neurons dominates over the rest (E f E b + ), we have P f 1, P b 0, and P s 0. For a balanced activation of forward and backward motor neurons, i.e. when E f E b, we have S s 1, which leads to P f P b 1, and P s 1. On the other hand the probabilities P f, P b, and P s are related to the times (T f, T b, T s ) the worm spends in the corresponding states. The average probability that C. elegans is in the forward state is P f = T f /(T f +T b +T s ). Similarly, P b = T b /(T f +T b +T s ), and P s = T s /(T f + T b + T s ). Thus, the ratio T f /T b is equal to P f /P b. Combining the above equations, we obtain T f /T b = exp [(E f E b )/η], (15) where η = η 0 /2. Note that the ratio of times associated with forward and backward locomotion neither depends on the activation threshold nor on the form of the S s (x) function. The ratio of times depends only on the difference in the activation of complementary motor neurons and the level of noise in the system. It is interesting to stress that the quantity of empirical interest, i.e. T f /(T f +T b ) is equal to [1 + exp((e b E f )/η)] 1. The latter expression is known as a sigmoidal logistic function, and serves as a transfer 8
9 function from neural activities to behavioral output. In order to examine the robustness of our results, we also investigated another choice for T f /T b as a function of E f E b, different from that given by Eq. (8) in the main text. Specifically, we tried the following function: T f /T b = ln(1 + e (E f E b )/η )/ ln2. (16) Note that for E b E f, Eqs. 15 and 16 both yield T f /T b that are proportional to each other. For E f E b they behave differently, i.e. the ratio T f /T b increases with E f E b much faster in Eq. (15) (exponential) than in Eq. (16) (linear). We found that the winning combinations had essentially the same ED values for both choices of T f /T b, which implies that Eqs. (15) and (16) yield equivalent results for the best configurations. This means that our results are not sensitive to the precise form of the transfer function between neural activities and locomotory output. 9
Model of Motor Neural Circuit of C. elegans
Model of Motor Neural Circuit of C. elegans Different models for oscillators There are several different models to generate oscillations. Among these models, some have different values for the same parameters,
More informationA More Complicated Model of Motor Neural Circuit of C. elegans
A More Complicated Model of Motor Neural Circuit of C. elegans Overview Locomotion of C. elegans follows a nonlinear dynamics proposed in Fang-Yen et al. 2010. The dynamic equation is written as y C N
More informationarxiv: v3 [q-bio.nc] 6 Nov 2017
RESEARCH Potential role of a ventral nerve cord central pattern generator in forward and backward locomotion in Caenorhabditis elegans Erick O. Olivares 1, Eduardo J. Izquierdo 1,2, Randall D. Beer 1,2
More informationChapter 37 Active Reading Guide Neurons, Synapses, and Signaling
Name: AP Biology Mr. Croft Section 1 1. What is a neuron? Chapter 37 Active Reading Guide Neurons, Synapses, and Signaling 2. Neurons can be placed into three groups, based on their location and function.
More informationLearning Neural Networks
Learning Neural Networks Neural Networks can represent complex decision boundaries Variable size. Any boolean function can be represented. Hidden units can be interpreted as new features Deterministic
More informationData Mining Part 5. Prediction
Data Mining Part 5. Prediction 5.5. Spring 2010 Instructor: Dr. Masoud Yaghini Outline How the Brain Works Artificial Neural Networks Simple Computing Elements Feed-Forward Networks Perceptrons (Single-layer,
More informationChapter 9. Nerve Signals and Homeostasis
Chapter 9 Nerve Signals and Homeostasis A neuron is a specialized nerve cell that is the functional unit of the nervous system. Neural signaling communication by neurons is the process by which an animal
More informationComputational statistics
Computational statistics Lecture 3: Neural networks Thierry Denœux 5 March, 2016 Neural networks A class of learning methods that was developed separately in different fields statistics and artificial
More information4. Multilayer Perceptrons
4. Multilayer Perceptrons This is a supervised error-correction learning algorithm. 1 4.1 Introduction A multilayer feedforward network consists of an input layer, one or more hidden layers, and an output
More informationNeural Networks and the Back-propagation Algorithm
Neural Networks and the Back-propagation Algorithm Francisco S. Melo In these notes, we provide a brief overview of the main concepts concerning neural networks and the back-propagation algorithm. We closely
More informationSynaptic dynamics. John D. Murray. Synaptic currents. Simple model of the synaptic gating variable. First-order kinetics
Synaptic dynamics John D. Murray A dynamical model for synaptic gating variables is presented. We use this to study the saturation of synaptic gating at high firing rate. Shunting inhibition and the voltage
More informationMath in systems neuroscience. Quan Wen
Math in systems neuroscience Quan Wen Human brain is perhaps the most complex subject in the universe 1 kg brain 10 11 neurons 180,000 km nerve fiber 10 15 synapses 10 18 synaptic proteins Multiscale
More informationCOGS Q250 Fall Homework 7: Learning in Neural Networks Due: 9:00am, Friday 2nd November.
COGS Q250 Fall 2012 Homework 7: Learning in Neural Networks Due: 9:00am, Friday 2nd November. For the first two questions of the homework you will need to understand the learning algorithm using the delta
More informationTHE MOST IMPORTANT BIT
NEURAL NETWORKS THE MOST IMPORTANT BIT A neural network represents a function f : R d R d 2. Peter Orbanz Applied Data Mining 262 BUILDING BLOCKS Units The basic building block is a node or unit: φ The
More informationComparison of receptive fields to polar and Cartesian stimuli computed with two kinds of models
Supplemental Material Comparison of receptive fields to polar and Cartesian stimuli computed with two kinds of models Motivation The purpose of this analysis is to verify that context dependent changes
More informationepochs epochs
Neural Network Experiments To illustrate practical techniques, I chose to use the glass dataset. This dataset has 214 examples and 6 classes. Here are 4 examples from the original dataset. The last values
More informationEffects of Interactive Function Forms and Refractoryperiod in a Self-Organized Critical Model Based on Neural Networks
Commun. Theor. Phys. (Beijing, China) 42 (2004) pp. 121 125 c International Academic Publishers Vol. 42, No. 1, July 15, 2004 Effects of Interactive Function Forms and Refractoryperiod in a Self-Organized
More informationSection 6.1: Composite Functions
Section 6.1: Composite Functions Def: Given two function f and g, the composite function, which we denote by f g and read as f composed with g, is defined by (f g)(x) = f(g(x)). In other words, the function
More informationProbabilistic Logics and the Synthesis of Reliable Organisms from Unreliable Components
Probabilistic Logics and the Synthesis of Reliable Organisms from Unreliable Components John Z. Sun Massachusetts Institute of Technology September 21, 2011 Outline Automata Theory Error in Automata Controlling
More informationBalance of Electric and Diffusion Forces
Balance of Electric and Diffusion Forces Ions flow into and out of the neuron under the forces of electricity and concentration gradients (diffusion). The net result is a electric potential difference
More informationSupplementary Information. Brain-wide 3D imaging of neuronal activity in Caenorhabditis elegans with sculpted light
Supplementary Information Brain-wide D imaging of neuronal activity in Caenorhabditis elegans with sculpted light Tina Schrödel,, Robert Prevedel,2,,, Karin Aumayr,, Manuel Zimmer & Alipasha Vaziri,2,
More information2 Equivalence Relations
2 Equivalence Relations In mathematics, we often investigate relationships between certain objects (numbers, functions, sets, figures, etc.). If an element a of a set A is related to an element b of a
More informationCOMP304 Introduction to Neural Networks based on slides by:
COMP34 Introduction to Neural Networks based on slides by: Christian Borgelt http://www.borgelt.net/ Christian Borgelt Introduction to Neural Networks Motivation: Why (Artificial) Neural Networks? (Neuro-)Biology
More informationNature-inspired Analog Computing on Silicon
Nature-inspired Analog Computing on Silicon Tetsuya ASAI and Yoshihito AMEMIYA Division of Electronics and Information Engineering Hokkaido University Abstract We propose CMOS analog circuits that emulate
More informationNervous Systems: Neuron Structure and Function
Nervous Systems: Neuron Structure and Function Integration An animal needs to function like a coherent organism, not like a loose collection of cells. Integration = refers to processes such as summation
More informationarxiv: v1 [cs.ne] 9 Nov 2017
Worm-level Control through Search-based Reinforcement Learning arxiv:1711.03467v1 [cs.ne] 9 Nov 2017 Introduction Mathias Lechner Radu Grosu Abstract Ramin M. Hasani Through natural evolution, nervous
More information/639 Final Solutions, Part a) Equating the electrochemical potentials of H + and X on outside and inside: = RT ln H in
580.439/639 Final Solutions, 2014 Question 1 Part a) Equating the electrochemical potentials of H + and X on outside and inside: RT ln H out + zf 0 + RT ln X out = RT ln H in F 60 + RT ln X in 60 mv =
More informationArtificial Intelligence
Artificial Intelligence Jeff Clune Assistant Professor Evolving Artificial Intelligence Laboratory Announcements Be making progress on your projects! Three Types of Learning Unsupervised Supervised Reinforcement
More informationNeurons and Nervous Systems
34 Neurons and Nervous Systems Concept 34.1 Nervous Systems Consist of Neurons and Glia Nervous systems have two categories of cells: Neurons, or nerve cells, are excitable they generate and transmit electrical
More informationECE521 Lectures 9 Fully Connected Neural Networks
ECE521 Lectures 9 Fully Connected Neural Networks Outline Multi-class classification Learning multi-layer neural networks 2 Measuring distance in probability space We learnt that the squared L2 distance
More informationarxiv: v1 [q-bio.nc] 1 Nov 2017
Functional Connectomics from Data: Probabilistic Graphical Models for Neuronal Network of C. elegans Hexuan Liu 1, Jimin Kim 2, and Eli Shlizerman 1,2 arxiv:1711.00193v1 [q-bio.nc] 1 Nov 2017 1 Department
More informationLogarithmic Functions and Models Power Functions Logistic Function. Mathematics. Rosella Castellano. Rome, University of Tor Vergata
Mathematics Rome, University of Tor Vergata The logarithm is used to model real-world phenomena in numerous elds: i.e physics, nance, economics, etc. From the equation 4 2 = 16 we see that the power to
More informationArtificial Neural Network
Artificial Neural Network Contents 2 What is ANN? Biological Neuron Structure of Neuron Types of Neuron Models of Neuron Analogy with human NN Perceptron OCR Multilayer Neural Network Back propagation
More informationMACROSCOPIC VARIABLES, THERMAL EQUILIBRIUM. Contents AND BOLTZMANN ENTROPY. 1 Macroscopic Variables 3. 2 Local quantities and Hydrodynamics fields 4
MACROSCOPIC VARIABLES, THERMAL EQUILIBRIUM AND BOLTZMANN ENTROPY Contents 1 Macroscopic Variables 3 2 Local quantities and Hydrodynamics fields 4 3 Coarse-graining 6 4 Thermal equilibrium 9 5 Two systems
More informationDynamical systems in neuroscience. Pacific Northwest Computational Neuroscience Connection October 1-2, 2010
Dynamical systems in neuroscience Pacific Northwest Computational Neuroscience Connection October 1-2, 2010 What do I mean by a dynamical system? Set of state variables Law that governs evolution of state
More informationCircular symmetry of solutions of the neural field equation
Neural Field Dynamics Circular symmetry of solutions of the neural field equation take 287 Hecke Dynamically Selforganized Max Punk Institute Symposium on Nonlinear Glacier Dynamics 2005 in Zermatt Neural
More informationMultilayer Perceptrons and Backpropagation
Multilayer Perceptrons and Backpropagation Informatics 1 CG: Lecture 7 Chris Lucas School of Informatics University of Edinburgh January 31, 2017 (Slides adapted from Mirella Lapata s.) 1 / 33 Reading:
More informationProcessing of Time Series by Neural Circuits with Biologically Realistic Synaptic Dynamics
Processing of Time Series by Neural Circuits with iologically Realistic Synaptic Dynamics Thomas Natschläger & Wolfgang Maass Institute for Theoretical Computer Science Technische Universität Graz, ustria
More informationEvolving multi-segment super-lamprey CPG s for increased swimming control
Evolving multi-segment super-lamprey CPG s for increased swimming control Leena N. Patel 1 and Alan Murray 1 and John Hallam 2 1- School of Engineering and Electronics, The University of Edinburgh, Kings
More informationCS 6501: Deep Learning for Computer Graphics. Basics of Neural Networks. Connelly Barnes
CS 6501: Deep Learning for Computer Graphics Basics of Neural Networks Connelly Barnes Overview Simple neural networks Perceptron Feedforward neural networks Multilayer perceptron and properties Autoencoders
More informationIntegration of synaptic inputs in dendritic trees
Integration of synaptic inputs in dendritic trees Theoretical Neuroscience Fabrizio Gabbiani Division of Neuroscience Baylor College of Medicine One Baylor Plaza Houston, TX 77030 e-mail:gabbiani@bcm.tmc.edu
More informationDIFFERENCE EQUATIONS
Chapter 3 DIFFERENCE EQUATIONS 3.1 Introduction Differential equations are applicable for continuous systems and cannot be used for discrete variables. Difference equations are the discrete equivalent
More informationThe Effects of Voltage Gated Gap. Networks
The Effects of Voltage Gated Gap Junctions on Phase Locking in Neuronal Networks Tim Lewis Department of Mathematics, Graduate Group in Applied Mathematics (GGAM) University of California, Davis with Donald
More informationNeural Networks and Fuzzy Logic Rajendra Dept.of CSE ASCET
Unit-. Definition Neural network is a massively parallel distributed processing system, made of highly inter-connected neural computing elements that have the ability to learn and thereby acquire knowledge
More informationChapter 48 Neurons, Synapses, and Signaling
Chapter 48 Neurons, Synapses, and Signaling Concept 48.1 Neuron organization and structure reflect function in information transfer Neurons are nerve cells that transfer information within the body Neurons
More informationMachine Learning for Large-Scale Data Analysis and Decision Making A. Neural Networks Week #6
Machine Learning for Large-Scale Data Analysis and Decision Making 80-629-17A Neural Networks Week #6 Today Neural Networks A. Modeling B. Fitting C. Deep neural networks Today s material is (adapted)
More information1. Introduction to Chemical Kinetics
1. Introduction to Chemical Kinetics objectives of chemical kinetics 1) Determine empirical rate laws H 2 + I 2 2HI How does the concentration of H 2, I 2, and HI change with time? 2) Determine the mechanism
More informationCHAPTER 15 CMOS DIGITAL LOGIC CIRCUITS
CHAPTER 5 CMOS DIGITAL LOGIC CIRCUITS Chapter Outline 5. CMOS Logic Gate Circuits 5. Digital Logic Inverters 5.3 The CMOS Inverter 5.4 Dynamic Operation of the CMOS Inverter 5.5 Transistor Sizing 5.6 Power
More informationDual Nature Hidden Layers Neural Networks A Novel Paradigm of Neural Network Architecture
Dual Nature Hidden Layers Neural Networks A Novel Paradigm of Neural Network Architecture S.Karthikeyan 1, Ravi Prakash 2, B.B.Paul 3 1 Lecturer, Department of Computer Science, Faculty of Science, Banaras
More informationHow Behavioral Constraints May Determine Optimal Sensory Representations
How Behavioral Constraints May Determine Optimal Sensory Representations by Salinas (2006) CPSC 644 Presented by Yoonsuck Choe Motivation Neural response is typically characterized in terms of a tuning
More informationIntroduction Biologically Motivated Crude Model Backpropagation
Introduction Biologically Motivated Crude Model Backpropagation 1 McCulloch-Pitts Neurons In 1943 Warren S. McCulloch, a neuroscientist, and Walter Pitts, a logician, published A logical calculus of the
More informationJournal Club. Haoyun Lei Joint CMU-Pitt Computational Biology
Journal Club Haoyun Lei 10.10 Joint CMU-Pitt Computational Biology Some background 302 neurons Somatic neurons system(282) pharyngeal nervous system(20) The neurons communicate through approximately 6400
More informationEGYPTIAN FRACTIONS WITH EACH DENOMINATOR HAVING THREE DISTINCT PRIME DIVISORS
#A5 INTEGERS 5 (205) EGYPTIAN FRACTIONS WITH EACH DENOMINATOR HAVING THREE DISTINCT PRIME DIVISORS Steve Butler Department of Mathematics, Iowa State University, Ames, Iowa butler@iastate.edu Paul Erdős
More informationAn Efficient Method for Computing Synaptic Conductances Based on a Kinetic Model of Receptor Binding
NOTE Communicated by Michael Hines An Efficient Method for Computing Synaptic Conductances Based on a Kinetic Model of Receptor Binding A. Destexhe Z. F. Mainen T. J. Sejnowski The Howard Hughes Medical
More informationDifferential Equations. Lectures INF2320 p. 1/64
Differential Equations Lectures INF2320 p. 1/64 Lectures INF2320 p. 2/64 Differential equations A differential equations is: an equations that relate a function to its derivatives in such a way that the
More informationSection 2: Limits and Continuity
Chapter 2 The Derivative Business Calculus 79 Section 2: Limits and Continuity In the last section, we saw that as the interval over which we calculated got smaller, the secant slopes approached the tangent
More informationDirect-Current Circuits. Physics 231 Lecture 6-1
Direct-Current Circuits Physics 231 Lecture 6-1 esistors in Series and Parallel As with capacitors, resistors are often in series and parallel configurations in circuits Series Parallel The question then
More informationMesoscopic Organization Reveals the Constraints Governing Caenorhabditis elegans Nervous System
Mesoscopic Organization Reveals the Constraints Governing Caenorhabditis elegans Nervous System Raj Kumar Pan 1, Nivedita Chatterjee 2, Sitabhra Sinha 1 * 1 The Institute of Mathematical Sciences, Chennai,
More informationPatterns, Memory and Periodicity in Two-Neuron Delayed Recurrent Inhibitory Loops
Math. Model. Nat. Phenom. Vol. 5, No. 2, 2010, pp. 67-99 DOI: 10.1051/mmnp/20105203 Patterns, Memory and Periodicity in Two-Neuron Delayed Recurrent Inhibitory Loops J. Ma 1 and J. Wu 2 1 Department of
More informationElectrophysiology of the neuron
School of Mathematical Sciences G4TNS Theoretical Neuroscience Electrophysiology of the neuron Electrophysiology is the study of ionic currents and electrical activity in cells and tissues. The work of
More informationBasic elements of neuroelectronics -- membranes -- ion channels -- wiring. Elementary neuron models -- conductance based -- modelers alternatives
Computing in carbon Basic elements of neuroelectronics -- membranes -- ion channels -- wiring Elementary neuron models -- conductance based -- modelers alternatives Wiring neurons together -- synapses
More informationSupporting Online Material for
www.sciencemag.org/cgi/content/full/319/5869/1543/dc1 Supporting Online Material for Synaptic Theory of Working Memory Gianluigi Mongillo, Omri Barak, Misha Tsodyks* *To whom correspondence should be addressed.
More informationImpulsive Noise Filtering In Biomedical Signals With Application of New Myriad Filter
BIOSIGAL 21 Impulsive oise Filtering In Biomedical Signals With Application of ew Myriad Filter Tomasz Pander 1 1 Division of Biomedical Electronics, Institute of Electronics, Silesian University of Technology,
More informationChapter 18. Remarks on partial differential equations
Chapter 8. Remarks on partial differential equations If we try to analyze heat flow or vibration in a continuous system such as a building or an airplane, we arrive at a kind of infinite system of ordinary
More informationSymbolic Computation of Recursion Operators of Nonlinear Differential-Difference Equations
Symbolic Computation of Recursion Operators of Nonlinear Differential-Difference Equations Ünal Göktaş and Willy Hereman Department of Computer Engineering Turgut Özal University, Keçiören, Ankara, Turkey
More informationComputational Explorations in Cognitive Neuroscience Chapter 2
Computational Explorations in Cognitive Neuroscience Chapter 2 2.4 The Electrophysiology of the Neuron Some basic principles of electricity are useful for understanding the function of neurons. This is
More information(Feed-Forward) Neural Networks Dr. Hajira Jabeen, Prof. Jens Lehmann
(Feed-Forward) Neural Networks 2016-12-06 Dr. Hajira Jabeen, Prof. Jens Lehmann Outline In the previous lectures we have learned about tensors and factorization methods. RESCAL is a bilinear model for
More informationOn the SIR s ( Signal -to- Interference -Ratio) in. Discrete-Time Autonomous Linear Networks
On the SIR s ( Signal -to- Interference -Ratio) in arxiv:93.9v [physics.data-an] 9 Mar 9 Discrete-Time Autonomous Linear Networks Zekeriya Uykan Abstract In this letter, we improve the results in [5] by
More information(Received 15 June 1973)
J. Phyeiol. (1974), 236, pp. 363-371 363 With 5 text-fighurem Printed in Great Britain STOCHASTIC PROPERTIES OF SPONTANEOUS TRANSMITTER RELEASE AT THE CRAYFISH NEUROMUSCULAR JUNCTION BY IRA COHEN, HIROSHI
More informationarxiv: v1 [cs.ne] 29 Oct 2014
A neural circuit for navigation inspired by C. elegans Chemotaxis Shibani Santurkar and Bipin Rajendran arxiv:1410.7881v1 [cs.ne] 29 Oct 2014 Deparment of Electrical Engineering, Indian Institute of Technology
More informationREUNotes08-CircuitBasics May 28, 2008
Chapter One Circuits (... introduction here... ) 1.1 CIRCUIT BASICS Objects may possess a property known as electric charge. By convention, an electron has one negative charge ( 1) and a proton has one
More informationBIOLOGY 11/10/2016. Neurons, Synapses, and Signaling. Concept 48.1: Neuron organization and structure reflect function in information transfer
48 Neurons, Synapses, and Signaling CAMPBELL BIOLOGY TENTH EDITION Reece Urry Cain Wasserman Minorsky Jackson Lecture Presentation by Nicole Tunbridge and Kathleen Fitzpatrick Concept 48.1: Neuron organization
More informationRevision: Neural Network
Revision: Neural Network Exercise 1 Tell whether each of the following statements is true or false by checking the appropriate box. Statement True False a) A perceptron is guaranteed to perfectly learn
More informationLinear Regression, Neural Networks, etc.
Linear Regression, Neural Networks, etc. Gradient Descent Many machine learning problems can be cast as optimization problems Define a function that corresponds to learning error. (More on this later)
More informationESTIMATING THE ACTIVATION FUNCTIONS OF AN MLP-NETWORK
ESTIMATING THE ACTIVATION FUNCTIONS OF AN MLP-NETWORK P.V. Vehviläinen, H.A.T. Ihalainen Laboratory of Measurement and Information Technology Automation Department Tampere University of Technology, FIN-,
More informationInformation processing. Divisions of nervous system. Neuron structure and function Synapse. Neurons, synapses, and signaling 11/3/2017
Neurons, synapses, and signaling Chapter 48 Information processing Divisions of nervous system Central nervous system (CNS) Brain and a nerve cord Integration center Peripheral nervous system (PNS) Nerves
More informationFourier Analysis Fourier Series C H A P T E R 1 1
C H A P T E R Fourier Analysis 474 This chapter on Fourier analysis covers three broad areas: Fourier series in Secs...4, more general orthonormal series called Sturm iouville epansions in Secs..5 and.6
More informationArtificial Neural Network and Fuzzy Logic
Artificial Neural Network and Fuzzy Logic 1 Syllabus 2 Syllabus 3 Books 1. Artificial Neural Networks by B. Yagnanarayan, PHI - (Cover Topologies part of unit 1 and All part of Unit 2) 2. Neural Networks
More informationfunction independent dependent domain range graph of the function The Vertical Line Test
Functions A quantity y is a function of another quantity x if there is some rule (an algebraic equation, a graph, a table, or as an English description) by which a unique value is assigned to y by a corresponding
More informationSwitch + R. ower upply. Voltmete. Capacitor. Goals. Introduction
Lab 6. Switch RC Circuits ower upply Goals To appreciate the capacitor as a charge storage device. To measure the voltage across a capacitor as it discharges through a resistor, and to compare + the result
More informationNeuron. Detector Model. Understanding Neural Components in Detector Model. Detector vs. Computer. Detector. Neuron. output. axon
Neuron Detector Model 1 The detector model. 2 Biological properties of the neuron. 3 The computational unit. Each neuron is detecting some set of conditions (e.g., smoke detector). Representation is what
More informationLearning Deep Architectures for AI. Part II - Vijay Chakilam
Learning Deep Architectures for AI - Yoshua Bengio Part II - Vijay Chakilam Limitations of Perceptron x1 W, b 0,1 1,1 y x2 weight plane output =1 output =0 There is no value for W and b such that the model
More informationConnecting a Connectome to Behavior: An Ensemble of Neuroanatomical Models of C. elegans Klinotaxis
Connecting a Connectome to Behavior: An Ensemble of Eduardo J. Izquierdo*, Randall D. Beer Cognitive Science Program, Indiana University, Bloomington, Indiana, United States of America Abstract Increased
More informationNeurons, Synapses, and Signaling
LECTURE PRESENTATIONS For CAMPBELL BIOLOGY, NINTH EDITION Jane B. Reece, Lisa A. Urry, Michael L. Cain, Steven A. Wasserman, Peter V. Minorsky, Robert B. Jackson Chapter 48 Neurons, Synapses, and Signaling
More informationCh. 5. Membrane Potentials and Action Potentials
Ch. 5. Membrane Potentials and Action Potentials Basic Physics of Membrane Potentials Nerve and muscle cells: Excitable Capable of generating rapidly changing electrochemical impulses at their membranes
More informationThe CENTRE for EDUCATION in MATHEMATICS and COMPUTING cemc.uwaterloo.ca Euclid Contest. Tuesday, April 12, 2016
The CENTRE for EDUCATION in MATHEMATICS and COMPUTING cemc.uwaterloo.ca 016 Euclid Contest Tuesday, April 1, 016 (in North America and South America) Wednesday, April 13, 016 (outside of North America
More informationNN V: The generalized delta learning rule
NN V: The generalized delta learning rule We now focus on generalizing the delta learning rule for feedforward layered neural networks. The architecture of the two-layer network considered below is shown
More informationSimple Neural Nets for Pattern Classification: McCulloch-Pitts Threshold Logic CS 5870
Simple Neural Nets for Pattern Classification: McCulloch-Pitts Threshold Logic CS 5870 Jugal Kalita University of Colorado Colorado Springs Fall 2014 Logic Gates and Boolean Algebra Logic gates are used
More informationBesides resistors, capacitors are one of the most common electronic components that you will encounter. Sometimes capacitors are components that one
1 Besides resistors, capacitors are one of the most common electronic components that you will encounter. Sometimes capacitors are components that one would deliberately add to a circuit. Other times,
More informationA SIMPLE MODEL OF A CENTRAL PATTERN GENERATOR FOR QUADRUPED GAITS
A SIMPLE MODEL OF A CENTRAL PATTERN GENERATOR FOR QUADRUPED GAITS JEFFREY MARSH Humankind s long association with four-legged animals, wild as well as domesticated, has produced a rich vocabulary of words
More informationQuantitative Electrophysiology
ECE 795: Quantitative Electrophysiology Notes for Lecture #4 Wednesday, October 4, 2006 7. CHEMICAL SYNAPSES AND GAP JUNCTIONS We will look at: Chemical synapses in the nervous system Gap junctions in
More informationVasil Khalidov & Miles Hansard. C.M. Bishop s PRML: Chapter 5; Neural Networks
C.M. Bishop s PRML: Chapter 5; Neural Networks Introduction The aim is, as before, to find useful decompositions of the target variable; t(x) = y(x, w) + ɛ(x) (3.7) t(x n ) and x n are the observations,
More informationEncoding of Both Analog- and Digital-like Behavioral Outputs by One C. elegans Interneuron
Article Encoding of Both Analog- and Digital-like Behavioral Outputs by One C. elegans Interneuron Zhaoyu Li, Jie Liu, Maohua Zheng, and X.Z. Shawn Xu, * Life Sciences Institute and Department of Molecular
More informationIntroduction to Neural Networks U. Minn. Psy 5038 Spring, 1999 Daniel Kersten. Lecture 2a. The Neuron - overview of structure. From Anderson (1995)
Introduction to Neural Networks U. Minn. Psy 5038 Spring, 1999 Daniel Kersten Lecture 2a The Neuron - overview of structure From Anderson (1995) 2 Lect_2a_Mathematica.nb Basic Structure Information flow:
More informationArtificial Neural Networks. Part 2
Artificial Neural Netorks Part Artificial Neuron Model Folloing simplified model of real neurons is also knon as a Threshold Logic Unit x McCullouch-Pitts neuron (943) x x n n Body of neuron f out Biological
More informationC4 Phenomenological Modeling - Regression & Neural Networks : Computational Modeling and Simulation Instructor: Linwei Wang
C4 Phenomenological Modeling - Regression & Neural Networks 4040-849-03: Computational Modeling and Simulation Instructor: Linwei Wang Recall.. The simple, multiple linear regression function ŷ(x) = a
More informationProbabilistic Models in Theoretical Neuroscience
Probabilistic Models in Theoretical Neuroscience visible unit Boltzmann machine semi-restricted Boltzmann machine restricted Boltzmann machine hidden unit Neural models of probabilistic sampling: introduction
More informationShangbang Gao, Ph. D
Shangbang Gao, Ph. D Professor, College of Life Science and Technology, Huazhong University of Science & Technology, 1037 Luoyu Road, Wuhan 430074 China Phone: (027)87792025 E-mail: sgao@hust.edu.cn Lab
More informationAT2 Neuromodeling: Problem set #3 SPIKE TRAINS
AT2 Neuromodeling: Problem set #3 SPIKE TRAINS Younesse Kaddar PROBLEM 1: Poisson spike trains Link of the ipython notebook for the code Brain neuron emit spikes seemingly randomly: we will aim to model
More informationVoltage-clamp and Hodgkin-Huxley models
Voltage-clamp and Hodgkin-Huxley models Read: Hille, Chapters 2-5 (best) Koch, Chapters 6, 8, 9 See also Clay, J. Neurophysiol. 80:903-913 (1998) (for a recent version of the HH squid axon model) Rothman
More information