Mathematical Neuroscience. Course: Dr. Conor Houghton 2010 Typeset: Cathal Ormond

Size: px
Start display at page:

Download "Mathematical Neuroscience. Course: Dr. Conor Houghton 2010 Typeset: Cathal Ormond"

Transcription

1 Mathematical Neuroscience Course: Dr. Conor Houghton 21 Typeset: Cathal Ormond May 6, 211

2 Contents 1 Introduction The Brain Pyramidal Neuron Signalling Connection Between Neurons Electrodynamics Introduction Equilibrium Potential Nernst Equation Gates and Transient Channels Persistence Channels Transient Channels Hodgkin-Huxley Model Integrate-and-Fire Models Synapses Post-Synaptic Conductances Coding Spike Trains Tuning Curves Spike-Triggered Averages Linear Models Problems with Linear Model Rate Based Spiking

3 Chapter 1 Introduction 1.1 The Brain The brain consists of Neurons grey matter) and of Glial Cells white matter). Neurons participate actively in signalling and in computations. Glial cells offer structural support and have a metabolic and modulating role. We will be dealing mostly with neurons. 1.2 Pyramidal Neuron Table 1.1: Parts of a Pyramidal Neuron The Soma is the cell body. It is the site of metabolic processes and contains the nucleus. This is where the incoming signals are integrated. 2

4 1.3. SIGNALLING CHAPTER 1. INTRODUCTION Dendrites carry signals into the soma. They are passive, in the sense that the signals diffuse, they are quite short approx. 4mm) Axons carry signals away from the soma, by active signalling. They are quite long approx. 4mm). 1.3 Signalling dendrites: passive, signal comes in soma: sums up signals by time weighing: where V is the voltage and τ is a constant. }{{} τ V = V + signals }{{} linear relaxation voltage changed by to incoming signals axons: actively propagating signals. If the voltage in the soma passes some threshold, a spike or voltage pulse) is sent down the axon. 1.4 Connection Between Neurons An axon terminates at a Synapse: When a spike arrives at a synapse, the voltage in the dendrite changes: Table 1.2: A Synapse 3

5 1.4. CONNECTION BETWEEN NEURONS CHAPTER 1. INTRODUCTION Table 1.3: When a spike arrives at a synapse The chemical gradients involved are sodium Na + ), potassium K + ), calcium Ca 2+ ) and chlorine Cl ). These gradients are maintained by ion pumps - tiny machines which consume energy while transporting ions. Ion Gates are ion-selective gated i.e. open or closed) channels. The gate is usually controlled by voltage gradients or chemical signals. There are several types of signals: Passive Channels: allow specific ions to leak through Pumps: pump some ions in and some ions out, e.g. sodium in, calcium out. Gated Channels: can open or close in response to voltage gradients, concentration gradients or chemical signals. Note: the word gradient is used here, but it is slightly misleading, in that the voltages and concentrations vary discontinuously across the membrane. Spikes, aka active potentials, are voltage pulses which propagate along the axon. Depolarization is where the current flowing into the cell changes the membrane potential to less negative/positive values. The opposite is Hyperpolarization. If a neuron is depolarized enough to raise the membrane potential above a certain threshold, the neuron generates an action potential, called a Spike, which has a potential of 1mV and lasts 1ms. For a few milliseconds after a spike, it may be virtually impossible to have another spike. This is called the Absolute Refractory Period. For a longer interval 1ms) known as the Relative Refractory Period it is more difficult - but not impossible - to evoke an action potential. This is important, as action potentials are the only type of membrane potential fluctuation which can propagate over large distances. 4

6 1.4. CONNECTION BETWEEN NEURONS CHAPTER 1. INTRODUCTION Table 1.4: Voltage of a Spike In the synapse, the voltage transient of the action potential opens ion channels, producing an influx of Ca 2+ that prompts the release of a neurotransmitter. This binds to receptors at the postsynaptic signal receiving) side of the synapse, causing ion-conducting channels to open. When a spike arrives at a synapse, it changes the voltage in the dendrite. A spike is non-linear. The energy for a spike comes from the energy stored in the membrane by the gradient, so the membrane sustains the spike. Spikes propagate without dissipation. At branches, the spike continues equally down each branch. If the pump shuts off, the cell can still produce 7, spikes. When a spike arrives at the synapse, the vesicles migrate towards the cleft and some of them burst. This migrations is due to an increase in calcium levels. Channels open and ions can pass in to or out of the dendrites causing the change in voltage. 5

7 Chapter 2 Electrodynamics 2.1 Introduction The neuron relies on moving ions around using voltages and dissipation i.e. Brownian motion of ions and atoms). All particles have thermal energy, and this energy on average is proportional to the temperature, in particular at temperature T we have E ion = K B T where E ion is the average energy per ion and K B is the Boltzmann constant. We will calculate the typical voltage of a neuron so that the voltage gaps will roughly have this potential energy. A Mole of something is a specific number of constituent particles, i.e. Avogadro s number L = The thermal energy of a mole is given by RT where R is the gas constant given by R = LK B = 8.31J/mol. We need the thermal energy to be similar to the potential gap due to voltages in the neuron. If you have a potential gap of V T then the energy required to move a charge q the charge of one proton) across the gap is qv T. Similarly, the energy required to move one mole of charged ions against a potential of V T is F V T, where F is Faraday s constant given by ql. Balancing these, we get qv T = K B T V T = K BT q = RT F 27mV The intracellular resistance to current flows can cause substantial differences in the membrane potential measured in different parts of a neuron. Long, narrow stretches of dendrites or axonal cables can cause a high resistance. Neurons that have few of these may have relatively uniform membrane potentials across their surface. These are called Electronically Compact neurons. Assume that the voltage is the same everywhere in a cell. This is equivalent to saying that the time-scales of dissipation across the cell are small compared to the other cells. This is a harmless enough assumption, as we are really dealing with a small section of membrane. If we have a voltage across a membrane, the charge is stored on the membrane. The amount of charge stored Q depends linearly on the voltage, so Q = CV 6

8 2.2. EQUILIBRIUM POTENTIAL CHAPTER 2. ELECTRODYNAMICS where C is the capacitance given by C = ca, where c is the specific capacitance per area and A is the area of the membrane. The current through the membrane is given by Ohm s Law tells us that I = dq dt = C dv dt I = 1 R V i.e. that the current is linearly proportional to voltage, where G = 1 R is the conductance, and R is the resistance. We also have the specific resistance and specific conductance given respectively by R = r A G = g A We may combine the above equations to see the following: [RC] = [T ] [R] = [CV ][I 1 ] = [V ][Q 1 ][T ] [C] = [Q][V 1 ] We wish to have one equation for V, but before we do this we need to think about chemical gradients, i.e. the differences in ion concentrations across the membrane. If there is no voltage gap and high conductivity, then there is a sodium current in the absence of a potential. Ohm s Law can be modified in the presence of concentration differences: [V E i ] = IR where E i is the Reversal Potential that would be required to prevent the net diffusion across the barrier. This value will change as a current changes concentrations. We will ignore this and assume that the current is small. 2.2 Equilibrium Potential Equilibrium Potential is the voltage gap required to prevent a current in the presence of a chemical gradient. Equilibrium potential is given by the Nernst equation which we will derive. Imagine ions of charge zq, where q is the charge of a single proton and z = 1 for Na +. These ions will need energy zqv to cross the barrier. What is the probability that the ion has that energy? The distribution of energy is given by the the Boltzman distribution, i.e. pɛ) = 1 z exp ɛ ) ɛ2 1 ɛ Pɛ 1 < energy of ion < ɛ 2 ) = K B T ɛ 1 z e K B T dɛ 7

9 2.3. NERNST EQUATION CHAPTER 2. ELECTRODYNAMICS This implies that 1 = 1 z exp ɛ ) dɛ K B T [ = 1 zk B T exp ɛ )] K B T = [ K BT z exp ɛ )] K B T = K BT z Which gives us that z = K B T. We also have where V T = K BT q Pɛ > zqv ) = 1 K B T is the typical voltage. 2.3 Nernst Equation = 1 K B T zqv zqv = exp K B ) T zv = exp exp ɛ ) dɛ K B T [ K B T exp V T ) ɛ )] K B T Consider a cell barrier. Inside, only exp zv V T ) cells have enough energy to diffuse out to the exterior. Outside, all cells have enough energy to diffuse into the interior. Let p i and p e be the concentrations of ions in the interior and exterior respectively. Assume, near equilibrium, that the diffusion flow is equivalent to the concentration of energetically available ions. Then: ) ze p i exp = p e V T ) ze exp V T E = V T z log = p e p i The latter of which is the Nernst Equation. Each ion has a different equilibrium potential. Na + has a potential of c 7mV. The current for sodium is given by pe g Na V E Na ) p i ) 8

10 2.3. NERNST EQUATION CHAPTER 2. ELECTRODYNAMICS and so the Hodgkin-Huxley Equation is given by: C dv dt = i + I e A where I e is an electrode current. This accounts for experimental situations with injected current in the brain replaced by a synaptic current. The current for each ion is then given by Ohm s law, i x = 1 r x V E x ). We will frequently make use of the conductance g x = 1 r x. This gives: i = g l V E l ) +g }{{} Na V E Na ) + g K V E K ) leaking current g l is the conductance of all permanently open channels, whereas g Na, g k are the conductances through the gated channels: channels that generate a particular conductance that allows only one type of ion to pass through. Models that describe the membrane potential of a neuron by just a single variable V are called Single-Compartment Models. The basic equation for all single compartment models is, as above: dv c m dt = i m + I e A Table 2.1: The Equivalent Circuit of a Neuron The structure of such a model is the same as an electrical circuit, called an Equivalent Circuit, which consists of a capacitor and a set of variable and non-variable resistors corresponding to the different membrane conductances. The membrane resistance is given by Ohm s law, V = IR m. Note that we will often use specific resistances and capacitances, denoted by small letters, R m rm A and C m c m A, where A is the surface area of the neuron. 9

11 2.4. GATES AND TRANSIENT CHANNELS CHAPTER 2. ELECTRODYNAMICS 2.4 Gates and Transient Channels Persistence Channels Voltage dependent channels open and close as a function of membrane potentials. A channel that acts like it has a single type of gate is a Persistent Channel opening of the gate us called activation of the conductance. We denote the probability that a gate is open as P k. The opening of Table 2.2: The Equivalent Circuit of a Neuron a persistent gate may involve a number of different changes. In general, if k independent, identical events are required for a channel to open, P k can be written as P k = n k where n is the probability that any one of the k independent gating events has occurred. Note that for the Hodgkin-Huxley equation, we have k = 4. The rate at which the open probability for a gate changes is given by dn dt = α n1 n) + β n n where α n is the opening rate and β n is the closing rate. Simplifying, we have τ n dn dt = n n τ n = 1 α n + β n, n = We can look at this as an inhomogeneous first order ODE: τ df dt = f f 1 α n α n + β n

12 2.4. GATES AND TRANSIENT CHANNELS CHAPTER 2. ELECTRODYNAMICS Assume that f and τ are constant. Then solving this we have ft) = f + f f ) exp t ) τ Transient Channels Table 2.3: The Equivalent Circuit of a Neuron The activation is coupled to a voltage sensor, and acts like a gate in a persistent channel. A second gate - the deactivation fate - can block that channel once it is open. Only the middle panel corresponds to an open, ion-conducting state. Since the first gate acts like the one in the persistent channel, we can say that Pgate 1 is open) = m k where m is an activation variable similar to n from before and k is an integer. The ball acts as the second gate. We have Pball does not block the channel pore) = h h is called the Inactivation Variable. The activation and inactivation variables m and h are distinguished by having opposite voltage dependencies. For the transient channels to conduct, both gates must be open and assuming they both act independently. This has probability: P Na = m 3 h 11

13 2.5. HODGKIN-HUXLEY MODEL CHAPTER 2. ELECTRODYNAMICS As with the persistent channels, we get dm dt = α m1 m) β m m dh dt = α h1 h) β h h Functions m and h describing the steady-state activation and inactivation levels, and voltage dependent time constraints for m and h can be defined as for persistent channels. To turn on a conductance maximally, it may be first necessary to hyperpolarize the neuron below its resting potential and then depolarize it. Hyperpolarization raises the value of the inactivation variable h, also called Deinactivation. The second step - depolarization - increases the value of m, the activation variable. Only when m and h are are both non-zero is the conductance turned on. Note that the conductance can be reduced in magnitude by either decreasing m or h. Decreasing h is called Inactivation and decreasing m is called Deactivation. 2.5 Hodgkin-Huxley Model The revised Hodgkin-Huxley model is described by i = g l V E l ) + g Na m 3 hv E Na ) + g K n 4 V E K ) where the bar indicates a constant. This is constructed by writing the membrane current as the sum of a leakage current, a delayed-rectified K + current and a transient Na + current. A positive electrode current is injected into the model, causing an initial rise of the membrane potential. When the current has been risen up to about 5mV, the m variable that describes the activation of the Na + conductance suddenly jumps from nearly to nearly 1. Initially, the h variable expressing the degree of inactivation of the Na + conductance) is.6. Thus, for a brief period, both m and h are significantly different from. This causes a large influx of Na + ions, producing a sharp downward spike of inward current q) which causes the membrane potential to rise rapidly to around 5mV, near the Na + equilibrium potential. The rapid increase in both V and m is due to a positive feedback effect. Depolarization of the membrane causes m to increase and the resulting activation of the Na + conductance makes V increase. This drives h, causing the Na + current to shut off. The rise in V also activates the K + conductance by driving n towards 1. This increases the K + current which drives the membrane potential back down to negative values. After V has returned to the reset value, the gates return to their reset states, i.e. n m h 1 so n, m and h relax to these values. This is not instantaneous, and so there is a refractory period. The Connor-Stevens Model provides an alternative description of action-potential generation. The membrane current in this model is given by: i m = g l V E l ) + g Na m 3 hv E Na ) + g K n 4 V E K ) + g A a 3 bv E A ) 12

14 2.6. INTEGRATE-AND-FIRE MODELS CHAPTER 2. ELECTRODYNAMICS This model has an additional K + conductance called the A current) which is transient. The A current causes the firing rate to rise continuously from, and to increase roughly linearly for currents over the range shown know as a Type I neuron. If A-current is switched off, the firing rate is much higher and jumps discontinuously to a non-zero value Type II). The A-current also delays the occurrence of the first action potential. A-current lowers the internal voltage, and reduces spiking). This model can be extended by including a transient Ca 2+ conductance, e.g. in thalmocortical neurons. A transient Ca 2+ conductance acts - in many ways - like a slower version of the transient Na + conductance that generates action potentials. Instead of producing an action potential, a transient Ca 2+ conductance generates a slower transient depolarization, sometimes called a Ca 2+ spike. This causes the neuron to fire a burst of action potentials which are Na + spikes riding on the slower Ca 2+ spike. Neurons can fire action potentials either at a steady rate or in bursts, even without current injection or synaptic output. Periodic bursting gives rise to transient Ca 2+ spikes with action potentials riding on them. Ca 2+ current during these bursts causes a dramatic increase in intracellular Ca 2+ concentration. This activates a Ca 2+ dependent K + current, which - along with the inactivation of a Ca 2+ current - terminates the burst. The interburst interval is determined primarily by the time it takes for the intracellular Ca 2+ concentration to a low value, which deactivates the Ca 2+ dependent K + current, allowing another burst to be generated. Membrane potentials can vary considerably over the surface of the cell membrane, especially for neurons with long and narrow processes, or if we consider rapidly changing membrane potentials. The attenuation and delay within a neuron are most severe when electrical signals travel down the long, narrow, cable-like structures of dendritic or axonal branches. For this reason, the mathematical analysis of signal propagation within neurons is called Cable Theory. The voltage drop across a cable segment of length x, radius a and intracellular resistivity r L is where V = V x + x) V x) = R L I L R L = r l x πa 2 I L = πa2 V r l x = πa2 r L V x Many axons are covered with an insulating sheath of myelin, except at certain gaps in the neuron, called the Nodes of Ranvier where there is a high density of Na + channels. There is no spike at the myelinated bits, since the myelin acts as an insulator, and so the signal travels as a current. The signal therefore gets weaker and goes faster and is actively regenerated at the nodes of Ranvier. Action potential propagation is thus sped up. 2.6 Integrate-and-Fire Models The mechanisms that we have shown by which K + and Na + produce action potentials are well understood and can be modelled quite accurately. However, neuron models can be simplified and simulation can be drastically accelerated if these biophysical mechanisms are not explicitly included in the model. Integrate-And-Fire models do this by stating that an action potential 13

15 2.6. INTEGRATE-AND-FIRE MODELS CHAPTER 2. ELECTRODYNAMICS occurs whenever the membrane potential of the model neuron reaches a threshold value, V t. After the spike, the potential is rest to a value V r, where V r < V t. I and F models only model subthreshold membrane potential dynamics. In the simplest model, all active membrane conductances are ignored, including synaptic inputs, and the entire membrane conductance is modelled as a single passive leakage term: i m = g l V E l ) This is known as the Leaky Integrate-and-Fire model. The membrane potential in this model is determined by dv c m dt = g lv E l ) + I e A If we multiply across by r m = 1 g m and define r m c m = τ m, then τ m dv dt = E l V + R m I e To generate action potentials in the model, we augment this by the rule that whenever V reaches the threshold value V t, an action potential is fired and the potential is reset to V r. When I e = we have V = E L, and so E L is the resisting potential. To get the membrane potential, we simply integrate the above equation. The firing rate of an I and F model in response to a constant injected current can by computed analytically: V t) = E L + R m I e + V ) E L R m I e ) exp t ) τ m This is valid only as long as V t) < V t. Suppose that at t =, an action potential has just fired, so V ) = V r. If t isi is the time to the next spike, then we have V t = V t isi ) = E l + R m I e + V r E L R m I e ) exp t ) isi τ m exp t ) isi = V t E l R m I e τ m V r E l R m I e t ) isi Vt E l R m I e = log τ m V r E l R m I e ) Rm i e + E l V r t isi = τ m log R m i e + E l V t whenever R m I e > V t E L. Otherwise, we have that t isi =. We call t isi the Interspike Interval for the constant I e. Alternatively, we can calculate the interspike-interval firing rate of the neuron r isi : r isi = 1 = 1 [ )] Rm i e + E L V 1 r log t isi τ m R m i e + E L V t whenever R m I e > V t E L. Otherwise, we have that r isi =. For sufficiently large values of I e i.e. R m I e >> v t E L ), we can use the linear approximation of the logarithm, log1 + z) z) to see that r isi R mi e + E L V t τ m V t V r ) 14

16 2.7. SYNAPSES CHAPTER 2. ELECTRODYNAMICS which shows that the firing rate grows linearly with I e for large I e. Real neurons exhibit spike-rate adaptation in that t isi lengthens over time when a constant current is injected into the cell, before settling to a steady state value, i.e. stabilizing. So far, our passive I and F model has been based on two separate approximations: a highly simplified description of the action potential linear approximation for the total membrane current. We ll keep the first assumption, but we can still model the membrane current in as much detail as necessary. We can model spike-rate adaptation by including an additional current in the model: τ m dv dt = E l V r m g sra V E K ) + R m I E }{{} inputs where E K E L V r and g sra is the spike-rate adaptation conductance, and has been modelled as a K + conductance, so when activated hyperpolarizes the neuron, i.e. moves it away from firing. We ll assume that g sra relaxes exponentially to with a time constant τ sra, i.e. τ sra dg sra dt = g sra Clearly, a non-zero g sra changes the equilibrium potential: dv τ m dt = E L + r m g sra E K 1 + r m g rsa )V + inputs τ m dv 1 + r m g rsa dt = E L + r m g sra V + reduced inputs 1 + r m g rsa The refractory effect is not included in the basic I and F model. Refractoriness can be incorporated by adding a conductance similar to g sra described above, but with a much smaller τ, and a much larger g conductance increment). 2.7 Synapses Synaptic transmission begins when a spike invades the pre-synaptic terminal and activates Ca 2+ channels leading to a rise in the concentration of Ca 2+. Ca 2+ enters the button, and the vesicles migrate to the cell membrane and fuse. They then burst, releasing neurotransmitters into the cleft. These diffuse across the cleft, and bind to receptors on the post-synaptic neuron, leading to the opening of ion channels that modify the conductance of the post-synaptic neuron. The neurotransmitter is then reabsorbed and the channels close again. We want to model this mathematically. A Ligand-Gated Channel is one which opens or closes in response to the binding of a neurotransmitter to a receptor. There are two broad classes of synaptic conductances: Ionotropic Gates - where the neurotransmitter binds directly to the gate fast, simple). 15

17 2.8. POST-SYNAPTIC CONDUCTANCES CHAPTER 2. ELECTRODYNAMICS Metabotropic Gates - where the neurotransmitter binds to receptors that are not on the gate, but where the binding initiates a biochemical process that opens the gate and has other effects. The two major neurotransmitters that are found in the brain are: Glutamates: excitatory transmitters. Principal ionotropic receptors are AMPA and NMDA. GABA Gamma-aminobutyric acid): an inhibitory transmitter. A synapse will either have a glutamate or GABA and then a mixture of the corresponding gates. As with a voltage-dependent conductance, a synaptic conductance can be written as a product of a maximal conductance and an open channel probability: g s = g s P where P is the probability that an individual gate is open. P can be expressed as a product of two terms that reflect processes occurring on the pre and post-synaptic sides of the synapse: P = P r P s where P r is the probability that the transmitter is released by the pre-synaptic terminal following the arrival of an action potential. P r varies to take account of vesicle depletion. here, we let P r = Post-Synaptic Conductances In a simple model of a directly activated receptor channel, the transmitter interacts with the channel through a binding reaction in which k transmitter molecules bind to a closed receptor and open it in the reverse reaction. The transmitter molecules unbind from the receptor and is closes. This is modelled by dp s = α s 1 P s ) + β s P s dt where β s is the constant closing rate and α s is the opening rate which depends on the concentration of the transmitter available for binding, i.e. α s depends on the chance of a neurotransmitter is close enough to a transmitter to bind. We ll assume that α s >> β s, so we can ignore β s in our initial calculations. When an action potential invades the pre-synaptic terminal, the transmitter concentration rises and α s grows rapidly, causing P s to increase. P s rises towards 1 with a time-scale τ α = 1 α s. Assume that the spike arrives at t =, and assume that t [, T ]. Then we have P s t) = 1 + P s ) 1) exp t ) τ α If P s ) =, then P s t) = 1 exp t ) τ α 16

18 2.8. POST-SYNAPTIC CONDUCTANCES CHAPTER 2. ELECTRODYNAMICS so the largest change in P s occurs in this case. Following the release of the transmitter, the transmitter concentration reduces rapidly. This sets α s = and P s then decays exponentially with timescale τ β = 1 β s. Typically τ β >> τ α. Then open probability takes its maximum value at t = T, and then for t T decays exponentially at a rate determined by β s : P s t) = P s T ) exp β s t T )) If P s ) = as it will if there is no synaptic release immediately before the release at t = ) the maximum value for P s is ) P max = 1 exp Tτα This gives us from beforehand that i.e. if a spike arrives at a time t + T, then P s T ) = P s ) + P max 1 P )) P s t + T ) = P s t) + P where P s = P max 1 P s t)) One simple model leaves out the T -scale dynamics: τ β dp s dt = P s note: τ s = τ β The model discussed above, i.e. 1 + P P s t) = s ) 1) exp t ) τ α P s T ) exp β s t T )) t [, T ] t T can be used to describe synapses with slower rise times, but there are many other models. One way of describing both the rise and fall of a synaptic conductance is to express P s as the difference of two exponentials: P s t) = βp max exp t ) exp t )) τ 1 τ 2 where τ 1 and τ 2 are two time-scales in response to a spike arriving when P s and β is some normalization constant. This model allows for a smooth rise as well as a smooth fall. Another popular synaptic response is given by the α-function P s t) = P maxt τ s exp 1 t ) τ s This model starts at, reaches peak value at t = τ s and decays with a time constant τ s. Again, this is favoured for its simplicity and because it somewhat resembles the actual conductance response, albeit with too slow a rise. As with the previous model, to implement it properly, it should be understood as a solution to a differential equation. 17

19 Chapter 3 Coding 3.1 Spike Trains A Spike Train is a series of spike times and is the result of extracellular recording. It is believed that the spike times are the information carrying component of spike trains. If we ignore the brief duration of an action potential, we can just count the spikes. For n spikes, denote these times by t i for i = 1,..., n. The trial is taken to start at time and ends at time T, i.e. ρt) = n δt t i ) The spike count, or the average number of spikes, is given by n = i=1 ρτ) dτ We denote the spike count rate by r, which is given by r = n T = 1 T ρτ) dτ The next step is to discretize time and produce a histogram, i.e. divide T into subintervals for the form [nδt, n + 1)δt] and define rt) = n+1)δt nδt ρτ) dτ t [nδt, n + 1)δt] i.e. rt) is the number of spikes in the corresponding interval. We repeat over numerous trials to see that the firing rate is the average n+1)δt rt) = ρτ) dτ trials A more sophisticated point of view would be to have a moving window nδt rt) = t+ δt 2 t δt 2 18 ρτ) dτ

20 3.1. SPIKE TRAINS CHAPTER 3. CODING which gives rise to a histogram without the rigid discretisation. Again, with multiple trials, the average is 1 t+ δt 2 rt) = ρτ) dτ trials δt rt) = lim # trials δt t δt 2 Thus, rt)δt is the number of spikes in [t δt 2, t + δt 2 ] and if you average over trials, rt)δt is the average number of spikes that fall in that interval. We regard a neuron as having a firing rate t+ δt 2 t δt 2 ρτ) dτ trials which may be approximated by rt). The firing rate for a set of repeated trials at a resolution t is defined as rt) = 1 t t+ t t ρτ) dτ so rt) t is the number of spikes occurring in t, t), i.e. the fraction of trials on which a spike occurred in t, t). Note: r rt) r is the spike count rate is firing rate is the average firing rate, equal to n r = 1 T ρτ) In practise, the firing rate is something we calculate from a finite number of trials, and what matters is the usefulness of a given prescription for calculating the firing rate in terms of how well it can be modelled. The basic point is that since spike trains are so variable, they don t give us a good way to describe the response. In this description, the firing rate is some what of giving a smoothed, averaged quantity for ˆr which can easily fir into models and be compared to experiments. A simple way of extracting an estimate of the firing fate from a spike trains is to divide time into discrete bins of duration t, count the number of spikes within each bin and divide by t, so r app t) = n wt t i ) i=1 where w is the window function defined by Alternatively, we have { wt) = 1 t if t [ t 2, t 2 ] otherwise r app t) = wτ)ρt τ) dτ = w ρ)t) This integral is called the Linear Filter, and the window function also called the Filter Kernel) specifies how the neural response function evaluated at time t τ contributes to the firing rate 19

21 3.2. TUNING CURVES CHAPTER 3. CODING approximated at time t. This use of a sliding window avoids the arbitrariness of the bin placement and produces a rate that might appear to have better temporal resolution. One thing that s commonly done is to replace the filter kernel with a more smooth function, like a Gaussian: ) 1 wt) = exp t2 2πσw 2σw 2 In such a filter calculation, the choice of filter forms part of the prescription. Other choices include 1 e t t if t > wt) = t otherwise wt) = { α 2 te αt if t > otherwise There is no experimental evidence to show that any given filter is better than another, not is there any derivation from principle. The choice of t or σ w does matter, usually chosen by validating against the data. 3.2 Tuning Curves Neuronal responses typically depend on many different properties of stimulus. A simple way of characterizing the response of a neuron is to count the number of action potentials fired during the presentation of a stimulus and then repeat an infinite number of times) and average. A Tuning Curve is the graph of the average r against some experimental parameter. Response tuning curves characterize the average response of a neuron to a given stimulus. We now consider the complementary procedure of averaging the given stimuli that produces a given response. The resulting quantity, called the spike-triggered average stimulus provides a useful way of characterizing neuronal selectivity. STAs are computed using stimuli characterized by a parameter st) that varies over time. 3.3 Spike-Triggered Averages This is another way of describing the relationship between stimulus and response, exactly how we can better understand the linear models. The Spike Triggered Average Stimulus, denoted Cτ), is the average value of the stimulus at a time interval τ before a spike is fired. It is given by 1 n Cτ) = st i τ) n i=1 2

22 3.4. LINEAR MODELS CHAPTER 3. CODING In other words, for a spike occurring at time t i, we determine st i τ), sum over all n spikes in a trial and divide the total by n. In addition, we average over trials, so 1 n Cτ) = st i τ) n i=1 n 1 st i τ) n = 1 n = 1 n = 1 n which is the stimulus response correlation. i=1 T T T ρt)st τ) dt ρt) st τ) dt rt)st τ) dt Correlation functions are a useful way of determining how two quantities that vary over time are related to each other. The correlation function of the firing rate and the stimulus is From this we can see that Q rs τ) = 1 T rt)st + τ) dt Cτ) = 1 r Q rs τ) Because the argument of this correlation function is τ, the STA stimulus is often called the reverse correlation function. The STA stimulus is widely used to study average and characterize neural responses. Because Cτ) is the average value of the stimulus at a time τ before a stimulus, larger values of τ represent times further in the past relative to the triggering spike. For this reason, we plot the STA with the time axis going backward compared to the normal convention. This allows the average spike-triggered stimulus to be read of from the plots in the usual left-to-right order. The results obtained by spike-triggered averaging depend on the particular set of stimuli used during an experiment. There are certain advantages to using a stimulus that is uncorrelated from one time to the next, e.g. a white-noise stimulus. This condition of white-noise stimulus can be expressed using the stimulus-stimulus correlation function: Q ss τ) = 1 T st)st + τ) dt if you had a white-noise stimulus, you might expect that for negative values of τ, we have Cτ) =. 3.4 Linear Models From before, we noted that Cτ) = 1 r Q rs τ) 21

23 3.4. LINEAR MODELS CHAPTER 3. CODING We can see that Cτ) this depends only on Q rs. However, a better description would be to take Q ss into account. In for formula for Q rs, we cannot be sure if a non-zero value reflects a statistical relationship between st) and rt + τ) or, for example, one between st) and st + τ) and another between st + τ) and rt + τ). Thus, the problems with the STA are: 1. no accounting for Q ss 2. it only depends on 2 nd order statistics 3. no accounting for spike-spike effects Linear Models can solve the first problem, but not the other two. We consider: rt) = r + Dτ)st τ) dτ where r is a constant which accounts for any background firing when s =. Dτ) is a weighting factor that determines how strongly and with what sign the value of st τ) affects the firing rate at the time t τ. The integral in this equation is a linear filter of the same for *** as those defined before. In the linear model, a neuron has a kernel associated with it, and the predicted firing rate is the convolution of the kernel and the stimulus. We can think of this equation as being the first two terms in a Volterra Expansion - the functional equivalent of the Taylor series expansion used to generate power series approximations of the functions: rt) = r + + D 1 τ)st τ) dτ + D 2 τ 1, τ 2 )st τ 1 )st τ 2 ) dτ 1 dτ 2 D 3 τ 1, τ 2, τ 3 )st τ 1 )st τ 2 )st τ 3 ) dτ 1 dτ 2 dτ 3 The question now is what is Dτ) and how to calculate it. The standard method is reverse correlation - without loss of generality, we ll absorb r into r and r to let r =, or simply consider r r r. We wish to choose the kernel D to minimize the squared difference between the estimated response to a stimulus and the actual measured response to a stimulus and the actual measured response averaged over the duration of the trial T ), i.e. ɛ = 1 T rt) rt)) 2 dt ɛ This is called the Objective Function. To optimize this, we want as a problem in the Dτ) calculus of variations. However, instead, we want to phrase the problem as a simple variation. We send Dτ) to Dτ) + δdτ) and calculate the corresponding variation in ɛ. Let ɛ be the new error under this translation: ɛ = 1 T r 2 2r r + r ) 2 ) dt 22

24 3.4. LINEAR MODELS CHAPTER 3. CODING where the denotes the new estimate, not the derivative. We know that r is given by: r t) = = rt) + If we let ɛ = ɛ + δɛ, we have that Dτ) + δrτ))st τ) dτ δdτ)st τ) dτ δɛ = 1 T = 1 T = 1 T = 2 T T T r 2 2r r + r ) 2 ) dt 1 T 2r r r ) + r ) 2 r) 2 ) dt [ 2r δdτ)st τ) dτ + 2 r δdτ) r 2 2r r + r) 2 ) dt st τ) rt) rt)) dt dτ ] δdτ)st τ) dτ dt + OδD 2 ) where we change the order of integration. For optimal Dτ), we want δɛ =, so we need to have st τ) rt) rt)) dt which is an integral equation for Dτ). We have that Now we consider Letting t = t σ, we have st τ)rt) dτ = = = = = T st τ) rt) dτ st τ) T Dσ) Dσ)st σ) dσ dt st τ)dσ)st σ) dσ dt st τ)dσ)st σ) dt dσ st τ)st σ) dt dσ st τ)st σ) dt dσ Recalling that st τ)st σ) dt dσ = st + σ τ)st ) dt = T Q ss σ τ) Q rs τ) = 1 T rt)st τ) dt = 1 T Dσ) st τ)st σ) dt dσ 23

25 3.4. LINEAR MODELS CHAPTER 3. CODING we conclude that Q rs τ) = D Q ss )τ) since it can be shown) Q ss is an even function of τ. This method is know as the reverse correlation because the firing rate-stimulus correlation function is evaluated at τ in this equation. What happens if the stimulus is white noise? If knowing st) tells you something about st + τ) for τ, then it can be argued that Q ss = σ 2 δt) for T where σ 2 is the variance of st) at a point. Substituting this into the above equation, we have Q rs τ) = D Q ss )τ) = Q ss τ τ )Dτ ) dτ = σ 2 δτ τ )Dτ ) dτ = σ 2 Dτ) whence we conclude Dτ) = 1 σ 2 Q rs τ) Previously, we have seen that the spike STA was approximated by so Ct) 1 r Q rs τ) Dτ) r Cτ) σ 2 Thus, the linear kernel is approximately equal to the STA for a white noise stimulus. We can also think of the linear kernel as encoding information about how the neuron responds to the stimulus in a way that separates the response from the structure of the stimulus. This makes it useful in situations where we need to use a highly structured stimulus to study the sort of behaviour the neuron has when performing its computational tasks. Calculating Dτ) can be tricky: At its simplest, we consider the Fourier Transform, and recall that So we have giving Ff g) = Ff)Fg) F[Q rs τ)] = F[Q ss τ)]f[dτ)] [ ] Dτ) = F 1 F[Qrs τ)] F[Q ss τ)] However, this will not always work as our convolution is not quite correct. Also, F[Q ss τ)] is sometimes quite small, so our division can give rise to errors. 24

26 3.5. PROBLEMS WITH LINEAR MODEL CHAPTER 3. CODING Another approach is to rewrite the equations as a matrix equation by discretizing time: We then write and we can see that so Q rs τ) Q rs nδτ) = Q rs n a vector) Dτ) D n = D nδτ) Q ss nn = S ssnδτ n δτ) Q rs n = Q ss nn D n }{{} Q ss D D n = Q ss nn ) 1 Q rs n It turns out that Q ss nn is always invertible, but often its eigenvalues are small, and this dominate the inverse matrix. 3.5 Problems with Linear Model We have the following failures of the linear model: 1. The objective function ɛ is not chosen from principle - there is a subtle model dependence in what we did. 2. st) introduces more model dependence in real applications. 3. There are no spikes. Ideally, we consider a stimulus, estimate its spike train based on a given model and compare it to its experimentally determined spike train. Here, we estimate the firing rate based on our model, and compare it to the firing rate of the spike train determined by experiment. We consider the following questions 1. Are there better models that produce spikes instead of firing rates? 2. Alternatively, can we supplement a firing fate model with a model that gives spikes? 3. How does a spike model relate to the definition of ɛ? 3.6 Rate Based Spiking The idea is that the probability of a spike depends only on the current value of the firing rate. This gives us a Poisson process. For small time intervals t, the probability of a spike at time t [t, t + t] is Pt < t < t + t) = rt ) t as T We re interested in the probability of a given spike train, so Pt 1 1 < t < t 2 1, t 1 2 < t < t 2 2,..., t 1 n < t < t 2 n) = 25 t 2 1 t 2 2 t 1 1 t 1 2 t 2 n P [t 1, t 2,..., t n ] dt n t 1 n

27 3.6. RATE BASED SPIKING CHAPTER 3. CODING Here, P [t 1, t 2,..., t n ] is the probability density function. Note that if we have n spikes, the probability distribution for those spikes occurring at time t 1, t 2,..., t n ) is the sum of the probabilities of the spikes occurring at t σ1), t σ2),..., t σn) ), where σ is a permutation of {1, 2,..., n}. Each spike has a constant distribution over [, T ], so we get P [t 1, t 2,..., t n ] = n! T n P[n] where P[n] is the probability that n spikes occur. To calculate P[n], we divide the interval into m subintervals of width t = T. We can assume that t is sufficiently small that we never get m two spikes within any one subinterval, because at the end of the calculation, we take t. The probability of a spike occurring in on specific interval is r t, and the probability of n spikes occurring in n given intervals is r t) n. Similarly, the probability that a spike doesn t occur in an interval is 1 r t), so the probability of having the remaining M n intervals without spikes is 1 r t) M n. Finally, the number of ways of putting n spikes into M intervals is gives us P[n] = lim M ) M rt n M ) n 1 rt M 1 rt M ) n ) M M n ). This To take the limit, we note that as t, M grows without bound because M t = T. Because n is fixed, we can write M n M = T. Using this approximation and defining ɛ = r t, we find δt that [ ] lim 1 t r t)m n = lim 1 + ɛ) 1 rt ɛ = e rt ɛ Also, for large enough M, M! M n)! M n so we have P[n] = rt )n e rt n! which is the Poisson Distribution. We can compute the mean and standard deviation of this distribution: n = np[n] n= rt )n = n e rt n! n=1 ) rt ) n 1 = rt e rt n 1! n=1 ) rt ) m = rt e rt m! = rt m= 26

28 3.6. RATE BASED SPIKING CHAPTER 3. CODING Note that so the variance is given by n 2 = = = n 2 P[n] n= nn 1)P[n] + np[n] n= n=2 n= rt )n nn 1) e rt + n n! = rt ) 2 e rt = rt ) 2 + rt n=2 rt ) n 2 n 2)! + rt σ 2 = n 2 n 2 = rt In general, the ratio of the variance and the mean is known as the Fano Factor, F = σ2 n. For a homogeneous Poisson process, F = 1. However, even with homogeneous stimulus, the Fano factor for Poisson spiking is usually greater than 1. We have the following considerations Homogeneous Poisson spiking doesn t describe spike trains. More interesting evidence is provided by the distribution of the inter-spike intervals. The probability density of the time interval between adjacent spikes is called the inter-spike interval distribution, and it has a useful statistic or characterizing spiking patterns. Let t i be the time between spikes. A similar argument to the previous one shows that Pt i ) = ne rt i Neuronal spiking is clearly not Poisson. For a start, there is the refractory period. Even if it was Poisson, it is unlikely that it would be homogeneous. In the homogeneous Poisson process, the firing rate s not constant, but the probability of getting a spike depends only on the current value of the firing rate rt). We need a formula for P [t 1, t 2,..., t n ]. Consider the time between 2 spikes at t i and t i+1. Divide that into M subintervals. P [no spike] = M 1 rt m ) t) m=1 where rt m ) t is the the probability of a spike in the m th subinterval and The trick is to take logarithms, so t = t i+1 t i M M M log P [no spike] = log1 rt m ) t) rt m ) t m=1 m=1 27

29 3.6. RATE BASED SPIKING CHAPTER 3. CODING recalling that log1 + z) z for small enough z. Assuming that r is nice, we have that Thus we have Combining this, we have ti+1 log P [no spike] = rt) dt t i ti+1 ) P [no spike] = exp rt) dt t i n P [t 1, t 2,..., t n ] = rt i ) exp i=1 n = exp = exp i=1 ti+1 t i ) rt) dt ) n rt) dt rt i ) ti+1 t i i=1 ) n rt) dt rt i ) i=1 28

Channels can be activated by ligand-binding (chemical), voltage change, or mechanical changes such as stretch.

Channels can be activated by ligand-binding (chemical), voltage change, or mechanical changes such as stretch. 1. Describe the basic structure of an ion channel. Name 3 ways a channel can be "activated," and describe what occurs upon activation. What are some ways a channel can decide what is allowed to pass through?

More information

Structure and Measurement of the brain lecture notes

Structure and Measurement of the brain lecture notes Structure and Measurement of the brain lecture notes Marty Sereno 2009/2010!"#$%&'(&#)*%$#&+,'-&.)"/*"&.*)*-'(0&1223 Neurons and Models Lecture 1 Topics Membrane (Nernst) Potential Action potential/voltage-gated

More information

Overview Organization: Central Nervous System (CNS) Peripheral Nervous System (PNS) innervate Divisions: a. Afferent

Overview Organization: Central Nervous System (CNS) Peripheral Nervous System (PNS) innervate Divisions: a. Afferent Overview Organization: Central Nervous System (CNS) Brain and spinal cord receives and processes information. Peripheral Nervous System (PNS) Nerve cells that link CNS with organs throughout the body.

More information

9 Generation of Action Potential Hodgkin-Huxley Model

9 Generation of Action Potential Hodgkin-Huxley Model 9 Generation of Action Potential Hodgkin-Huxley Model (based on chapter 12, W.W. Lytton, Hodgkin-Huxley Model) 9.1 Passive and active membrane models In the previous lecture we have considered a passive

More information

Neurons and Nervous Systems

Neurons and Nervous Systems 34 Neurons and Nervous Systems Concept 34.1 Nervous Systems Consist of Neurons and Glia Nervous systems have two categories of cells: Neurons, or nerve cells, are excitable they generate and transmit electrical

More information

Chapter 9. Nerve Signals and Homeostasis

Chapter 9. Nerve Signals and Homeostasis Chapter 9 Nerve Signals and Homeostasis A neuron is a specialized nerve cell that is the functional unit of the nervous system. Neural signaling communication by neurons is the process by which an animal

More information

Introduction and the Hodgkin-Huxley Model

Introduction and the Hodgkin-Huxley Model 1 Introduction and the Hodgkin-Huxley Model Richard Bertram Department of Mathematics and Programs in Neuroscience and Molecular Biophysics Florida State University Tallahassee, Florida 32306 Reference:

More information

Chapter 48 Neurons, Synapses, and Signaling

Chapter 48 Neurons, Synapses, and Signaling Chapter 48 Neurons, Synapses, and Signaling Concept 48.1 Neuron organization and structure reflect function in information transfer Neurons are nerve cells that transfer information within the body Neurons

More information

Physiology Unit 2. MEMBRANE POTENTIALS and SYNAPSES

Physiology Unit 2. MEMBRANE POTENTIALS and SYNAPSES Physiology Unit 2 MEMBRANE POTENTIALS and SYNAPSES In Physiology Today Ohm s Law I = V/R Ohm s law: the current through a conductor between two points is directly proportional to the voltage across the

More information

Nervous Systems: Neuron Structure and Function

Nervous Systems: Neuron Structure and Function Nervous Systems: Neuron Structure and Function Integration An animal needs to function like a coherent organism, not like a loose collection of cells. Integration = refers to processes such as summation

More information

Neurophysiology. Danil Hammoudi.MD

Neurophysiology. Danil Hammoudi.MD Neurophysiology Danil Hammoudi.MD ACTION POTENTIAL An action potential is a wave of electrical discharge that travels along the membrane of a cell. Action potentials are an essential feature of animal

More information

Introduction to Neural Networks U. Minn. Psy 5038 Spring, 1999 Daniel Kersten. Lecture 2a. The Neuron - overview of structure. From Anderson (1995)

Introduction to Neural Networks U. Minn. Psy 5038 Spring, 1999 Daniel Kersten. Lecture 2a. The Neuron - overview of structure. From Anderson (1995) Introduction to Neural Networks U. Minn. Psy 5038 Spring, 1999 Daniel Kersten Lecture 2a The Neuron - overview of structure From Anderson (1995) 2 Lect_2a_Mathematica.nb Basic Structure Information flow:

More information

Information processing. Divisions of nervous system. Neuron structure and function Synapse. Neurons, synapses, and signaling 11/3/2017

Information processing. Divisions of nervous system. Neuron structure and function Synapse. Neurons, synapses, and signaling 11/3/2017 Neurons, synapses, and signaling Chapter 48 Information processing Divisions of nervous system Central nervous system (CNS) Brain and a nerve cord Integration center Peripheral nervous system (PNS) Nerves

More information

Neurons. The Molecular Basis of their Electrical Excitability

Neurons. The Molecular Basis of their Electrical Excitability Neurons The Molecular Basis of their Electrical Excitability Viva La Complexity! Consider, The human brain contains >10 11 neurons! Each neuron makes 10 3 (average) synaptic contacts on up to 10 3 other

More information

Mathematical Foundations of Neuroscience - Lecture 3. Electrophysiology of neurons - continued

Mathematical Foundations of Neuroscience - Lecture 3. Electrophysiology of neurons - continued Mathematical Foundations of Neuroscience - Lecture 3. Electrophysiology of neurons - continued Filip Piękniewski Faculty of Mathematics and Computer Science, Nicolaus Copernicus University, Toruń, Poland

More information

Physiology Unit 2. MEMBRANE POTENTIALS and SYNAPSES

Physiology Unit 2. MEMBRANE POTENTIALS and SYNAPSES Physiology Unit 2 MEMBRANE POTENTIALS and SYNAPSES Neuron Communication Neurons are stimulated by receptors on dendrites and cell bodies (soma) Ligand gated ion channels GPCR s Neurons stimulate cells

More information

Synaptic dynamics. John D. Murray. Synaptic currents. Simple model of the synaptic gating variable. First-order kinetics

Synaptic dynamics. John D. Murray. Synaptic currents. Simple model of the synaptic gating variable. First-order kinetics Synaptic dynamics John D. Murray A dynamical model for synaptic gating variables is presented. We use this to study the saturation of synaptic gating at high firing rate. Shunting inhibition and the voltage

More information

Nerve Signal Conduction. Resting Potential Action Potential Conduction of Action Potentials

Nerve Signal Conduction. Resting Potential Action Potential Conduction of Action Potentials Nerve Signal Conduction Resting Potential Action Potential Conduction of Action Potentials Resting Potential Resting neurons are always prepared to send a nerve signal. Neuron possesses potential energy

More information

MEMBRANE POTENTIALS AND ACTION POTENTIALS:

MEMBRANE POTENTIALS AND ACTION POTENTIALS: University of Jordan Faculty of Medicine Department of Physiology & Biochemistry Medical students, 2017/2018 +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ Review: Membrane physiology

More information

Action Potentials and Synaptic Transmission Physics 171/271

Action Potentials and Synaptic Transmission Physics 171/271 Action Potentials and Synaptic Transmission Physics 171/271 Flavio Fröhlich (flavio@salk.edu) September 27, 2006 In this section, we consider two important aspects concerning the communication between

More information

NEURONS, SENSE ORGANS, AND NERVOUS SYSTEMS CHAPTER 34

NEURONS, SENSE ORGANS, AND NERVOUS SYSTEMS CHAPTER 34 NEURONS, SENSE ORGANS, AND NERVOUS SYSTEMS CHAPTER 34 KEY CONCEPTS 34.1 Nervous Systems Are Composed of Neurons and Glial Cells 34.2 Neurons Generate Electric Signals by Controlling Ion Distributions 34.3

More information

Basic elements of neuroelectronics -- membranes -- ion channels -- wiring

Basic elements of neuroelectronics -- membranes -- ion channels -- wiring Computing in carbon Basic elements of neuroelectronics -- membranes -- ion channels -- wiring Elementary neuron models -- conductance based -- modelers alternatives Wires -- signal propagation -- processing

More information

Model Neurons I: Neuroelectronics

Model Neurons I: Neuroelectronics Chapter 5 Model Neurons I: Neuroelectronics 5.1 Introduction A great deal is known about the biophysical mechanisms responsible for generating neuronal activity, and these provide a basis for constructing

More information

Lecture 2. Excitability and ionic transport

Lecture 2. Excitability and ionic transport Lecture 2 Excitability and ionic transport Selective membrane permeability: The lipid barrier of the cell membrane and cell membrane transport proteins Chemical compositions of extracellular and intracellular

More information

! Depolarization continued. AP Biology. " The final phase of a local action

! Depolarization continued. AP Biology.  The final phase of a local action ! Resting State Resting potential is maintained mainly by non-gated K channels which allow K to diffuse out! Voltage-gated ion K and channels along axon are closed! Depolarization A stimulus causes channels

More information

Nervous Tissue. Neurons Neural communication Nervous Systems

Nervous Tissue. Neurons Neural communication Nervous Systems Nervous Tissue Neurons Neural communication Nervous Systems What is the function of nervous tissue? Maintain homeostasis & respond to stimuli Sense & transmit information rapidly, to specific cells and

More information

Basic elements of neuroelectronics -- membranes -- ion channels -- wiring. Elementary neuron models -- conductance based -- modelers alternatives

Basic elements of neuroelectronics -- membranes -- ion channels -- wiring. Elementary neuron models -- conductance based -- modelers alternatives Computing in carbon Basic elements of neuroelectronics -- membranes -- ion channels -- wiring Elementary neuron models -- conductance based -- modelers alternatives Wiring neurons together -- synapses

More information

Neural Modeling and Computational Neuroscience. Claudio Gallicchio

Neural Modeling and Computational Neuroscience. Claudio Gallicchio Neural Modeling and Computational Neuroscience Claudio Gallicchio 1 Neuroscience modeling 2 Introduction to basic aspects of brain computation Introduction to neurophysiology Neural modeling: Elements

More information

Control and Integration. Nervous System Organization: Bilateral Symmetric Animals. Nervous System Organization: Radial Symmetric Animals

Control and Integration. Nervous System Organization: Bilateral Symmetric Animals. Nervous System Organization: Radial Symmetric Animals Control and Integration Neurophysiology Chapters 10-12 Nervous system composed of nervous tissue cells designed to conduct electrical impulses rapid communication to specific cells or groups of cells Endocrine

More information

LESSON 2.2 WORKBOOK How do our axons transmit electrical signals?

LESSON 2.2 WORKBOOK How do our axons transmit electrical signals? LESSON 2.2 WORKBOOK How do our axons transmit electrical signals? This lesson introduces you to the action potential, which is the process by which axons signal electrically. In this lesson you will learn

More information

Nervous Tissue. Neurons Electrochemical Gradient Propagation & Transduction Neurotransmitters Temporal & Spatial Summation

Nervous Tissue. Neurons Electrochemical Gradient Propagation & Transduction Neurotransmitters Temporal & Spatial Summation Nervous Tissue Neurons Electrochemical Gradient Propagation & Transduction Neurotransmitters Temporal & Spatial Summation What is the function of nervous tissue? Maintain homeostasis & respond to stimuli

More information

Neurons, Synapses, and Signaling

Neurons, Synapses, and Signaling LECTURE PRESENTATIONS For CAMPBELL BIOLOGY, NINTH EDITION Jane B. Reece, Lisa A. Urry, Michael L. Cain, Steven A. Wasserman, Peter V. Minorsky, Robert B. Jackson Chapter 48 Neurons, Synapses, and Signaling

More information

CELL BIOLOGY - CLUTCH CH. 9 - TRANSPORT ACROSS MEMBRANES.

CELL BIOLOGY - CLUTCH CH. 9 - TRANSPORT ACROSS MEMBRANES. !! www.clutchprep.com K + K + K + K + CELL BIOLOGY - CLUTCH CONCEPT: PRINCIPLES OF TRANSMEMBRANE TRANSPORT Membranes and Gradients Cells must be able to communicate across their membrane barriers to materials

More information

Neuron. Detector Model. Understanding Neural Components in Detector Model. Detector vs. Computer. Detector. Neuron. output. axon

Neuron. Detector Model. Understanding Neural Components in Detector Model. Detector vs. Computer. Detector. Neuron. output. axon Neuron Detector Model 1 The detector model. 2 Biological properties of the neuron. 3 The computational unit. Each neuron is detecting some set of conditions (e.g., smoke detector). Representation is what

More information

Consider the following spike trains from two different neurons N1 and N2:

Consider the following spike trains from two different neurons N1 and N2: About synchrony and oscillations So far, our discussions have assumed that we are either observing a single neuron at a, or that neurons fire independent of each other. This assumption may be correct in

More information

3 Detector vs. Computer

3 Detector vs. Computer 1 Neurons 1. The detector model. Also keep in mind this material gets elaborated w/the simulations, and the earliest material is often hardest for those w/primarily psych background. 2. Biological properties

More information

Dendrites - receives information from other neuron cells - input receivers.

Dendrites - receives information from other neuron cells - input receivers. The Nerve Tissue Neuron - the nerve cell Dendrites - receives information from other neuron cells - input receivers. Cell body - includes usual parts of the organelles of a cell (nucleus, mitochondria)

More information

Neurons, Synapses, and Signaling

Neurons, Synapses, and Signaling Chapter 48 Neurons, Synapses, and Signaling PowerPoint Lecture Presentations for Biology Eighth Edition Neil Campbell and Jane Reece Lectures by Chris Romero, updated by Erin Barley with contributions

More information

Neurons, Synapses, and Signaling

Neurons, Synapses, and Signaling Chapter 48 Neurons, Synapses, and Signaling PowerPoint Lectures for Biology, Eighth Edition Lectures by Chris Romero, updated by Erin Barley with contributions from Joan Sharp and Janette Lewis Copyright

More information

Neural Conduction. biologyaspoetry.com

Neural Conduction. biologyaspoetry.com Neural Conduction biologyaspoetry.com Resting Membrane Potential -70mV A cell s membrane potential is the difference in the electrical potential ( charge) between the inside and outside of the cell. The

More information

Decoding. How well can we learn what the stimulus is by looking at the neural responses?

Decoding. How well can we learn what the stimulus is by looking at the neural responses? Decoding How well can we learn what the stimulus is by looking at the neural responses? Two approaches: devise explicit algorithms for extracting a stimulus estimate directly quantify the relationship

More information

BIOLOGY. 1. Overview of Neurons 11/3/2014. Neurons, Synapses, and Signaling. Communication in Neurons

BIOLOGY. 1. Overview of Neurons 11/3/2014. Neurons, Synapses, and Signaling. Communication in Neurons CAMPBELL BIOLOGY TENTH EDITION 48 Reece Urry Cain Wasserman Minorsky Jackson Neurons, Synapses, and Signaling Lecture Presentation by Nicole Tunbridge and Kathleen Fitzpatrick 1. Overview of Neurons Communication

More information

Neurons, Synapses, and Signaling

Neurons, Synapses, and Signaling Chapter 48 Neurons, Synapses, and Signaling PowerPoint Lecture Presentations for Biology Eighth Edition Neil Campbell and Jane Reece Lectures by Chris Romero, updated by Erin Barley with contributions

More information

BIOLOGY 11/10/2016. Neurons, Synapses, and Signaling. Concept 48.1: Neuron organization and structure reflect function in information transfer

BIOLOGY 11/10/2016. Neurons, Synapses, and Signaling. Concept 48.1: Neuron organization and structure reflect function in information transfer 48 Neurons, Synapses, and Signaling CAMPBELL BIOLOGY TENTH EDITION Reece Urry Cain Wasserman Minorsky Jackson Lecture Presentation by Nicole Tunbridge and Kathleen Fitzpatrick Concept 48.1: Neuron organization

More information

9 Generation of Action Potential Hodgkin-Huxley Model

9 Generation of Action Potential Hodgkin-Huxley Model 9 Generation of Action Potential Hodgkin-Huxley Model (based on chapter 2, W.W. Lytton, Hodgkin-Huxley Model) 9. Passive and active membrane models In the previous lecture we have considered a passive

More information

3.3 Simulating action potentials

3.3 Simulating action potentials 6 THE HODGKIN HUXLEY MODEL OF THE ACTION POTENTIAL Fig. 3.1 Voltage dependence of rate coefficients and limiting values and time constants for the Hodgkin Huxley gating variables. (a) Graphs of forward

More information

Deconstructing Actual Neurons

Deconstructing Actual Neurons 1 Deconstructing Actual Neurons Richard Bertram Department of Mathematics and Programs in Neuroscience and Molecular Biophysics Florida State University Tallahassee, Florida 32306 Reference: The many ionic

More information

Biological Modeling of Neural Networks

Biological Modeling of Neural Networks Week 4 part 2: More Detail compartmental models Biological Modeling of Neural Networks Week 4 Reducing detail - Adding detail 4.2. Adding detail - apse -cable equat Wulfram Gerstner EPFL, Lausanne, Switzerland

More information

Exercises. Chapter 1. of τ approx that produces the most accurate estimate for this firing pattern.

Exercises. Chapter 1. of τ approx that produces the most accurate estimate for this firing pattern. 1 Exercises Chapter 1 1. Generate spike sequences with a constant firing rate r 0 using a Poisson spike generator. Then, add a refractory period to the model by allowing the firing rate r(t) to depend

More information

BME 5742 Biosystems Modeling and Control

BME 5742 Biosystems Modeling and Control BME 5742 Biosystems Modeling and Control Hodgkin-Huxley Model for Nerve Cell Action Potential Part 1 Dr. Zvi Roth (FAU) 1 References Hoppensteadt-Peskin Ch. 3 for all the mathematics. Cooper s The Cell

More information

37 Neurons, Synapses, and Signaling

37 Neurons, Synapses, and Signaling CAMPBELL BIOLOGY IN FOCUS Urry Cain Wasserman Minorsky Jackson Reece 37 Neurons, Synapses, and Signaling Lecture Presentations by Kathleen Fitzpatrick and Nicole Tunbridge Overview: Lines of Communication

More information

Passive Membrane Properties

Passive Membrane Properties Passive Membrane Properties Communicating through a leaky garden hose... Topics I Introduction & Electrochemical Gradients Passive Membrane Properties Action Potentials Voltage-Gated Ion Channels Topics

More information

Fundamentals of the Nervous System and Nervous Tissue

Fundamentals of the Nervous System and Nervous Tissue Chapter 11 Part B Fundamentals of the Nervous System and Nervous Tissue Annie Leibovitz/Contact Press Images PowerPoint Lecture Slides prepared by Karen Dunbar Kareiva Ivy Tech Community College 11.4 Membrane

More information

Voltage-clamp and Hodgkin-Huxley models

Voltage-clamp and Hodgkin-Huxley models Voltage-clamp and Hodgkin-Huxley models Read: Hille, Chapters 2-5 (best) Koch, Chapters 6, 8, 9 See also Clay, J. Neurophysiol. 80:903-913 (1998) (for a recent version of the HH squid axon model) Rothman

More information

Nervous System Organization

Nervous System Organization The Nervous System Chapter 44 Nervous System Organization All animals must be able to respond to environmental stimuli -Sensory receptors = Detect stimulus -Motor effectors = Respond to it -The nervous

More information

Integration of synaptic inputs in dendritic trees

Integration of synaptic inputs in dendritic trees Integration of synaptic inputs in dendritic trees Theoretical Neuroscience Fabrizio Gabbiani Division of Neuroscience Baylor College of Medicine One Baylor Plaza Houston, TX 77030 e-mail:gabbiani@bcm.tmc.edu

More information

Resting Distribution of Ions in Mammalian Neurons. Outside Inside (mm) E ion Permab. K Na Cl

Resting Distribution of Ions in Mammalian Neurons. Outside Inside (mm) E ion Permab. K Na Cl Resting Distribution of Ions in Mammalian Neurons Outside Inside (mm) E ion Permab. K + 5 100-81 1.0 150 15 +62 0.04 Cl - 100 10-62 0.045 V m = -60 mv V m approaches the Equilibrium Potential of the most

More information

Neurons, Synapses, and Signaling

Neurons, Synapses, and Signaling CAMPBELL BIOLOGY IN FOCUS URRY CAIN WASSERMAN MINORSKY REECE 37 Neurons, Synapses, and Signaling Lecture Presentations by Kathleen Fitzpatrick and Nicole Tunbridge, Simon Fraser University SECOND EDITION

More information

1 Single neuron computation and the separation of processing and signaling

1 Single neuron computation and the separation of processing and signaling Physics 178/278 - David Kleinfeld - Winter 2017 - Week 1 1 Single neuron computation and the separation of processing and signaling This overview serves to connect simple notions of statistical physics

More information

COGNITIVE SCIENCE 107A

COGNITIVE SCIENCE 107A COGNITIVE SCIENCE 107A Electrophysiology: Electrotonic Properties 2 Jaime A. Pineda, Ph.D. The Model Neuron Lab Your PC/CSB115 http://cogsci.ucsd.edu/~pineda/cogs107a/index.html Labs - Electrophysiology

More information

Ch. 5. Membrane Potentials and Action Potentials

Ch. 5. Membrane Potentials and Action Potentials Ch. 5. Membrane Potentials and Action Potentials Basic Physics of Membrane Potentials Nerve and muscle cells: Excitable Capable of generating rapidly changing electrochemical impulses at their membranes

More information

PROPERTY OF ELSEVIER SAMPLE CONTENT - NOT FINAL. The Nervous System and Muscle

PROPERTY OF ELSEVIER SAMPLE CONTENT - NOT FINAL. The Nervous System and Muscle The Nervous System and Muscle SECTION 2 2-1 Nernst Potential 2-2 Resting Membrane Potential 2-3 Axonal Action Potential 2-4 Neurons 2-5 Axonal Conduction 2-6 Morphology of Synapses 2-7 Chemical Synaptic

More information

Membrane Potentials, Action Potentials, and Synaptic Transmission. Membrane Potential

Membrane Potentials, Action Potentials, and Synaptic Transmission. Membrane Potential Cl Cl - - + K + K+ K + K Cl - 2/2/15 Membrane Potentials, Action Potentials, and Synaptic Transmission Core Curriculum II Spring 2015 Membrane Potential Example 1: K +, Cl - equally permeant no charge

More information

BIOELECTRIC PHENOMENA

BIOELECTRIC PHENOMENA Chapter 11 BIOELECTRIC PHENOMENA 11.3 NEURONS 11.3.1 Membrane Potentials Resting Potential by separation of charge due to the selective permeability of the membrane to ions From C v= Q, where v=60mv and

More information

Neurons: Cellular and Network Properties HUMAN PHYSIOLOGY POWERPOINT

Neurons: Cellular and Network Properties HUMAN PHYSIOLOGY POWERPOINT POWERPOINT LECTURE SLIDE PRESENTATION by LYNN CIALDELLA, MA, MBA, The University of Texas at Austin Additional text by J Padilla exclusively for physiology at ECC UNIT 2 8 Neurons: PART A Cellular and

More information

Lecture 11 : Simple Neuron Models. Dr Eileen Nugent

Lecture 11 : Simple Neuron Models. Dr Eileen Nugent Lecture 11 : Simple Neuron Models Dr Eileen Nugent Reading List Nelson, Biological Physics, Chapter 12 Phillips, PBoC, Chapter 17 Gerstner, Neuronal Dynamics: from single neurons to networks and models

More information

An Introductory Course in Computational Neuroscience

An Introductory Course in Computational Neuroscience An Introductory Course in Computational Neuroscience Contents Series Foreword Acknowledgments Preface 1 Preliminary Material 1.1. Introduction 1.1.1 The Cell, the Circuit, and the Brain 1.1.2 Physics of

More information

Topics in Neurophysics

Topics in Neurophysics Topics in Neurophysics Alex Loebel, Martin Stemmler and Anderas Herz Exercise 2 Solution (1) The Hodgkin Huxley Model The goal of this exercise is to simulate the action potential according to the model

More information

Nervous System Organization

Nervous System Organization The Nervous System Nervous System Organization Receptors respond to stimuli Sensory receptors detect the stimulus Motor effectors respond to stimulus Nervous system divisions Central nervous system Command

More information

Balance of Electric and Diffusion Forces

Balance of Electric and Diffusion Forces Balance of Electric and Diffusion Forces Ions flow into and out of the neuron under the forces of electricity and concentration gradients (diffusion). The net result is a electric potential difference

More information

Neural Encoding: Firing Rates and Spike Statistics

Neural Encoding: Firing Rates and Spike Statistics Neural Encoding: Firing Rates and Spike Statistics Dayan and Abbott (21) Chapter 1 Instructor: Yoonsuck Choe; CPSC 644 Cortical Networks Background: Dirac δ Function Dirac δ function has the following

More information

Action Potential (AP) NEUROEXCITABILITY II-III. Na + and K + Voltage-Gated Channels. Voltage-Gated Channels. Voltage-Gated Channels

Action Potential (AP) NEUROEXCITABILITY II-III. Na + and K + Voltage-Gated Channels. Voltage-Gated Channels. Voltage-Gated Channels NEUROEXCITABILITY IIIII Action Potential (AP) enables longdistance signaling woohoo! shows threshold activation allornone in amplitude conducted without decrement caused by increase in conductance PNS

More information

Action Potentials & Nervous System. Bio 219 Napa Valley College Dr. Adam Ross

Action Potentials & Nervous System. Bio 219 Napa Valley College Dr. Adam Ross Action Potentials & Nervous System Bio 219 Napa Valley College Dr. Adam Ross Review: Membrane potentials exist due to unequal distribution of charge across the membrane Concentration gradients drive ion

More information

The Nervous System. Nerve Impulses. Resting Membrane Potential. Overview. Nerve Impulses. Resting Membrane Potential

The Nervous System. Nerve Impulses. Resting Membrane Potential. Overview. Nerve Impulses. Resting Membrane Potential The Nervous System Overview Nerve Impulses (completed12/03/04) (completed12/03/04) How do nerve impulses start? (completed 19/03/04) (completed 19/03/04) How Fast are Nerve Impulses? Nerve Impulses Nerve

More information

Voltage-clamp and Hodgkin-Huxley models

Voltage-clamp and Hodgkin-Huxley models Voltage-clamp and Hodgkin-Huxley models Read: Hille, Chapters 2-5 (best Koch, Chapters 6, 8, 9 See also Hodgkin and Huxley, J. Physiol. 117:500-544 (1952. (the source Clay, J. Neurophysiol. 80:903-913

More information

Neuroscience 201A Exam Key, October 7, 2014

Neuroscience 201A Exam Key, October 7, 2014 Neuroscience 201A Exam Key, October 7, 2014 Question #1 7.5 pts Consider a spherical neuron with a diameter of 20 µm and a resting potential of -70 mv. If the net negativity on the inside of the cell (all

More information

Organization of the nervous system. Tortora & Grabowski Principles of Anatomy & Physiology; Page 388, Figure 12.2

Organization of the nervous system. Tortora & Grabowski Principles of Anatomy & Physiology; Page 388, Figure 12.2 Nervous system Organization of the nervous system Tortora & Grabowski Principles of Anatomy & Physiology; Page 388, Figure 12.2 Autonomic and somatic efferent pathways Reflex arc - a neural pathway that

More information

Lecture 10 : Neuronal Dynamics. Eileen Nugent

Lecture 10 : Neuronal Dynamics. Eileen Nugent Lecture 10 : Neuronal Dynamics Eileen Nugent Origin of the Cells Resting Membrane Potential: Nernst Equation, Donnan Equilbrium Action Potentials in the Nervous System Equivalent Electrical Circuits and

More information

2401 : Anatomy/Physiology

2401 : Anatomy/Physiology Dr. Chris Doumen Week 6 2401 : Anatomy/Physiology Action Potentials NeuroPhysiology TextBook Readings Pages 400 through 408 Make use of the figures in your textbook ; a picture is worth a thousand words!

More information

Bio 449 Fall Exam points total Multiple choice. As with any test, choose the best answer in each case. Each question is 3 points.

Bio 449 Fall Exam points total Multiple choice. As with any test, choose the best answer in each case. Each question is 3 points. Name: Exam 1 100 points total Multiple choice. As with any test, choose the best answer in each case. Each question is 3 points. 1. The term internal environment, as coined by Clause Bernard, is best defined

More information

BIOLOGY. Neurons, Synapses, and Signaling CAMPBELL. Reece Urry Cain Wasserman Minorsky Jackson

BIOLOGY. Neurons, Synapses, and Signaling CAMPBELL. Reece Urry Cain Wasserman Minorsky Jackson CAMPBELL BIOLOGY TENTH EDITION Reece Urry Cain Wasserman Minorsky Jackson 48 Neurons, Synapses, and Signaling Lecture Presentation by Nicole Tunbridge and Kathleen Fitzpatrick Lines of Communication The

More information

CSE/NB 528 Final Lecture: All Good Things Must. CSE/NB 528: Final Lecture

CSE/NB 528 Final Lecture: All Good Things Must. CSE/NB 528: Final Lecture CSE/NB 528 Final Lecture: All Good Things Must 1 Course Summary Where have we been? Course Highlights Where do we go from here? Challenges and Open Problems Further Reading 2 What is the neural code? What

More information

The Neuron - F. Fig. 45.3

The Neuron - F. Fig. 45.3 excite.org(anism): Electrical Signaling The Neuron - F. Fig. 45.3 Today s lecture we ll use clickers Review today 11:30-1:00 in 2242 HJ Patterson Electrical signals Dendrites: graded post-synaptic potentials

More information

Nervous System: Part II How A Neuron Works

Nervous System: Part II How A Neuron Works Nervous System: Part II How A Neuron Works Essential Knowledge Statement 3.E.2 Continued Animals have nervous systems that detect external and internal signals, transmit and integrate information, and

More information

Ch 33. The nervous system

Ch 33. The nervous system Ch 33 The nervous system AP bio schedule Tuesday Wed Thursday Friday Plant test Animal behavior lab Nervous system 25 Review Day (bring computer) 27 Review Day (bring computer) 28 Practice AP bio test

More information

MATH 3104: THE HODGKIN-HUXLEY EQUATIONS

MATH 3104: THE HODGKIN-HUXLEY EQUATIONS MATH 3104: THE HODGKIN-HUXLEY EQUATIONS Parallel conductance model A/Prof Geoffrey Goodhill, Semester 1, 2009 So far we have modelled neuronal membranes by just one resistance (conductance) variable. We

More information

PNS Chapter 7. Membrane Potential / Neural Signal Processing Spring 2017 Prof. Byron Yu

PNS Chapter 7. Membrane Potential / Neural Signal Processing Spring 2017 Prof. Byron Yu PNS Chapter 7 Membrane Potential 18-698 / 42-632 Neural Signal Processing Spring 2017 Prof. Byron Yu Roadmap Introduction to neuroscience Chapter 1 The brain and behavior Chapter 2 Nerve cells and behavior

More information

Model Neurons II: Conductances and Morphology

Model Neurons II: Conductances and Morphology Chapter 6 Model Neurons II: Conductances and Morphology 6.1 Levels of Neuron Modeling In modeling neurons, we must deal with two types of complexity; the intricate interplay of active conductances that

More information

The homogeneous Poisson process

The homogeneous Poisson process The homogeneous Poisson process during very short time interval Δt there is a fixed probability of an event (spike) occurring independent of what happened previously if r is the rate of the Poisson process,

More information

BIOL Week 5. Nervous System II. The Membrane Potential. Question : Is the Equilibrium Potential a set number or can it change?

BIOL Week 5. Nervous System II. The Membrane Potential. Question : Is the Equilibrium Potential a set number or can it change? Collin County Community College BIOL 2401 Week 5 Nervous System II 1 The Membrane Potential Question : Is the Equilibrium Potential a set number or can it change? Let s look at the Nernst Equation again.

More information

Ionic basis of the resting membrane potential. Foundations in Neuroscience I, Oct

Ionic basis of the resting membrane potential. Foundations in Neuroscience I, Oct Ionic basis of the resting membrane potential Foundations in Neuroscience I, Oct 3 2017 The next 4 lectures... - The resting membrane potential (today) - The action potential - The neural mechanisms behind

More information

Electrophysiology of the neuron

Electrophysiology of the neuron School of Mathematical Sciences G4TNS Theoretical Neuroscience Electrophysiology of the neuron Electrophysiology is the study of ionic currents and electrical activity in cells and tissues. The work of

More information

Chapter 37 Active Reading Guide Neurons, Synapses, and Signaling

Chapter 37 Active Reading Guide Neurons, Synapses, and Signaling Name: AP Biology Mr. Croft Section 1 1. What is a neuron? Chapter 37 Active Reading Guide Neurons, Synapses, and Signaling 2. Neurons can be placed into three groups, based on their location and function.

More information

Particles with opposite charges (positives and negatives) attract each other, while particles with the same charge repel each other.

Particles with opposite charges (positives and negatives) attract each other, while particles with the same charge repel each other. III. NEUROPHYSIOLOGY A) REVIEW - 3 basic ideas that the student must remember from chemistry and physics: (i) CONCENTRATION measure of relative amounts of solutes in a solution. * Measured in units called

More information

80% of all excitatory synapses - at the dendritic spines.

80% of all excitatory synapses - at the dendritic spines. Dendritic Modelling Dendrites (from Greek dendron, tree ) are the branched projections of a neuron that act to conduct the electrical stimulation received from other cells to and from the cell body, or

More information

/639 Final Solutions, Part a) Equating the electrochemical potentials of H + and X on outside and inside: = RT ln H in

/639 Final Solutions, Part a) Equating the electrochemical potentials of H + and X on outside and inside: = RT ln H in 580.439/639 Final Solutions, 2014 Question 1 Part a) Equating the electrochemical potentials of H + and X on outside and inside: RT ln H out + zf 0 + RT ln X out = RT ln H in F 60 + RT ln X in 60 mv =

More information

ACTION POTENTIAL. Dr. Ayisha Qureshi Professor MBBS, MPhil

ACTION POTENTIAL. Dr. Ayisha Qureshi Professor MBBS, MPhil ACTION POTENTIAL Dr. Ayisha Qureshi Professor MBBS, MPhil DEFINITIONS: Stimulus: A stimulus is an external force or event which when applied to an excitable tissue produces a characteristic response. Subthreshold

More information

Curtis et al. Il nuovo Invito alla biologia.blu BIOLOGY HIGHLIGHTS KEYS

Curtis et al. Il nuovo Invito alla biologia.blu BIOLOGY HIGHLIGHTS KEYS BIOLOGY HIGHLIGHTS KEYS Watch the videos and download the transcripts of this section at: online.scuola.zanichelli.it/curtisnuovoinvitoblu/clil > THE HUMAN NERVOUS SYSTEM 2. WARM UP a) The structures that

More information

Computational Explorations in Cognitive Neuroscience Chapter 2

Computational Explorations in Cognitive Neuroscience Chapter 2 Computational Explorations in Cognitive Neuroscience Chapter 2 2.4 The Electrophysiology of the Neuron Some basic principles of electricity are useful for understanding the function of neurons. This is

More information

Propagation& Integration: Passive electrical properties

Propagation& Integration: Passive electrical properties Fundamentals of Neuroscience (NSCS 730, Spring 2010) Instructor: Art Riegel; email: Riegel@musc.edu; Room EL 113; time: 9 11 am Office: 416C BSB (792.5444) Propagation& Integration: Passive electrical

More information