Input areas Cortex/thalamus/limbic. Striatum SNc STN. GPe. BG output. Excitatory Inhibitory. Dopaminergic

Size: px
Start display at page:

Download "Input areas Cortex/thalamus/limbic. Striatum SNc STN. GPe. BG output. Excitatory Inhibitory. Dopaminergic"

Transcription

1 Analysis and simulation of a model of intrinsic processing in the Basal Ganglia K. Gurney, P. Redgrave, A. Prescott Dept Psychology, University of Sheffield, Sheffield S10 2TP

2 Abstract We have recently suggested (Redgrave et al 1997, 1998) that the basal ganglia (BG) play a crucial role in solving the action selection problem in vertebrates. We now present a quantitative model of intrinsic BG processing which shows how this might occur. We assume that several command systems with different levels of salience are requesting their actions to be enabled by the BG. Under dopaminergic control the striatum extracts the salience, and selects a subset of actions for subsequent processing by running a series of winner-take-all competitions mediated by short-range recurrent connections. Each such winning command system defines a channel of information flow through individual BG nuclei with the exception of the subthalamic nucleus (STN) which integrates information across channels. The model has been subject to an analysis of its equilibrium states and full dynamic computer simulation. There are three main results: (i) Diffuse excitation from STN and focused inhibition from striatum combine to provide input to a feedforward, off-centre on-surround network whose output layer is the globus pallidus internal segment (GPi) and substantia nigra reticulata (SNr). This network operates in conjunction with local striatal competitions so that the striatum-stn-gpi/snr complex forms the primary selection pathway within the BG. (ii) The GPe forms the output of a separate control pathway (whose input is also provided by STN and striatum) which supplies control signals to the switching network. For example, GPe dynamically limits excitation of STN when the latter is driven by a variable number of competing actions. (iii) Differential modulation of each pathway by increased dopamine D1/D2 receptor activation works synergistically to induce less exclusive action selection. We conclude that the functionality of the BG anatomymay be more accurately described in terms of selection and control pathways rather than the more usual direct and indirect pathways, and that more promiscuous action selection is fostered by the increased presence of dopamine.

3 Contents 1 Introduction Basal ganglia: anatomy and physiology Basal ganglia: the conventional model From anatomy to functional architecture Basal ganglia operation for action selection Signal selection Reinterpreting the functional anatomy Feedforward networks for selection Intra-striatal mechanisms Results Quantitative Model: description Artificial neuron model Channels Striatum STN The discrete model Homogeneous model GPe GPi/SNr Quantitative model: results Activity of sub-nuclei GPe STN GPi/SNr Basic selection mechanisms inter-channel signal relationships Signal changes with respect to the tonic level Simulation results The effect of dopamine change in absolute GPi/SNr activation change in inter-channel activations Signal selection Comparison with Neurophysiological data Discussion Main findings Specific features Model limitations Comparison with other models Outstanding issues General conclusion

4 A Selection function metrics 37 B Striatal recurrent networks 37 C Operation of selection mechanism 38 C.1 X + h and X + d compared C.2 Change of GPi activation under change in salience C.2.1 Homogeneous model C.2.2 Discrete model C.3 The effect of dopamine C.3.1 Dopamine in the selection pathway C.3.2 Dopamine in the control pathway C.4 Comparing channel activities C.4.1 GPe C.4.2 GPi Introduction This paper deals with a model of intrinsic processing within the basal ganglia which is predicated on the hypothesis that the basal ganglia works to enable action selection within vertebrates. The justification of this viewpoint has been articulated in full elsewhere (Redgrave et al., 1997; Redgrave et al., 1998) and will be taken as our point ofdeparture. The primary goal of the work presented here was to build a model, constrained by the general patterns of connectivity and functionality within the basal ganglia, which fulfils the role of a selection mechanism. In determining a general functional architecture we were also driven by the constraint that the basal ganglia can exploit only those selection mechanisms available to neural network circuits. Within this framework, we sought to identify the specific roles of individual components to the overall selection function. Analysis and simulation of the resulting fully quantitative model provides evidence for our hypothesis of action selection, identifies novel properties of the intrinsic basal ganglia connectivity, and describes the effects of dopaminergic modulation. 1.1 Basal ganglia: anatomy and physiology There are several excellent recent reviews of basal ganglia anatomy and physiology (Mink, 1996; Gerfen and Wilson, 1996; Wickens, 1997). Here will describe only those features which are essential to an understanding of the proposed model. The principal components of the primate basal ganglia are the striatum, the globus pallidus (GP), and the subthalamic nucleus (STN) in the forebrain, and the substantia nigra (SN) in the midbrain. These nuclei and their interconnection pathways are show schematically in figure 1. The striatum comprises the caudate, putamen and nucleus acumbens. In the current model we refer generically to the striatum without attempting to differentiate its subdivisions. In primates the globus pallidus is divided into the internal and external segments (GPi and GPe). The substantia nigra is also subdivided into the ventral pars reticulata (SNr) and the 2

5 Excitatory Inhibitory Input areas Cortex/thalamus/limbic Dopaminergic GPi - globus pallidus (external segment) GPi - globus pallidus (internal segment) STN - subthlamic nucleus SNc - substantia nigra pars compacta Striatum SNc STN GPe SNr GPi BG output Figure 1: basal ganglia anatomy dorsal pars compacta (SNc). Homologous structures (often with different names) are found in the nervous systems of other vertebrates. The input structures to the basal ganglia are the striatum and the STN, which receive input from many areas of cerebral cortex and thalamus. The GPi and SNr constitute output nuclei and project to ventral thalamus, superior colliculus and other mid-brain and brain-stem areas. Striatum and STN both project to GPi and GPe but have different patterns of connections. The STN provides a diffuse set of efferents to the pallidal nuclei so that each cell in GPe and GPi/SNr receives input from large areas of STN. The striatum, on the other hand, supplies a more focussed projection in which pallidal cells receive input from localised receptive fields in striatum. It will be shown that this differential pattern of innervation is crucial to selection function. Finally GPe projects to GPi/SNr and STN, while SNc provides an important input, mainly to the striatum, which uses dopamine as a neurotransmitter. Within striatum, several features may be singled out for attention in connection with selection mechanisms (Mink, 1996; Wickens, 1993; Kawaguchi, 1997). First, an overwhelming proportion (> 90%) of neurons in striatum are medium spiny cells which project to GPe, GPi and SNr. They receive many thousands of excitatory glutamergic inputs from cortex, thalamus and limbic areas. Secondly, in their default state, medium spiny neurons are largely silent when not subject to input. Afferents must display substantially coordinated activity before a medium spiny striatal neuron produces significant output. This dichotomous behaviour is captured by using the terms `down-state' and `up-state' to describe these two modes of operation. Thirdly, the ease with which medium spiny neurons are elevated into their upstate is modifiable by the presence of dopamine released at synaptic junctions associated with input from SNc. In particular, cells may be divided into two classes depending on the kind of dopamine receptors they use. Those with D1 type synapses have their transition into the upstate facilitated by dopamine and innervate GPi/SNr; cells with D2 type receptors find it harder to initiate the up/down state transition under dopaminergic influence and innervate GPe. Finally, striatum is associated with areas of local inhibition although it is not clear 3

6 whether these are mediated by direct lateral inter-spiny connections or via a separate class of interneurons. 1.2 Basal ganglia: the conventional model One popular interpretation of the functional architecture of the basal ganglia (Albin et al., 1989) is shown in figure 2. The paths which are believed to play a principal role are shown as Cortex/ Thalamus Striatum indirect direct GPe STN GPi Figure 2: Conventional basal ganglia architecture; direct and indirect pathways solid lines whereas dotted lines indicate paths which are supposedly of a subsidiary nature. The main mechanism for selection in this scheme is a series of local lateral-inhibitory nets within striatum. This results in a group of striatal outputs becoming more active and, since these are inhibitory, they reduce activity in GPi/SNr targets, thereby releasing inhibition on the selected outputs. The so-called indirect pathway (GPE and STN) has an overall excitatory effect on GPi/SNr and is usually reported as performing some kind of modulatory function which remains ill-specified. While the direct/indirect pathway model has served a useful role as a working hypothesis to help direct research, it has several shortcomings. First, the role of the more subsidiary pathways are not adequately specified. Second, it remains to be developed as a full computational model and claims for the functionality of the direct and indirect pathways must, therefore, remain unsubstantiated. Thirdly, it has recently been noted that the model is unable to accommodate some of the clinical and animal functional data (Chesselet and Delfs, 1996; Parent and Cicchetti, 1998). Finally, it has recently been critically assessed by its own authors (Albin et al., 1995) who look forward to 'the destruction of the model and its resurrection in a more realistic form'. Given this state of affairs, we believe that the functional connectivity of basal ganglia is ready for review. The next section will develop, in outline, the key ideas of our model before being detailed quantitatively in section 3. 4

7 2 From anatomy to functional architecture 2.1 Basal ganglia operation for action selection Within the general framework of the action selection hypothesis it is necessary to prescribe some broad principles that allow selection to take place. These have been described in full elsewhere (Redgrave et al., 1997; Redgrave et al., 1998) but we summarise here for convenience and extend them where necessary. Candidate actions are represented in networks distributed widely throughout the central nervous system. The overall activity level of the neural representation of a given action determines its salience or propensity to be selected for execution at the current time. More precisely, the mean activity level of the striatal population associated with each candidate action is supposed to define the salience for that action. Actions are then viewed as competing with one another in such away that those with high salience values get selected. As well as a population in striatum, there will be co-active populations for each action in all basal ganglia nuclei. We will refer collectively to the ensemble of cells in all such populations for a given action as the channel for that action. In this scheme, therefore, one purpose of the medium spiny cells is to extract the saliences of any channels with which they are associated. This may take place within the extensive dendritic contacts of these cells which have been implicated in processing patterns of widely-distributed input activity. This is evident, for example, in the convergent pattern of corticostriatal connections involved in motor processing (Flaherty and Graybiel, 1991; Graybiel et al., 1994). Salience, then, is supposed to be the `common currency' with which action selection is mediated within the basal ganglia. Following Chevalier and Deniau (Chevalier et al., 1985; Deniau and Chevalier, 1985; Chevalier and Deniau, 1990) we propose that salience signals are used selectively to gate the access of the original action representation to its associated movement generators. In its default state, the output nuclei (the GPi and SNr) supply widespread tonic inhibitory control over access to shared motor resources. Thus, the role of the basal ganglia as a whole is to remove selectively the inhibition on certain outputs, thereby enabling the structures receiving these efferent signals to become active and initiate their corresponding behavioural actions. The outputs which get disinhibited are, of course, those with the largest salience; the remaining actions are associated with increased inhibition on their GPi/SNr outputs [(Anderson and Horak, 1985; Mink and Thach, 1991b). It is important to notice that, although the basal ganglia may be concerned with action selection it does not `know' about behavioural acts but simply extracts salience signals and uses these to drive a subset of its output signals to levels suitable for positive gating. Thus, in isolation, the basal ganglia may be viewed as a `signal selector' and it is our aim to show how this process takes place. Thus, it does not concern us here what is is meant behaviourly by the term `action'. For our purposes an action is defined as the behavioural expression determined by a specific population of cells within basal ganglia (a channel) and may be associated with anything from an elemental motor act through to an extensive behavioural strategy; it remains a subject of further work to determine the repertoire of actions the basal ganglia mediates. Nevertheless, having redefined the problem as one of signal selection it is important thatwe define exactly what this means. 5

8 2.2 Signal selection One definition of a `signal selection' would be to say that it consists of a dichotomisation of a set of inputs X into sets S, μ S of selected and `non-selected' signals respectively. The criterion for membership of S might be that the transformed signal be greater than some threshold and that all transformed signals less than this threshold are in μ S. In this case, it is possible that signals of each type have values which are arbitrarily close to the. However, effective signal processing usually requires that two signal sets be separated by some interval, otherwise noise is likely to interfere with their interpretation. This is certainly the case in digital electronic circuits where voltages close to 0 and the supply voltage are guaranteed to define the logic values 0 and 1 respectively, but intermediate voltages are subject to either interpretation (Horowitz and Hill, 1989). Thus we need to refine our definition and partition the transformed signal interval into three, determined by two selection thresholds 1 ; 2,allowing transformed signals to be in S, μs or an indeterminate set Y 0. In addition weneedtoallow for the possibility that selected signals are deemed to be those in either the upper or lower intervals so that we refer to a large-signal encoding if S is contained in the upper interval and a small signal encoding if S is contained in the lower interval. We will now specialise to the case in which the the input interval and output (transformed) interval are the same and denote this by I. An example of large-signal encoding is shown in figure 3 in which the ideal situation has S θ 2 I X Y 0 = S θ 1 Figure 3: Selection function occurred where Y 0 is empty. In general, however, imperfect mechanisms will not produce this state of affairs for all X. As well as performing the signal set transformations, we also wish to preserve the identity of the individual signals. Thus, if x i 2 X then x i transformed into y i, where this transformation depends, of course, on the other signals in X. To accommodate this in a functional setting, label the members of X such that i<j) x i» x j. The resulting vector x is a member of the space I jxj and selection is implemented by functions G : I jxj! I jxj where G(x) =(y 1 ;y 2 ; ::::y n ). Further, selection usually implies that ordering relations are preserved so that, for largesignal selection, x i < x j ) y i < y j, and for small-signal selection, x i < x j ) y i > y j. Thus selection functions are those that preserve or reverse the ordering of the input elements. The combination of a selection function G and two thresholds 1 ; 2 will be said to define a selection mechanism M G ( 1 ; 2 ). 6

9 To determine how well a mechanism performs we need to define some metrics. First we define the decisiveness D(x) of the mechanism applied to X by D(x) =1 jy 0j jxj (1) where 0» D(x)» 1. D(x) = 1 signifies an extremely decisive or `clean' selection with respect to X, while D(x) = 0 means that no selection has taken place and all signals are in the indeterminate set Y 0. Next we specify what fraction ffi(x) = jsj=jxj of the signals are selected and refer to this as the selection promiscuity. Alternatively we may talk of the selectivity 1 ffi(x). Having defined metrics we cannow compare them with some requirements. Clearly we normally ask that D(x) be as close to unity as possible for all x. Next we wish to specify the required promiscuity ß s (x) for, if we select too many or too few signals then, even though Y 0 may be empty, the mechanism may not be functioning as desired. Informally of course, we wish ffi to be close to ß s for all X; formalisation of this is not subsequently required but is dealt with in appendix A. Our model will deal with the basal ganglia a selection mechanism whose inputs are the set of saliences and whose outputs are those of the GPi/SNr. Thus, while saliences are supposed to be the outcome of internal processing within medium spiny cells,we assume this to have been done and supply the values parametrically. A realistic description of salience extraction would require models of the afferent input systems to basal ganglia together with a detailed description of medium spiny dendrites. It is not our intention to do this here since we believe much can be learnt by focussing on processing intrinsic to the basal ganglia in toto. 2.3 Reinterpreting the functional anatomy Our methodology is based on the observation that, since the basal ganglia are composed of neural circuits, the switching or selection mechanisms they utilise must belong to the repertoire available to such systems. Conversely, an examination of this set of mechanisms will help direct an interpretation of the possible functional configurations of the basal ganglia anatomy. In the current models of the basal ganglia (see section 1.2), some of the pathways are either ignored or considered to be of minor importance. For example the direct corticothalamic input to STN is often overlooked. However, many areas of the cortex and thalamus which project to the striatum also project to the STN. In our model the role of the STN is elevated from a relatively minor component in the indirect pathway into a major basal ganglia input nucleus. We will also emphasise the diffuse nature of STN projections to GPi/SNr and GPe, as opposed to the more focussed projections from striatum, and show how these may work together to provide a selective role within basal ganglia. These features of STN connectivity have been highlighted previously by Mink and Thach (1993). However, we will reframe these ideas in the language of neural networks and signal selection, and explore their consequences quantitatively for the construction of a selection function in the sense given in section 2.2. Thus, we propose that the STN plays the role of the excitatory input to an off-centre, on surround feedforward network whose output `layer' is provided by the GPi/SNr 7

10 and whose inhibitory (off-centre) connections are supplied by direct striatal projections to GPi/SNr. We now proceed to examine the function of such networks Feedforward networks for selection Figure 4(a) shows the architecture of an off-centre on-surround feedforward network; inhibitory 1 input activity output neuron Figure 4: Off centre on surround feedforward net. (see text for details) and excitatory connections are shown in grey and black respectively. Off-centre, on-surround feedforward networks are less widely known than their on-centre off-surround counterparts but their operation is similar and the on-centre variant has been described by Grossberg (1976). The network in figure 4 forces weak inputs to drive their corresponding (off-centre) activations high, while strong inputs send their activations below zero. This may be demonstrated by instantiating a simple additive, piecewise linear neural model in which the activity a is the weighted sum of inputs and the output y equals a unless a<0,inwhich case y =0. Setting all positive (excitatory) weights to 2.0 and the inhibitory weight to-0.5 gives the result shown in figure 4(b) for the set of inputs shown in the top panel. small signal encoding and inhibitory control. In terms of the discussion in section 2.2, the network implements a small-signal encoding mechanism. In the context of neural circuits to which this net is afferent, a small-signal encoding implies selection by release of inhibition. The need for tonic output. As it stands the network described above suffers from a major drawback. Input vectors with equal valued components get sent to the zero output vector - this includes the quiescent (no input) state of the net. Thus, no matter what our choice of 8

11 selection criterion (setting of 1 0) the quiescent state enforces all signal to be selected. This may be overcome by mapping the quiescent state into some non-zero or tonic value t which may be implemented via an intrinsic membrane property or by tonically exciting all neurons to the same degree. The resulting net can then implement amechanism with 1 <t and 2 >t. Need for dual input subsystems. In a biological system cells cannot normally provide simultaneous excitation and inhibition and so we must provide two separate input groups one for each signal sign. In addition, it is unrealistic to suppose that the excitation be essentially diffuse but still contain a localised central lacuna in its projective field (at the `off-centre'). The resulting architecture is shown in figure 5. On the left is shown the network and on the common input + + Inhibitory cells - focussed Excitatory cells + diffuse Output (inhibitory) Figure 5: Biologically plausible off-centre on-surround feedforward network right the system level description. To map this architecture onto the basal ganglia we identify the inhibitory and excitatory cells with striatum and STN respectively, and the output with GPi/SNr. While STN inputs may not be identical to those in striatum we suppose that, within each channel, the levels of excitation of striatum and STN are comparable and may be approximated by a common input. Not only does the anatomy of basal ganglia correspond to a biological implementation of the feedforward net but the release of inhibition by GPi/SNr and their tonic output levels are accommodated quite naturally in our signal selection scheme Intra-striatal mechanisms Within striatum, we assume, in line with other models of basal ganglia (Wickens et al., 1991; Alexander and Wickens, 1993; Houk et al., 1995; Mink and Thach, 1993) that local recurrent inhibitory nets are available to perform selection between competing inputs. However, the present model does not explicitly deal with the details of intra-striatal connections. Instead, we assume a single sub-population within each local network yields non-zero output and use a simple expression for this as the basis for striatal efferent strength; details are given in section 9

12 3.3. Finally, we explicitly model the up/down-state behaviour of medium spiny neurons and emphasise the role of this mechanism as a filter to select only those inputs large enough to force an up-state. 2.4 Results (a) Cortex/thalamus (b) Cortex/thalamus striatum (D1) STN The basis of our model architecture consist of a natural integration of the mechanisms discussed above. The result is shown in figure 6 (a). There are three types of neural selecstriatum up-state/down-state filtering STN GPi/SNr local recurrent circuits focussed inhibition Inter-nuclei connections diffuse excitaion (c) BG output Cortex/thalamus striatum (D2) STN GPe? Figure 6: The selection pathway tion mechanism involved in direct combination; those based on thresholds (up/down state of medium spiny neurons), recurrent inhibition (local, intra-striatal interactions) and feedforward off-centre on-surround networks (projections from STN and striatum to the output nuclei). The key observation is that there are, in fact, two distinct instantiations of this structure in the basal ganglia. One of these is shown in figure 6 (b) in which the output nucleus corresponds to GPi/SNr and the striatal component consists of those cells which use D1 type dopamine receptors. Since GPI/SNr provides the basal ganglia output it is reasonable to suppose that this subsystem performs the selection mechanism proper and we therefore dub it the selection pathway. Notice that no intervening nuclei take part and the structure is consistent with what we would expect if the essential selection mechanism of the basal ganglia is to be constructed from the basic neural network selection `toolkit' outlined above. 10

13 Other possible neural mechanisms, such as on-centre-off-surround feedforward nets are not used because they would be inconsistent with small-signal encoding. We dub the system in figure The other instantiation of the architecture in figure 6 (a) is shown in part (c) of the same figure. This subsystem makes use of the those striatal cells which use D2 type dopamine receptors and the output nucleus is supplied by GPe. However, it is not immediately clear in what sense the GPe is an `output' nucleus since its efferents are confined to other basal ganglia nuclei. This problem is resolved by supposing that this subsystem forms a control pathway whose function is to modulate the properties of the main selection pathway via signal supplied by GPe. The view that GPe supplies control signals to the rest of the basal ganglia has been alluded to by Parent who came to this conclusion on physiological grounds. Combining the two subsystems together gives the functional architecture of the basal ganglia shown in figure 7. Figure 7: New model architecture of basal ganglia In our proposed scheme, the basal ganglia consists of two subsystems or `pathways'. One consists of the triad of D1-type striatal cells, STN and GPi/SNr which together perform the selection function per se. The other consist of D2-type striatal cells, STN and GPe, and serves to modulate the function of the selection pathway. This hypothesis will substantiated by analysis and simulation showing that the dopaminergic influence in both pathways works synergistically to control the selection process, and that the GPe! STN pathway performs an auto-scaling of STN activity which allows for effective selection irrespective of the number of basal ganglia channels. 11

14 3 Quantitative Model: description 3.1 Artificial neuron model All the models described here use artificial neurons of the leaky-integrator type (Arbib, 1995). This is the simplest neural model which incorporates the idea of a dynamic membrane potential while obviating the need to model an abundance of ionic channels (Yamada et al., 1989). As such, it is a common choice for biologically plausible models of neural circuits. The leaky integrator is defined by the rate of change of their activation a which might be interpreted as the membrane potential near the axon hillock. Let u is the total post-synaptic potential generated by the afferent input, k a rate constant which depends on the cell membrane capacitance and resistance and ~a the equilibrium activation then _a = k(a u) ~a = u (2) where _a da=dt. The output y of the neuron is supposed to correspond to mean firing rate and is a monotonic increasing function of a. It will be bounded below by 0 and above by some maximum value y max which may be normalised to 1. This means that, in terms of the selection function framework developed above, I =[0; 1]. We adopt a piecewise linear output function given by y = 8 >< >: 0 : a<ffl m(a ffl) : a» ffl» 1=m + ffl 1 : a>1=m + ffl where slope of the strictly increasing portion is m and ffl is the threshold below which the output is zero. The function allows for non-zero tonic firing rates by setting ffl < 0. Under certain circumstance, it may be possible to ensure that y never reaches 1. It is then possible to write the output relation as y = m(a ffl)h(a ffl) (4) where H() is the Heaviside step function. The assumption y 6= 1 is made for the purposes of the analytic solution of the model and is borne out in simulation. 3.2 Channels Recall that a channel for an arbitrary action b i is defined by the subpopulation p i of neurons within basal ganglia which are active in its execution. Within each nucleus N, the relevant component of p i will be denoted by p N i. Notice that two actions, b 1 ;b 2 may use substantially the same populations within a given nucleus N so that p N 1 overlaps significantly with p N 2. However, given the possible division of basal ganglia into disjoint, parallel processing streams (Alexander et al., 1986; Alexander and Crutcher, 1990; Hoover and Strick, 1993) and, where appropriate a somatotopic organisation, (Flaherty and Graybiel, 1991; Flaherty and Graybiel, 1993; DeLong et al., 1983) we work under the assumption that all channels are physically distinct. This means that the activities of the p N i within each nucleus are not directly dependent on each other (although processing across channels is, of course, possible via inter-nucleus pathways). Clearly this may break down under certain circumstances but (3) 12

15 we contend that it is good approximation for many choices of channel set. One possible exception to this concerns the STN. While this displays somatotopy (DeLong et al., 1985) it is the smallest basal ganglia nucleus, is therefore the most likely to suffer from significant channel overlap, and may require a different modelling approach. This has been addressed by developing two versions of the model - one with distinct channels and one with a uniform STN cell population. Within the model, the interpretation of a channel may be dynamic; thus, if two actions b 1 ;b 2 make sequential requests for execution which are not temporally overlapping it is not necessary to invoke two model channels to deal with them; we can use the same model `hardware' but simply assign it notionally to b 1 when this is active and then to b 2. Further, all neurons within each population p N i are assumed to be identical so that we work with a single representative in each case. This is approach is a common one and has been adopted, for example, in modelling cortex (Douglas et al., 1989). Each p N i is therefore modelled by a leaky-integrator of the type described in section 3.1. Finally, we assume that (4) holds; that is, the activation does not become so large as to drive the output into the saturation region where y = 1. This is accomplished for the input structures (striatum and STN) which receive excitatory input by limiting the action salience. For the output structures (segments of GP) it is of less concern since these will have their inputs decreased by highly salient signals. 3.3 Striatum There are two Striatal sub-systems, efferent separately to GPe and GPi/SNr distinguished by their neurochemistry and their response to dopamine (see section 1.1). Across the entire striatum, we model the medium spiny cells' up/down-state behaviour by using a positive threshold in the neuron's piecewise linear output law (4). Thus, no output occurs unless the activation exceeds ffl. Each striatal sub-system is supposed to consist of local, possibly overlapping recurrent nets and, assuming a somatotopic organisation for striatum, each such network therefore selects between action requests associated with the same somatic area. In this way, striatum can to resolve resource conflicts so that only one motor programme can gain control of a motor resource at any time. Now, while there is evidence for local lateral connections between spiny neurons (wilson and Groves, 1980), it remains to be established that the inhibition is mediated by the these connections (Wickens, 1997). In spite of this we assume these local networks behave in a similar way to local recurrent, inhibitory nets with direct lateral connectivity as shown at the top of figure 8. The result of a simulation using leaky integrators of the type described in section 3.1 is shown in the lower part of the figure; this result is always obtained with uniform weights of sufficiently large magnitude. Each local recurrent network implements a large signal encoding selection mechanism with 1 =0. Further, there is a perfect match with the requirement that ß s (x) =1=jXj for sets which contain at least one x i greater than the output threshold ffl, and ß s (x) = 0 otherwise. The decisiveness of the mechanism depends on the setting of 2 but can never be 1 for all input vectors. In contrast a `winner-take-all' (WTA) net which makes use of central excitation may be assigned D(x) = 1; 8x. Although this may appear more desirable, the preservation of a graded response in striatum is crucial to its effective operation 13

16 efferents from single node in local recurrent network Figure 8: Recurrent networks in striatum in selecting actions with stronger salience. We now proceed to formalise the description of this network and show how its behaviour may be analytically described, thus obviating the need to simulate it explicitly. Consider the ith neural population in the mth local recurrent network in one of the striatal sub-systems. This population codes for some action b j(m;i). Let the equilibrium activation be ~a mi, the net post-synaptic potential resulting from the (excitatory) input be J mi, and the output be x mi. Let the absolute value of the lateral connection weights be w, the slope of the strictly increasing portion of the neural output relation m, and the output threshold ffl. Then, assuming that all x mi < 1 ~a mi = J mi w X j6=i x mj x mi = m (~a mi ffl)h(~a mi ffl) (5) Let J mk(m) = max i (J mi ). If w m 1, then one solution to the set of coupled equations in (5) is x mi = m (J mi ffl)h(j mi ffl)ffi ik(m) (6) where ffi ij is the Kronecker ffi. The solution in (6) may be easily verified by direct substitution (see appendix B) and we assume that all local recurrent nets obtain a solution of this form. Note that two overlapping networks m; m 0,mayhave a single signal selection x mk(m) in common, if that signal is derived from neurons common to both. While the microstructure of individual nets in the two striatal subsystems may vary, we now assume that the set of actions for which ffi ik(m) = 1 is the same for both. That is, the 14

17 set of actions b j(m;k) which are associated with potentially active channels is the same in each subsystem. This is the set of channels of interest and, using a single index into this set, the output x i of the ith such channel is just x i = m (J i ffl)h(j i ffl) (7) A channel will become active in its subsystem if, in addition, H(J i ffl) =1. Notice that we nowhave an analytic form for the set of non-zero signals emerging from striatum which makes no explicit reference to the underlying recurrent networks which gave rise to them. It remains to detail the nature of the net input to each cell population J i. To do this, suppose the action on the ith potentially active channel has salience c i, then put J i = w s c i, where w s is a measure of the overall synaptic efficiency of the medium spiny neurons in integrating inputs associated with b i. Then x i = m (w s c i ffl)h(c i ffl=w s ) (8) where the step function now depends on c i and indicates that the threshold for c i to force an up-state with respect to c i is ffl=w s. We now introduce the dopamine modulation of the efficacy of medium spiny excitation. Irrespective of the details of this mechanism, it is possible to model it phenomenologically by introducing a multiplicative factor in the synaptic strength w s. Thus, for the GPe subsystem which makes use of D2 receptors, the effect of dopamine is inhibitory and the synaptic strength may be written w s (1 e i ), where e i parametrises the dopamine modulation and 0» e i < 1. In principle, is would be possible to allow dopamine levels to vary dynamically and to consider the short-term dopamine signals associated with movement inducing stimuli (Schultz and Romo, 1990). However, the model analysis deals with equilibrium conditions and the dopamine parameters were constant during all simulations. Therefore, we are concerned here with tonic or `background' levels of dopamine that might alter as a result of pathologies of SNc, laboratory manipulations in animals or intoxicated states in humans (this is described further in the Discussion). Further we assume that such tonic dopaminergic input to striatum is diffuse enough that it is essentially constant over all channels. Thus we write e i = e 8i. The step function becomes H[c i ffl=w s (1 e )] which reflects the fact that GPe efferents will transfer to the up-state less readily under increased dopamine. To emphasise this and to ease the notation, we write H[c i ffl=w s (1 e )] = H " i ( e ) (9) where the " suggests the idea of the up-state criterion. The equilibrium relations may now be written ~a e i = w s (1 e )c i x e i = m [w s (1 e )c i ffl]h " i ( e ) (10) GPi/SNr efferents may be modelled in a similar way to that for GPe except that the dopamine modulation is excitatory. Thus, the unmodulated synaptic strength w s (assumed to be the same as it is for efferents to GPe ) is modified to w s (1 + g ); 0 < g < 1. Then if 15

18 x g i is the equilibrium output on the ith potentially active channel ~a g i = w s (1 + g )c i x g i = m [w s (1 + g )c i ffl]h " i ( g ) (11) where H " i ( g ) = H[c i ffl=w s (1 + g )]. In this case the step-function threshold is reduced by dopamine to ffl=w s (1 + g ) and the up-state is reached more easily (as suggested by the negative argument in the function H " ). Let P Λ e (t), P Λ g (t) be the sets of GPe-active and GPi/SNr-active channels respectively; that is, channels whose output signals in their respective striatal subsystems are currently non-zero. Now, H " i ( g ) H " i ( e ) implies that P Λ e P Λ g with equality in both relations holding only if e = g =0. Therefore we talk of the set of active channels P Λ = P Λ g, where P Λ is the set of channels associated with striatal output of any kind.notice that, if B Λ is the corresponding set of actions, then B B Λ, since being active is a necessary but not sufficient condition that a channel's GPi/SNr inhibition be released to the point which allows gating the action. 3.4 STN As noted in section 3.2there are grounds for supposing that the discrete channel assumption may first break down in the STN. Given this possibility we model the two extreme cases: p STN i = p STN j 8i; j, and p STN i 6= p STN j 8i; j, referring to them as the homogeneous and discrete STN models respectively. In both models we assume that the STN efferents to GPe and GPi/SNr convey the same signal strength which is consistent with the evidence for branched STN collaterals to these areas (Deniau et al., 1978; Kita et al., 1983) The discrete model Unlike the striatum, STN activity will also include contributions from non-active channels. Thus, all channels which fail to get selected by local recurrent nets in any part of striatum are supposed to contribute a mean excitation of ffl 0 to the STN population. For an active channel, however, this is augmented by a contribution w t c i where w t is a measure of the synaptic efficiency at the input to STN. Thus, the contribution to STN activation at equilibrium due to inputs extrinsic to basal ganglia may be written w t c i + ffl 0, as long as we ensure that, for non-active channels, we assign c i =0. Thus, by design c i ( =0 : i =2 P Λ ffl=ws(1 + g ) : i 2 P Λ (12) Notice that, even though there is a specific STN cell population designated to process the ith channel in P Λ, this population cannot differentiate i from the action requests that competed with it in striatum; these are subsumed in the constant ffl 0. Thus while assuming that the somatotopy of STN is sufficiently well developed to differentiate members of P Λ (t) at any particular time, we allow for the possibility that, at different times, the same population may be responsible for processing a different action b j. Finally, the ith STN channel also receives input from GPe so that, if w g is the synaptic efficiency of GPe with STN and yi e is the GPe output at equilibrium 16

19 ~a + i = w t c i + ffl 0 w g y e i (13) If the output of the ith channel in STN is x + i and m + the gradient onthe strictly increasing part of the output function then x + i = m + ~a + i H +" i (14) where H +" i = H(w t c i + ffl 0 w g yi e ) and indicates activity in STN for channel i. Notice that this is equivalent in form to the case where non-active channel requests contribute nothing to STN activation but the membrane properties of STN yield a non-zero output ffl 0. This confounding of the two possibilities is of no concern here since the essential pointisthatwe include a non-zero tonic level of STN output (DeLong et al., 1985; Wichmann et al., 1994). It is useful to define the total STN output X + d for the discrete model where X + d = P n i=1 x + i and n is the total number of channels in the model. n must certainly be large enough to include all active channels and should, in principle, be large enough to reflect the activity ofstndue to non-active input. However, it cannot be made indefinitely large since we are assuming channel independence. Thus, if S is the total number of neurons in STN and hs i i the mean size of populations Pi STN then a sensible upper bound on n is S=hS i i. This may seem to be a problem for the model since we do not know either S or S=hS i i. More fundamentally it might be a problem for the basal ganglia, for X + d grows with n and, if the synaptic efficiencies of STN with GPe and GPi/SNr are not correctly limited, then excitation from STN may dominate. In fact we will show that GPe inhibition is able to auto-scale X + d with n so that its value is not critical Homogeneous model Here there is a single population with output X + h where X + h = m + n X i=1(w t c i + ffl 0 w g y e i ) (15) There is no need to include a step function because X + h > 0. This follows by reductio ad absurdum: Suppose X + h = 0 then there is no excitation to GPe and, given that GPe tonic rates are maintained by STN (see below) there can be no inhibition of STN; but this implies X + h > 0 since ffl 0 > 0. In the homogeneous model, the number of discrete channels will be limited by the dendritic resources allocated to each one. 3.5 GPe We assume a discrete, channel-based model and the activation of the ith GPe channel will be denoted by a e i. As with STN, we assume that striatal afferents all synapse with the same strength, this time denoted by w i. To enable comparison of the relative synaptic strength, w ;w + put w + = ffiw so that, at equilibrium ~a e i = w (ffix + x e i )+ffl e (16) where we have used the symbol X + to denote model non-specific STN input and ffl e is the quiescent GPe activation due to its intrinsic membrane properties. Using the expression for 17

20 x e i given in (10) The output relation for GPe is just 3.6 GPi/SNr ~a e i = w fffix + m [(1 e )w s c i ffl]h " i ( e )g + ffl e (17) y e i = m e ~a e i H(a e i) (18) We assume that the afferent strength of STN and Striatum to GPi/SNr are similar to their respective strengths to GPesothattheymay also be represented by w + and w. The equation for activation is similar to that for GPe apart from the inhibition from this structure. Thus, the equilibrium activation ~a g i is given by ~a g i = w (ffix + x g i ) w e y e i + ffl g (19) where w e represents the synaptic strength of the GPe! GPi/SNr pathway and ffl g is the quiescent GPi/SNr activation due to their intrinsic membrane properties. Substituting for x g i from (11) ~a g i = w (ffix + m [w s (1 + g )c i ffl]h " i ( g )) w e y e i + ffl g (20) If y g i is the GPi/SNr output at equilibrium then y g i = m g ~a g i H(a g i ) (21) 4 Quantitative model: results There are several ways to determine a model's behaviour. In this paper we obtain some general properties by an analysis of the equilibrium states defined above, together with more detailed results obtained in simulation. In what follows, it is convenient to normalise all output function gradients m to one; this simply implies a possible rescaling of the weights and threshold offsets. 4.1 Activity of sub-nuclei We solve for each of GPe, STN and GPi/SNr in turn GPe Let the set of non-gpe-active channels be Q 0 and define the subsets Q = fi 2 P Λ e : y e i > 0g; Q Λ = fi 2 P Λ e : y e i =0g, and their cardinalities Q 0 = jq 0 j;q= jqj;q Λ = jq Λ j. Then i 2 Q 0 : yi e = w ffix + + ffl e i 2 Q : yi e = w fffix + [(1 e )w s c i ffl]g + ffl e i 2 Q Λ : yi e =0 (22) 18

21 4.1.2 STN Homogeneous model then Let hci Λ be the mean salience of active channels and ffi Λ = jp Λ j=n X i c i = X i 2 P Λ c i = nffi Λ hci Λ (23) Using this, the gradient normalisation and the definitions of in (15) X + h = n(w t ffi Λ hci Λ + ffl 0 ) w g 2 4 X i 2Q 0 y e i + X i 2Q The first sum in the brackets is zero and, defining hci q =1=Q P i 2Qc i, y e i 3 5 (24) X + h = n(w t ffi Λ hci Λ + ffl 0 ) w g w fffix + h (Q 0 + Q) Q[(1 e )w s hci q ffl]g w g (Q o + Q)ffl e (25) Now put M = Q 0 + Q, M = nffi m and Q = nffi q, then X + h = n n o wt ffi 1+ffiw g w Λ hci Λ + ffl 0 + ffi q w g w [(1 e )w s hci q ffl] w g ffi m ffl e nffi m (26) While X + h depends on n it does not grow indefinitely and has a limiting value. Thus, assuming w g w ffinffi m fl 1then X + h ß 1 n o wt ffi ffiw g w Λ hci Λ + ffl 0 + ffi q w g w [(1 e )w s hci q ffl] w g ffi m ffl e ffi m To see the conditions under which w g w ffinffi m fl 1 notice that, under gradient normalisation, the activations must be constrained to be O(1) (of the same order of magnitude as 1). This means that the excitatory and inhibitory contributions to each node in the model are also O(1). Since there are only single inhibitory inputs to each node this implies w ;w g = O(1). Weights which obey this criterion will be said to be unit-scaled. Under these conditions the requirement becomes ffinffi m fl 1and,ifffi = O(1), then this in turn necessitates nffi m fl 1. Now in section we show that the number of channels in Q Λ is limited (under the phenomenon of selection limiting) so that ffi m = O(1). Thus, (27) is true for large n and the right hand side is bounded above. Discrete model If S Λ is the set of channels for which H +" i normalisation X = X + d (27) = 1 then, under gradient i 2 S Λ (w t c i + ffl 0 w g y e i ) (28) Notice that S Λ can never be empty for, if it was then, since there is no excitation to GPe, all the yi e would be zero and so H +" i = 1 for all i. Thus, by reductio ad absurdum we have S Λ 6= ;. To proceed further define ffi s = js Λ j=n; ffi Λ;s = js Λ P Λ j=n; ffi q;s = js Λ Qj=n; ffi m;s = js Λ (Q[Q 0 )j=n, together with associated mean saliences hci s Λ = 1 nffi Λ;s X i 2 S Λ P Λ c i hci s q = 1 nffi q;s X i 2S Λ Q c i (29) 19

22 Then X + d = n 1+ffiw g w nffi m;s n wt ffi Λ;s hci s Λ + ffi s ffl 0 + ffi q;s w g w [(1 e )w s hci q;s ffl] w g ffi s ffl e o (30) In order to show that this is bounded above we might proceed as we did for the homogeneous model. However, we cannot rely on the fraction ffi m;s being O(1) for there may only be a single member of S Λ so that ffi s;m = 1=n. That the expression is bounded above is due to the corresponding decline in the fractions ffi Λ;s ;ffi q;s. To formalise this put ffi x;s = ffi s ffi s x for each of x = m; Λ;q so that each ffi s x represents a fraction with respect to js Λ j; for example, ffi s Λ = js Λ P Λ j=js Λ j. Then putting n Λ = js Λ j n Λ X + d = 1+ffiw g w n Λ ffi s m n wt ffi s Λhci s Λ + ffl 0 + ffi s qw g w [(1 e )w s hci s q ffl] w g ffl e o If n Λ is small then X + d is clearly bounded above. If however, n Λ is large then, by selection limiting (section 4.2.2) jq Λ j fi n Λ and so ffi s m = O(1) and we have a similar situation to the homogeneous case. Thus for large n Λ n wt ffi s Λhci s Λ + ffl 0 + ffi s qw g w [(1 e )w s hci s q ffl] w g ffl e o 1 X + d ß (32) ffiw g w ffi s m For the model used in the simulation, large n Λ occurs with a large number of active channels and with no active channels. A single active channel i is sufficient to drive GPe inhibition high enough to force the STN output on all but channel i to zero. Tonic values These can be evaluated by putting c i = 0 8i. Denoting the tonic level of a quantity v by ^v ^X + h = n(ffl0 w g ffl e ) (33) 1+ffiw g w n For the discrete model all channel levels are the same so, putting ^x + k where ^y e = w ffi ^X + d and ^X + d = n^x +. This gives = ^x + ; ^y e k = ^y e ; 8k (31) ^x + = ffl 0 w g ^y e (34) ^x + = ffl0 w g ffl e 1+ffiw g w n (35) and so ^X + d = ^X + h ^X +. The denominator in the right handside of (33) is greater than or equal to that in (26) (since ffi m» 1) and any active channels introduce positive terms into the numerator of (26) so that X + h >X+ if P Λ 6= ; (36) The presence of n Λ in the expression for X + d confounds this simple proof and it requires simulation to discover whether X + d >X +. 20

Overview Organization: Central Nervous System (CNS) Peripheral Nervous System (PNS) innervate Divisions: a. Afferent

Overview Organization: Central Nervous System (CNS) Peripheral Nervous System (PNS) innervate Divisions: a. Afferent Overview Organization: Central Nervous System (CNS) Brain and spinal cord receives and processes information. Peripheral Nervous System (PNS) Nerve cells that link CNS with organs throughout the body.

More information

Fundamentals of Computational Neuroscience 2e

Fundamentals of Computational Neuroscience 2e Fundamentals of Computational Neuroscience 2e Thomas Trappenberg March 21, 2009 Chapter 9: Modular networks, motor control, and reinforcement learning Mixture of experts Expert 1 Input Expert 2 Integration

More information

Nervous Tissue. Neurons Electrochemical Gradient Propagation & Transduction Neurotransmitters Temporal & Spatial Summation

Nervous Tissue. Neurons Electrochemical Gradient Propagation & Transduction Neurotransmitters Temporal & Spatial Summation Nervous Tissue Neurons Electrochemical Gradient Propagation & Transduction Neurotransmitters Temporal & Spatial Summation What is the function of nervous tissue? Maintain homeostasis & respond to stimuli

More information

Artificial Neural Network and Fuzzy Logic

Artificial Neural Network and Fuzzy Logic Artificial Neural Network and Fuzzy Logic 1 Syllabus 2 Syllabus 3 Books 1. Artificial Neural Networks by B. Yagnanarayan, PHI - (Cover Topologies part of unit 1 and All part of Unit 2) 2. Neural Networks

More information

Lecture 04, 04 Sept 2003 Chapters 4 and 5. Vertebrate Physiology ECOL 437 University of Arizona Fall instr: Kevin Bonine t.a.

Lecture 04, 04 Sept 2003 Chapters 4 and 5. Vertebrate Physiology ECOL 437 University of Arizona Fall instr: Kevin Bonine t.a. Lecture 04, 04 Sept 2003 Chapters 4 and 5 Vertebrate Physiology ECOL 437 University of Arizona Fall 2003 instr: Kevin Bonine t.a.: Bret Pasch Vertebrate Physiology 437 1. Membranes (CH4) 2. Nervous System

More information

Nervous Tissue. Neurons Neural communication Nervous Systems

Nervous Tissue. Neurons Neural communication Nervous Systems Nervous Tissue Neurons Neural communication Nervous Systems What is the function of nervous tissue? Maintain homeostasis & respond to stimuli Sense & transmit information rapidly, to specific cells and

More information

Nervous Systems: Neuron Structure and Function

Nervous Systems: Neuron Structure and Function Nervous Systems: Neuron Structure and Function Integration An animal needs to function like a coherent organism, not like a loose collection of cells. Integration = refers to processes such as summation

More information

Chapter 9. Nerve Signals and Homeostasis

Chapter 9. Nerve Signals and Homeostasis Chapter 9 Nerve Signals and Homeostasis A neuron is a specialized nerve cell that is the functional unit of the nervous system. Neural signaling communication by neurons is the process by which an animal

More information

Neuron. Detector Model. Understanding Neural Components in Detector Model. Detector vs. Computer. Detector. Neuron. output. axon

Neuron. Detector Model. Understanding Neural Components in Detector Model. Detector vs. Computer. Detector. Neuron. output. axon Neuron Detector Model 1 The detector model. 2 Biological properties of the neuron. 3 The computational unit. Each neuron is detecting some set of conditions (e.g., smoke detector). Representation is what

More information

CORRELATION TRANSFER FROM BASAL GANGLIA TO THALAMUS IN PARKINSON S DISEASE. by Pamela Reitsma. B.S., University of Maine, 2007

CORRELATION TRANSFER FROM BASAL GANGLIA TO THALAMUS IN PARKINSON S DISEASE. by Pamela Reitsma. B.S., University of Maine, 2007 CORRELATION TRANSFER FROM BASAL GANGLIA TO THALAMUS IN PARKINSON S DISEASE by Pamela Reitsma B.S., University of Maine, 27 Submitted to the Graduate Faculty of the Department of Mathematics in partial

More information

Chapter 48 Neurons, Synapses, and Signaling

Chapter 48 Neurons, Synapses, and Signaling Chapter 48 Neurons, Synapses, and Signaling Concept 48.1 Neuron organization and structure reflect function in information transfer Neurons are nerve cells that transfer information within the body Neurons

More information

Marr's Theory of the Hippocampus: Part I

Marr's Theory of the Hippocampus: Part I Marr's Theory of the Hippocampus: Part I Computational Models of Neural Systems Lecture 3.3 David S. Touretzky October, 2015 David Marr: 1945-1980 10/05/15 Computational Models of Neural Systems 2 Marr

More information

Basal Ganglia. Input: Cortex Output: Thalamus motor and sensory gajng. Striatum Caudate Nucleus Putamen Globus Pallidus

Basal Ganglia. Input: Cortex Output: Thalamus motor and sensory gajng. Striatum Caudate Nucleus Putamen Globus Pallidus An Interac*ve Channel Model of the Basal Ganglia: Bifurca*on Analysis Under Healthy and Parkinsonian Condi*ons Robert Merrison- Hort, Nada Yousif, Felix Njap, Ulrich G. Hofmann, Oleksandr Burylko, Roman

More information

Tuning tuning curves. So far: Receptive fields Representation of stimuli Population vectors. Today: Contrast enhancment, cortical processing

Tuning tuning curves. So far: Receptive fields Representation of stimuli Population vectors. Today: Contrast enhancment, cortical processing Tuning tuning curves So far: Receptive fields Representation of stimuli Population vectors Today: Contrast enhancment, cortical processing Firing frequency N 3 s max (N 1 ) = 40 o N4 N 1 N N 5 2 s max

More information

Neurons and Nervous Systems

Neurons and Nervous Systems 34 Neurons and Nervous Systems Concept 34.1 Nervous Systems Consist of Neurons and Glia Nervous systems have two categories of cells: Neurons, or nerve cells, are excitable they generate and transmit electrical

More information

The Neuron - F. Fig. 45.3

The Neuron - F. Fig. 45.3 excite.org(anism): Electrical Signaling The Neuron - F. Fig. 45.3 Today s lecture we ll use clickers Review today 11:30-1:00 in 2242 HJ Patterson Electrical signals Dendrites: graded post-synaptic potentials

More information

(Feed-Forward) Neural Networks Dr. Hajira Jabeen, Prof. Jens Lehmann

(Feed-Forward) Neural Networks Dr. Hajira Jabeen, Prof. Jens Lehmann (Feed-Forward) Neural Networks 2016-12-06 Dr. Hajira Jabeen, Prof. Jens Lehmann Outline In the previous lectures we have learned about tensors and factorization methods. RESCAL is a bilinear model for

More information

Dendrites - receives information from other neuron cells - input receivers.

Dendrites - receives information from other neuron cells - input receivers. The Nerve Tissue Neuron - the nerve cell Dendrites - receives information from other neuron cells - input receivers. Cell body - includes usual parts of the organelles of a cell (nucleus, mitochondria)

More information

Instituto Tecnológico y de Estudios Superiores de Occidente Departamento de Electrónica, Sistemas e Informática. Introductory Notes on Neural Networks

Instituto Tecnológico y de Estudios Superiores de Occidente Departamento de Electrónica, Sistemas e Informática. Introductory Notes on Neural Networks Introductory Notes on Neural Networs Dr. José Ernesto Rayas Sánche April Introductory Notes on Neural Networs Dr. José Ernesto Rayas Sánche BIOLOGICAL NEURAL NETWORKS The brain can be seen as a highly

More information

Neurons, Synapses, and Signaling

Neurons, Synapses, and Signaling LECTURE PRESENTATIONS For CAMPBELL BIOLOGY, NINTH EDITION Jane B. Reece, Lisa A. Urry, Michael L. Cain, Steven A. Wasserman, Peter V. Minorsky, Robert B. Jackson Chapter 48 Neurons, Synapses, and Signaling

More information

How to build a brain. Cognitive Modelling. Terry & Chris Centre for Theoretical Neuroscience University of Waterloo

How to build a brain. Cognitive Modelling. Terry & Chris Centre for Theoretical Neuroscience University of Waterloo How to build a brain Cognitive Modelling Terry & Chris Centre for Theoretical Neuroscience University of Waterloo So far... We have learned how to implement high-dimensional vector representations linear

More information

Information processing. Divisions of nervous system. Neuron structure and function Synapse. Neurons, synapses, and signaling 11/3/2017

Information processing. Divisions of nervous system. Neuron structure and function Synapse. Neurons, synapses, and signaling 11/3/2017 Neurons, synapses, and signaling Chapter 48 Information processing Divisions of nervous system Central nervous system (CNS) Brain and a nerve cord Integration center Peripheral nervous system (PNS) Nerves

More information

Artificial Neural Network

Artificial Neural Network Artificial Neural Network Contents 2 What is ANN? Biological Neuron Structure of Neuron Types of Neuron Models of Neuron Analogy with human NN Perceptron OCR Multilayer Neural Network Back propagation

More information

An Introductory Course in Computational Neuroscience

An Introductory Course in Computational Neuroscience An Introductory Course in Computational Neuroscience Contents Series Foreword Acknowledgments Preface 1 Preliminary Material 1.1. Introduction 1.1.1 The Cell, the Circuit, and the Brain 1.1.2 Physics of

More information

Balance of Electric and Diffusion Forces

Balance of Electric and Diffusion Forces Balance of Electric and Diffusion Forces Ions flow into and out of the neuron under the forces of electricity and concentration gradients (diffusion). The net result is a electric potential difference

More information

80% of all excitatory synapses - at the dendritic spines.

80% of all excitatory synapses - at the dendritic spines. Dendritic Modelling Dendrites (from Greek dendron, tree ) are the branched projections of a neuron that act to conduct the electrical stimulation received from other cells to and from the cell body, or

More information

Computational Explorations in Cognitive Neuroscience Chapter 2

Computational Explorations in Cognitive Neuroscience Chapter 2 Computational Explorations in Cognitive Neuroscience Chapter 2 2.4 The Electrophysiology of the Neuron Some basic principles of electricity are useful for understanding the function of neurons. This is

More information

Introduction Principles of Signaling and Organization p. 3 Signaling in Simple Neuronal Circuits p. 4 Organization of the Retina p.

Introduction Principles of Signaling and Organization p. 3 Signaling in Simple Neuronal Circuits p. 4 Organization of the Retina p. Introduction Principles of Signaling and Organization p. 3 Signaling in Simple Neuronal Circuits p. 4 Organization of the Retina p. 5 Signaling in Nerve Cells p. 9 Cellular and Molecular Biology of Neurons

More information

COMP304 Introduction to Neural Networks based on slides by:

COMP304 Introduction to Neural Networks based on slides by: COMP34 Introduction to Neural Networks based on slides by: Christian Borgelt http://www.borgelt.net/ Christian Borgelt Introduction to Neural Networks Motivation: Why (Artificial) Neural Networks? (Neuro-)Biology

More information

3 Detector vs. Computer

3 Detector vs. Computer 1 Neurons 1. The detector model. Also keep in mind this material gets elaborated w/the simulations, and the earliest material is often hardest for those w/primarily psych background. 2. Biological properties

More information

A Biologically-Inspired Model for Recognition of Overlapped Patterns

A Biologically-Inspired Model for Recognition of Overlapped Patterns A Biologically-Inspired Model for Recognition of Overlapped Patterns Mohammad Saifullah Department of Computer and Information Science Linkoping University, Sweden Mohammad.saifullah@liu.se Abstract. In

More information

Chapter 9: The Perceptron

Chapter 9: The Perceptron Chapter 9: The Perceptron 9.1 INTRODUCTION At this point in the book, we have completed all of the exercises that we are going to do with the James program. These exercises have shown that distributed

More information

Neurons, Synapses, and Signaling

Neurons, Synapses, and Signaling Chapter 48 Neurons, Synapses, and Signaling PowerPoint Lecture Presentations for Biology Eighth Edition Neil Campbell and Jane Reece Lectures by Chris Romero, updated by Erin Barley with contributions

More information

Patterns, Memory and Periodicity in Two-Neuron Delayed Recurrent Inhibitory Loops

Patterns, Memory and Periodicity in Two-Neuron Delayed Recurrent Inhibitory Loops Math. Model. Nat. Phenom. Vol. 5, No. 2, 2010, pp. 67-99 DOI: 10.1051/mmnp/20105203 Patterns, Memory and Periodicity in Two-Neuron Delayed Recurrent Inhibitory Loops J. Ma 1 and J. Wu 2 1 Department of

More information

Khinchin s approach to statistical mechanics

Khinchin s approach to statistical mechanics Chapter 7 Khinchin s approach to statistical mechanics 7.1 Introduction In his Mathematical Foundations of Statistical Mechanics Khinchin presents an ergodic theorem which is valid also for systems that

More information

Neurochemistry 1. Nervous system is made of neurons & glia, as well as other cells. Santiago Ramon y Cajal Nobel Prize 1906

Neurochemistry 1. Nervous system is made of neurons & glia, as well as other cells. Santiago Ramon y Cajal Nobel Prize 1906 Neurochemistry 1 Nervous system is made of neurons & glia, as well as other cells. Santiago Ramon y Cajal Nobel Prize 1906 How Many Neurons Do We Have? The human brain contains ~86 billion neurons and

More information

2 1. Introduction. Neuronal networks often exhibit a rich variety of oscillatory behavior. The dynamics of even a single cell may be quite complicated

2 1. Introduction. Neuronal networks often exhibit a rich variety of oscillatory behavior. The dynamics of even a single cell may be quite complicated GEOMETRIC ANALYSIS OF POPULATION RHYTHMS IN SYNAPTICALLY COUPLED NEURONAL NETWORKS J. Rubin and D. Terman Dept. of Mathematics; Ohio State University; Columbus, Ohio 43210 Abstract We develop geometric

More information

MEMBRANE POTENTIALS AND ACTION POTENTIALS:

MEMBRANE POTENTIALS AND ACTION POTENTIALS: University of Jordan Faculty of Medicine Department of Physiology & Biochemistry Medical students, 2017/2018 +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ Review: Membrane physiology

More information

Lecture 4: Feed Forward Neural Networks

Lecture 4: Feed Forward Neural Networks Lecture 4: Feed Forward Neural Networks Dr. Roman V Belavkin Middlesex University BIS4435 Biological neurons and the brain A Model of A Single Neuron Neurons as data-driven models Neural Networks Training

More information

Executable Symbolic Modeling of Neural Processes

Executable Symbolic Modeling of Neural Processes Executable Symbolic Modeling of Neural Processes M Sriram Iyengar 1, Carolyn Talcott 2, Riccardo Mozzachiodi 3, Douglas Baxter 3 1 School of Health Information Sciences, Univ. of Texas Health Science Center

More information

Nervous System Organization

Nervous System Organization The Nervous System Chapter 44 Nervous System Organization All animals must be able to respond to environmental stimuli -Sensory receptors = Detect stimulus -Motor effectors = Respond to it -The nervous

More information

A Spiking Independent Accumulator Model for Winner-Take-All Computation

A Spiking Independent Accumulator Model for Winner-Take-All Computation A Spiking Independent Accumulator Model for Winner-Take-All Computation Jan Gosmann (jgosmann@uwaterloo.ca) Aaron R. Voelker (arvoelke@uwaterloo.ca) Chris Eliasmith (celiasmith@uwaterloo.ca) Centre for

More information

INTRODUCTION TO NEURAL NETWORKS

INTRODUCTION TO NEURAL NETWORKS INTRODUCTION TO NEURAL NETWORKS R. Beale & T.Jackson: Neural Computing, an Introduction. Adam Hilger Ed., Bristol, Philadelphia and New York, 990. THE STRUCTURE OF THE BRAIN The brain consists of about

More information

The Nervous System. Nervous System Organization. Nerve Tissue. Two parts to the nervous system 11/27/2016

The Nervous System. Nervous System Organization. Nerve Tissue. Two parts to the nervous system 11/27/2016 The Nervous System Nervous System Organization Animals must be able to respond to environmental stimuli. Three functions of the nervous system: Sensory input conduction of signals from sensory receptors.

More information

EE249 - Fall 2012 Lecture 18: Overview of Concrete Contract Theories. Alberto Sangiovanni-Vincentelli Pierluigi Nuzzo

EE249 - Fall 2012 Lecture 18: Overview of Concrete Contract Theories. Alberto Sangiovanni-Vincentelli Pierluigi Nuzzo EE249 - Fall 2012 Lecture 18: Overview of Concrete Contract Theories 1 Alberto Sangiovanni-Vincentelli Pierluigi Nuzzo Outline: Contracts and compositional methods for system design Where and why using

More information

Neurons, Synapses, and Signaling

Neurons, Synapses, and Signaling Chapter 48 Neurons, Synapses, and Signaling PowerPoint Lecture Presentations for Biology Eighth Edition Neil Campbell and Jane Reece Lectures by Chris Romero, updated by Erin Barley with contributions

More information

CISC 3250 Systems Neuroscience

CISC 3250 Systems Neuroscience CISC 3250 Systems Neuroscience Systems Neuroscience How the nervous system performs computations How groups of neurons work together to achieve intelligence Professor Daniel Leeds dleeds@fordham.edu JMH

More information

Data Mining Part 5. Prediction

Data Mining Part 5. Prediction Data Mining Part 5. Prediction 5.5. Spring 2010 Instructor: Dr. Masoud Yaghini Outline How the Brain Works Artificial Neural Networks Simple Computing Elements Feed-Forward Networks Perceptrons (Single-layer,

More information

Control and Integration. Nervous System Organization: Bilateral Symmetric Animals. Nervous System Organization: Radial Symmetric Animals

Control and Integration. Nervous System Organization: Bilateral Symmetric Animals. Nervous System Organization: Radial Symmetric Animals Control and Integration Neurophysiology Chapters 10-12 Nervous system composed of nervous tissue cells designed to conduct electrical impulses rapid communication to specific cells or groups of cells Endocrine

More information

Artificial Neural Networks. Part 2

Artificial Neural Networks. Part 2 Artificial Neural Netorks Part Artificial Neuron Model Folloing simplified model of real neurons is also knon as a Threshold Logic Unit x McCullouch-Pitts neuron (943) x x n n Body of neuron f out Biological

More information

Neurons, Synapses, and Signaling

Neurons, Synapses, and Signaling Chapter 48 Neurons, Synapses, and Signaling PowerPoint Lectures for Biology, Eighth Edition Lectures by Chris Romero, updated by Erin Barley with contributions from Joan Sharp and Janette Lewis Copyright

More information

Feedforward Neural Nets and Backpropagation

Feedforward Neural Nets and Backpropagation Feedforward Neural Nets and Backpropagation Julie Nutini University of British Columbia MLRG September 28 th, 2016 1 / 23 Supervised Learning Roadmap Supervised Learning: Assume that we are given the features

More information

Neurophysiology. Danil Hammoudi.MD

Neurophysiology. Danil Hammoudi.MD Neurophysiology Danil Hammoudi.MD ACTION POTENTIAL An action potential is a wave of electrical discharge that travels along the membrane of a cell. Action potentials are an essential feature of animal

More information

Nervous System Organization

Nervous System Organization The Nervous System Nervous System Organization Receptors respond to stimuli Sensory receptors detect the stimulus Motor effectors respond to stimulus Nervous system divisions Central nervous system Command

More information

A Mass Model of Interconnected Thalamic Populations including both Tonic and Burst Firing Mechanisms

A Mass Model of Interconnected Thalamic Populations including both Tonic and Burst Firing Mechanisms International Journal of Bioelectromagnetism Vol. 12, No. 1, pp. 26-31, 2010 www.ijbem.org A Mass Model of Interconnected Thalamic Populations including both Tonic and Burst Firing Mechanisms Pirini M.

More information

Synaptic Input. Linear Model of Synaptic Transmission. Professor David Heeger. September 5, 2000

Synaptic Input. Linear Model of Synaptic Transmission. Professor David Heeger. September 5, 2000 Synaptic Input Professor David Heeger September 5, 2000 The purpose of this handout is to go a bit beyond the discussion in Ch. 6 of The Book of Genesis on synaptic input, and give some examples of how

More information

Chapter 37 Active Reading Guide Neurons, Synapses, and Signaling

Chapter 37 Active Reading Guide Neurons, Synapses, and Signaling Name: AP Biology Mr. Croft Section 1 1. What is a neuron? Chapter 37 Active Reading Guide Neurons, Synapses, and Signaling 2. Neurons can be placed into three groups, based on their location and function.

More information

ARTIFICIAL NEURAL NETWORK PART I HANIEH BORHANAZAD

ARTIFICIAL NEURAL NETWORK PART I HANIEH BORHANAZAD ARTIFICIAL NEURAL NETWORK PART I HANIEH BORHANAZAD WHAT IS A NEURAL NETWORK? The simplest definition of a neural network, more properly referred to as an 'artificial' neural network (ANN), is provided

More information

arxiv:physics/ v1 [physics.bio-ph] 19 Feb 1999

arxiv:physics/ v1 [physics.bio-ph] 19 Feb 1999 Odor recognition and segmentation by coupled olfactory bulb and cortical networks arxiv:physics/9902052v1 [physics.bioph] 19 Feb 1999 Abstract Zhaoping Li a,1 John Hertz b a CBCL, MIT, Cambridge MA 02139

More information

Hopfield Neural Network and Associative Memory. Typical Myelinated Vertebrate Motoneuron (Wikipedia) Topic 3 Polymers and Neurons Lecture 5

Hopfield Neural Network and Associative Memory. Typical Myelinated Vertebrate Motoneuron (Wikipedia) Topic 3 Polymers and Neurons Lecture 5 Hopfield Neural Network and Associative Memory Typical Myelinated Vertebrate Motoneuron (Wikipedia) PHY 411-506 Computational Physics 2 1 Wednesday, March 5 1906 Nobel Prize in Physiology or Medicine.

More information

The Simplex Method: An Example

The Simplex Method: An Example The Simplex Method: An Example Our first step is to introduce one more new variable, which we denote by z. The variable z is define to be equal to 4x 1 +3x 2. Doing this will allow us to have a unified

More information

How to do backpropagation in a brain

How to do backpropagation in a brain How to do backpropagation in a brain Geoffrey Hinton Canadian Institute for Advanced Research & University of Toronto & Google Inc. Prelude I will start with three slides explaining a popular type of deep

More information

Ângelo Cardoso 27 May, Symbolic and Sub-Symbolic Learning Course Instituto Superior Técnico

Ângelo Cardoso 27 May, Symbolic and Sub-Symbolic Learning Course Instituto Superior Técnico BIOLOGICALLY INSPIRED COMPUTER MODELS FOR VISUAL RECOGNITION Ângelo Cardoso 27 May, 2010 Symbolic and Sub-Symbolic Learning Course Instituto Superior Técnico Index Human Vision Retinal Ganglion Cells Simple

More information

Neurons, Synapses, and Signaling

Neurons, Synapses, and Signaling CAMPBELL BIOLOGY IN FOCUS URRY CAIN WASSERMAN MINORSKY REECE 37 Neurons, Synapses, and Signaling Lecture Presentations by Kathleen Fitzpatrick and Nicole Tunbridge, Simon Fraser University SECOND EDITION

More information

BIOLOGY. 1. Overview of Neurons 11/3/2014. Neurons, Synapses, and Signaling. Communication in Neurons

BIOLOGY. 1. Overview of Neurons 11/3/2014. Neurons, Synapses, and Signaling. Communication in Neurons CAMPBELL BIOLOGY TENTH EDITION 48 Reece Urry Cain Wasserman Minorsky Jackson Neurons, Synapses, and Signaling Lecture Presentation by Nicole Tunbridge and Kathleen Fitzpatrick 1. Overview of Neurons Communication

More information

9 Generation of Action Potential Hodgkin-Huxley Model

9 Generation of Action Potential Hodgkin-Huxley Model 9 Generation of Action Potential Hodgkin-Huxley Model (based on chapter 12, W.W. Lytton, Hodgkin-Huxley Model) 9.1 Passive and active membrane models In the previous lecture we have considered a passive

More information

Principles of Artificial Intelligence Fall 2005 Handout #7 Perceptrons

Principles of Artificial Intelligence Fall 2005 Handout #7 Perceptrons Principles of Artificial Intelligence Fall 2005 Handout #7 Perceptrons Vasant Honavar Artificial Intelligence Research Laboratory Department of Computer Science 226 Atanasoff Hall Iowa State University

More information

37 Neurons, Synapses, and Signaling

37 Neurons, Synapses, and Signaling CAMPBELL BIOLOGY IN FOCUS Urry Cain Wasserman Minorsky Jackson Reece 37 Neurons, Synapses, and Signaling Lecture Presentations by Kathleen Fitzpatrick and Nicole Tunbridge Overview: Lines of Communication

More information

Hierarchy. Will Penny. 24th March Hierarchy. Will Penny. Linear Models. Convergence. Nonlinear Models. References

Hierarchy. Will Penny. 24th March Hierarchy. Will Penny. Linear Models. Convergence. Nonlinear Models. References 24th March 2011 Update Hierarchical Model Rao and Ballard (1999) presented a hierarchical model of visual cortex to show how classical and extra-classical Receptive Field (RF) effects could be explained

More information

لجنة الطب البشري رؤية تنير دروب تميزكم

لجنة الطب البشري رؤية تنير دروب تميزكم 1) Hyperpolarization phase of the action potential: a. is due to the opening of voltage-gated Cl channels. b. is due to prolonged opening of voltage-gated K + channels. c. is due to closure of the Na +

More information

Dynamic Modeling of Brain Activity

Dynamic Modeling of Brain Activity 0a Dynamic Modeling of Brain Activity EIN IIN PC Thomas R. Knösche, Leipzig Generative Models for M/EEG 4a Generative Models for M/EEG states x (e.g. dipole strengths) parameters parameters (source positions,

More information

Kantian metaphysics to mind-brain. The approach follows Bacon s investigative method

Kantian metaphysics to mind-brain. The approach follows Bacon s investigative method 82 Basic Tools and Techniques As discussed, the project is based on mental physics which in turn is the application of Kantian metaphysics to mind-brain. The approach follows Bacon s investigative method

More information

An analysis of how coupling parameters influence nonlinear oscillator synchronization

An analysis of how coupling parameters influence nonlinear oscillator synchronization An analysis of how coupling parameters influence nonlinear oscillator synchronization Morris Huang, 1 Ben McInroe, 2 Mark Kingsbury, 2 and Will Wagstaff 3 1) School of Mechanical Engineering, Georgia Institute

More information

Effects of Interactive Function Forms in a Self-Organized Critical Model Based on Neural Networks

Effects of Interactive Function Forms in a Self-Organized Critical Model Based on Neural Networks Commun. Theor. Phys. (Beijing, China) 40 (2003) pp. 607 613 c International Academic Publishers Vol. 40, No. 5, November 15, 2003 Effects of Interactive Function Forms in a Self-Organized Critical Model

More information

1 Balanced networks: Trading speed for noise

1 Balanced networks: Trading speed for noise Physics 178/278 - David leinfeld - Winter 2017 (Corrected yet incomplete notes) 1 Balanced networks: Trading speed for noise 1.1 Scaling of neuronal inputs An interesting observation is that the subthresold

More information

BIOLOGY 11/10/2016. Neurons, Synapses, and Signaling. Concept 48.1: Neuron organization and structure reflect function in information transfer

BIOLOGY 11/10/2016. Neurons, Synapses, and Signaling. Concept 48.1: Neuron organization and structure reflect function in information transfer 48 Neurons, Synapses, and Signaling CAMPBELL BIOLOGY TENTH EDITION Reece Urry Cain Wasserman Minorsky Jackson Lecture Presentation by Nicole Tunbridge and Kathleen Fitzpatrick Concept 48.1: Neuron organization

More information

Model of a Biological Neuron as a Temporal Neural Network

Model of a Biological Neuron as a Temporal Neural Network Model of a Biological Neuron as a Temporal Neural Network Sean D. Murphy and Edward W. Kairiss Interdepartmental Neuroscience Program, Department of Psychology, and The Center for Theoretical and Applied

More information

UNIT I INTRODUCTION TO ARTIFICIAL NEURAL NETWORK IT 0469 NEURAL NETWORKS

UNIT I INTRODUCTION TO ARTIFICIAL NEURAL NETWORK IT 0469 NEURAL NETWORKS UNIT I INTRODUCTION TO ARTIFICIAL NEURAL NETWORK IT 0469 NEURAL NETWORKS Elementary Neuro Physiology Neuron: A neuron nerve cell is an electricallyexcitable cell that processes and transmits information

More information

Clicker Question. Discussion Question

Clicker Question. Discussion Question Connectomics Networks and Graph Theory A network is a set of nodes connected by edges The field of graph theory has developed a number of measures to analyze networks Mean shortest path length: the average

More information

Lecture 7 Artificial neural networks: Supervised learning

Lecture 7 Artificial neural networks: Supervised learning Lecture 7 Artificial neural networks: Supervised learning Introduction, or how the brain works The neuron as a simple computing element The perceptron Multilayer neural networks Accelerated learning in

More information

Figure 1.1: Schematic symbols of an N-transistor and P-transistor

Figure 1.1: Schematic symbols of an N-transistor and P-transistor Chapter 1 The digital abstraction The term a digital circuit refers to a device that works in a binary world. In the binary world, the only values are zeros and ones. Hence, the inputs of a digital circuit

More information

Consider the following spike trains from two different neurons N1 and N2:

Consider the following spike trains from two different neurons N1 and N2: About synchrony and oscillations So far, our discussions have assumed that we are either observing a single neuron at a, or that neurons fire independent of each other. This assumption may be correct in

More information

Causality and communities in neural networks

Causality and communities in neural networks Causality and communities in neural networks Leonardo Angelini, Daniele Marinazzo, Mario Pellicoro, Sebastiano Stramaglia TIRES-Center for Signal Detection and Processing - Università di Bari, Bari, Italy

More information

DEVS Simulation of Spiking Neural Networks

DEVS Simulation of Spiking Neural Networks DEVS Simulation of Spiking Neural Networks Rene Mayrhofer, Michael Affenzeller, Herbert Prähofer, Gerhard Höfer, Alexander Fried Institute of Systems Science Systems Theory and Information Technology Johannes

More information

Action Potentials and Synaptic Transmission Physics 171/271

Action Potentials and Synaptic Transmission Physics 171/271 Action Potentials and Synaptic Transmission Physics 171/271 Flavio Fröhlich (flavio@salk.edu) September 27, 2006 In this section, we consider two important aspects concerning the communication between

More information

2002NSC Human Physiology Semester Summary

2002NSC Human Physiology Semester Summary 2002NSC Human Physiology Semester Summary Griffith University, Nathan Campus Semester 1, 2014 Topics include: - Diffusion, Membranes & Action Potentials - Fundamentals of the Nervous System - Neuroanatomy

More information

Intro and Homeostasis

Intro and Homeostasis Intro and Homeostasis Physiology - how the body works. Homeostasis - staying the same. Functional Types of Neurons Sensory (afferent - coming in) neurons: Detects the changes in the body. Informations

More information

Biomedical Instrumentation

Biomedical Instrumentation ELEC ENG 4BD4: Biomedical Instrumentation Lecture 5 Bioelectricity 1. INTRODUCTION TO BIOELECTRICITY AND EXCITABLE CELLS Historical perspective: Bioelectricity first discovered by Luigi Galvani in 1780s

More information

Addressing Challenges in Neuromorphic Computing with Memristive Synapses

Addressing Challenges in Neuromorphic Computing with Memristive Synapses Addressing Challenges in Neuromorphic Computing with Memristive Synapses Vishal Saxena 1, Xinyu Wu 1 and Maria Mitkova 2 1 Analog Mixed-Signal and Photonic IC (AMPIC) Lab 2 Nanoionic Materials and Devices

More information

Housekeeping, 26 January 2009

Housekeeping, 26 January 2009 5 th & 6 th Lectures Mon 26 & Wed 28 Jan 2009 Vertebrate Physiology ECOL 437 (MCB/VetSci 437) Univ. of Arizona, spring 2009 Neurons Chapter 11 Kevin Bonine & Kevin Oh 1. Finish Solutes + Water 2. Neurons

More information

Neurons. 5 th & 6 th Lectures Mon 26 & Wed 28 Jan Finish Solutes + Water. 2. Neurons. Chapter 11

Neurons. 5 th & 6 th Lectures Mon 26 & Wed 28 Jan Finish Solutes + Water. 2. Neurons. Chapter 11 5 th & 6 th Lectures Mon 26 & Wed 28 Jan 2009 Vertebrate Physiology ECOL 437 (MCB/VetSci 437) Univ. of Arizona, spring 2009 Neurons Chapter 11 Kevin Bonine & Kevin Oh 1. Finish Solutes + Water 2. Neurons

More information

Fuzzy System Composed of Analogue Fuzzy Cells

Fuzzy System Composed of Analogue Fuzzy Cells Fuzzy System Composed of Analogue Fuzzy Cells László Ormos *, István Ajtonyi ** * College of Nyíregyháza, Technical and Agricultural Faculty, Department Electrotechnics and Automation, Nyíregyháta, POB.166,

More information

DR.RUPNATHJI( DR.RUPAK NATH )

DR.RUPNATHJI( DR.RUPAK NATH ) Contents 1 Sets 1 2 The Real Numbers 9 3 Sequences 29 4 Series 59 5 Functions 81 6 Power Series 105 7 The elementary functions 111 Chapter 1 Sets It is very convenient to introduce some notation and terminology

More information

NEURONS, SENSE ORGANS, AND NERVOUS SYSTEMS CHAPTER 34

NEURONS, SENSE ORGANS, AND NERVOUS SYSTEMS CHAPTER 34 NEURONS, SENSE ORGANS, AND NERVOUS SYSTEMS CHAPTER 34 KEY CONCEPTS 34.1 Nervous Systems Are Composed of Neurons and Glial Cells 34.2 Neurons Generate Electric Signals by Controlling Ion Distributions 34.3

More information

Introduction and Perceptron Learning

Introduction and Perceptron Learning Artificial Neural Networks Introduction and Perceptron Learning CPSC 565 Winter 2003 Christian Jacob Department of Computer Science University of Calgary Canada CPSC 565 - Winter 2003 - Emergent Computing

More information

Online Interval Coloring and Variants

Online Interval Coloring and Variants Online Interval Coloring and Variants Leah Epstein 1, and Meital Levy 1 Department of Mathematics, University of Haifa, 31905 Haifa, Israel. Email: lea@math.haifa.ac.il School of Computer Science, Tel-Aviv

More information

CELL BIOLOGY - CLUTCH CH. 9 - TRANSPORT ACROSS MEMBRANES.

CELL BIOLOGY - CLUTCH CH. 9 - TRANSPORT ACROSS MEMBRANES. !! www.clutchprep.com K + K + K + K + CELL BIOLOGY - CLUTCH CONCEPT: PRINCIPLES OF TRANSMEMBRANE TRANSPORT Membranes and Gradients Cells must be able to communicate across their membrane barriers to materials

More information

Artificial Neural Networks

Artificial Neural Networks Artificial Neural Networks CPSC 533 Winter 2 Christian Jacob Neural Networks in the Context of AI Systems Neural Networks as Mediators between Symbolic AI and Statistical Methods 2 5.-NeuralNets-2.nb Neural

More information

CN2 1: Introduction. Paul Gribble. Sep 10,

CN2 1: Introduction. Paul Gribble. Sep 10, CN2 1: Introduction Paul Gribble http://gribblelab.org Sep 10, 2012 Administrivia Class meets Mondays, 2:00pm - 3:30pm and Thursdays, 11:30am - 1:00pm, in NSC 245A Contact me with any questions or to set

More information

NEURONS Excitable cells Therefore, have a RMP Synapse = chemical communication site between neurons, from pre-synaptic release to postsynaptic

NEURONS Excitable cells Therefore, have a RMP Synapse = chemical communication site between neurons, from pre-synaptic release to postsynaptic NEUROPHYSIOLOGY NOTES L1 WHAT IS NEUROPHYSIOLOGY? NEURONS Excitable cells Therefore, have a RMP Synapse = chemical communication site between neurons, from pre-synaptic release to postsynaptic receptor

More information