Synchronized Action of Synaptically Coupled Chaotic. Neurons: I. Simulations Using Model Neurons. and A. I. Selverston 3

Size: px
Start display at page:

Download "Synchronized Action of Synaptically Coupled Chaotic. Neurons: I. Simulations Using Model Neurons. and A. I. Selverston 3"

Transcription

1 Synchronized Action of Synaptically Coupled Chaotic Neurons: I. Simulations Using Model Neurons Henry D. I. Abarbanel 1;2, R. Huerta 2, M. I. Rabinovich 2, N. F. Rulkov 2, P. F. Rowat 3, and A. I. Selverston 3 1 Department of Physics and Marine Physical Laboratory Scripps Institution of Oceanography University of California, San Diego La Jolla, CA Institute for Nonlinear Science University of California, San Diego La Jolla, CA Department of Biology University of California, San Diego La Jolla, CA Abstract Experimental observations of the intracellular recorded electrical activity in individual neurons show that the temporal behavior is often chaotic. We discuss both our own observations on a cell from the stomatogastric central pattern generator of lobster and earlier observations in other cells. Our analysis of this chaotic behavior in our own measurements uses new tools for the analysis of time series from nonlinear sources, and we are able to better characterize the precise nature of the chaotic motions than has previously been possible. Using this observation about the temporal behavior of individual neurons, we study how chaotic neurons can communicate and synchronize as the beginning of our analysis of how central pattern generating neurons individually and in ensembles can act in a collective fashion to achieve biological functionality. In this paper we work with models of these chaotic neurons, building on models by Hindmarsh and Rose for bursting, spiking activity in neurons. The key features of these simplied models of neurons are the presence of coupled slow and fast subsystems. Our model neurons possess these features. We analyze the model neurons using the same tools employed in the analysis of our experimental data. We then couple two model neurons both electrotonically and electrochemically in inhibitory and excitatory fashions. In each of these cases we demonstrate that the model neurons can synchronize in phase and out of phase depending on the strength of the coupling. For normal synaptic coupling we have a time delay between the action of one neuron and the response of the other. We also analyze how the synchronization depends on this delay. A rich spectrum of synchronized behaviors is possible for electrically coupled neurons and for inhibitory coupling between neurons. In synchronous neurons one typically sees chaotic motion of the coupled neurons. Excitatory coupling produces essentially periodic voltage trajectories which are also synchronized. We display and discuss these synchronized behaviors using two `distance' measures of the synchronization. We also evaluate the mutual information between the coupled neuron voltages during synchronized and unsynchronized behavior as a quantitative measure of the information transmission between neurons. The key question we address in this paper is the transmission of information between two chaotic neurons in order to synchronize them for directed collective behavior. The role of information, as dened by Shannon [1], is widely recognized in the communications literature; viewing the synaptic connections among neurons as communications channels makes a useful connection between the dynamics of these neurons as nonlinear oscillators and ideas in contemporary dynamical systems [2]. Our studies suggest a number of interesting laboratory experiments using coupled neurons as well as a series of investigations on the behaviors of collections of coupled chaotic neurons. 1

2 Contents 1 Introduction 2 2 Individual Neurons Observed Low Dimensional (Chaotic) Behavior of Neurons : : : : : : : : : : : : : : : : Familiar Fourier Analysis Tools : : : : : : : : : : : : : : : : : : : : : : : : : : : Nonlinear Analysis of Observed Data : : : : : : : : : : : : : : : : : : : : : : : : Chaotic Neurons : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : Individual Rose Hindmarsh (RH) Model Neurons : : : : : : : : : : : : : : : : : : : : : 9 3 Synaptic Coupling of Two RH Neurons Electrical Coupling : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : `Distances' Between the Coupled Neurons : : : : : : : : : : : : : : : : : : : : : Information Transport Between Two Coupled Neurons : : : : : : : : : : : : : : Inhibitory Coupling : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : Excitatory Coupling : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : 22 4 Discussion and Conclusions Chaos and Mutual Information Neuronal Activity : : : : : : : : : : : : : : : : : : : : : 26 Appendix 27 1 Introduction Individual neurons coupled synaptically to form small functional networks or Central Pattern Generators (CPG) have cooperative properties related to the function they are called on to perform. The cooperative behavior of these coupled cells can be much richer and much more organized than the activity of the individual neurons forming the CPG. The isolated neural cells often exhibit chaotic motions as observed in the characteristics of intracellular voltage measurements, while some coordination or synchronization of these chaotic activities must be arranged for the directed cooperative behavior of the CPG's to manifest themselves. Our goal in this paper is to demonstrate, by examining the synaptic coupling of realistic model neurons, the interesting broad range of cooperative behaviors which arise when chaotic neurons are connected. Further, we want to understand how it is possible that the potentially very complex behavior which might transpire when chaotic neurons are coupled, in fact, reduces in a clean, dynamical way to rather simpler, often well organized motion. In our analysis of the way in which coupled chaotic neurons are able to correlate their actions, each of which can remain quite complex, we will use notions from nonlinear dynamics associated with the transmission of information among nonlinear oscillators, such as our neurons. We use the ideas of information as expressed in the communications literature [1], originally by Shannon ve decades ago. This information theoretic connection is attuned to the nonlinear aspects of the neuronal behavior and constitutes an appropriate generalization of familiar linear correlation statistics. The utility of information as a measure of nonlinear connection among coupled oscillators as well as the analysis of the chaotic motions of the individual neurons will be a central part of our discussion. Starting from the classical Hodgkin-Huxley [3] formulation of the dynamics of ionic channels, numerous simpler `reduced' models have been derived and extensively discussed [4]. To capture both 2

3 the bursting behavior and the spiking behavior observed in intracellular voltage measurements, one requires a combination of slow and fast subsystems acting in coordination. The fast subsystem is associated with the membrane voltage and rapid conduction channels, typically those due to Na + and K +. The slow subsystem is critical for bursting behavior on top of which the spikes occur. The dierences among the many models with fast and slow subsystems are characterized by the details of how individual neurons, abstracted to such simplied descriptions, produce spikes and bursts, but each contains the main qualitative behaviors seen in the laboratory. Here we are primarily concerned with the manner in which these neurons act in a cooperative fashion as a result of their synaptic coupling to each other, and in this work we have focused on the rich variety of features which can arise through this coupling. An important goal of our research eort is to determine the dynamical variables of such models which are predictive and have sound physiological grounding. To this end we will begin with the analysis of data from an isolated neuron from the lobster stomatogastric CPG, then we will turn to the analysis of our model neurons, and from our analyses subsequently turn to the study of further laboratory experiments. This paper concentrates on the rst two of these facets of our research. To this end we have adopted as our description for individual neurons the rather simple three degree of freedom model discussed by Rose and Hindmarsh (RH) [5, 6]. This model has as dynamical variables the membrane potential, an auxiliary variable corresponding to the various ionic channels, and a slow variable associated with other ion dynamics. The model shows bursting, spike phenomena, and chaotic bursting and spiking. All these neural actions are seen in measurements on neurons in the laboratory [7, 8]. In this paper we concentrate on the synchronized behavior of chaotic RH neurons when they are coupled by the various types of chemical and electrical synapses coupling observed in nature. Both excitatory and inhibitory forms of chemical synapses will be considered. There are two main issues in beginning this analysis: (1) the chaotic property of the individual neuron dynamics, and (2) the interpretation of synchronized behavior when two such chaotic neurons are coupled. We will consider chains and networks of coupled chaotic RH neurons in a subsequent paper. 2 Individual Neurons Rose and Hindmarsh (RH) [5, 6] study a variety of model neurons which are cast into a few ordinary dierential equations describing the membrane potential and various aspects of ion conduction important for various functions. Some of their models are two dimensional, but we know from the Poincare-Bendixson theorem that such models can never exhibit the chaotic bursting and spiking seen in real neurons. In this regard, they may be seen as oversimplications of actual neurons which contain numerous types and numbers of channels and may have tens of dynamical degrees of freedom. RH also study three degree of freedom models which, happily, exhibit the broader range of features, including chaotic bursting, seen in laboratory experiments on isolated neurons [7]. Even these three dimensional models, however, ignore many ion channels which would, perhaps, be required to extract detailed behavior of any real neuron. In preparation for adopting one of the three dimensional models of RH for our study of coupling between chaotic neurons, we begin with an analysis of some experimental observations on real neurons. 3

4 Membrane potential (mv) Time (sec.) (a) 10-1 (b) log 10 [P(f)] Frequency (Hz) Attractor for Isolated LP Data V(t+18) in mv V in mv V(t+9) in mv Figure 1: (a) Time trace of intracellular voltage in mv taken from a single LP neuron of the stomatogastric (STG) central pattern generator (CPG) of the lobster. The sampling time was s = 1 ms. No current was injected into the cell during the measurements. (b) The Fourier power spectrum from the single neuron in the lobster STG CPG. The apparent temporal irregularity translates into a continuous broadband spectrum consistent with the nonperiodic observations of the intracellular voltage. (c) State space reconstruction in dimension three for the time series shown in Fig. 1a. 4

5 2.1 Observed Low Dimensional (Chaotic) Behavior of Neurons We will be emphasizing the synchronization and associated communication among model neurons which individually exhibit chaotic behavior. To give a rationale for this emphasis we rst discuss the observed chaotic behavior of an isolated cell from the lobster Stomatogastric (STG) Central Pattern Generator (CPG) as observed in our laboratories, that is, a LP neuron from a pyloric CPG. Neurons were isolated from their physiological connections with other neurons in the stomatogastric systems by either placing dissociated neurons in culture or by pharmacological blockage of synaptic connections. The data consists of intracellular voltage taken at a data rate of 1 khz. The sampling time is s = 1ms. A typical time series is shown in Figure 1a. A total of 45,000 data points were available for our analysis. This is about forty ve seconds of real time data Familiar Fourier Analysis Tools The Fourier power spectrum associated with the time series is shown in Figure 1b. This is a remarkably featureless spectrum, and, as is often the case with chaotic data, is not revealing about the dynamics underlying the intracellular voltage signal. It does tell us in a somewhat more quantitative fashion than just inspection of the time series that the data are nonperiodic. These two quantities (time series and Fourier power spectrum) are informative about the dynamics without revealing anything about the number of degrees of freedom or independent autonomous dierential equations governing the time sequences observed. From the Fourier spectrum we can see that all the apparent energy is associated with the broadband background, and this is characteristic of chaotic motions [9, 2] as they are nonperiodic. The time series would also seem to fall in the category of `irregular' behavior. We need sharper tools to analyze the time series to uncover whether we are observing a few degrees of freedom operating in an essentially linear fashion but contaminated by substantial noise, or whether we might be seeing a characteristic manifestation of data from a nonlinear source which is intrinsically and deterministically spectrally broad and temporally nonperiodic Nonlinear Analysis of Observed Data To establish which of these alternatives to choose, we must create from the observed voltage measurements v(n) = v(t 0 + n s ) a coordinate system in which the degrees of freedom are unfolded from their projection on the observation axis of intracellular voltage. If the system is low dimensional, a few independent coordinates made from v(n) and its time lags v(n + kt ) = v(t 0 + (n + kt ) s ) will suce to unfold the geometric structure typical of a nonlinear dynamical system [9, 2]. If the observed time sequence is chaotic, this geometric structure, the attractor of the neuronal system, must be embedded in a coordinate system with dimension three or greater. This means we need to construct vectors in some d-dimensional space within which the neuron moves in time. The structure is parametrically labeled by time which is discrete because of the realities of the measurement process. The vectors y(n) = [v(n); v(n + T ); : : : ; v(n + (d? 1)T )] (1) provide a workable coordinate system in which to unfold the projection of the multidimensional dynamics onto the voltage axis. The theory behind this is found in the review article by Abarbanel, Brown, Sidorowich, and Tsimring [9] and in more detail in the forthcoming book by Abarbanel [2]. An initial question which must be answered has to do with the independence of the components of the d-dimensional data vectors y(n), and a nonlinear answer to this question is contained in the average mutual information [2, 10] between components of the vector. 5

6 Considering the measurements v(n) and v(n + T ) as two sets of observations A = fa i g = fv(n)g and B = fb j g = fv(n + T )g, the average mutual information I(T ) in bits between these two sets as a function of the time lag T is dened by [10, 1] X P AB (a i ; b j ) I(T ) = P AB (a i ; b j ) log 2 [ ]; (2) a i ;b j P A (a i )P B (b j ) where P A (a i ) is the normalized distribution of measurements from set A or the v(n), P B (b j ) is the normalized distribution of measurements from set B or the v(n + T ), and P AB (a i ; b j ) is the normalized joint distribution of the two sets of measurements. I(T ) is a generalized, nonlinear statistic over the observations which acts as a nonlinear correlation function for the observations at hand. For the data from the isolated neuron taken from the STG CPG whose time trace was seen in Figure 1a, we have in Figure 2a the average mutual information I(T ). It has become a useful and workable prescription to select the time lag corresponding to the rst minimum in I(T ) to be for the construction of the data vectors y(n). From the gure we see that the rst minimum is at T = 9 ms. To determine the number of coordinates required to unfold the geometry of the neuronal attractor, we need to ask in what dimension d E used in the data vectors y(n) do we no longer have overlaps of strands of the orbits because of projection from a higher dimension. In other words, we wish to know when a data point y(n) has neighbors associated with dynamics rather than with an incorrect and too small choice of dimension in which to view the data. The method of false nearest neighbors [11, 9, 2] directly answers this question by inquiring when the nearest neighbor of each data point y(i) in the full data set remains a nearby point when the dimension of the data vector is increased. When this number of true neighbors is 100%, we have unfolded the geometry of the dynamics. Additional dimensions for the data space are not required. This criterion is implemented by asking about the percentage of false nearest neighbors, and in Figure 2b we see this statistic as evaluated for the time series observed for the intracellular voltage in the isolated STG neuron. We see clearly that at dimension four or ve, the number of false neighbors is nearly zero, and remains nearly zero as dimension is increased. A close examination of the residual percentage of false nearest neighbors shows it levels o near 0.75%, and this is characteristic of slightly noisy measurements. This residual noise level is consistent with the environmental status of the experiment. The analysis has thus established that the dynamics of the isolated neuron can be captured in four dimensions. This is a global statement. It may be that the dynamics actually lies in a lower dimensional space, but because of the twisting and folding of the trajectory due to the nonlinear processes, the global space required to undo this twisting and folding is larger than the local dynamics in this space. To establish if this is the case, we utilize the test of local false nearest neighbors [12, 2] which asks what is happening locally on the attractor, and does so in a space larger than or equal to the dimension established in the global false nearest neighbors test. Here this means looking in global dimension ve or greater to see what local dimension of the dynamics captures the variation of the data. We have examined the STG isolated neuron data in global dimension d E = 12. This is done to give us a long region of dimensions in which the local false neighbors can show independence of dimension and of number of neighbors. We know that for any global dimension greater than ve or six, the number of global false neighbors is zero, so any working dimension for local false neighbors greater than this will do. In practice it is useful to choose the working dimension somewhat larger than the known unfolding dimension.then we ask, as a function of local dimension d L and number of 6

7 3.0 5 I(T) (a) Percentage of FNN (b) Timelag (msec) Dimension (c) 1.5 (d) P K N B =50 N B =100 N B =150 N B =200 Lyapunov Exponents Local Dimension; d L Time (sec) Figure 2: (a) The average mutual information as dened by Equation 2 for the observations of intracellular voltage in the single neurons from the lobster STG CPG. The rst minimum of this is shallow and is located at T = 9 ms. (b) Global false nearest neighbors for the intracellular voltage observations on the single neuron from the lobster STG CPG. The percentage of false nearest neighbors essentially reaches zero at global embedding dimension d E = 4. There is some residual in this statistic for dimensions greater than this reecting high dimensional contamination; i. e. noise. (c) Local false nearest neighbors for the intracellular voltage observations on the single neurons from the lobster STG CPG. The percentage of bad predictions P K becomes independent of local dimension d L and of the number of neighbors used to make local predictions at d L = 3. (d) Lyapunov exponents for the single LP neuron from the lobster STG CPG. We worked in d E = d L = 3 and used time lag T = 9 suggested by average mutual information. The system exhibits one positive Lyapunov exponent, one zero exponent indicating that dierential equations govern the data, and one negative exponents. 7

8 neighbors N B, how well a local polynomial predictor from the neighborhood around data point y(i) predict the observed points in the neighborhood around y(i+1). When the number of false predictions P K as shown in Figure 2c becomes independent of d L and N B, we have determined a local dimension for the dynamics. In Figure 2c we see that d L = 3 is selected by the data. This implies that we can make models of the behavior of this isolated neuron with ve degrees of freedom. More details on local dynamical dimension can be found in references [2, 9]. In Figure 1c one can see the limit set reconstructed from the LP time series plotted in Figure 1a. A nal question we will address concerning the possibility of chaos in the observed behavior of the isolated STG neuron regards the global Lyapunov exponents [9, 2, 13] which characterize stability of orbits when slight perturbations are made to them. Chaotic behavior is established by the presence of one or more positive global Lyapunov exponents. In Figure 2d we show the Lyapunov exponents for the isolated STG neuron as a function of the number of time steps. From Figure 2d we can see that there are one positive global Lyapunov exponent for this neuron, one zero exponent which tells us that ordinary dierential equations govern the behavior of this system, and one negative exponent. A fractal dimension [9, 2] can be established from these global exponents, and it is 2:75. With these analyses in place, we have clearly established that the dynamics of this neuron are chaotic. The key element in this was our evaluation of the global Lyapunov exponents of the dynamics represented by the measured intracellular voltage trace in the reconstructed phase space using time delays of the measurements. A positive global Lyapunov exponent is the hallmark of chaotic behavior in a dynamical system [2] Chaotic Neurons The evidence we have presented for chaotic behavior in observed neuron activity may be slightly more detailed than earlier analyses, but we are certainly not the rst to see this phenomenon. Hayashi and Ishizuka [7] describe in detail a series of experiments on a molluscan pacemaker neuron which shows that when certain levels of current are injected into the neuron chaotic appearing behavior is seen. The analysis of that experiment used tools such as Poincare sections and phase space portraits which are convincing, but not quantitative. The observations of Hayashi and Ishizuka were on isolated neurons as in the observations we report here. Mpitsos, et al [14] also present evidence for chaotic activity in Pleurobranchaea californica using phase space portraits and a computation of the correlation dimension from their observations. The neurons in the study of Mpitsos, et al were not isolated but located within a larger grouping. The qualitative aspects of their conclusions are nonetheless connected with our observations. These earlier observations of chaotic behavior, taken with our own, lead us to agree with Hayashi and Ishizuka [7] in their conclusion that chaotic oscillations in neurons is expected to be quite a normal state of neural activity. Indeed, when one considers the genericity of chaos [15] in dynamical systems described by three or more ordinary dierential equations coupled with the numerous ion channels operating in neural behavior, one should anticipate chaotic oscillations as the normal state of activity. We are adopting in this paper the point of view that a simple model of neural oscillations which contains both a slow and a fast subsystem leading to bursting, spiking behavior will operate in chaotic motion as its normal state. Our analysis will describe this chaotic behavior of a single neuron, but our main discussion will then take o from that and concentrate on the oscillations seen, in the model for the moment, in synaptic coupling between two of these model neurons. Our observations set the stage for experiments to be carried out to verify the appearance of the rich variety of these coupled oscillations. 8

9 2.2 Individual Rose Hindmarsh (RH) Model Neurons The model neuron we adopt here is familiar from the papers of Rose and Hindmarsh [5, 6] and the many subsequent analyses of their models. We will use a three variable model which describes in dimensionless units the membrane potential x, and auxiliary variable called y representing a set of fast ion channels connected with aspects of potassium and sodium transport, and nally a `slow' variable z which captures the slower dynamics of yet other ion channels. It is not our intention to repeat in any detail either the analysis of Rose and Hindmarsh and others in establishing the basis of this model or to repeat the analyses of others on the features of this model as a description of neuron activity. This model generates the time series that looks like the time series of an isolated LP neuron (Fig. 1a) and its strange attractor has the same topology as the one depicted in Figure 1c. Instead our starting point is that this model contains the appropriate mix of slow and fast dynamics to adequately describe the bursting and spiking behavior of observations in neural systems, and we shall discuss at some length in subsequent sections the interesting and fascinating phenomena which transpire when these model neurons are coupled together. First we establish some general aspects of the RH model which satises the dierential equations dx dy dz = y + (x)? z + I; = (x)? y; =?rz + rs(x? c x ); (3) where the two functions (x) and (x) are determined from empirical observations on voltage current relations. The choice suggested by Rose and Hindmarsh varies with the system they are describing in detail, but the following are conventional (x) = ax 2? x 3 = 3:0x 2? x 3 (x) = 1:0? bx 2 = 1:0? 5:0x 2 : (4) Similarly the linear appearance of the membrane voltage in the governing equation for the slow variable z is discussed at length by Hindmarsh and Rose. For some situations they choose an exponential dependence on membrane voltage for this term, but we have settled on the simpler linear description. It is our experience with this system that while details do depend on the functional choices, the basic features operating in the communication between and synchronization of neurons are not dependent on much beyond the appearance of slow and fast subsystems as featured in the model we adopt. The other parameters in the equations are chosen in our calculations as: the injected current I = 3:281, the voltage c x =?1:6, the scale of the inuence of the membrane voltage on the slow dynamics S = 4:0, and the time scale for the slow adaptation current r we choose as r = 0:0021 Little, if anything, in our analysis of synchronized behavior of these model neurons depends on the specic values of these parameters. This choice does result in chaotic behavior. To see this we display in Figures 3a and 3b two samples of the same time series from the solution of these equations. The equations were solved with a very ne time step using a fourth order Runge-Kutta scheme. This solution oversampled the waveforms but assured us that all interesting variation in the dynamics would be represented. The output from the solution was desampled to produce a dimensionless time step t = 1:0 to produce the displays in Figures 3a and 3b. The analysis of this chaotic output from the 9

10 Membrane Potential, x Time (a) Membrane Potential, x Time (b) Figure 3: (a) Times series from our model neuron, Equation (3), with parameters I = 3:281; S = 4:0; c x =?1:6, and r = 0:0021. (b) Blowup of part of the previous time series from the RH Neuron. RH model neuron can, of course, be analyzed using the same tools as we applied to the experimental data in an earlier section. We use only the data from the membrane voltage x coming from our integration of the model equations. The equations themselves are not utilized. The global false nearest neighbors is shown in Figure 4a and blown up in Figure 4b where d E = 4 is indicated. That this number is greater than the obvious three degrees of freedom of the dierential equations (3) has to do with the fact we are using a coordinate system created out of x and its time lags. This coordinate system is equivalent in the sense that dynamical properties of the attractor in (x; y; z) space are the same as when evaluated in (x; x(t + T ); x(t + 2T ); : : :) space, but there is no reason that the dimension of the reconstructed space must be equal to the dynamical dimension 3. Since the equations are dissipative, there will be an attractor of dimension d A < 3 on which orbits reside. The reconstruction theorem [2, 16, 17] tells us that d E > 2d A, where d E is an integer but d A need not be, is sucient to unfold the attractor of dimension d A using time delay or any other smooth set of coordinates. The dynamical dimension d L or local dimension of the dynamics must, of course, be d L = 3 since that is what we put into the signal x. In Figure 4c we have the local false nearest neighbors which converges in a sharp and unambiguous fashion at d L = 3 which we already know is correct for this model neuron. d L < d E is quite natural as global coordinate systems may have to unfold considerable twisting and folding of an attractor when the coordinates are constructed from time delays or other quantities not directly the original dynamical variables [9, 2]. Many key features of this model are captured in the simple properties of the vector eld (the right hand side) of Equation (3) in the case when z is ignored. We then have three equilibrium points and a stable limit cycle on the (x; y) phase plane. These are shown in Figure 5. The stable separatrix of the equilibrium saddle point, 10

11 Percentage of FNN (a) Dimension Percentage of FNN (b) P K Dimension N B =10 N B =20 N B =30 N B = Local Dimension;d L (c) Figure 4: (a) Percentage of false nearest neighbors for data from the model neuron of RH. We use a time delay of T = 15 for this analysis. The percentage of false nearest neighbors goes to zero at d E = 3 using this set of coordinates y(n) = [v(n); v(+15); v(n + 30)]. (b) The percentage of false nearest neighbors for T = 40. We see the eect of chaotic decoupling of the coordinates [v(n); v(n+ 40); v(n + 80)]. The change in T is a change of coordinate systems, and the change in the required d E is apparent. (c) The local false nearest neighbors statistic for the time series from the model RH neuron. The selection of the correct d L = 3 of the model is clear as the percentage of bad predictions becomes independent of the number of neighbors N B and the local dimension d L at d L = 3. 11

12 y x dx = 0 Unstable Equilibrium Point Stable Equilibrium Point Stable Limit Cycle dy = 0 Separatrix Figure 5: The nullclines of the RH model. labeled RS for resting state in Figure 5, is the basin boundary for the limit cycle. When one changes the initial condition for this reduced model or equivalently adds a short impulse through the synaptic current I, it is possible to bump the system behavior from one attractor, namely the stable equilibrium point, to the other attractor, namely the stable limit cycle. If we hold the slow variable z at zero, then when the model neuron is properly triggered, it can enter an indenitely long period of repetitive ring; this is just the limit cycle behavior. When the slow dynamics is turned on, however, the repetitive ring can cease and the neuron will return to its resting state. This resting state corresponds to the saddle node RS. After a relaxation time of order 1 the system may move back to z 0 and the repetitive ring will resume. As it resumes, r depending on the system parameters, one will see regular ring, namely periodic motions, or chaotic time traces. 3 Synaptic Coupling of Two RH Neurons 3.1 Electrical Coupling We now turn our attention to the behavior of two of our model neurons when they are coupled. Three kinds of coupling are of interest here. The rst is a simple electrical coupling which treats the channel between the neurons as a wire with no structural properties. We model this electrotonic coupling as dx 1 = y 1 + (x 1 )? z 1 + I?( + )(x 1? x 2 ); 12

13 dy 1 dz 1 dx 2 dy 2 dz 2 = (x 1 )? y 1 ; =?rz 1 + rs(x 1? c x ); = y 2 + (x 2 )? z 2 + I?( + )(x 2? x 1 ); = (x 2 )? y 2 =?rz 2 + rs(x 2? c x ): (5) These are two identical model neurons of the form discussed above coupled with a parameter which is a conductance for the `wire' connecting them. The quantity is Gaussian, white noise which has zero mean and RMS amplitude. We add this noise term, which we restrict to have very small amplitude, for two reasons: All laboratory measurements are noisy and to provide a simple source for this continual perturbation of our coupled systems, we have placed these nondeterministic or high dimensional uctuations in the coupling mechanism among the chaotic neurons, and when one couples together model neurons with three degrees of freedom each, the total possible phase space can become both quite large and rich in ne structure and details of basin boundaries in phase space which, as all measurements are really noisy, we have no chance of observing in the laboratory. To smooth out these complex details, we attribute some noise to the coupling among neurons. One could have attributed the noise to the individual neurons themselves and accomplished essentially the same goal. This electrical coupling is one which has been much studied in the physical sciences literature [18, 20, 19], and it is known that it often possesses a submanifold of the full six dimensional phase space where x 1 = x 2 ; y 1 = y 2 ; and z 1 = z 2 : (6) On this submanifold we clearly have identical chaotic oscillations of the two neuron oscillators for on this manifold the coupling term is precisely zero and the individual dynamics is just the same as if the neurons were not coupled at all. When the submanifold is stable, that is it is an attractor, the neurons are synchronized yet chaotic. We can show that the synchronization, and chaotic motion, will occur with certainty if the coupling is strong enough. To see that introduce the dierence variables for the coupled system v = x 1? x 2 ; u = y 1? y 2 w = z 1? z 2 ; (7) representing the distance from the submanifold. governing u; v; w are If we ignore noise = 0, and the equations dv = u? (a(x 1 + x 2 )? (x x 2 2) 13

14 du dw with a = 3:0, and b = 5:0. If we now construct the Lyapunov function +x 1 x 2 ) + 2)v? w =?b(x 1 + x 2 )v? u =?rw + rsv; (8) L = v2 2 then from the equations of motion we nd + 2au2 b 2 + w2 2rS ; (9) with dl =?'(; t)v 2? [8u? b2 v + 4bv(x 1 + x 2 )] 2 16ad 2? w2 S ; (10) '(; t) = 2 + 3x 1 x 2 + ( b 2? a)(x 1 + x 2 )? b2 16 : (11) When '(; t) > 0, then L will be a Lyapunov function for the coupled system of neurons. In this case both the xed point (v; u; w) = (0; 0; 0) as well as the submanifold above will be stable for any initial condition. This translates to the statement that synchronized motions are assured when 2 > max f?3x 1 x 2? ( b 2? a)(x 1 + x 2 ) + b2 g: (12) 16 In the Appendix we show that all trajectories in the full six dimensional space of the coupled neurons move in a bounded domain of the full space and cannot depart from it. This means that for large enough, the synchronization condition is certainly satised. Now we look at some numerical evidence for synchronized behavior of these model neurons with electrotonic coupling. We set the RMS level of the white noise at = 0:005. To exhibit the synchronization we introduce two useful measures of the `distance' between the neuron activity. In these measures of distance we concentrate on the connection between the membrane potentials x 1 and x 2 in the neurons. Since the neurons are identical, it is sucient to examine this connection `Distances' Between the Coupled Neurons The rst distance measure is essentially the RMS dierence between the x 1 motion and the x 2 motion with the dierence that we recognize from the outset that a shift in the timing of the chaotic motions of the two neurons may occur due to the coupling. To allow for this we dene the average distance D(; ) between the two neural behaviors as D 2 (; ) = 1 N s XN s k=1 (x 1 (k)? x 2 (k + )) 2 ; (13) where is the possible time shift needed to `align' the two neurons and N s is the total number of samples. To distinguish between the bursting and spiking motions and examine the synchronization in the bursting alone, we delimit the values of x i by replacing any values greater than -1 with -1. This 14

15 D(τ min,) DB(τ min,) (a) (b) (c) τ min() Figure 6: (a) The distance statistic, Equation (13), for electrically coupled RH neurons. The RMS noise level introduced into the coupling is = 0:005. Near 0:2 and for 0:5 the model neurons are synchronized. (b) The bursting distance statistic for electrically coupled RH neurons. The RMS noise level introduced into the coupling is = 0:005. Near 0:2 and for 0:5 the model neurons are synchronized. (c) The time at which the distance statistic, Equation (13), is a minimum as a function of. The synchronization near 0:2 apparent in Figure 12 requires a large time delay and is antiphase synchronization. The synchronization for 0:5 is in phase. 15

16 eliminates the spikes and retains the bursts in the membrane potential. A high pass ltering of the data would achieve the same end. When this delimiting is employed, we call the average distance the bursting average distance DB(; ). In characterizing the synchronization we have examined the D(; ) and DB(; ) as a function of the coupling at that value of for which each distance measure is minimum. This is nothing but tracking along the valley of minimum values of D(; ) or DB(; ) and labeling the location along that valley by. In Figure 6a we show D( min ; ) which is the RMS distance as dened above for the time shift min which at xed coupling yield the minimum value of D(; ). It is clear that for just a bit bigger than 0.5 and in a region about 0:2 the neurons appear synchronized. In Figure 6b we have DB( min ; ) which exhibits more or less the same characteristics as D( min ; ) but also suggests that some synchronization among the slow or bursting motions of the neurons may be appearing near 0:04. A third characteristic is achieved by looking at the actual minimum time shifts min which minimize DB(; ) at each. This function min () is shown in Figure 6c. This reveals an interesting new possibility for the synchronized neurons. For couplings in the neighborhood of 0:2 we see that synchronization of the bursts is reached with a nonzero min which is nearly the same for a range of. The synchronization for other values of is reached with min 0. For 0:2 this suggests that the neurons are synchronized but out of phase with each other. This is quite dierent from the synchronization on the submanifold discussed above which we know is reached for large. The same phenomena for \limit-cycle" neuron models were observed and discussed the last few years [21, 22]. Here, we have a more general case: regularization of chaos and antiphase locking for coupled chaotic oscillators. Looking now at the time traces of x 1 and x 2 for 0:2 we can see in Figures 7a, 7b, and 7c visual evidence of the various kinds of phase synchronization. Figure 7a shows the out of phase synchronization which occurs when 0:2. In Figure 7b we have 0:4 and see evidence of a partial synchronization of the two coupled HR neurons. When we raise to 0.8, as in Figure 7c we see full synchronization of the two model neurons Information Transport Between Two Coupled Neurons The key question we are addressing in this paper is the transmission of information between two chaotic neurons in order to synchronize them for directed collective behavior. To examine synchronization then from this point of view we now evaluate the average mutual information between measurements of the membrane potential in the rst neuron x 1 and the membrane potential in the second neuron x 2 (t + ) at some time lag between the measurements. To evaluate the average mutual information from the measurements x 1 and x 2 we bin the data into M bins by dening discrete variables s 1 (n) and s 2 (n) through x j (n) <?2 then s j (n) = h 0?2 + 4 k? 1 M? 1 x k j(n) <?2 + 4 M? 1 then s j(n) = h k ; k = 1; 2; : : : ; ; M? 2 x j (n) 2 then s j (n) = h M?1; (14) where h 0 ; h 1 ; : : : ; h M?1 are any set of M `letters" designating bins for the values of the x j (n). Using this bining of the data for membrane potential for each of the two coupled neurons, we now identify in Equation (2) a i = number of points with s 1 (n) = h i and b j = number of points with s 2 (n + ) = h j, and evaluate M?1 X P AB (a i ; b j ) I(; ) = P AB (a i ; b j ) log 2 [ ]: (15) P A (a i )P B (b j ) i;j=0 16

17 2.0 x 1 x 2 Membrane potential - (a) Time 2.0 x 1 x 2 Membrane potential - (b) Time 2.0 x 1 x 2 Membrane potential Time Figure 7: (a) Membrane potentials x 1 and x 2 from electrically coupled RH neurons for = 0:2 The synchronization is antiphase. (b) Membrane potentials x 1 and x 2 from electrically coupled RH neurons for = 0:4 The synchronization is not complete, but nearly in phase. (c) Membrane potentials x 1 and x 2 from electrically coupled RH neurons for = 0:4 The synchronization is complete and in phase. 17

18 We evaluate I(; ) at each value of the neuron coupling as a function of and select that for which it is a maximum. This gives us that time lag max between the measurements of the two membrane potentials at which the most information is learned about one by observing the other. Neuron two will receive the maximum information about the actions of neuron two at any given value of the coupling by sensing membrane potential two at this time lag. When the two neurons are synchronized, the communications channel between them, namely the synapse which is here an electrotonic connection only, should transmit the most information. This maximum is the entropy of either neuron considered as a source of information. The entropy is dened by M?1 S 1 X () =? P A (a j ) log 2 [P A (a j )]; (16) j=0 with a similar denition for S 2 (). With identical neurons, as we have here, the distributions of the observations among the `letters' will be identical. If we normalize the average mutual information by the entropy, the ratio I(max; )=S i () (17) should be unity when the systems are synchronized. In Figure 8a we display the average mutual information at max as a function of. The synchronization for 0:4 and greater is clear. Figure 8b shows max itself as a function of, showing both the regions of antiphase synchronization near 0:2 and the in phase synchronization and near synchronization for other values of. The nal quantity we evaluate for these electrically coupled neurons is the spectrum of Lyapunov exponents. We determine these exponents directly from the dierential equations (3). For this we retain the low noise level ( = 0:005) and utilize data points for the calculation. We use a numerical method which starts the calculation with a random orthonormal basis for the Jacobian matrices for our ow and renormalize the growth at each point. Figure 8c displays the largest four of the Lyapunov exponents for the coupled neurons as a function of. The remaining two are large in magnitude and quite negative compared to these four. We see that the antiphase synchronization which occurs near 0:2 is not chaotic as all exponents are zero or less. For the other synchronization we have chaotic bursting for the individual neurons and chaotic behavior of the coupled system as well. 3.2 Inhibitory Coupling We have also simulated realistic synaptic connections among the neurons as well as direct electrical connections. These connections allow for transmission of information between and among neurons which diers from direct electrotonic couplings. The diusion of neurotransmitters associated with this type of coupling leads to a time delay in the action of one neuron on another. We choose to represent this as a response in the membrane potential of one neuron which is delayed and subject to a threshold over which the potential must rise along with that delay. In the equation for the membrane potential of neuron one x 1, for example, we will add a response associated with the behavior of neuron two's potential x 2 of the form? ( + )(x 1 + V c )(x 2 (t? c )? X); (18) where, as above, is the strength of the coupling and is a very small zero mean noise term with RMS magnitude. The new ingredients in this coupling term are the thresholding associated with 18

19 I(T=τ max )/S (a) (b) τ max () Lyapunov exponents λ 1 λ 2 λ 3 λs (c) Figure 8: (a) The average mutual information at the time delay where it is a maximum as a function of the coupling for electrically coupled RH neurons. The average mutual information is normalized by the entropy of the individual neurons. When this ratio is unity, the maximum amount of information possible is being transported from one neuron to the other. The information transmission is maximal for the in phase synchronization which occurs for 0:5. (b) The time max() at which the average mutual information between electrically coupled RH neurons is a maximum. The antiphase synchronization for 0:2 and the in phase synchronization for 0:5 are revealed here. (c) The four largest Lyapunov exponents for the electrically coupled RH neurons as a function of the coupling strength. These are evaluated directly from the equations of motion. When we have antiphase synchronization near 0:2 the system is not chaotic since all a 0, while for 0:5, the coupled system is chaotic as one of the exponents is positive. 19

20 the Heaviside function (w) which is unity for w > 0 and vanishes for w < 0. In addition we have a reverse potential V c which tells us the magnitude of the response to the threshold, and we have a threshold X over which the other neuron's membrane potential must have rise at a time delayed by c. The response to potential x 1 on the part of x 2 will simply have 1 $ 2 in the coupling term. Inhibitory coupling of the neurons is associated with a reverse potential V c = 1:25, while we achieve excitatory coupling by choosing V c = 0:0. The threshold potential is X = 0:85 is each case. This section considers the result of inhibitory coupling which leads to the six dierential equations among the membrane potentials and the fast and slow auxiliary variables dx 1 dy 1 dz 1 dx 2 dy 2 dz 2 = y 1 + (x 1 )? z 1 + I?( + )(x 1 + V c )(x 2 (t? c )? X); = (x 1 )? y 1 ; =?rz 1 + rs(x 1? c x ); = y 2 + (x 2 )? z 2 + I?( + )(x 2 + V c )(x 1 (t? c )? X); = (x 2 )? y 2 =?rz 2 + rs(x 2? c x ); (19) and we use the parameters = 0:005; X = 0:85, and c x = 1:6 in our work here. V c = 1:4 for the inhibitory coupling. The HR neurons we have coupled in this time delayed fashion are identical, as embodied in our totally symmetric couplings. We shall not consider asymmetric couplings or coupling diering neurons in this paper, but we plan to return to these cases. Although there are now six ordinary dierential equations representing the coupled behavior of two HR neurons because of the time delay involved in the equations, there are now, using the usual description of degrees of freedom in dierential equations, and innite number of degrees of freedom. The time delay leads via Taylor series of expressions such as x 1 (t? c ) to an innite number of derivatives of x 1 appearing in the equations. This means that the phase space or state space of these coupled systems could be very large indeed, and our demonstration below that the space occupied by the solutions to these coupled neurons is, in fact, quite small. We have examined the solutions to these coupled equations using the parameters just mentioned as well as the parameters utilized above in our discussion of electrical coupling. We are unable to give the same argument via a Lyapunov function that for large enough coupling we will have synchronization between our identical HR neurons, but synchronization does occur, and we have uncovered it by numerical work. The qualitative aspects of this are displayed in Figure 9a which is a kind of phase diagram for this system. We indicate the dierent behaviors seen as we vary both the coupling strength and the time delay c. To further illustrate the typical behaviors one nds in the various regions of ( c ; ) space, we show in Figure 9b time traces from x 1 and x 2. The nearly synchronized and nearly in phase behavior of the two neurons varies little with within the region denoted I in Figure 9a. The behavior in Region I' is shown in Figure 9c. Here the two neurons are completely synchronized and fully in phase. Region II is typied by the time traces in Figure 9d. In this region the neurons are completely out of phase but clearly synchronized. Finally Region II. as seen in Figure 9e, shows a slight variation on Region 20

2 1. Introduction. Neuronal networks often exhibit a rich variety of oscillatory behavior. The dynamics of even a single cell may be quite complicated

2 1. Introduction. Neuronal networks often exhibit a rich variety of oscillatory behavior. The dynamics of even a single cell may be quite complicated GEOMETRIC ANALYSIS OF POPULATION RHYTHMS IN SYNAPTICALLY COUPLED NEURONAL NETWORKS J. Rubin and D. Terman Dept. of Mathematics; Ohio State University; Columbus, Ohio 43210 Abstract We develop geometric

More information

From neuronal oscillations to complexity

From neuronal oscillations to complexity 1/39 The Fourth International Workshop on Advanced Computation for Engineering Applications (ACEA 2008) MACIS 2 Al-Balqa Applied University, Salt, Jordan Corson Nathalie, Aziz Alaoui M.A. University of

More information

Math 1270 Honors ODE I Fall, 2008 Class notes # 14. x 0 = F (x; y) y 0 = G (x; y) u 0 = au + bv = cu + dv

Math 1270 Honors ODE I Fall, 2008 Class notes # 14. x 0 = F (x; y) y 0 = G (x; y) u 0 = au + bv = cu + dv Math 1270 Honors ODE I Fall, 2008 Class notes # 1 We have learned how to study nonlinear systems x 0 = F (x; y) y 0 = G (x; y) (1) by linearizing around equilibrium points. If (x 0 ; y 0 ) is an equilibrium

More information

Phase-Space Reconstruction. Gerrit Ansmann

Phase-Space Reconstruction. Gerrit Ansmann Phase-Space Reconstruction Gerrit Ansmann Reprise: The Need for Non-Linear Methods. Lorenz oscillator x = 1(y x), y = x(28 z) y, z = xy 8z 3 Autoregressive process measured with non-linearity: y t =.8y

More information

K. Pyragas* Semiconductor Physics Institute, LT-2600 Vilnius, Lithuania Received 19 March 1998

K. Pyragas* Semiconductor Physics Institute, LT-2600 Vilnius, Lithuania Received 19 March 1998 PHYSICAL REVIEW E VOLUME 58, NUMBER 3 SEPTEMBER 998 Synchronization of coupled time-delay systems: Analytical estimations K. Pyragas* Semiconductor Physics Institute, LT-26 Vilnius, Lithuania Received

More information

Activity Driven Adaptive Stochastic. Resonance. Gregor Wenning and Klaus Obermayer. Technical University of Berlin.

Activity Driven Adaptive Stochastic. Resonance. Gregor Wenning and Klaus Obermayer. Technical University of Berlin. Activity Driven Adaptive Stochastic Resonance Gregor Wenning and Klaus Obermayer Department of Electrical Engineering and Computer Science Technical University of Berlin Franklinstr. 8/9, 187 Berlin fgrewe,obyg@cs.tu-berlin.de

More information

Chimera states in networks of biological neurons and coupled damped pendulums

Chimera states in networks of biological neurons and coupled damped pendulums in neural models in networks of pendulum-like elements in networks of biological neurons and coupled damped pendulums J. Hizanidis 1, V. Kanas 2, A. Bezerianos 3, and T. Bountis 4 1 National Center for

More information

Chaos and Liapunov exponents

Chaos and Liapunov exponents PHYS347 INTRODUCTION TO NONLINEAR PHYSICS - 2/22 Chaos and Liapunov exponents Definition of chaos In the lectures we followed Strogatz and defined chaos as aperiodic long-term behaviour in a deterministic

More information

THE CONTROL OF CHAOS: THEORY AND APPLICATIONS

THE CONTROL OF CHAOS: THEORY AND APPLICATIONS S. Boccaletti et al. / Physics Reports 329 (2000) 103}197 103 THE CONTROL OF CHAOS: THEORY AND APPLICATIONS S. BOCCALETTI, C. GREBOGI, Y.-C. LAI, H. MANCINI, D. MAZA Department of Physics and Applied Mathematics,

More information

Bursting Oscillations of Neurons and Synchronization

Bursting Oscillations of Neurons and Synchronization Bursting Oscillations of Neurons and Synchronization Milan Stork Applied Electronics and Telecommunications, Faculty of Electrical Engineering/RICE University of West Bohemia, CZ Univerzitni 8, 3064 Plzen

More information

Consider the following spike trains from two different neurons N1 and N2:

Consider the following spike trains from two different neurons N1 and N2: About synchrony and oscillations So far, our discussions have assumed that we are either observing a single neuron at a, or that neurons fire independent of each other. This assumption may be correct in

More information

MAT 22B - Lecture Notes

MAT 22B - Lecture Notes MAT 22B - Lecture Notes 4 September 205 Solving Systems of ODE Last time we talked a bit about how systems of ODE arise and why they are nice for visualization. Now we'll talk about the basics of how to

More information

Modeling chaotic motions of a string from experimental data. Kevin Judd and Alistair Mees. The University of Western Australia.

Modeling chaotic motions of a string from experimental data. Kevin Judd and Alistair Mees. The University of Western Australia. Modeling chaotic motions of a string from experimental data Kevin Judd and Alistair Mees Department of Mathematics The University of Western Australia March 3, 995 Abstract Experimental measurements of

More information

One dimensional Maps

One dimensional Maps Chapter 4 One dimensional Maps The ordinary differential equation studied in chapters 1-3 provide a close link to actual physical systems it is easy to believe these equations provide at least an approximate

More information

Stochastic Oscillator Death in Globally Coupled Neural Systems

Stochastic Oscillator Death in Globally Coupled Neural Systems Journal of the Korean Physical Society, Vol. 52, No. 6, June 2008, pp. 19131917 Stochastic Oscillator Death in Globally Coupled Neural Systems Woochang Lim and Sang-Yoon Kim y Department of Physics, Kangwon

More information

A Novel Three Dimension Autonomous Chaotic System with a Quadratic Exponential Nonlinear Term

A Novel Three Dimension Autonomous Chaotic System with a Quadratic Exponential Nonlinear Term ETASR - Engineering, Technology & Applied Science Research Vol., o.,, 9-5 9 A Novel Three Dimension Autonomous Chaotic System with a Quadratic Exponential Nonlinear Term Fei Yu College of Information Science

More information

Dynamical Systems in Neuroscience: Elementary Bifurcations

Dynamical Systems in Neuroscience: Elementary Bifurcations Dynamical Systems in Neuroscience: Elementary Bifurcations Foris Kuang May 2017 1 Contents 1 Introduction 3 2 Definitions 3 3 Hodgkin-Huxley Model 3 4 Morris-Lecar Model 4 5 Stability 5 5.1 Linear ODE..............................................

More information

The Phase Response Curve of Reciprocally Inhibitory Model Neurons Exhibiting Anti-Phase Rhythms

The Phase Response Curve of Reciprocally Inhibitory Model Neurons Exhibiting Anti-Phase Rhythms The Phase Response Curve of Reciprocally Inhibitory Model Neurons Exhibiting Anti-Phase Rhythms Jiawei Zhang Timothy J. Lewis Department of Mathematics, University of California, Davis Davis, CA 9566,

More information

Dynamical Systems and Chaos Part I: Theoretical Techniques. Lecture 4: Discrete systems + Chaos. Ilya Potapov Mathematics Department, TUT Room TD325

Dynamical Systems and Chaos Part I: Theoretical Techniques. Lecture 4: Discrete systems + Chaos. Ilya Potapov Mathematics Department, TUT Room TD325 Dynamical Systems and Chaos Part I: Theoretical Techniques Lecture 4: Discrete systems + Chaos Ilya Potapov Mathematics Department, TUT Room TD325 Discrete maps x n+1 = f(x n ) Discrete time steps. x 0

More information

NONLINEAR TIME SERIES ANALYSIS, WITH APPLICATIONS TO MEDICINE

NONLINEAR TIME SERIES ANALYSIS, WITH APPLICATIONS TO MEDICINE NONLINEAR TIME SERIES ANALYSIS, WITH APPLICATIONS TO MEDICINE José María Amigó Centro de Investigación Operativa, Universidad Miguel Hernández, Elche (Spain) J.M. Amigó (CIO) Nonlinear time series analysis

More information

Electrophysiology of the neuron

Electrophysiology of the neuron School of Mathematical Sciences G4TNS Theoretical Neuroscience Electrophysiology of the neuron Electrophysiology is the study of ionic currents and electrical activity in cells and tissues. The work of

More information

Title. Author(s)Fujii, Hiroshi; Tsuda, Ichiro. CitationNeurocomputing, 58-60: Issue Date Doc URL. Type.

Title. Author(s)Fujii, Hiroshi; Tsuda, Ichiro. CitationNeurocomputing, 58-60: Issue Date Doc URL. Type. Title Neocortical gap junction-coupled interneuron systems exhibiting transient synchrony Author(s)Fujii, Hiroshi; Tsuda, Ichiro CitationNeurocomputing, 58-60: 151-157 Issue Date 2004-06 Doc URL http://hdl.handle.net/2115/8488

More information

An Introductory Course in Computational Neuroscience

An Introductory Course in Computational Neuroscience An Introductory Course in Computational Neuroscience Contents Series Foreword Acknowledgments Preface 1 Preliminary Material 1.1. Introduction 1.1.1 The Cell, the Circuit, and the Brain 1.1.2 Physics of

More information

Fundamentals of Dynamical Systems / Discrete-Time Models. Dr. Dylan McNamara people.uncw.edu/ mcnamarad

Fundamentals of Dynamical Systems / Discrete-Time Models. Dr. Dylan McNamara people.uncw.edu/ mcnamarad Fundamentals of Dynamical Systems / Discrete-Time Models Dr. Dylan McNamara people.uncw.edu/ mcnamarad Dynamical systems theory Considers how systems autonomously change along time Ranges from Newtonian

More information

Timing regulation in a network reduced from voltage-gated equations to a one-dimensional map

Timing regulation in a network reduced from voltage-gated equations to a one-dimensional map J. Math. Biol. (1999) 38: 479}533 Timing regulation in a network reduced from voltage-gated equations to a one-dimensional map Thomas LoFaro, Nancy Kopell Department of Pure and Applied Mathematics, Washington

More information

Delay Coordinate Embedding

Delay Coordinate Embedding Chapter 7 Delay Coordinate Embedding Up to this point, we have known our state space explicitly. But what if we do not know it? How can we then study the dynamics is phase space? A typical case is when

More information

LECTURE 8: DYNAMICAL SYSTEMS 7

LECTURE 8: DYNAMICAL SYSTEMS 7 15-382 COLLECTIVE INTELLIGENCE S18 LECTURE 8: DYNAMICAL SYSTEMS 7 INSTRUCTOR: GIANNI A. DI CARO GEOMETRIES IN THE PHASE SPACE Damped pendulum One cp in the region between two separatrix Separatrix Basin

More information

INTRODUCTION TO CHAOS THEORY T.R.RAMAMOHAN C-MMACS BANGALORE

INTRODUCTION TO CHAOS THEORY T.R.RAMAMOHAN C-MMACS BANGALORE INTRODUCTION TO CHAOS THEORY BY T.R.RAMAMOHAN C-MMACS BANGALORE -560037 SOME INTERESTING QUOTATIONS * PERHAPS THE NEXT GREAT ERA OF UNDERSTANDING WILL BE DETERMINING THE QUALITATIVE CONTENT OF EQUATIONS;

More information

... it may happen that small differences in the initial conditions produce very great ones in the final phenomena. Henri Poincaré

... it may happen that small differences in the initial conditions produce very great ones in the final phenomena. Henri Poincaré Chapter 2 Dynamical Systems... it may happen that small differences in the initial conditions produce very great ones in the final phenomena. Henri Poincaré One of the exciting new fields to arise out

More information

COMMENTARY THE ROLE OF CHAOS IN NEURAL SYSTEMS

COMMENTARY THE ROLE OF CHAOS IN NEURAL SYSTEMS Pergamon Neuroscience Vol. 87, No. 1, pp. 5 14, 1998 Copyright 1998 IBRO. Published by Elsevier Science Ltd Printed in Great Britain. All rights reserved PII: S0306-4522(98)00091-8 0306 4522/98 $19.00+0.00

More information

Experimental Characterization of Nonlinear Dynamics from Chua s Circuit

Experimental Characterization of Nonlinear Dynamics from Chua s Circuit Experimental Characterization of Nonlinear Dynamics from Chua s Circuit John Parker*, 1 Majid Sodagar, 1 Patrick Chang, 1 and Edward Coyle 1 School of Physics, Georgia Institute of Technology, Atlanta,

More information

Stabilization of Hyperbolic Chaos by the Pyragas Method

Stabilization of Hyperbolic Chaos by the Pyragas Method Journal of Mathematics and System Science 4 (014) 755-76 D DAVID PUBLISHING Stabilization of Hyperbolic Chaos by the Pyragas Method Sergey Belyakin, Arsen Dzanoev, Sergey Kuznetsov Physics Faculty, Moscow

More information

Dynamical Embodiments of Computation in Cognitive Processes James P. Crutcheld Physics Department, University of California, Berkeley, CA a

Dynamical Embodiments of Computation in Cognitive Processes James P. Crutcheld Physics Department, University of California, Berkeley, CA a Dynamical Embodiments of Computation in Cognitive Processes James P. Crutcheld Physics Department, University of California, Berkeley, CA 94720-7300 and Santa Fe Institute, 1399 Hyde Park Road, Santa Fe,

More information

Linear Regression and Its Applications

Linear Regression and Its Applications Linear Regression and Its Applications Predrag Radivojac October 13, 2014 Given a data set D = {(x i, y i )} n the objective is to learn the relationship between features and the target. We usually start

More information

A Chaotic Phenomenon in the Power Swing Equation Umesh G. Vaidya R. N. Banavar y N. M. Singh March 22, 2000 Abstract Existence of chaotic dynamics in

A Chaotic Phenomenon in the Power Swing Equation Umesh G. Vaidya R. N. Banavar y N. M. Singh March 22, 2000 Abstract Existence of chaotic dynamics in A Chaotic Phenomenon in the Power Swing Equation Umesh G. Vaidya R. N. Banavar y N. M. Singh March, Abstract Existence of chaotic dynamics in the classical swing equations of a power system of three interconnected

More information

Dynamical systems in neuroscience. Pacific Northwest Computational Neuroscience Connection October 1-2, 2010

Dynamical systems in neuroscience. Pacific Northwest Computational Neuroscience Connection October 1-2, 2010 Dynamical systems in neuroscience Pacific Northwest Computational Neuroscience Connection October 1-2, 2010 What do I mean by a dynamical system? Set of state variables Law that governs evolution of state

More information

MODEL NEURONS: FROM HODGKIN-HUXLEY TO HOPFIELD. L.F. Abbott and Thomas B. Kepler. Physics and Biology Departments. and. Center for Complex Systems

MODEL NEURONS: FROM HODGKIN-HUXLEY TO HOPFIELD. L.F. Abbott and Thomas B. Kepler. Physics and Biology Departments. and. Center for Complex Systems MODEL NEURONS: FROM HODGKIN-HUXLEY TO HOPFIELD L.F. Abbott and Thomas B. Kepler Physics and Biology Departments and Center for Complex Systems Brandeis University Waltham, MA 02254 Abstract A new technique

More information

Slow Manifold of a Neuronal Bursting Model

Slow Manifold of a Neuronal Bursting Model Slow Manifold of a Neuronal Bursting Model Jean-Marc Ginoux 1 and Bruno Rossetto 2 1 PROTEE Laboratory, Université du Sud, B.P. 2132, 83957, La Garde Cedex, France, ginoux@univ-tln.fr 2 PROTEE Laboratory,

More information

Chapter 1. Introduction

Chapter 1. Introduction Chapter 1 Introduction 1.1 What is Phase-Locked Loop? The phase-locked loop (PLL) is an electronic system which has numerous important applications. It consists of three elements forming a feedback loop:

More information

Tracking the State of the Hindmarsh-Rose Neuron by Using the Coullet Chaotic System Based on a Single Input

Tracking the State of the Hindmarsh-Rose Neuron by Using the Coullet Chaotic System Based on a Single Input ISSN 1746-7659, England, UK Journal of Information and Computing Science Vol. 11, No., 016, pp.083-09 Tracking the State of the Hindmarsh-Rose Neuron by Using the Coullet Chaotic System Based on a Single

More information

Reliable circuits from irregular neurons: A dynamical approach to understanding central pattern generators

Reliable circuits from irregular neurons: A dynamical approach to understanding central pattern generators J. Physiol. (Paris) 94 (2000) 357 374 2000 Elsevier Science Ltd. Published by Éditions scientifiques et médicales Elsevier SAS. All rights reserved PII: S0928-4257(00)01101-3/REV Reliable circuits from

More information

AN ELECTRIC circuit containing a switch controlled by

AN ELECTRIC circuit containing a switch controlled by 878 IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS II: ANALOG AND DIGITAL SIGNAL PROCESSING, VOL. 46, NO. 7, JULY 1999 Bifurcation of Switched Nonlinear Dynamical Systems Takuji Kousaka, Member, IEEE, Tetsushi

More information

Igor A. Khovanov,Vadim S. Anishchenko, Astrakhanskaya str. 83, Saratov, Russia

Igor A. Khovanov,Vadim S. Anishchenko, Astrakhanskaya str. 83, Saratov, Russia Noise Induced Escape from Dierent Types of Chaotic Attractor Igor A. Khovanov,Vadim S. Anishchenko, Dmitri G. Luchinsky y,andpeter V.E. McClintock y Department of Physics, Saratov State University, Astrakhanskaya

More information

Nonlinear Observer Design and Synchronization Analysis for Classical Models of Neural Oscillators

Nonlinear Observer Design and Synchronization Analysis for Classical Models of Neural Oscillators Nonlinear Observer Design and Synchronization Analysis for Classical Models of Neural Oscillators Ranjeetha Bharath and Jean-Jacques Slotine Massachusetts Institute of Technology ABSTRACT This work explores

More information

Characterizing chaotic time series

Characterizing chaotic time series Characterizing chaotic time series Jianbo Gao PMB InTelliGence, LLC, West Lafayette, IN 47906 Mechanical and Materials Engineering, Wright State University jbgao.pmb@gmail.com http://www.gao.ece.ufl.edu/

More information

Modeling and Predicting Chaotic Time Series

Modeling and Predicting Chaotic Time Series Chapter 14 Modeling and Predicting Chaotic Time Series To understand the behavior of a dynamical system in terms of some meaningful parameters we seek the appropriate mathematical model that captures the

More information

Introduction to Neural Networks U. Minn. Psy 5038 Spring, 1999 Daniel Kersten. Lecture 2a. The Neuron - overview of structure. From Anderson (1995)

Introduction to Neural Networks U. Minn. Psy 5038 Spring, 1999 Daniel Kersten. Lecture 2a. The Neuron - overview of structure. From Anderson (1995) Introduction to Neural Networks U. Minn. Psy 5038 Spring, 1999 Daniel Kersten Lecture 2a The Neuron - overview of structure From Anderson (1995) 2 Lect_2a_Mathematica.nb Basic Structure Information flow:

More information

More Details Fixed point of mapping is point that maps into itself, i.e., x n+1 = x n.

More Details Fixed point of mapping is point that maps into itself, i.e., x n+1 = x n. More Details Fixed point of mapping is point that maps into itself, i.e., x n+1 = x n. If there are points which, after many iterations of map then fixed point called an attractor. fixed point, If λ

More information

THREE DIMENSIONAL SYSTEMS. Lecture 6: The Lorenz Equations

THREE DIMENSIONAL SYSTEMS. Lecture 6: The Lorenz Equations THREE DIMENSIONAL SYSTEMS Lecture 6: The Lorenz Equations 6. The Lorenz (1963) Equations The Lorenz equations were originally derived by Saltzman (1962) as a minimalist model of thermal convection in a

More information

Neural Excitability in a Subcritical Hopf Oscillator with a Nonlinear Feedback

Neural Excitability in a Subcritical Hopf Oscillator with a Nonlinear Feedback Neural Excitability in a Subcritical Hopf Oscillator with a Nonlinear Feedback Gautam C Sethia and Abhijit Sen Institute for Plasma Research, Bhat, Gandhinagar 382 428, INDIA Motivation Neural Excitability

More information

Elec4621 Advanced Digital Signal Processing Chapter 11: Time-Frequency Analysis

Elec4621 Advanced Digital Signal Processing Chapter 11: Time-Frequency Analysis Elec461 Advanced Digital Signal Processing Chapter 11: Time-Frequency Analysis Dr. D. S. Taubman May 3, 011 In this last chapter of your notes, we are interested in the problem of nding the instantaneous

More information

Chaos, Complexity, and Inference (36-462)

Chaos, Complexity, and Inference (36-462) Chaos, Complexity, and Inference (36-462) Lecture 4 Cosma Shalizi 22 January 2009 Reconstruction Inferring the attractor from a time series; powerful in a weird way Using the reconstructed attractor to

More information

Analysis of Electroencephologram Data Using Time-Delay Embeddings to Reconstruct Phase Space

Analysis of Electroencephologram Data Using Time-Delay Embeddings to Reconstruct Phase Space Dynamics at the Horsetooth Volume 1, 2009. Analysis of Electroencephologram Data Using Time-Delay Embeddings to Reconstruct Phase Space Department of Mathematics Colorado State University Report submitted

More information

Majid Sodagar, 1 Patrick Chang, 1 Edward Coyler, 1 and John Parke 1 School of Physics, Georgia Institute of Technology, Atlanta, Georgia 30332, USA

Majid Sodagar, 1 Patrick Chang, 1 Edward Coyler, 1 and John Parke 1 School of Physics, Georgia Institute of Technology, Atlanta, Georgia 30332, USA Experimental Characterization of Chua s Circuit Majid Sodagar, 1 Patrick Chang, 1 Edward Coyler, 1 and John Parke 1 School of Physics, Georgia Institute of Technology, Atlanta, Georgia 30332, USA (Dated:

More information

NON-LINEAR DYNAMICS TOOLS FOR THE MOTION ANALYSIS AND CONDITION MONITORING OF ROBOT JOINTS

NON-LINEAR DYNAMICS TOOLS FOR THE MOTION ANALYSIS AND CONDITION MONITORING OF ROBOT JOINTS Mechanical Systems and Signal Processing (2001) 15(6), 1141}1164 doi:10.1006/mssp.2000.1394, available online at http://www.idealibrary.com on NON-LINEAR DYNAMICS TOOLS FOR THE MOTION ANALYSIS AND CONDITION

More information

Chapter 3 Least Squares Solution of y = A x 3.1 Introduction We turn to a problem that is dual to the overconstrained estimation problems considered s

Chapter 3 Least Squares Solution of y = A x 3.1 Introduction We turn to a problem that is dual to the overconstrained estimation problems considered s Lectures on Dynamic Systems and Control Mohammed Dahleh Munther A. Dahleh George Verghese Department of Electrical Engineering and Computer Science Massachuasetts Institute of Technology 1 1 c Chapter

More information

Separation of a Signal of Interest from a Seasonal Effect in Geophysical Data: I. El Niño/La Niña Phenomenon

Separation of a Signal of Interest from a Seasonal Effect in Geophysical Data: I. El Niño/La Niña Phenomenon International Journal of Geosciences, 2011, 2, **-** Published Online November 2011 (http://www.scirp.org/journal/ijg) Separation of a Signal of Interest from a Seasonal Effect in Geophysical Data: I.

More information

The Hopf bifurcation with bounded noise

The Hopf bifurcation with bounded noise The Hopf bifurcation with bounded noise Ryan T. Botts 1, Ale Jan Homburg 2,3, and Todd R. Young 4 1 Department of Mathematical, Information & Computer Sciences, Point Loma Nazarene University, 3900 Lomaland

More information

Mathematical Foundations of Neuroscience - Lecture 7. Bifurcations II.

Mathematical Foundations of Neuroscience - Lecture 7. Bifurcations II. Mathematical Foundations of Neuroscience - Lecture 7. Bifurcations II. Filip Piękniewski Faculty of Mathematics and Computer Science, Nicolaus Copernicus University, Toruń, Poland Winter 2009/2010 Filip

More information

7 Planar systems of linear ODE

7 Planar systems of linear ODE 7 Planar systems of linear ODE Here I restrict my attention to a very special class of autonomous ODE: linear ODE with constant coefficients This is arguably the only class of ODE for which explicit solution

More information

Gravitational potential energy *

Gravitational potential energy * OpenStax-CNX module: m15090 1 Gravitational potential energy * Sunil Kumar Singh This work is produced by OpenStax-CNX and licensed under the Creative Commons Attribution License 2.0 The concept of potential

More information

Chaotic Balanced State in a Model of Cortical Circuits C. van Vreeswijk and H. Sompolinsky Racah Institute of Physics and Center for Neural Computatio

Chaotic Balanced State in a Model of Cortical Circuits C. van Vreeswijk and H. Sompolinsky Racah Institute of Physics and Center for Neural Computatio Chaotic Balanced State in a Model of Cortical Circuits C. van Vreeswij and H. Sompolinsy Racah Institute of Physics and Center for Neural Computation Hebrew University Jerusalem, 91904 Israel 10 March

More information

The Big Picture. Discuss Examples of unpredictability. Odds, Stanisław Lem, The New Yorker (1974) Chaos, Scientific American (1986)

The Big Picture. Discuss Examples of unpredictability. Odds, Stanisław Lem, The New Yorker (1974) Chaos, Scientific American (1986) The Big Picture Discuss Examples of unpredictability Odds, Stanisław Lem, The New Yorker (1974) Chaos, Scientific American (1986) Lecture 2: Natural Computation & Self-Organization, Physics 256A (Winter

More information

Handout 2: Invariant Sets and Stability

Handout 2: Invariant Sets and Stability Engineering Tripos Part IIB Nonlinear Systems and Control Module 4F2 1 Invariant Sets Handout 2: Invariant Sets and Stability Consider again the autonomous dynamical system ẋ = f(x), x() = x (1) with state

More information

Phase Desynchronization as a Mechanism for Transitions to High-Dimensional Chaos

Phase Desynchronization as a Mechanism for Transitions to High-Dimensional Chaos Commun. Theor. Phys. (Beijing, China) 35 (2001) pp. 682 688 c International Academic Publishers Vol. 35, No. 6, June 15, 2001 Phase Desynchronization as a Mechanism for Transitions to High-Dimensional

More information

Kantian metaphysics to mind-brain. The approach follows Bacon s investigative method

Kantian metaphysics to mind-brain. The approach follows Bacon s investigative method 82 Basic Tools and Techniques As discussed, the project is based on mental physics which in turn is the application of Kantian metaphysics to mind-brain. The approach follows Bacon s investigative method

More information

Introduction to Dynamical Systems Basic Concepts of Dynamics

Introduction to Dynamical Systems Basic Concepts of Dynamics Introduction to Dynamical Systems Basic Concepts of Dynamics A dynamical system: Has a notion of state, which contains all the information upon which the dynamical system acts. A simple set of deterministic

More information

Single neuron models. L. Pezard Aix-Marseille University

Single neuron models. L. Pezard Aix-Marseille University Single neuron models L. Pezard Aix-Marseille University Biophysics Biological neuron Biophysics Ionic currents Passive properties Active properties Typology of models Compartmental models Differential

More information

Modeling Low-Dimensional Submanifolds In Dynamical Systems

Modeling Low-Dimensional Submanifolds In Dynamical Systems 1 Clarkson University Modeling Low-Dimensional Submanifolds In Dynamical Systems A Dissertation by Chen Yao Department of Mathematics and Computer Science Submitted in partial fulllment of the requirements

More information

CHALMERS, GÖTEBORGS UNIVERSITET. EXAM for DYNAMICAL SYSTEMS. COURSE CODES: TIF 155, FIM770GU, PhD

CHALMERS, GÖTEBORGS UNIVERSITET. EXAM for DYNAMICAL SYSTEMS. COURSE CODES: TIF 155, FIM770GU, PhD CHALMERS, GÖTEBORGS UNIVERSITET EXAM for DYNAMICAL SYSTEMS COURSE CODES: TIF 155, FIM770GU, PhD Time: Place: Teachers: Allowed material: Not allowed: August 22, 2018, at 08 30 12 30 Johanneberg Jan Meibohm,

More information

6.2 Brief review of fundamental concepts about chaotic systems

6.2 Brief review of fundamental concepts about chaotic systems 6.2 Brief review of fundamental concepts about chaotic systems Lorenz (1963) introduced a 3-variable model that is a prototypical example of chaos theory. These equations were derived as a simplification

More information

The Mixed States of Associative Memories Realize Unimodal Distribution of Dominance Durations in Multistable Perception

The Mixed States of Associative Memories Realize Unimodal Distribution of Dominance Durations in Multistable Perception The Mixed States of Associative Memories Realize Unimodal Distribution of Dominance Durations in Multistable Perception Takashi Kanamaru Department of Mechanical Science and ngineering, School of Advanced

More information

Phase-Space Learning Fu-Sheng Tsung Chung Tai Ch'an Temple 56, Yuon-fon Road, Yi-hsin Li, Pu-li Nan-tou County, Taiwan 545 Republic of China Garrison

Phase-Space Learning Fu-Sheng Tsung Chung Tai Ch'an Temple 56, Yuon-fon Road, Yi-hsin Li, Pu-li Nan-tou County, Taiwan 545 Republic of China Garrison Phase-Space Learning Fu-Sheng Tsung Chung Tai Ch'an Temple 56, Yuon-fon Road, Yi-hsin Li, Pu-li Nan-tou County, Taiwan 545 Republic of China Garrison W. Cottrell Institute for Neural Computation Computer

More information

Schiestel s Derivation of the Epsilon Equation and Two Equation Modeling of Rotating Turbulence

Schiestel s Derivation of the Epsilon Equation and Two Equation Modeling of Rotating Turbulence NASA/CR-21-2116 ICASE Report No. 21-24 Schiestel s Derivation of the Epsilon Equation and Two Equation Modeling of Rotating Turbulence Robert Rubinstein NASA Langley Research Center, Hampton, Virginia

More information

DESYNCHRONIZATION TRANSITIONS IN RINGS OF COUPLED CHAOTIC OSCILLATORS

DESYNCHRONIZATION TRANSITIONS IN RINGS OF COUPLED CHAOTIC OSCILLATORS Letters International Journal of Bifurcation and Chaos, Vol. 8, No. 8 (1998) 1733 1738 c World Scientific Publishing Company DESYNCHRONIZATION TRANSITIONS IN RINGS OF COUPLED CHAOTIC OSCILLATORS I. P.

More information

DETERMINATION OF MODEL VALID PREDICTION PERIOD USING THE BACKWARD FOKKER-PLANCK EQUATION

DETERMINATION OF MODEL VALID PREDICTION PERIOD USING THE BACKWARD FOKKER-PLANCK EQUATION .4 DETERMINATION OF MODEL VALID PREDICTION PERIOD USING THE BACKWARD FOKKER-PLANCK EQUATION Peter C. Chu, Leonid M. Ivanov, and C.W. Fan Department of Oceanography Naval Postgraduate School Monterey, California.

More information

10k. 5k 10k. V in. 10n. 10n 100k. m k + 10k. 10k V 2 V V(t d)

10k. 5k 10k. V in. 10n. 10n 100k. m k + 10k. 10k V 2 V V(t d) Attractors of a randomly forced electronic oscillator Peter Ashwin, Department of Mathematical and Computing Sciences, University of Surrey, Guildford GU2 5XH, UK. September 23, 998 Abstract This paper

More information

Problem Set Number 02, j/2.036j MIT (Fall 2018)

Problem Set Number 02, j/2.036j MIT (Fall 2018) Problem Set Number 0, 18.385j/.036j MIT (Fall 018) Rodolfo R. Rosales (MIT, Math. Dept., room -337, Cambridge, MA 0139) September 6, 018 Due October 4, 018. Turn it in (by 3PM) at the Math. Problem Set

More information

1 Introduction and neurophysiology

1 Introduction and neurophysiology Dynamics of Continuous, Discrete and Impulsive Systems Series B: Algorithms and Applications 16 (2009) 535-549 Copyright c 2009 Watam Press http://www.watam.org ASYMPTOTIC DYNAMICS OF THE SLOW-FAST HINDMARSH-ROSE

More information

1 The pendulum equation

1 The pendulum equation Math 270 Honors ODE I Fall, 2008 Class notes # 5 A longer than usual homework assignment is at the end. The pendulum equation We now come to a particularly important example, the equation for an oscillating

More information

Chapter 6 Nonlinear Systems and Phenomena. Friday, November 2, 12

Chapter 6 Nonlinear Systems and Phenomena. Friday, November 2, 12 Chapter 6 Nonlinear Systems and Phenomena 6.1 Stability and the Phase Plane We now move to nonlinear systems Begin with the first-order system for x(t) d dt x = f(x,t), x(0) = x 0 In particular, consider

More information

Frequency Adaptation and Bursting

Frequency Adaptation and Bursting BioE332A Lab 3, 2010 1 Lab 3 January 5, 2010 Frequency Adaptation and Bursting In the last lab, we explored spiking due to sodium channels. In this lab, we explore adaptation and bursting due to potassium

More information

arxiv: v3 [q-bio.nc] 17 Oct 2018

arxiv: v3 [q-bio.nc] 17 Oct 2018 Evaluating performance of neural codes in model neural communication networks Chris G. Antonopoulos 1, Ezequiel Bianco-Martinez 2 and Murilo S. Baptista 3 October 18, 2018 arxiv:1709.08591v3 [q-bio.nc]

More information

G : Statistical Mechanics Notes for Lecture 3 I. MICROCANONICAL ENSEMBLE: CONDITIONS FOR THERMAL EQUILIBRIUM Consider bringing two systems into

G : Statistical Mechanics Notes for Lecture 3 I. MICROCANONICAL ENSEMBLE: CONDITIONS FOR THERMAL EQUILIBRIUM Consider bringing two systems into G25.2651: Statistical Mechanics Notes for Lecture 3 I. MICROCANONICAL ENSEMBLE: CONDITIONS FOR THERMAL EQUILIBRIUM Consider bringing two systems into thermal contact. By thermal contact, we mean that the

More information

Neuron. Detector Model. Understanding Neural Components in Detector Model. Detector vs. Computer. Detector. Neuron. output. axon

Neuron. Detector Model. Understanding Neural Components in Detector Model. Detector vs. Computer. Detector. Neuron. output. axon Neuron Detector Model 1 The detector model. 2 Biological properties of the neuron. 3 The computational unit. Each neuron is detecting some set of conditions (e.g., smoke detector). Representation is what

More information

1 Random walks and data

1 Random walks and data Inference, Models and Simulation for Complex Systems CSCI 7-1 Lecture 7 15 September 11 Prof. Aaron Clauset 1 Random walks and data Supposeyou have some time-series data x 1,x,x 3,...,x T and you want

More information

High-conductance states in a mean-eld cortical network model

High-conductance states in a mean-eld cortical network model Neurocomputing 58 60 (2004) 935 940 www.elsevier.com/locate/neucom High-conductance states in a mean-eld cortical network model Alexander Lerchner a;, Mandana Ahmadi b, John Hertz b a Oersted-DTU, Technical

More information

Statistical mechanics of classical systems

Statistical mechanics of classical systems Statistical mechanics of classical systems States and ensembles A microstate of a statistical system is specied by the complete information about the states of all microscopic degrees of freedom of the

More information

Numerical Simulations in Jerk Circuit and It s Application in a Secure Communication System

Numerical Simulations in Jerk Circuit and It s Application in a Secure Communication System Numerical Simulations in Jerk Circuit and It s Application in a Secure Communication System A. SAMBAS, M. SANJAYA WS, M. MAMAT, N. V. KARADIMAS, O. TACHA Bolabot Techno Robotic School, Sanjaya Star Group

More information

PHY411 Lecture notes Part 5

PHY411 Lecture notes Part 5 PHY411 Lecture notes Part 5 Alice Quillen January 27, 2016 Contents 0.1 Introduction.................................... 1 1 Symbolic Dynamics 2 1.1 The Shift map.................................. 3 1.2

More information

Neural variability and Poisson statistics

Neural variability and Poisson statistics Neural variability and Poisson statistics January 15, 2014 1 Introduction We are in the process of deriving the Hodgkin-Huxley model. That model describes how an action potential is generated by ion specic

More information

6.3.4 Action potential

6.3.4 Action potential I ion C m C m dφ dt Figure 6.8: Electrical circuit model of the cell membrane. Normally, cells are net negative inside the cell which results in a non-zero resting membrane potential. The membrane potential

More information

Linear Algebra (part 1) : Vector Spaces (by Evan Dummit, 2017, v. 1.07) 1.1 The Formal Denition of a Vector Space

Linear Algebra (part 1) : Vector Spaces (by Evan Dummit, 2017, v. 1.07) 1.1 The Formal Denition of a Vector Space Linear Algebra (part 1) : Vector Spaces (by Evan Dummit, 2017, v. 1.07) Contents 1 Vector Spaces 1 1.1 The Formal Denition of a Vector Space.................................. 1 1.2 Subspaces...................................................

More information

3 Detector vs. Computer

3 Detector vs. Computer 1 Neurons 1. The detector model. Also keep in mind this material gets elaborated w/the simulations, and the earliest material is often hardest for those w/primarily psych background. 2. Biological properties

More information

Voltage-clamp and Hodgkin-Huxley models

Voltage-clamp and Hodgkin-Huxley models Voltage-clamp and Hodgkin-Huxley models Read: Hille, Chapters 2-5 (best) Koch, Chapters 6, 8, 9 See also Clay, J. Neurophysiol. 80:903-913 (1998) (for a recent version of the HH squid axon model) Rothman

More information

Attractor of a Shallow Water Equations Model

Attractor of a Shallow Water Equations Model Thai Journal of Mathematics Volume 5(2007) Number 2 : 299 307 www.math.science.cmu.ac.th/thaijournal Attractor of a Shallow Water Equations Model S. Sornsanam and D. Sukawat Abstract : In this research,

More information

Structured reservoir computing with spatiotemporal chaotic attractors

Structured reservoir computing with spatiotemporal chaotic attractors Structured reservoir computing with spatiotemporal chaotic attractors Carlos Lourenço 1,2 1- Faculty of Sciences of the University of Lisbon - Informatics Department Campo Grande, 1749-016 Lisboa - Portugal

More information

3 Action Potentials - Brutal Approximations

3 Action Potentials - Brutal Approximations Physics 172/278 - David Kleinfeld - Fall 2004; Revised Winter 2015 3 Action Potentials - Brutal Approximations The Hodgkin-Huxley equations for the behavior of the action potential in squid, and similar

More information

Why is Deep Learning so effective?

Why is Deep Learning so effective? Ma191b Winter 2017 Geometry of Neuroscience The unreasonable effectiveness of deep learning This lecture is based entirely on the paper: Reference: Henry W. Lin and Max Tegmark, Why does deep and cheap

More information

Coupling in Networks of Neuronal Oscillators. Carter Johnson

Coupling in Networks of Neuronal Oscillators. Carter Johnson Coupling in Networks of Neuronal Oscillators Carter Johnson June 15, 2015 1 Introduction Oscillators are ubiquitous in nature. From the pacemaker cells that keep our hearts beating to the predator-prey

More information