Estimation of Propagating Phase Transients in EEG Data - Application of Dynamic Logic Neural Modeling Approach
|
|
- Duane Waters
- 6 years ago
- Views:
Transcription
1 Proceedings of International Joint Conference on Neural Networks, Orlando, Florida, USA, August 12-17, 2007 Estimation of Propagating Phase Transients in EEG Data - Application of Dynamic Logic Neural Modeling Approach Robert Kozma, Ross W Deming, and Leonid I Perlovsky Abstract Dynamic Logic (DL) approach establishes a unified framework for the statistical description of mixtures using model-based neural networks. In the present work, we extend the previous results to dynamic processes where the mixture parameters, including partial and total energy of the components are time-dependent. Equations are derived and solved for the estimation of parameters which vary in time. The results provide optimal approximation to a broad class of pattern recognition and process identification problems with variable and noisy data. The introduced methodology is demonstrated on the example of identification of propagating phase gradients generated by intermittent fluctuations in nonequilibrium neural media. I. INTRODUCTION Experimental studies indicate that intermittent synchronization across large cortical areas provides the window for the emergence of meaningful cognitive activity in animals and humans (Freeman et al., 2003; Freeman, 2004; Stam, 2005). In neural tissues, populations of neurons send electric currents to each other and produce activation potentials observed in EEG experiments. The synchrony among neural units can be evaluated by comparing their activation levels as the function of time. While single unit activations have large variability and do not seem synchronous, the activations of neural groups often exhibit apparent synchrony. What distinguishes brain dynamics from other kinds of dynamical behavior observed in nature is the filamentous texture of neural tissue called neuropil, which is unlike any other substance in the known universe (Freeman, 1991). Neural populations stem ontogenetically in embryos from aggregates of neurons that grow axons and dendrites and form synaptic connections of steadily increasing density. At some threshold the density allows neurons to transmit more pulses than they receive, so that an aggregate undergoes a state transition from a zero point attractor to a non-zero point attractor, thereby becoming a population. Experimental studies on brain waves at the level of neural populations using EEG techniques gave rise to new theories (Freeman, 1991; Freeman, Kozma, 2000). Multiple electrode recordings in the olfactory bulb indicated that odors are encoded as complex spatial and temporal patterns in the bulb. Based on these observations, a chaos theory of sensory perception has been proposed (Skarda and Freeman, 1987; Freeman, 1991). In this approach, the state variables of the Robert Kozma is with Dept. of Computer Science, 373 Dunn Hall, The University of Memphis, Memphis, TN 38152, rkozma@memphis.edu; Leonid Perlovsky and Ross Deming with Air Force Research Laboratory, Sensory Directorate, Hanscom AFB, MA 01371, leonid.perlovsky@hanscom.af.mil. The work is partially supported by National Research Council through an award to Robert Kozma. brain in general, and the olfactory bulb in particular, traverse along complex chaotic trajectories which constitute a strange attractor with multiple wings. Spatio-temporal behavior of the trajectories and their behavioral significance is studied, e.g., in (Kozma, Freeman, 2002). External stimuli constrain the trajectories to one of the attractor wings, which are identified as stimulus specific patterns. Once the stimulus disappears, the dynamics returns to the unconstrained state until the next stimulus arrives. Identification of spatio-temporal spontaneous and inputinduced synchronization-desynchronization events in cortical populations poses a difficult problem due to the noisy and transient character of the processes involved. We propose to use DL-based neural networks for solving this identification task. Concepts and instincts are basic building blocks of in this theory. Concepts are formed as models of sensory observations on the external world, while instincts correspond to interoceptory percepts. DL neural networks provide the machinery operating through the knowledge instinct to achieve iterative converge toward an emotionally balanced final state. Dynamic logic has been used very successfully in solving difficult pattern recognition problems (Perlovsky, 2001, 2006). In this paper we introduce the DP-based learning method for analyzing spatially distributed EEG distributions. First we derive the corresponding equations applicable to the case of time varying EEG transitions. Next we describe the algorithm to solve these equations and estimate the model parameters. We demonstrate the effectiveness of the proposed methodology using simulated data. Our results indicate the feasibility of the proposed methodology and the successful estimation of model parameters in problems with high noise level. Finally, the results are summarized and the directions of future work are outlined, toward the analysis of actual multi-channel EEG data. II. OVERVIEW OF PHASE TRANSITIONS OBSERVED IN EEG DATA EEG measurements confirm the presence of the selfsustaining, randomized, steady state background activity of brains. This background activity is the source from which ordered states of macroscopic neural activity emerge, like the patterns of waves at the surfaces of deep bodies of water. Neural tissues, however, are not passive media, through which effects propagate like waves in water (Freeman and Kozma 2000). The brain medium has an intimate relationship with the dynamics through a generally weak, subthreshold interaction of neurons X/07/$ IEEE
2 Fig. 1. Phase differences across space and time measured by Hilbert method. There are long periods of small phase differences, interrupted by short transients with high phase dispersion. The EEG shows that neocortex processes information in frames like a cinema, at the approx. theta rate (7-12 Hz). The perceptual content is found in the phase plateaus from rabbit EEG. The phase jumps show the shutter (Freeman, 2004). Fig. 2. Illustration of the onset of phase cones in the rabbit olfactory bulb, using an array of 8x8 intracranial electrodes. Schematic illustration of the observed amplitude modulation patterns are shown on the left two panels. The right panel shows the location of phase cones detected during the experiments (Barrie et al, 1996). The brain activity exhibits high level of synchrony across large cortical regions. The synchrony is interrupted by episodes of desynchronization, when propagation of phase gradients in the activation of local populations can be identified. The spatially ordered phase relationship between cortical signals is called phase cone. Experiments show that phase gradients propagate at velocities up to 2 m/s and can cover cortical areas with size of several cm 2 (Barrie et al., 1996; Freeman et al., 2003). A typical example of EEG measured at rabbit cortex is shown in Fig. 1. Coordinated analytic phase differences (CAPD)are calculated in the gamma band (20-50 Hz) with 8x mm spacing 5.6x5.6 mm array digitized at 2 ms intervals. The EEG shows that neocortex processes information in frames like a cinema. The perceptual content is found in the phase plateaus of EEG; similar content is predicted to be found in the plateaus of human scalp EEG. The phase jumps show the shutter (Freeman, 2004). Estimation of parameters of the phase relationships is a very difficult problem. Phase differences are typically masked by high noise level due to the cortex background activity. Phase cones can appear with positive and negative phase lags, corresponding to explosive and implosive transitions in the cortical spatio-temporal dynamics. Figure 2 illustrates the distribution of the amplitude modulation patterns (AM) in the gamma band which carry information on the subject cognitive activity. These experiments have been performed on the rabbit s olfactory cortex using intracranial array of 8x8 electrodes. The experiments show that several phase cones may simultaneously coexists in the given spatio-temporal measurement window. The present results have been obtained by a properly tuned multi-variable regression technique, which selects the dominant phase cone at a give time interval and conducts parameter estimation for the dominant cone (Barrie et al, 1996). The fitting technique applied in the above reference has been used in various experiments and it produced satisfactory results. It is clear, however, that this estimation is a very tedious procedure. Moreover, it obtains the estimated parameters for the dominant cone only. Clearly, a more advanced method of parameter estimation, especially in the case of overlapping cones, which persist only for a relative short time period of 100 ms or less, would be very desirable. In this paper we extend the dynamic logic methodology to transient mixture processes, which will make it suitable to estimate parameters of the EEG phase cones. III. DYNAMIC LOGIC EQUATIONS A. Maximum Likelihood Estimation Framework The statistical method for maximum likelihood estimation presented here is based on the Dynamic Logic formulation; for details, see (Perlovsky, 2001). Let Θ denote the set of parameters of a model, s is the total number of parameters: Θ = [Θ 1, Θ 2,..., Θ s ]. (1) The set of available data is given by X n. Here n is the data index in the space and time. Let us consider an system having N spatial points, and for each point of space we measure T time instances, n = 1,..., N T. In the context of EEG frame identification, this means we have T frames of phase patterns, each having size N. The goodness of the fit between the model and the data is often described by the log-likelihood (LL) function (Duda and Hart, 1973). Various forms of of LL can be defined. Here we introduce the Einsteinean LL function LL E following Perlovsky (2001): LL E = n [N,T ] p 0 (X n )ln r k (t)p(x n Θ k ). (2) k=1
3 The notations are as follows: the model is assumed of having K mixture components, each is characterized by its own probability density function p(x n Θ k ), where Θ k is the parameter set of the k-th component of the mixture model. Here p 0 (X n ) is the distribution of the data points to be modeled. r k denotes the relative weight of component k. The following normalization condition is satisfied : k=1 t=1 r k (t) = E, (3) where E is the total power of all components across all times, during the experiment of length T. The parameters of mixtures are estimated by maximizing the log-likelihood in Eq.2. Next we derive the conditions for each parameter by calculating the partial derivative of LL E with respect to the corresponding variables. We use the standard method of Lagrangian multipliers. By introducing unknown coefficient λ, the expression F to be minimized writes : min F = LL E + λ(e t=1 k=1 r k (t)). (4) The time-dependence of r k (t) can obey various conditions, e.g., linear regression : r k (t) = r k0 + r k1 t, (5) higher-order polynomial, or an exponential relationship of the following from : r k (t) = r k0 exp(r k1 t). (6) For the time being, we derive a general condition for any expression of the time-dependence of r k (t). Later on, when considering a given problem, we will specify the form of r k (t). Let us take the derivative of F with respect to parameters r ki : 0 = F = LL λ t=1 k=1 r k (t). (7) After some algebra, we arrive at the following expression for unknown parameter λ: 1 λ = T r k(t)/ N,T p 0 (X n )P (k n) r k(t) /r k (t). (8) Here we defined association probability P (k n) as the probability that a given sample belongs to class k: P (k n) = r k (t)p(x n Θ k ) K k=1 r k(t)p(x n Θ k ). (9) At this point, the actual functional form of r k (t) is substituted. We consider the case of exponential growth or decay of the component, see Eq.6. After a series of transformations, we arrive at the conditions: T r k1 e r k 1 T e r k 1 T + 1 r k1 (e r k 1 T 1) r k0 = = < t > < 1 >, (10) r k1 (e r k 1 T 1) < 1 >. (11) The bracket notation < > has been introduced as a shorthand for the following expression: < >= n [N,T ] p 0 (X n )P (k n). (12) Eq. 10 is a transcendental equation. Its solution can be determined numerically. It is straightforward to show that Eq. 10 has a unique solution in the parameter range of interest to our problem. For convenience, it is easy to generate an look-up table with various values of the coefficient, so the solution of this equation is not computationally intensive. We mention this fact here, as at a later stage we need to solve another set of equations iteratively. In that case we can not have such a shortcut as in Eq. 10. Once r k1 is determined as the solution of the mentioned transcendental equation, its value should be substituted to Eq. 11 to obtain r k0. At the next step, the remaining model parameters shall be determined. We assume that the probability density function of each component obeys a normal distribution over the 2- dimensional array of X = [x, y] coordinates: 1 p(x Θ k ) = 2π C k (t) e( 1 2 (X M k) T C 1 k (t)(x M k)) (13) Here M k is the mean of the Gaussian distribution of the k- th component, while C k (t) is the time dependent covariance matrix. The assumption of normal distributions simplifies the treatment ahead, but other distributions can be used as well, as required by the nature of the process under investigation. According to the modeled process, we assume time dependence of the covariance only, while the mean values are fixed. This is consistent with the problem at hand. In EEG experiments, once the apex is determined as the point of initiation of the phase cone, its position is not changed. Only the extent of the phase cone varies as it evolves. It is possible to have explosion of phase cones, i.e., increasing size of the cone starting form an initial point. We also observe negative cones, i.e., implosions, when the phase cone converges to the apex from a broad initial state. Both explosions and implosions can be described with our model, by selecting the sign of the exponential evolution accordingly. The equation for M k is obtained by partial derivation of the log-likelihood equation Eq.2. After simple algebra, we arrive at the expression: M k = c 0k < X n > +c 1k < tx n > c 0k < 1 > +c 1k < t > (14) This is an explicit expression for the class mean values, using known c 0k and c 1k, k = 1,..., K. These constants
4 describe the temporal variation of the covariance matrix: C k (t) = 1 I, (15) c 0k + c 1k t where I is the 2 x 2 identity matrix. We introduce the notation c Rk = c 1k /c 0k. Note that the covariances monotonously decreasing for positive values of c 0k and c 1k and forward time, while they give increasing covariances for a range of negative c 1k values. Therefore, the model can describe both explosive and implosive evolution types. There is no closed-form solution for the covariance model coefficients, rather the following pair of transcendental equations must be solved simultaneously : < 1/(1 + c Rk t) > < t/(1 + c Rk t) > = < (X M k) T (X M k ) > < (X M k ) T (X M k )t > (16) c 0k = < 1/(1 + c Rk t) > < (X M k ) T (X M k ) > (17) An iterative learning procedure will be used to determine the model parameters based on the above equations. Namely, we start with some initial parameter values. We use the above adaptive equations to update r k1, r k0, M k, and c 1k, c 0k parameters, k = 1,..., K. The iteration is terminated typically after a given number of steps (typically 100 steps), or until a preset convergence criterion is satisfied. These model parameters can be considered weights, and the DL-based iterative method describes a model-based neural network learning process (Perlovsky et al, 1997; Perlovsky, 2001). In the next section, results of parameter estimation are introduced based on the procedure given above. IV. ESTIMATION OF PARAMETERS OF PHASE CONES USING DL The dynamic logic-based estimation procedure has been implemented in Matlab computing environment. It has been tested in various scenarios involving simulated phase cones. At first, the case of a single phase cone with low level of noise has been completed successfully. Next we produced more complicated simulated scenarios with multiple, overlapping phase cones and high level of noise. In these experiments the following parameters have been estimated: Location of the apex of each of the phase cones. The localization is conducted both in space and time. Namely, we estimated the x k and y k coordinates of the apex in 2-dimensional plane and the onset time instant t k, where k = 1,..., K, Dynamic parameters for the growth/collapse of the variance of each phase cone, c 0k and c 1k, k = 1,..., K, Energy evolution of each cone described by parameters r k0 and r k1, k = 1,..., K. Our results showed that all these parameters can be estimated successfully within a relatively few iterative learning step. Typically the results converged to the known value within less than 100 learning steps. The locations of the cones were the easiest to determine. This is not surprising, as DL Fig. 3. Final plots of estimating parameters of 4 phase cones; note: only the final frame is shown. is known to perform well for stationary or moving targets of fixed shapes, even in environment with high noise level and clutter (Perlovsky et al, 1997; Deming et al, 2006). The really challenging task has been the estimation of dynamic parameters, in particular the time-dependence of the covariances. Our method worked very well on that task, and it produced parameter values quickly and with good accuracy. Typically we obtained parameter values with an error of ±10% within less than 100 iteration steps. The most difficult task has been the estimation of the gradient of the variance growth/decrease. The system would be able to get relatively good estimation of all other parameters, like location and onset time, which would produce a reasonable estimation of the scenario, by assuming a certain effective variance for the observation period, without time dependence. This would mean a local minimum. Special attention should be paid to make sure the system does not get into local minimum. A general methodology proposed by DL approach is to start with an initially vague model with large variance. Then iteratively improve the model and continuously decrease the variance between model and observations. By selecting an appropriate variance adjustment schema, the iterations converge to model parameters close to the actual values and avoid getting trapped in local minima. An example of the estimation of the evolution of the phase cones is shown in Fig. 3. We simulated 100 frames of 128x128 pixel size each. Four different phase cones have been generated during the 100 frame period. The top two panels show the target image (top left) and the estimated image (top right). The bottom left frame represents the error of the model estimation. Note that the frames shown in Fig. 3 correspond to the final state of the frames after all 100 frames have been estimated. In this printed paper we can not show the movie with the time-dependent evolution. The lower right plot in Fig. 3 shows the convergence of the gradient of the variance c 1k. We observe that within about 20 iteration the
5 model converges to a gradient value close to the true value, which is 1 in this case. These results indicate the the introduced DL-based model produces parameter estimations very quickly within a reasonable accuracy of less than 10% in the case of simulated phase cone scenarios. V. CONCLUSIONS In this work a novel dynamic logic DL-based identification method is described to model transient processes. The method has been used successfully to estimate the parameters of evolving phase cones in simulated data. The introduced model produces parameter estimations very quickly within a reasonable accuracy of less than 10% in the case of simulated phase cone scenarios. As the next step we implement the methodology to study actual EEG sequences and identify parameters of phase transitions. The work in this field is in progress. DL is a cognitively-motivated model, describing the hierarchy of increasingly complex reflections of the environment in the internal world of the subject (Perlovsky, 2006). In the present work we used DL as a efficient pattern recognition tool to solve the problem of determining model parameters. We did not consider its cognitive aspects. In future studies it will be very interesting to apply this work to model the evolution of the perception-action cycle, which as we know, progresses through the sequences of phase transitions signified by the phase cones modeled in this work. Therefore, the present results concerning the identification of phase transitions in EEG can be used as building blocks in constructing models of the action-perception cycle at the high-level cognitive models of brains. Kozma, R., Freeman, W.J. (2002) Classification of EEG Patterns Using Nonlinear Neurodynamics and Chaos, Neurocomputing, 44-46, Perlovsky, L.I. Schoendorf, W.H. Burdick, B.J. Tye, D.M. (1997) Model-based neural network for target detection in SAR images. IEEE Transactions on Image Processing, 6(1), Perlovsky, L.I. (2001) Neural Networks and Intellect Using Model Based Approach, Oxford University Press, Oxford, U.K. Perlovsky, L.I. (2006) Toward physics of the mind: Concepts, emotions, consciousness, and symbols. Physics of Life Reviews, 3(1), Skarda, C.A., W.J. Freeman (1987) How brains make chaos in order to make sense of the world, Behav. Brain Sci 10, Stam, C.J. (2005) Nonlinear dynamical analysis of EEG and MEG: review of an emerging field. Clin. Neurophysiology, 116(10), VI. REFERENCES Barrie JM. Freeman WJ, Lenhart M. (1996) Modulation by discriminative training of spatial patterns of gamma EEG amplitude and phase in neocortex of rabbits. J. Neurophysiol. 76: Duda, R.O., and Hart, P.E. (1973) Pattern Classification, Wiley and Sons, New York. Deming, R., Perlovsky, L.I. (2006). Robust detection and spectrum estimation of multiple sources from rotating-prism spectrometer images. Proc. SPIE, Freeman, W.J. (1991) The physiology of perception, Scientific American 264(2), Freeman, W.J., Kozma, R. (2000) Local-global interactions and the role of mesoscopic (intermediate-range) elements in brain dynamics. Behavioral and Brain Sciences (2000), 23, 401. Freeman W.J., Burke B.C., Holmes M.D. (2003) Aperiodic phase re-setting in scalp EEG of beta-gamma oscillations by state transitions at alpha-theta rates. Hum Brain Mapping, 19: Freeman W.J. (2004) Origin, structure, and role of background EEG activity. Part 2. Analytic phase. Clin. Neurophysiol. 115:
Influence of Criticality on 1/f α Spectral Characteristics of Cortical Neuron Populations
Influence of Criticality on 1/f α Spectral Characteristics of Cortical Neuron Populations Robert Kozma rkozma@memphis.edu Computational Neurodynamics Laboratory, Department of Computer Science 373 Dunn
More informationThis article was originally published in a journal published by Elsevier, and the attached copy is provided by Elsevier for the author s benefit and for the benefit of the author s institution, for non-commercial
More informationEEG- Signal Processing
Fatemeh Hadaeghi EEG- Signal Processing Lecture Notes for BSP, Chapter 5 Master Program Data Engineering 1 5 Introduction The complex patterns of neural activity, both in presence and absence of external
More informationHierarchy. Will Penny. 24th March Hierarchy. Will Penny. Linear Models. Convergence. Nonlinear Models. References
24th March 2011 Update Hierarchical Model Rao and Ballard (1999) presented a hierarchical model of visual cortex to show how classical and extra-classical Receptive Field (RF) effects could be explained
More informationChaotic Neurodynamics for Autonomous Agents
CHAOTIC NEURODYNAMICS FOR AUTONOMOUS AGENTS 1 Chaotic Neurodynamics for Autonomous Agents Derek Harter Member, Robert Kozma Senior Member Division of Computer Science, University of Memphis, TN, USA Abstract
More informationBursting and Chaotic Activities in the Nonlinear Dynamics of FitzHugh-Rinzel Neuron Model
Bursting and Chaotic Activities in the Nonlinear Dynamics of FitzHugh-Rinzel Neuron Model Abhishek Yadav *#, Anurag Kumar Swami *, Ajay Srivastava * * Department of Electrical Engineering, College of Technology,
More informationNONLINEAR BRAIN DYNAMICS AND MANY-BODY FIELD DYNAMICS 1
NONLINEAR BRAIN DYNAMICS AND MANY-BODY FIELD DYNAMICS 1 Walter J. Freeman and Giuseppe Vitiello Department of Molecular and Cell Biology University of California, Berkeley CA 94720-3206 USA http://sulcus.berkeley.edu
More informationA Multivariate Time-Frequency Based Phase Synchrony Measure for Quantifying Functional Connectivity in the Brain
A Multivariate Time-Frequency Based Phase Synchrony Measure for Quantifying Functional Connectivity in the Brain Dr. Ali Yener Mutlu Department of Electrical and Electronics Engineering, Izmir Katip Celebi
More informationConsider the following spike trains from two different neurons N1 and N2:
About synchrony and oscillations So far, our discussions have assumed that we are either observing a single neuron at a, or that neurons fire independent of each other. This assumption may be correct in
More informationPart 2: Multivariate fmri analysis using a sparsifying spatio-temporal prior
Chalmers Machine Learning Summer School Approximate message passing and biomedicine Part 2: Multivariate fmri analysis using a sparsifying spatio-temporal prior Tom Heskes joint work with Marcel van Gerven
More informationSeveral ways to solve the MSO problem
Several ways to solve the MSO problem J. J. Steil - Bielefeld University - Neuroinformatics Group P.O.-Box 0 0 3, D-3350 Bielefeld - Germany Abstract. The so called MSO-problem, a simple superposition
More informationEmergence of resonances in neural systems: the interplay between adaptive threshold and short-term synaptic plasticity
Emergence of resonances in neural systems: the interplay between adaptive threshold and short-term synaptic plasticity Jorge F. Mejias 1,2 and Joaquín J. Torres 2 1 Department of Physics and Center for
More informationParametric Techniques
Parametric Techniques Jason J. Corso SUNY at Buffalo J. Corso (SUNY at Buffalo) Parametric Techniques 1 / 39 Introduction When covering Bayesian Decision Theory, we assumed the full probabilistic structure
More informationData Mining Part 5. Prediction
Data Mining Part 5. Prediction 5.5. Spring 2010 Instructor: Dr. Masoud Yaghini Outline How the Brain Works Artificial Neural Networks Simple Computing Elements Feed-Forward Networks Perceptrons (Single-layer,
More informationArtificial Neural Network and Fuzzy Logic
Artificial Neural Network and Fuzzy Logic 1 Syllabus 2 Syllabus 3 Books 1. Artificial Neural Networks by B. Yagnanarayan, PHI - (Cover Topologies part of unit 1 and All part of Unit 2) 2. Neural Networks
More information!) + log(t) # n i. The last two terms on the right hand side (RHS) are clearly independent of θ and can be
Supplementary Materials General case: computing log likelihood We first describe the general case of computing the log likelihood of a sensory parameter θ that is encoded by the activity of neurons. Each
More informationNew Machine Learning Methods for Neuroimaging
New Machine Learning Methods for Neuroimaging Gatsby Computational Neuroscience Unit University College London, UK Dept of Computer Science University of Helsinki, Finland Outline Resting-state networks
More informationParametric Techniques Lecture 3
Parametric Techniques Lecture 3 Jason Corso SUNY at Buffalo 22 January 2009 J. Corso (SUNY at Buffalo) Parametric Techniques Lecture 3 22 January 2009 1 / 39 Introduction In Lecture 2, we learned how to
More informationThe homogeneous Poisson process
The homogeneous Poisson process during very short time interval Δt there is a fixed probability of an event (spike) occurring independent of what happened previously if r is the rate of the Poisson process,
More informationThe Mixed States of Associative Memories Realize Unimodal Distribution of Dominance Durations in Multistable Perception
The Mixed States of Associative Memories Realize Unimodal Distribution of Dominance Durations in Multistable Perception Takashi Kanamaru Department of Mechanical Science and ngineering, School of Advanced
More informationUC Berkeley UC Berkeley Previously Published Works
UC Berkeley UC Berkeley Previously Published Works Title Fine spatiotemporal structure of phase in human intracranial EEG Permalink https://escholarship.org/uc/item/4bj5627s Journal Clinical Neurophysiology,
More informationHMM and IOHMM Modeling of EEG Rhythms for Asynchronous BCI Systems
HMM and IOHMM Modeling of EEG Rhythms for Asynchronous BCI Systems Silvia Chiappa and Samy Bengio {chiappa,bengio}@idiap.ch IDIAP, P.O. Box 592, CH-1920 Martigny, Switzerland Abstract. We compare the use
More informationNatural Image Statistics
Natural Image Statistics A probabilistic approach to modelling early visual processing in the cortex Dept of Computer Science Early visual processing LGN V1 retina From the eye to the primary visual cortex
More informationIndependent Component Analysis. Contents
Contents Preface xvii 1 Introduction 1 1.1 Linear representation of multivariate data 1 1.1.1 The general statistical setting 1 1.1.2 Dimension reduction methods 2 1.1.3 Independence as a guiding principle
More informationHigh-dimensional geometry of cortical population activity. Marius Pachitariu University College London
High-dimensional geometry of cortical population activity Marius Pachitariu University College London Part I: introduction to the brave new world of large-scale neuroscience Part II: large-scale data preprocessing
More informationLinear discriminant functions
Andrea Passerini passerini@disi.unitn.it Machine Learning Discriminative learning Discriminative vs generative Generative learning assumes knowledge of the distribution governing the data Discriminative
More information80% of all excitatory synapses - at the dendritic spines.
Dendritic Modelling Dendrites (from Greek dendron, tree ) are the branched projections of a neuron that act to conduct the electrical stimulation received from other cells to and from the cell body, or
More informationTwo Decades of Search for Chaos in Brain.
Two Decades of Search for Chaos in Brain. A. Krakovská Inst. of Measurement Science, Slovak Academy of Sciences, Bratislava, Slovak Republic, Email: krakovska@savba.sk Abstract. A short review of applications
More informationMODULE -4 BAYEIAN LEARNING
MODULE -4 BAYEIAN LEARNING CONTENT Introduction Bayes theorem Bayes theorem and concept learning Maximum likelihood and Least Squared Error Hypothesis Maximum likelihood Hypotheses for predicting probabilities
More informationRecipes for the Linear Analysis of EEG and applications
Recipes for the Linear Analysis of EEG and applications Paul Sajda Department of Biomedical Engineering Columbia University Can we read the brain non-invasively and in real-time? decoder 1001110 if YES
More informationPerceiving Geometric Patterns: From Spirals to Inside Outside Relations
1084 IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL. 12, NO. 5, SEPTEMBER 2001 Perceiving Geometric Patterns: From Spirals to Inside Outside Relations Ke Chen, Senior Member, IEEE, and DeLiang Wang, Senior
More informationModelling stochastic neural learning
Modelling stochastic neural learning Computational Neuroscience András Telcs telcs.andras@wigner.mta.hu www.cs.bme.hu/~telcs http://pattern.wigner.mta.hu/participants/andras-telcs Compiled from lectures
More informationSpatial Source Filtering. Outline EEG/ERP. ERPs) Event-related Potentials (ERPs( EEG
Integration of /MEG/fMRI Vince D. Calhoun, Ph.D. Director, Image Analysis & MR Research The MIND Institute Outline fmri/ data Three Approaches to integration/fusion Prediction Constraints 2 nd Level Fusion
More informationSpike-Frequency Adaptation: Phenomenological Model and Experimental Tests
Spike-Frequency Adaptation: Phenomenological Model and Experimental Tests J. Benda, M. Bethge, M. Hennig, K. Pawelzik & A.V.M. Herz February, 7 Abstract Spike-frequency adaptation is a common feature of
More informationLecture 4: Feed Forward Neural Networks
Lecture 4: Feed Forward Neural Networks Dr. Roman V Belavkin Middlesex University BIS4435 Biological neurons and the brain A Model of A Single Neuron Neurons as data-driven models Neural Networks Training
More information+ + ( + ) = Linear recurrent networks. Simpler, much more amenable to analytic treatment E.g. by choosing
Linear recurrent networks Simpler, much more amenable to analytic treatment E.g. by choosing + ( + ) = Firing rates can be negative Approximates dynamics around fixed point Approximation often reasonable
More informationARTIFICIAL NEURAL NETWORK PART I HANIEH BORHANAZAD
ARTIFICIAL NEURAL NETWORK PART I HANIEH BORHANAZAD WHAT IS A NEURAL NETWORK? The simplest definition of a neural network, more properly referred to as an 'artificial' neural network (ANN), is provided
More informationAdaptation of nonlinear systems through dynamic entropy estimation
INSTITUTE OF PHYSICS PUBLISHING JOURNAL OF PHYSICS A: MATHEMATICAL AND GENERAL J. Phys. A: Math. Gen. 37 (24) 2223 2237 PII: S35-447(4)67628-1 Adaptation of nonlinear systems through dynamic entropy estimation
More informationSynchrony in Neural Systems: a very brief, biased, basic view
Synchrony in Neural Systems: a very brief, biased, basic view Tim Lewis UC Davis NIMBIOS Workshop on Synchrony April 11, 2011 components of neuronal networks neurons synapses connectivity cell type - intrinsic
More informationA MULTIVARIATE TIME-FREQUENCY BASED PHASE SYNCHRONY MEASURE AND APPLICATIONS TO DYNAMIC BRAIN NETWORK ANALYSIS. Ali Yener Mutlu
A MULTIVARIATE TIME-FREQUENCY BASED PHASE SYNCHRONY MEASURE AND APPLICATIONS TO DYNAMIC BRAIN NETWORK ANALYSIS By Ali Yener Mutlu A DISSERTATION Submitted to Michigan State University in partial fulfillment
More informationBrief Introduction of Machine Learning Techniques for Content Analysis
1 Brief Introduction of Machine Learning Techniques for Content Analysis Wei-Ta Chu 2008/11/20 Outline 2 Overview Gaussian Mixture Model (GMM) Hidden Markov Model (HMM) Support Vector Machine (SVM) Overview
More informationSTA 414/2104: Lecture 8
STA 414/2104: Lecture 8 6-7 March 2017: Continuous Latent Variable Models, Neural networks With thanks to Russ Salakhutdinov, Jimmy Ba and others Outline Continuous latent variable models Background PCA
More informationSignal Modeling Techniques in Speech Recognition. Hassan A. Kingravi
Signal Modeling Techniques in Speech Recognition Hassan A. Kingravi Outline Introduction Spectral Shaping Spectral Analysis Parameter Transforms Statistical Modeling Discussion Conclusions 1: Introduction
More informationOn the use of Long-Short Term Memory neural networks for time series prediction
On the use of Long-Short Term Memory neural networks for time series prediction Pilar Gómez-Gil National Institute of Astrophysics, Optics and Electronics ccc.inaoep.mx/~pgomez In collaboration with: J.
More informationp(d θ ) l(θ ) 1.2 x x x
p(d θ ).2 x 0-7 0.8 x 0-7 0.4 x 0-7 l(θ ) -20-40 -60-80 -00 2 3 4 5 6 7 θ ˆ 2 3 4 5 6 7 θ ˆ 2 3 4 5 6 7 θ θ x FIGURE 3.. The top graph shows several training points in one dimension, known or assumed to
More informationarxiv: v3 [q-bio.nc] 17 Oct 2018
Evaluating performance of neural codes in model neural communication networks Chris G. Antonopoulos 1, Ezequiel Bianco-Martinez 2 and Murilo S. Baptista 3 October 18, 2018 arxiv:1709.08591v3 [q-bio.nc]
More informationUniversity of California
University of California Peer Reviewed Title: Nonlinear brain dynamics as macroscopic manifestation of underlying many-body field dynamics Author: Freeman, Walter J III, University of California, Berkeley
More informationReading Group on Deep Learning Session 1
Reading Group on Deep Learning Session 1 Stephane Lathuiliere & Pablo Mesejo 2 June 2016 1/31 Contents Introduction to Artificial Neural Networks to understand, and to be able to efficiently use, the popular
More informationFinding a Basis for the Neural State
Finding a Basis for the Neural State Chris Cueva ccueva@stanford.edu I. INTRODUCTION How is information represented in the brain? For example, consider arm movement. Neurons in dorsal premotor cortex (PMd)
More informationIntroduction Neural Networks - Architecture Network Training Small Example - ZIP Codes Summary. Neural Networks - I. Henrik I Christensen
Neural Networks - I Henrik I Christensen Robotics & Intelligent Machines @ GT Georgia Institute of Technology, Atlanta, GA 30332-0280 hic@cc.gatech.edu Henrik I Christensen (RIM@GT) Neural Networks 1 /
More informationCS 179: LECTURE 16 MODEL COMPLEXITY, REGULARIZATION, AND CONVOLUTIONAL NETS
CS 179: LECTURE 16 MODEL COMPLEXITY, REGULARIZATION, AND CONVOLUTIONAL NETS LAST TIME Intro to cudnn Deep neural nets using cublas and cudnn TODAY Building a better model for image classification Overfitting
More informationLinear Models for Classification
Linear Models for Classification Oliver Schulte - CMPT 726 Bishop PRML Ch. 4 Classification: Hand-written Digit Recognition CHINE INTELLIGENCE, VOL. 24, NO. 24, APRIL 2002 x i = t i = (0, 0, 0, 1, 0, 0,
More informationNeural Networks 1 Synchronization in Spiking Neural Networks
CS 790R Seminar Modeling & Simulation Neural Networks 1 Synchronization in Spiking Neural Networks René Doursat Department of Computer Science & Engineering University of Nevada, Reno Spring 2006 Synchronization
More informationHow to do backpropagation in a brain
How to do backpropagation in a brain Geoffrey Hinton Canadian Institute for Advanced Research & University of Toronto & Google Inc. Prelude I will start with three slides explaining a popular type of deep
More informationarxiv: v1 [q-bio.nc] 6 Jun 2012
CHAPTER 1 arxiv:1206.1108v1 [q-bio.nc] 6 Jun 2012 THERMODYNAMIC MODEL OF CRITICALITY IN THE CORTEX BASED ON EEG/ECOG DATA Robert Kozma 1, Marko Puljic 2,1 and Walter J. Freeman 3 1 Center for Large-Scale
More informationCOM336: Neural Computing
COM336: Neural Computing http://www.dcs.shef.ac.uk/ sjr/com336/ Lecture 2: Density Estimation Steve Renals Department of Computer Science University of Sheffield Sheffield S1 4DP UK email: s.renals@dcs.shef.ac.uk
More informationSearching for Nested Oscillations in Frequency and Sensor Space. Will Penny. Wellcome Trust Centre for Neuroimaging. University College London.
in Frequency and Sensor Space Oscillation Wellcome Trust Centre for Neuroimaging. University College London. Non- Workshop on Non-Invasive Imaging of Nonlinear Interactions. 20th Annual Computational Neuroscience
More informationSimultaneous Multi-frame MAP Super-Resolution Video Enhancement using Spatio-temporal Priors
Simultaneous Multi-frame MAP Super-Resolution Video Enhancement using Spatio-temporal Priors Sean Borman and Robert L. Stevenson Department of Electrical Engineering, University of Notre Dame Notre Dame,
More informationLinear Regression, Neural Networks, etc.
Linear Regression, Neural Networks, etc. Gradient Descent Many machine learning problems can be cast as optimization problems Define a function that corresponds to learning error. (More on this later)
More informationPattern Classification
Pattern Classification All materials in these slides were taen from Pattern Classification (2nd ed) by R. O. Duda,, P. E. Hart and D. G. Stor, John Wiley & Sons, 2000 with the permission of the authors
More informationRecursive Generalized Eigendecomposition for Independent Component Analysis
Recursive Generalized Eigendecomposition for Independent Component Analysis Umut Ozertem 1, Deniz Erdogmus 1,, ian Lan 1 CSEE Department, OGI, Oregon Health & Science University, Portland, OR, USA. {ozertemu,deniz}@csee.ogi.edu
More informationReconnaissance d objetsd et vision artificielle
Reconnaissance d objetsd et vision artificielle http://www.di.ens.fr/willow/teaching/recvis09 Lecture 6 Face recognition Face detection Neural nets Attention! Troisième exercice de programmation du le
More informationNonlinear reverse-correlation with synthesized naturalistic noise
Cognitive Science Online, Vol1, pp1 7, 2003 http://cogsci-onlineucsdedu Nonlinear reverse-correlation with synthesized naturalistic noise Hsin-Hao Yu Department of Cognitive Science University of California
More informationLearning Gaussian Process Models from Uncertain Data
Learning Gaussian Process Models from Uncertain Data Patrick Dallaire, Camille Besse, and Brahim Chaib-draa DAMAS Laboratory, Computer Science & Software Engineering Department, Laval University, Canada
More informationBayesian probability theory and generative models
Bayesian probability theory and generative models Bruno A. Olshausen November 8, 2006 Abstract Bayesian probability theory provides a mathematical framework for peforming inference, or reasoning, using
More informationIterative face image feature extraction with Generalized Hebbian Algorithm and a Sanger-like BCM rule
Iterative face image feature extraction with Generalized Hebbian Algorithm and a Sanger-like BCM rule Clayton Aldern (Clayton_Aldern@brown.edu) Tyler Benster (Tyler_Benster@brown.edu) Carl Olsson (Carl_Olsson@brown.edu)
More informationDynamical Constraints on Computing with Spike Timing in the Cortex
Appears in Advances in Neural Information Processing Systems, 15 (NIPS 00) Dynamical Constraints on Computing with Spike Timing in the Cortex Arunava Banerjee and Alexandre Pouget Department of Brain and
More informationarxiv:physics/ v1 [physics.bio-ph] 19 Feb 1999
Odor recognition and segmentation by coupled olfactory bulb and cortical networks arxiv:physics/9902052v1 [physics.bioph] 19 Feb 1999 Abstract Zhaoping Li a,1 John Hertz b a CBCL, MIT, Cambridge MA 02139
More informationLecture 9. Time series prediction
Lecture 9 Time series prediction Prediction is about function fitting To predict we need to model There are a bewildering number of models for data we look at some of the major approaches in this lecture
More informationDynamic Modeling of Brain Activity
0a Dynamic Modeling of Brain Activity EIN IIN PC Thomas R. Knösche, Leipzig Generative Models for M/EEG 4a Generative Models for M/EEG states x (e.g. dipole strengths) parameters parameters (source positions,
More informationComparing integrate-and-fire models estimated using intracellular and extracellular data 1
Comparing integrate-and-fire models estimated using intracellular and extracellular data 1 Liam Paninski a,b,2 Jonathan Pillow b Eero Simoncelli b a Gatsby Computational Neuroscience Unit, University College
More informationMIXED EFFECTS MODELS FOR TIME SERIES
Outline MIXED EFFECTS MODELS FOR TIME SERIES Cristina Gorrostieta Hakmook Kang Hernando Ombao Brown University Biostatistics Section February 16, 2011 Outline OUTLINE OF TALK 1 SCIENTIFIC MOTIVATION 2
More informationStatistical and Learning Techniques in Computer Vision Lecture 2: Maximum Likelihood and Bayesian Estimation Jens Rittscher and Chuck Stewart
Statistical and Learning Techniques in Computer Vision Lecture 2: Maximum Likelihood and Bayesian Estimation Jens Rittscher and Chuck Stewart 1 Motivation and Problem In Lecture 1 we briefly saw how histograms
More informationSTATISTICAL MECHANICS OF NEOCORTICAL INTERACTIONS: EEG EIGENFUNCTIONS OF SHORT-TERM MEMORY
STATISTICAL MECHANICS OF NEOCORTICAL INTERACTIONS: EEG EIGENFUNCTIONS OF SHORT-TERM MEMORY Lester Ingber Lester Ingber Research PO Box 06440 Wacker Dr PO Sears Tower Chicago, IL 60606 and DRW Inv estments
More informationCh 4. Linear Models for Classification
Ch 4. Linear Models for Classification Pattern Recognition and Machine Learning, C. M. Bishop, 2006. Department of Computer Science and Engineering Pohang University of Science and echnology 77 Cheongam-ro,
More informationEEL 851: Biometrics. An Overview of Statistical Pattern Recognition EEL 851 1
EEL 851: Biometrics An Overview of Statistical Pattern Recognition EEL 851 1 Outline Introduction Pattern Feature Noise Example Problem Analysis Segmentation Feature Extraction Classification Design Cycle
More informationLinear & nonlinear classifiers
Linear & nonlinear classifiers Machine Learning Hamid Beigy Sharif University of Technology Fall 1394 Hamid Beigy (Sharif University of Technology) Linear & nonlinear classifiers Fall 1394 1 / 34 Table
More informationForecasting Wind Ramps
Forecasting Wind Ramps Erin Summers and Anand Subramanian Jan 5, 20 Introduction The recent increase in the number of wind power producers has necessitated changes in the methods power system operators
More informationOptimization Methods for Machine Learning (OMML)
Optimization Methods for Machine Learning (OMML) 2nd lecture (2 slots) Prof. L. Palagi 16/10/2014 1 What is (not) Data Mining? By Namwar Rizvi - Ad Hoc Query: ad Hoc queries just examines the current data
More informationSections 18.6 and 18.7 Artificial Neural Networks
Sections 18.6 and 18.7 Artificial Neural Networks CS4811 - Artificial Intelligence Nilufer Onder Department of Computer Science Michigan Technological University Outline The brain vs. artifical neural
More informationSTA 4273H: Statistical Machine Learning
STA 4273H: Statistical Machine Learning Russ Salakhutdinov Department of Statistics! rsalakhu@utstat.toronto.edu! http://www.utstat.utoronto.ca/~rsalakhu/ Sidney Smith Hall, Room 6002 Lecture 11 Project
More informationThe Bayesian Brain. Robert Jacobs Department of Brain & Cognitive Sciences University of Rochester. May 11, 2017
The Bayesian Brain Robert Jacobs Department of Brain & Cognitive Sciences University of Rochester May 11, 2017 Bayesian Brain How do neurons represent the states of the world? How do neurons represent
More informationNeural networks. Chapter 20. Chapter 20 1
Neural networks Chapter 20 Chapter 20 1 Outline Brains Neural networks Perceptrons Multilayer networks Applications of neural networks Chapter 20 2 Brains 10 11 neurons of > 20 types, 10 14 synapses, 1ms
More informationA Three-dimensional Physiologically Realistic Model of the Retina
A Three-dimensional Physiologically Realistic Model of the Retina Michael Tadross, Cameron Whitehouse, Melissa Hornstein, Vicky Eng and Evangelia Micheli-Tzanakou Department of Biomedical Engineering 617
More informationη 5 ξ [k+2] η 4 η 3 η 1 ξ [k]
On-line and Hierarchical Design Methods of Dynamics Based Information Processing System Masafumi OKADA, Daisuke NAKAMURA and Yoshihiko NAKAMURA Dept. of Mechano-Informatics, University of okyo 7-- Hongo
More informationEfficient coding of natural images with a population of noisy Linear-Nonlinear neurons
Efficient coding of natural images with a population of noisy Linear-Nonlinear neurons Yan Karklin and Eero P. Simoncelli NYU Overview Efficient coding is a well-known objective for the evaluation and
More informationSupporting Online Material for
www.sciencemag.org/cgi/content/full/319/5869/1543/dc1 Supporting Online Material for Synaptic Theory of Working Memory Gianluigi Mongillo, Omri Barak, Misha Tsodyks* *To whom correspondence should be addressed.
More informationAn Introductory Course in Computational Neuroscience
An Introductory Course in Computational Neuroscience Contents Series Foreword Acknowledgments Preface 1 Preliminary Material 1.1. Introduction 1.1.1 The Cell, the Circuit, and the Brain 1.1.2 Physics of
More informationUnsupervised Learning with Permuted Data
Unsupervised Learning with Permuted Data Sergey Kirshner skirshne@ics.uci.edu Sridevi Parise sparise@ics.uci.edu Padhraic Smyth smyth@ics.uci.edu School of Information and Computer Science, University
More informationIntroduction to Biomedical Engineering
Introduction to Biomedical Engineering Biosignal processing Kung-Bin Sung 6/11/2007 1 Outline Chapter 10: Biosignal processing Characteristics of biosignals Frequency domain representation and analysis
More informationArtificial Neural Networks
Artificial Neural Networks Stephan Dreiseitl University of Applied Sciences Upper Austria at Hagenberg Harvard-MIT Division of Health Sciences and Technology HST.951J: Medical Decision Support Knowledge
More informationNeutron inverse kinetics via Gaussian Processes
Neutron inverse kinetics via Gaussian Processes P. Picca Politecnico di Torino, Torino, Italy R. Furfaro University of Arizona, Tucson, Arizona Outline Introduction Review of inverse kinetics techniques
More informationEncoding or decoding
Encoding or decoding Decoding How well can we learn what the stimulus is by looking at the neural responses? We will discuss two approaches: devise and evaluate explicit algorithms for extracting a stimulus
More informationRecognition Performance from SAR Imagery Subject to System Resource Constraints
Recognition Performance from SAR Imagery Subject to System Resource Constraints Michael D. DeVore Advisor: Joseph A. O SullivanO Washington University in St. Louis Electronic Systems and Signals Research
More informationLinear Regression and Its Applications
Linear Regression and Its Applications Predrag Radivojac October 13, 2014 Given a data set D = {(x i, y i )} n the objective is to learn the relationship between features and the target. We usually start
More informationNeural Networks and the Back-propagation Algorithm
Neural Networks and the Back-propagation Algorithm Francisco S. Melo In these notes, we provide a brief overview of the main concepts concerning neural networks and the back-propagation algorithm. We closely
More informationJ. Marro Institute Carlos I, University of Granada. Net-Works 08, Pamplona, 9-11 June 2008
J. Marro http://ergodic.ugr.es/jmarro Institute Carlos I, University of Granada Net-Works 08, Pamplona, 9-11 June 2008 Networks? mathematical objects Königsberg bridges + Euler graphs (s. XVIII),... models
More informationEM-algorithm for Training of State-space Models with Application to Time Series Prediction
EM-algorithm for Training of State-space Models with Application to Time Series Prediction Elia Liitiäinen, Nima Reyhani and Amaury Lendasse Helsinki University of Technology - Neural Networks Research
More informationState-Space Methods for Inferring Spike Trains from Calcium Imaging
State-Space Methods for Inferring Spike Trains from Calcium Imaging Joshua Vogelstein Johns Hopkins April 23, 2009 Joshua Vogelstein (Johns Hopkins) State-Space Calcium Imaging April 23, 2009 1 / 78 Outline
More informationTitle. Author(s)Fujii, Hiroshi; Tsuda, Ichiro. CitationNeurocomputing, 58-60: Issue Date Doc URL. Type.
Title Neocortical gap junction-coupled interneuron systems exhibiting transient synchrony Author(s)Fujii, Hiroshi; Tsuda, Ichiro CitationNeurocomputing, 58-60: 151-157 Issue Date 2004-06 Doc URL http://hdl.handle.net/2115/8488
More information