Final Report For Undergraduate Research Opportunities Project Name: Biomedical Signal Processing in EEG. Zhang Chuoyao 1 and Xu Jianxin 2
|
|
- Kevin Gilmore
- 6 years ago
- Views:
Transcription
1 ABSTRACT Final Report For Undergraduate Research Opportunities Project Name: Biomedical Signal Processing in EEG Zhang Chuoyao 1 and Xu Jianxin 2 Department of Electrical and Computer Engineering, National University of Singapore In this paper, we will present application of independent component analysis (ICA) to the problem of source identification, especially artifact identification in multi-channel EEG. We first introduce principle of EEG and analyze its characteristic spectral shapes. Then, reason for adopting ICA, instead of traditional linear methods such as PCA and SVD to process EEG signal is stated. After that, we explain the mathematical model of ICA and propose criteria and algorithms. Finally we consider artifact identification (separation) as an essential application of ICA to EEG signal. EEG INTRODUCTION AND SPECTRAL ANALYSIS EEG, the acronym for Electroencephalography, is the neurophysiologic measurement of the electrical activity of the brain by recording from electrodes placed on the scalp. EEG is used extensively for monitoring the electrical activity within the human brain, both for research and clinical purposes. It is in fact one of the most widespread brain mapping techniques to date. EEG is used both for the measurement of spontaneous activity and for the study of evoked potentials. Evoked potentials are activity triggered by a particular stimulus that may be, for example, auditory. Typical clinical EEG systems use around 20 electrodes, evenly distributed over the head. State-of-the-art EEGs may consist of a couple hundred sensors. The signal-to noise ratio is typically quite low: the background potential distribution is of the order of 100 microvolt, whereas the evoked potentials may be two orders of magnitude weaker. The EEG waveforms can be characterized in frequency domain. The different frequency components of EEG denote different brain activities: Alpha wave This component has frequency ranges from 8 to 12 Hz. This typically infers that the person in test is relaxed while awake. The typical waveform is as follows: 1 Student 2 Supervisor
2 Figure 1. Typical waveform of Alpha wave Beta wave This component has frequency ranging from 12Hz to 30Hz. It is also characterized by low amplitude. In reality, this component arises with active mental activities such as anxiety. The typical waveform is as follows: Figure 2. Typical waveform of Beta wave Gamma wave This wave component usually corresponds to frequency ranges Hz. Gamma can only be observed when object is in thinking process. The typical waveform is as follows: Figure 3. Typical waveform of Gamma wave TRADITIONAL LINEAR SIGNAL PROCESSING METHODS IN EEG AND THEIR DRAWBACKS Principal Component Analysis (PCA) and Singular Value Decomposition (SVD) are two linear processing methods utilizing theory of linear algebra. In brief, the PCA decorrelate the output by computing eigenvectors and project the original signal vector to these directions. The components extracted from PCA/SVD are only uncorrelated, while not necessarily statistically independent (unless components are all gaussian). Thus, the resulting components from PCA/SVD may not carry clear physiological significance, thus reducing their use in scenarios where we need to extract independent components, ones that bear brain-functional and physiological meaning. ICA AND ITS MATHEMETICAL MODELS
3 Figure 4. Linear mixing (A) of the latent source signals s(t), demixing by whitening (V) And orthogonal transform (W) of the observed data X(t) Independent component analysis was originally developed to deal with problems that are closely related to the cocktail-party problem [1]. Since the recent increase of interest in ICA, it has become clear that this principle has a lot of other interesting applications as well, such as that in electrical recordings of brain activity as given by an electroencephalogram (EEG). The EEG data consists of recordings of electrical potentials in many different locations on the scalp. These potentials are presumably generated by mixing some underlying components of brain and muscle activity. This situation is quite similar to the cocktail-party problem: we would like to find the original components of brain activity, but we can only observe mixtures of the components. ICA can reveal interesting information on brain activity by giving access to its independent components. As illustrated in Figure 4, The ICA problem [2], [3] can be formulated as given X(t), the observed data, estimate both the matrix A and the corresponding S(t). There are several limitations to solving ICA problem. If no suppositions are made about the noise, then additive noise term in Eq.4 is omitted. X(t) = AS(t) + N(t) = S i (t)a i + N(t) (5) A practical strategy is to include it in the signals as a supplementary term in the sum Eq.5. Thus the ICA model becomes: X(t) = AS(t) = S i (t)a i (6) ICA consists in updating a demixing matrix B(t) of size M * N, without resorting to any information about the spatial mixing matrix A, so that the output vector y(t ) = B(t) x(t) becomes an estimate of the original independent source signals s(t ). Since ICA deals with higher-order statistics it is justified to normalize in some sense the first- and second-order moments. The effect is that the separating matrix B(t ) is divided in two parts: the first is dealing with dependencies in the first two moments the whitening matrix V(t ), and the second comprises the dependencies in higher-order statistics the orthogonal separating matrix W(t ) B(t) = W(t)V(t) -> WV (7) On the other hand, standard PCA, as described previously in this paper, is often used for whitening in a sense that the covariance matrix of resulting vector after PCA is diagonal, indicating that components are uncorrelated. Having discussed PCA or whitening operation V, we now further our discussion on how to find an orthogonal transformation W. Nongaussianity-maximization Criteria& Gradient Algorithm
4 Criteria: Maximization of kurtosis Kurtosis is defined in the zero-mean case by the equation Kurt(x) = E{x 4 } 3[E{x 2 }] 2 (8) For whitened variable x, E{x 2 } = 1; Kurt(x) = E{x 4 } 3 (9) In fact, 3 is for four order moments for Gaussian random variable. Thus for Gaussian random variable Kurt(x) = 3-3 = 0. Thus, the absolute value of kurtosis determines the likeliness of a zero mean, whitened, symmetric random variable to the standard Gaussian variable. In other words, a variable with large absolute value of kurtosis will have large nongaussianity. In context of ICA, given whitened vector z, we want an orthogonal transform W such that the resulting vector has most nongaussianity. In practice, to maximize the absolute value of kurtosis, we would start from some vector w, compute the direction in which the absolute value of the kurtosis of Y = w T z is growing most strongly, based on the available sample z(1),., z(t) of mixture vector z, and then move the vector w in that direction. This idea is implemented in gradient methods and their extensions. Using the principles in Chapter 3, the gradient of the absolute value of kurtosis of w T z can be simply computed as d kurt(w T z) /dw = 4 sign(kurt(w T z))[e{z(w T z) 3 } 3w w 2 ] (10) since for whitened data we have E{(w T z) 2 } = w 2. Since we are optimizing this function on the unit sphere w 2 = 1, the gradient method must be complemented by projecting w on the unit sphere after every step. This can be done simply by dividing w by its norm. Actually, in many cases one knows in advance the nature of the distributions of the independent components, i.e., whether they are subgaussian or supergaussian. Then one can simply plug the correct sign of kurtosis in the algorithm, and avoid its estimation. Minimization of Mutual Information Criteria and Algorithm Criteria: First we need to define several information-theoretic quantity for later use. Differential entropy: Negentropy: (16)
5 (17) Mutual information: (18) Suppose we have for an invertible linear transformation (demixing matrix B) y = Bx (x be observation vector) (19) Using information-theoretic relation of input x and output y, the result of linear transformation B, we have the following equation. I(y 1,.y n ) = Σ H(y i ) H(x) log detb (20) Now, let us consider what happens if we constrain the y i to be uncorrelated and of unit variance. This means E{yy T } = BE{xx T }B T = I, which implies and this implies that detb must be constant since det E{xx T } does not depend on B. Moreover, for y i of unit variance, entropy and negentropy differ only by a constant and the sign, as can be seen in Eq.17. Thus we obtain where the constant term does not depend on B. This shows the fundamental relation between negentropy and mutual information. We see in Eq.21 that finding an invertible linear transformation B that minimizes the mutual information is roughly equivalent to finding directions in which the negentropy is maximized. We have seen previously that negentropy is a measure of nongaussianity. Thus, Eq.21 shows that ICA estimation by minimization of mutual information is equivalent to maximizing the sum of nongaussianities of the estimates of the independent components, when the estimates are constrained to be uncorrelated. Thus, we see that the formulation of ICA as minimization of mutual information gives another rigorous justification of our more heuristically introduced idea of finding maximally nongaussian directions. In practice, however, there are also some important differences between these two criteria: 1. Negentropy, and other measures of nongaussianity, enable the deflationary, i.e., one-by-one, estimation of the independent components, since we can look for the maxima (21)
6 of nongaussianity of a single projection b T x. This is not possible with mutual information or most other criteria, like the likelihood. 2. A smaller difference is that in using nongaussianity, we force the estimates of the independent components to be uncorrelated. This is not necessary when using mutual information, because we could use the form in Eq.20 directly. Thus the optimization space is slightly reduced. Algorithm To use mutual information in practice, we need some method of estimating or approximating it from real data. Earlier, we saw two methods for approximating mutual entropy. The first one was introduced in [4]. The second one was based on sublinear approximation technique in [5]. Using mutual information leads essentially to the same algorithms as used for maximization of nongaussianity. In the scenario of maximization of nongaussianity, the corresponding algorithms are those that use symmetric orthogonalization, since we are maximizing the sum of nongaussianities, so that no order exists between the components. Thus, we do not present any new algorithms in this chapter; the reader is referred to algorithm for maximization of nongaussianity. ARTIFACT IDENTIFICATION As a first application of ICA on EEG signals, we consider separation of artifacts. Artifacts are signals that are not generated by brain activity, but by some external interference, such as muscle activity. A typical example is ocular artifacts, generated by eye muscle activity. A review on artifact identification can be found in [6]. The simplest and most widely used method consists of discarding the portions of the recordings containing attributes (e.g., amplitude peak, frequency contents, variance and slope) that are typical of artifacts and exceed a determined threshold. This may lead to significant loss of data, and to complete inability of studying interesting brain activity occurring near or during strong eye activity, such as in visual tracking experiments. Other methods include the subtraction of a regression portion of one or more additional inputs (e.g., from electrooculograms, electrocardiograms, or electromyograms) from the measured signals. This technique is more likely to be used in EEG recordings. ICA gives a method for artifact removal where we do not need an accurate model of the process that generated the artifacts; this is the blind aspect of the method. Neither do we need specified observation intervals that contain mainly the artifact, nor additional inputs; this is the unsupervised aspect of the method. Thus ICA gives a promising method for artifact identification and removal. It was shown in [8] that artifacts can indeed be estimated by ICA alone. It turns out that the artifacts are quite independent from the rest of the signal, and thus even this requirement of the model is reasonably well fulfilled. In the experiments on EEG artifact removal, the EEG signals were recorded in a shielded room with the 122-channel whole-scalp magnetometer described above. The test person
7 was asked to blink and make horizontal saccades, in order to produce typical ocular (eye) artifacts. Moreover, to produce muscular artifacts, the subject was asked to bite his teeth for as long as 20 seconds. Yet another artifact was created by placing a digital watch one meter away from the helmet into the shielded room. Figure 5 presents a subset of 12 artifact contaminated EEG signals, from a total of 122 used in the experiment. Several artifact structures are evident from this figure, such as eye and muscle activity. The results of artifact extraction using ICA are shown in Fig. 6. Components IC1 and IC2 are clearly the activations of two different muscle sets, whereas IC3 and IC5 are, respectively, horizontal eye movements and blinks. Furthermore, other disturbances with weaker signal-to-noise ratio, such as the heart beat and a digital watch, are extracted as well (IC4 and IC8, respectively). IC9 is probably a faulty sensor. ICs 6 and 7 can be breathing artifacts. Figure 5. part of EEG channel recordings IC1 IC2 IC3 IC4 IC5 IC6 IC7 IC8 IC9 Figure 6. Extracted artifact components
8 CONCLUSION AND FURTHER RESEARCH: We have introduced ICA, an alternative biomedical signal processing method applied in EEG and explained its advantages over more traditional linear methods such PCA and SVD, which only operate on second-order statistics. We ve also covered mathematical models of ICA and stated its rationale, criteria and algorithms. Finally, we ve covered one important ICA application to EEG, artifact identification. We ve seen that using ICA, we ve successfully recovered artifacts that are deliberately introduced in the experiment. Throughout this paper, we ve assumed a linear ICA model. This assumption is valid when the source mixing process is linear in nature or can be linearized without much distortion. However, there are scenarios where EEG signals cannot be simply modeled as linear superposition of independent components. In these cases, we need to consider non-linear ICA method along with corresponding criteria and algorithms. REFERENCE 1. S.-I. Amari. Neural learning in structured parameter spaces natural Riemannian gradient. In Advances in Neural Information Processing Systems 9, pages MIT Press, K. Abed-Meraim and P. Loubaton. A subspace algorithm for certain blind identification problems. IEEE Trans. on Information Theory, 43(2): , L. Almeida. Linear and nonlinear ICA based on mutual information. In Proc. IEEE 2000 Adaptive Systems for Signal Processing, Communications, and Control symposium (AS-SPCC), pages , Lake Louise, Canada, October T. Vilmansen. On an Approximation of Mutual Information. In IEEE Transactions on Computers, Volume 23, Issue 12, pages IEEE Computer Society 5 Sudipto Guha, Andrew McGregor and Suresh Venkatasubramanian, "Streaming and Sublinear Approximation of Entropy and Information Distances", 17th ACM-SIAM Symposium on Discrete Algorithms, C. H. M. Brunia, J. Möcks, andm. Van den Berg-Lennsen. Correcting ocular artifacts a comparison of several methods. J. of Psychophysiology, 3:1 50, R. Vig ario. Extraction of ocular artifacts from EEG using independent component analysis. Electroenceph. Clin. Neurophysiol., 103(3): , T. P. Jung, C. Humphries, T.-W. Lee, S. Makeig, M. J. McKeown, V. Iragui, and T. Sejnowski. Extended ICA removes artifacts from electroencephalographic recordings. In Advances in Neural Information Processing Systems, volume 10. MIT Press, 1998.
Advanced Introduction to Machine Learning CMU-10715
Advanced Introduction to Machine Learning CMU-10715 Independent Component Analysis Barnabás Póczos Independent Component Analysis 2 Independent Component Analysis Model original signals Observations (Mixtures)
More informationHST.582J/6.555J/16.456J
Blind Source Separation: PCA & ICA HST.582J/6.555J/16.456J Gari D. Clifford gari [at] mit. edu http://www.mit.edu/~gari G. D. Clifford 2005-2009 What is BSS? Assume an observation (signal) is a linear
More informationHST.582J / 6.555J / J Biomedical Signal and Image Processing Spring 2007
MIT OpenCourseWare http://ocw.mit.edu HST.582J / 6.555J / 16.456J Biomedical Signal and Image Processing Spring 2007 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms.
More informationGatsby Theoretical Neuroscience Lectures: Non-Gaussian statistics and natural images Parts I-II
Gatsby Theoretical Neuroscience Lectures: Non-Gaussian statistics and natural images Parts I-II Gatsby Unit University College London 27 Feb 2017 Outline Part I: Theory of ICA Definition and difference
More informationIndependent Component Analysis (ICA) Bhaskar D Rao University of California, San Diego
Independent Component Analysis (ICA) Bhaskar D Rao University of California, San Diego Email: brao@ucsdedu References 1 Hyvarinen, A, Karhunen, J, & Oja, E (2004) Independent component analysis (Vol 46)
More informationCIFAR Lectures: Non-Gaussian statistics and natural images
CIFAR Lectures: Non-Gaussian statistics and natural images Dept of Computer Science University of Helsinki, Finland Outline Part I: Theory of ICA Definition and difference to PCA Importance of non-gaussianity
More informationIndependent Component Analysis and Its Applications. By Qing Xue, 10/15/2004
Independent Component Analysis and Its Applications By Qing Xue, 10/15/2004 Outline Motivation of ICA Applications of ICA Principles of ICA estimation Algorithms for ICA Extensions of basic ICA framework
More informationIndependent Component Analysis. Contents
Contents Preface xvii 1 Introduction 1 1.1 Linear representation of multivariate data 1 1.1.1 The general statistical setting 1 1.1.2 Dimension reduction methods 2 1.1.3 Independence as a guiding principle
More informationIndependent Component Analysis for identification of artifacts in Magnetoencephalographic recordings
Independent Component Analysis for identification of artifacts in Magnetoencephalographic recordings Ricardo Vigario 1 ; Veikko J ousmiiki2, Matti Hiimiiliiinen2, Riitta Hari2, and Erkki Oja 1 1 Lab. of
More informationFundamentals of Principal Component Analysis (PCA), Independent Component Analysis (ICA), and Independent Vector Analysis (IVA)
Fundamentals of Principal Component Analysis (PCA),, and Independent Vector Analysis (IVA) Dr Mohsen Naqvi Lecturer in Signal and Information Processing, School of Electrical and Electronic Engineering,
More informationDifferent Estimation Methods for the Basic Independent Component Analysis Model
Washington University in St. Louis Washington University Open Scholarship Arts & Sciences Electronic Theses and Dissertations Arts & Sciences Winter 12-2018 Different Estimation Methods for the Basic Independent
More informationUnsupervised learning: beyond simple clustering and PCA
Unsupervised learning: beyond simple clustering and PCA Liza Rebrova Self organizing maps (SOM) Goal: approximate data points in R p by a low-dimensional manifold Unlike PCA, the manifold does not have
More informationChapter 15 - BLIND SOURCE SEPARATION:
HST-582J/6.555J/16.456J Biomedical Signal and Image Processing Spr ing 2005 Chapter 15 - BLIND SOURCE SEPARATION: Principal & Independent Component Analysis c G.D. Clifford 2005 Introduction In this chapter
More informationIntroduction to Independent Component Analysis. Jingmei Lu and Xixi Lu. Abstract
Final Project 2//25 Introduction to Independent Component Analysis Abstract Independent Component Analysis (ICA) can be used to solve blind signal separation problem. In this article, we introduce definition
More informationIndependent Component Analysis
1 Independent Component Analysis Background paper: http://www-stat.stanford.edu/ hastie/papers/ica.pdf 2 ICA Problem X = AS where X is a random p-vector representing multivariate input measurements. S
More informationIndependent Component Analysis
Independent Component Analysis James V. Stone November 4, 24 Sheffield University, Sheffield, UK Keywords: independent component analysis, independence, blind source separation, projection pursuit, complexity
More informationIndependent Component Analysis and Blind Source Separation
Independent Component Analysis and Blind Source Separation Aapo Hyvärinen University of Helsinki and Helsinki Institute of Information Technology 1 Blind source separation Four source signals : 1.5 2 3
More informationIndependent Component Analysis
A Short Introduction to Independent Component Analysis with Some Recent Advances Aapo Hyvärinen Dept of Computer Science Dept of Mathematics and Statistics University of Helsinki Problem of blind source
More informationIndependent component analysis: algorithms and applications
PERGAMON Neural Networks 13 (2000) 411 430 Invited article Independent component analysis: algorithms and applications A. Hyvärinen, E. Oja* Neural Networks Research Centre, Helsinki University of Technology,
More informationBlind Source Separation Using Artificial immune system
American Journal of Engineering Research (AJER) e-issn : 2320-0847 p-issn : 2320-0936 Volume-03, Issue-02, pp-240-247 www.ajer.org Research Paper Open Access Blind Source Separation Using Artificial immune
More informationNew Machine Learning Methods for Neuroimaging
New Machine Learning Methods for Neuroimaging Gatsby Computational Neuroscience Unit University College London, UK Dept of Computer Science University of Helsinki, Finland Outline Resting-state networks
More informationIndependent Component Analysis
A Short Introduction to Independent Component Analysis Aapo Hyvärinen Helsinki Institute for Information Technology and Depts of Computer Science and Psychology University of Helsinki Problem of blind
More informationExtraction of Sleep-Spindles from the Electroencephalogram (EEG)
Extraction of Sleep-Spindles from the Electroencephalogram (EEG) Allan Kardec Barros Bio-mimetic Control Research Center, RIKEN, 2271-13 Anagahora, Shimoshidami, Moriyama-ku, Nagoya 463, Japan Roman Rosipal,
More informationIndependent Component Analysis and Unsupervised Learning. Jen-Tzung Chien
Independent Component Analysis and Unsupervised Learning Jen-Tzung Chien TABLE OF CONTENTS 1. Independent Component Analysis 2. Case Study I: Speech Recognition Independent voices Nonparametric likelihood
More informationIndependent Component Analysis and Unsupervised Learning
Independent Component Analysis and Unsupervised Learning Jen-Tzung Chien National Cheng Kung University TABLE OF CONTENTS 1. Independent Component Analysis 2. Case Study I: Speech Recognition Independent
More informationTWO METHODS FOR ESTIMATING OVERCOMPLETE INDEPENDENT COMPONENT BASES. Mika Inki and Aapo Hyvärinen
TWO METHODS FOR ESTIMATING OVERCOMPLETE INDEPENDENT COMPONENT BASES Mika Inki and Aapo Hyvärinen Neural Networks Research Centre Helsinki University of Technology P.O. Box 54, FIN-215 HUT, Finland ABSTRACT
More informationIndependent Component Analysis
000 001 002 003 004 005 006 007 008 009 010 011 012 013 014 015 016 017 018 019 020 021 022 023 024 025 026 027 028 029 030 031 032 033 034 035 036 037 038 039 040 041 042 043 044 1 Introduction Indepent
More informationIndependent Component Analysis. PhD Seminar Jörgen Ungh
Independent Component Analysis PhD Seminar Jörgen Ungh Agenda Background a motivater Independence ICA vs. PCA Gaussian data ICA theory Examples Background & motivation The cocktail party problem Bla bla
More informationPrincipal Component Analysis
Principal Component Analysis Introduction Consider a zero mean random vector R n with autocorrelation matri R = E( T ). R has eigenvectors q(1),,q(n) and associated eigenvalues λ(1) λ(n). Let Q = [ q(1)
More informationAn Introduction to Independent Components Analysis (ICA)
An Introduction to Independent Components Analysis (ICA) Anish R. Shah, CFA Northfield Information Services Anish@northinfo.com Newport Jun 6, 2008 1 Overview of Talk Review principal components Introduce
More informationSeparation of the EEG Signal using Improved FastICA Based on Kurtosis Contrast Function
Australian Journal of Basic and Applied Sciences, 5(9): 2152-2156, 211 ISSN 1991-8178 Separation of the EEG Signal using Improved FastICA Based on Kurtosis Contrast Function 1 Tahir Ahmad, 2 Hjh.Norma
More informationAdaptive Noise Cancellation
Adaptive Noise Cancellation P. Comon and V. Zarzoso January 5, 2010 1 Introduction In numerous application areas, including biomedical engineering, radar, sonar and digital communications, the goal is
More informationComparison of ICA and WT with S-transform based method for removal of ocular artifact from EEG signals
J. Biomedical Science and Engineering, 011, 4, 341-351 doi:10.436/jbise.011.45043 Published Online May 011 (http://www.scirp.org/journal/jbise/). Comparison of ICA and WT with S-transform based method
More informationSTATS 306B: Unsupervised Learning Spring Lecture 12 May 7
STATS 306B: Unsupervised Learning Spring 2014 Lecture 12 May 7 Lecturer: Lester Mackey Scribe: Lan Huong, Snigdha Panigrahi 12.1 Beyond Linear State Space Modeling Last lecture we completed our discussion
More informationIndependent Component Analysis
Independent Component Analysis Seungjin Choi Department of Computer Science Pohang University of Science and Technology, Korea seungjin@postech.ac.kr March 4, 2009 1 / 78 Outline Theory and Preliminaries
More informationEEG- Signal Processing
Fatemeh Hadaeghi EEG- Signal Processing Lecture Notes for BSP, Chapter 5 Master Program Data Engineering 1 5 Introduction The complex patterns of neural activity, both in presence and absence of external
More informationINDEPENDENT COMPONENT ANALYSIS
INDEPENDENT COMPONENT ANALYSIS A THESIS SUBMITTED IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE DEGREE OF Bachelor of Technology in Electronics and Communication Engineering Department By P. SHIVA
More informationOne-unit Learning Rules for Independent Component Analysis
One-unit Learning Rules for Independent Component Analysis Aapo Hyvarinen and Erkki Oja Helsinki University of Technology Laboratory of Computer and Information Science Rakentajanaukio 2 C, FIN-02150 Espoo,
More informationIndependent component analysis for biomedical signals
INSTITUTE OF PHYSICS PUBLISHING Physiol. Meas. 26 (2005) R15 R39 PHYSIOLOGICAL MEASUREMENT doi:10.1088/0967-3334/26/1/r02 TOPICAL REVIEW Independent component analysis for biomedical signals Christopher
More informationSeparation of Different Voices in Speech using Fast Ica Algorithm
Volume-6, Issue-6, November-December 2016 International Journal of Engineering and Management Research Page Number: 364-368 Separation of Different Voices in Speech using Fast Ica Algorithm Dr. T.V.P Sundararajan
More informationStructure in Data. A major objective in data analysis is to identify interesting features or structure in the data.
Structure in Data A major objective in data analysis is to identify interesting features or structure in the data. The graphical methods are very useful in discovering structure. There are basically two
More informationMTTS1 Dimensionality Reduction and Visualization Spring 2014 Jaakko Peltonen
MTTS1 Dimensionality Reduction and Visualization Spring 2014 Jaakko Peltonen Lecture 3: Linear feature extraction Feature extraction feature extraction: (more general) transform the original to (k < d).
More information1 Introduction Independent component analysis (ICA) [10] is a statistical technique whose main applications are blind source separation, blind deconvo
The Fixed-Point Algorithm and Maximum Likelihood Estimation for Independent Component Analysis Aapo Hyvarinen Helsinki University of Technology Laboratory of Computer and Information Science P.O.Box 5400,
More informationHST-582J/6.555J/16.456J Biomedical Signal and Image Processing Spring Laboratory Project 3 Blind Source Separation: Fetal & Maternal ECG
HST-582J/6.555J/16.456J Biomedical Signal and Image Processing Spring 2009 DUE: 4/16/09 Laboratory Project 3 Blind Source Separation: Fetal & Maternal ECG 1 Introduction Many health-related problems that
More informationPCA & ICA. CE-717: Machine Learning Sharif University of Technology Spring Soleymani
PCA & ICA CE-717: Machine Learning Sharif University of Technology Spring 2015 Soleymani Dimensionality Reduction: Feature Selection vs. Feature Extraction Feature selection Select a subset of a given
More informationDimensionality Reduction. CS57300 Data Mining Fall Instructor: Bruno Ribeiro
Dimensionality Reduction CS57300 Data Mining Fall 2016 Instructor: Bruno Ribeiro Goal } Visualize high dimensional data (and understand its Geometry) } Project the data into lower dimensional spaces }
More informationIEEE TRANSACTIONS ON NEURAL NETWORKS, VOL. 17, NO. 2, MARCH Yuanqing Li, Andrzej Cichocki, and Shun-Ichi Amari, Fellow, IEEE
IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL 17, NO 2, MARCH 2006 419 Blind Estimation of Channel Parameters and Source Components for EEG Signals: A Sparse Factorization Approach Yuanqing Li, Andrzej Cichocki,
More informationSpatial Harmonic Analysis of EEG Data
Spatial Harmonic Analysis of EEG Data Uwe Graichen Institute of Biomedical Engineering and Informatics Ilmenau University of Technology Singapore, 8/11/2012 Outline 1 Motivation 2 Introduction 3 Material
More informationACENTRAL problem in neural-network research, as well
626 IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL. 10, NO. 3, MAY 1999 Fast and Robust Fixed-Point Algorithms for Independent Component Analysis Aapo Hyvärinen Abstract Independent component analysis (ICA)
More informationPrincipal Component Analysis vs. Independent Component Analysis for Damage Detection
6th European Workshop on Structural Health Monitoring - Fr..D.4 Principal Component Analysis vs. Independent Component Analysis for Damage Detection D. A. TIBADUIZA, L. E. MUJICA, M. ANAYA, J. RODELLAR
More informationLinear Prediction Based Blind Source Extraction Algorithms in Practical Applications
Linear Prediction Based Blind Source Extraction Algorithms in Practical Applications Zhi-Lin Zhang 1,2 and Liqing Zhang 1 1 Department of Computer Science and Engineering, Shanghai Jiao Tong University,
More informationDistinguishing Causes from Effects using Nonlinear Acyclic Causal Models
JMLR Workshop and Conference Proceedings 6:17 164 NIPS 28 workshop on causality Distinguishing Causes from Effects using Nonlinear Acyclic Causal Models Kun Zhang Dept of Computer Science and HIIT University
More informationDistinguishing Causes from Effects using Nonlinear Acyclic Causal Models
Distinguishing Causes from Effects using Nonlinear Acyclic Causal Models Kun Zhang Dept of Computer Science and HIIT University of Helsinki 14 Helsinki, Finland kun.zhang@cs.helsinki.fi Aapo Hyvärinen
More informationNatural Gradient Learning for Over- and Under-Complete Bases in ICA
NOTE Communicated by Jean-François Cardoso Natural Gradient Learning for Over- and Under-Complete Bases in ICA Shun-ichi Amari RIKEN Brain Science Institute, Wako-shi, Hirosawa, Saitama 351-01, Japan Independent
More informationTutorial on Blind Source Separation and Independent Component Analysis
Tutorial on Blind Source Separation and Independent Component Analysis Lucas Parra Adaptive Image & Signal Processing Group Sarnoff Corporation February 09, 2002 Linear Mixtures... problem statement...
More informationIndependent Component Analysis (ICA)
Independent Component Analysis (ICA) Seungjin Choi Department of Computer Science and Engineering Pohang University of Science and Technology 77 Cheongam-ro, Nam-gu, Pohang 37673, Korea seungjin@postech.ac.kr
More informationIndependent component analysis: an introduction
Research Update 59 Techniques & Applications Independent component analysis: an introduction James V. Stone Independent component analysis (ICA) is a method for automatically identifying the underlying
More informationIndependent Component Analysis of Evoked Potentials in EEG
Indepent Component Analysis of Evoked Potentials in EEG Michael Vinther, s9797 mv@logicnet.dk Ørsted, DTU December 8 th, Supervisor: Kaj-Åge Henneberg X x - 6 7 8 9 x - 6 7 8 9 x - 6 7 8 9-6 7 8 9-6 7
More informationFile: ica tutorial2.tex. James V Stone and John Porrill, Psychology Department, Sheeld University, Tel: Fax:
File: ica tutorial2.tex Independent Component Analysis and Projection Pursuit: A Tutorial Introduction James V Stone and John Porrill, Psychology Department, Sheeld University, Sheeld, S 2UR, England.
More informationNON-NEGATIVE MATRIX FACTORIZATION FOR SINGLE-CHANNEL EEG ARTIFACT REJECTION
NON-NEGATIVE MATRIX FACTORIZATION FOR SINGLE-CHANNEL EEG ARTIFACT REJECTION Cécilia Damon Antoine Liutkus Alexandre Gramfort Slim Essid Institut Mines-Telecom, TELECOM ParisTech - CNRS, LTCI 37, rue Dareau
More informationBlind Separation of Fetal ECG from Single Mixture using SVD and ICA
Blind Separation of Fetal ECG from Single Mixture using SVD and ICA Ping Gao Department of Computational Science National University of Singapore Ee-Chien Chang School of Computing National University
More informationIndependent Component Analysis of Rock Magnetic Measurements
Independent Component Analysis of Rock Magnetic Measurements Norbert Marwan March 18, 23 Title Today I will not talk about recurrence plots. Marco and Mamen will talk about them later. Moreover, for the
More informationSpatial Source Filtering. Outline EEG/ERP. ERPs) Event-related Potentials (ERPs( EEG
Integration of /MEG/fMRI Vince D. Calhoun, Ph.D. Director, Image Analysis & MR Research The MIND Institute Outline fmri/ data Three Approaches to integration/fusion Prediction Constraints 2 nd Level Fusion
More informationwhere A 2 IR m n is the mixing matrix, s(t) is the n-dimensional source vector (n» m), and v(t) is additive white noise that is statistically independ
BLIND SEPARATION OF NONSTATIONARY AND TEMPORALLY CORRELATED SOURCES FROM NOISY MIXTURES Seungjin CHOI x and Andrzej CICHOCKI y x Department of Electrical Engineering Chungbuk National University, KOREA
More informationCS281 Section 4: Factor Analysis and PCA
CS81 Section 4: Factor Analysis and PCA Scott Linderman At this point we have seen a variety of machine learning models, with a particular emphasis on models for supervised learning. In particular, we
More informationARTEFACT DETECTION IN ASTROPHYSICAL IMAGE DATA USING INDEPENDENT COMPONENT ANALYSIS. Maria Funaro, Erkki Oja, and Harri Valpola
ARTEFACT DETECTION IN ASTROPHYSICAL IMAGE DATA USING INDEPENDENT COMPONENT ANALYSIS Maria Funaro, Erkki Oja, and Harri Valpola Neural Networks Research Centre, Helsinki University of Technology P.O.Box
More informationArtifact Extraction from EEG Data Using Independent Component Analysis
The University of Kansas Technical Report Artifact Extraction from EEG Data Using Independent Component Analysis Shadab Mozaffar David W. Petr ITTC-FY2003-TR-03050-02 December 2002 Copyright 2002: The
More informationICA [6] ICA) [7, 8] ICA ICA ICA [9, 10] J-F. Cardoso. [13] Matlab ICA. Comon[3], Amari & Cardoso[4] ICA ICA
16 1 (Independent Component Analysis: ICA) 198 9 ICA ICA ICA 1 ICA 198 Jutten Herault Comon[3], Amari & Cardoso[4] ICA Comon (PCA) projection persuit projection persuit ICA ICA ICA 1 [1] [] ICA ICA EEG
More informationMultidimensional scaling (MDS)
Multidimensional scaling (MDS) Just like SOM and principal curves or surfaces, MDS aims to map data points in R p to a lower-dimensional coordinate system. However, MSD approaches the problem somewhat
More informationRecipes for the Linear Analysis of EEG and applications
Recipes for the Linear Analysis of EEG and applications Paul Sajda Department of Biomedical Engineering Columbia University Can we read the brain non-invasively and in real-time? decoder 1001110 if YES
More informationRecursive Generalized Eigendecomposition for Independent Component Analysis
Recursive Generalized Eigendecomposition for Independent Component Analysis Umut Ozertem 1, Deniz Erdogmus 1,, ian Lan 1 CSEE Department, OGI, Oregon Health & Science University, Portland, OR, USA. {ozertemu,deniz}@csee.ogi.edu
More informationArtificial Intelligence Module 2. Feature Selection. Andrea Torsello
Artificial Intelligence Module 2 Feature Selection Andrea Torsello We have seen that high dimensional data is hard to classify (curse of dimensionality) Often however, the data does not fill all the space
More informationA Constrained EM Algorithm for Independent Component Analysis
LETTER Communicated by Hagai Attias A Constrained EM Algorithm for Independent Component Analysis Max Welling Markus Weber California Institute of Technology, Pasadena, CA 91125, U.S.A. We introduce a
More informationON-LINE BLIND SEPARATION OF NON-STATIONARY SIGNALS
Yugoslav Journal of Operations Research 5 (25), Number, 79-95 ON-LINE BLIND SEPARATION OF NON-STATIONARY SIGNALS Slavica TODOROVIĆ-ZARKULA EI Professional Electronics, Niš, bssmtod@eunet.yu Branimir TODOROVIĆ,
More informationIndependent Component Analysis
Chapter 5 Independent Component Analysis Part I: Introduction and applications Motivation Skillikorn chapter 7 2 Cocktail party problem Did you see that Have you heard So, yesterday this guy I said, darling
More informationNatural Image Statistics
Natural Image Statistics A probabilistic approach to modelling early visual processing in the cortex Dept of Computer Science Early visual processing LGN V1 retina From the eye to the primary visual cortex
More informationUndercomplete Independent Component. Analysis for Signal Separation and. Dimension Reduction. Category: Algorithms and Architectures.
Undercomplete Independent Component Analysis for Signal Separation and Dimension Reduction John Porrill and James V Stone Psychology Department, Sheeld University, Sheeld, S10 2UR, England. Tel: 0114 222
More informationDimension Reduction (PCA, ICA, CCA, FLD,
Dimension Reduction (PCA, ICA, CCA, FLD, Topic Models) Yi Zhang 10-701, Machine Learning, Spring 2011 April 6 th, 2011 Parts of the PCA slides are from previous 10-701 lectures 1 Outline Dimension reduction
More informationPrincipal component analysis
Principal component analysis Angela Montanari 1 Introduction Principal component analysis (PCA) is one of the most popular multivariate statistical methods. It was first introduced by Pearson (1901) and
More informationReal and Complex Independent Subspace Analysis by Generalized Variance
Real and Complex Independent Subspace Analysis by Generalized Variance Neural Information Processing Group, Department of Information Systems, Eötvös Loránd University, Budapest, Hungary ICA Research Network
More informationICA. Independent Component Analysis. Zakariás Mátyás
ICA Independent Component Analysis Zakariás Mátyás Contents Definitions Introduction History Algorithms Code Uses of ICA Definitions ICA Miture Separation Signals typical signals Multivariate statistics
More informationIndependent component analysis applied to biophysical time series and EEG. Arnaud Delorme CERCO, CNRS, France & SCCN, UCSD, La Jolla, USA
Independent component analysis applied to biophysical time series and EEG Arnad Delorme CERCO, CNRS, France & SCCN, UCSD, La Jolla, USA Independent component analysis Cocktail Party Mixtre of Brain sorce
More informationIndependent Component Analysis Using an Extended Infomax Algorithm for Mixed Subgaussian and Supergaussian Sources
LETTER Communicated by Jean-François Cardoso Independent Component Analysis Using an Extended Infomax Algorithm for Mixed Subgaussian and Supergaussian Sources Te-Won Lee Howard Hughes Medical Institute,
More informationPROPERTIES OF THE EMPIRICAL CHARACTERISTIC FUNCTION AND ITS APPLICATION TO TESTING FOR INDEPENDENCE. Noboru Murata
' / PROPERTIES OF THE EMPIRICAL CHARACTERISTIC FUNCTION AND ITS APPLICATION TO TESTING FOR INDEPENDENCE Noboru Murata Waseda University Department of Electrical Electronics and Computer Engineering 3--
More informationA Canonical Genetic Algorithm for Blind Inversion of Linear Channels
A Canonical Genetic Algorithm for Blind Inversion of Linear Channels Fernando Rojas, Jordi Solé-Casals, Enric Monte-Moreno 3, Carlos G. Puntonet and Alberto Prieto Computer Architecture and Technology
More informationBeamforming Techniques Applied in EEG Source Analysis
Beamforming Techniques Applied in EEG Source Analysis G. Van Hoey 1,, R. Van de Walle 1, B. Vanrumste 1,, M. D Havé,I.Lemahieu 1 and P. Boon 1 Department of Electronics and Information Systems, University
More informationNonlinear reverse-correlation with synthesized naturalistic noise
Cognitive Science Online, Vol1, pp1 7, 2003 http://cogsci-onlineucsdedu Nonlinear reverse-correlation with synthesized naturalistic noise Hsin-Hao Yu Department of Cognitive Science University of California
More informationIndependent Component Analysis on the Basis of Helmholtz Machine
Independent Component Analysis on the Basis of Helmholtz Machine Masashi OHATA *1 ohatama@bmc.riken.go.jp Toshiharu MUKAI *1 tosh@bmc.riken.go.jp Kiyotoshi MATSUOKA *2 matsuoka@brain.kyutech.ac.jp *1 Biologically
More informationTHEORETICAL CONCEPTS & APPLICATIONS OF INDEPENDENT COMPONENT ANALYSIS
THEORETICAL CONCEPTS & APPLICATIONS OF INDEPENDENT COMPONENT ANALYSIS SONALI MISHRA 1, NITISH BHARDWAJ 2, DR. RITA JAIN 3 1,2 Student (B.E.- EC), LNCT, Bhopal, M.P. India. 3 HOD (EC) LNCT, Bhopal, M.P.
More informationPOLYNOMIAL SINGULAR VALUES FOR NUMBER OF WIDEBAND SOURCES ESTIMATION AND PRINCIPAL COMPONENT ANALYSIS
POLYNOMIAL SINGULAR VALUES FOR NUMBER OF WIDEBAND SOURCES ESTIMATION AND PRINCIPAL COMPONENT ANALYSIS Russell H. Lambert RF and Advanced Mixed Signal Unit Broadcom Pasadena, CA USA russ@broadcom.com Marcel
More informationPCA, Kernel PCA, ICA
PCA, Kernel PCA, ICA Learning Representations. Dimensionality Reduction. Maria-Florina Balcan 04/08/2015 Big & High-Dimensional Data High-Dimensions = Lot of Features Document classification Features per
More informationHST.582J / 6.555J / J Biomedical Signal and Image Processing Spring 2007
MIT OpenCourseWare http://ocw.mit.edu HST.582J / 6.555J / 16.456J Biomedical Signal and Image Processing Spring 2007 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms.
More informationPRINCIPAL COMPONENTS ANALYSIS
121 CHAPTER 11 PRINCIPAL COMPONENTS ANALYSIS We now have the tools necessary to discuss one of the most important concepts in mathematical statistics: Principal Components Analysis (PCA). PCA involves
More informationComparative Analysis of ICA Based Features
International Journal of Emerging Engineering Research and Technology Volume 2, Issue 7, October 2014, PP 267-273 ISSN 2349-4395 (Print) & ISSN 2349-4409 (Online) Comparative Analysis of ICA Based Features
More informationSPARSE REPRESENTATION AND BLIND DECONVOLUTION OF DYNAMICAL SYSTEMS. Liqing Zhang and Andrzej Cichocki
SPARSE REPRESENTATON AND BLND DECONVOLUTON OF DYNAMCAL SYSTEMS Liqing Zhang and Andrzej Cichocki Lab for Advanced Brain Signal Processing RKEN Brain Science nstitute Wako shi, Saitama, 351-198, apan zha,cia
More informationDimensionality Reduction
Lecture 5 1 Outline 1. Overview a) What is? b) Why? 2. Principal Component Analysis (PCA) a) Objectives b) Explaining variability c) SVD 3. Related approaches a) ICA b) Autoencoders 2 Example 1: Sportsball
More informationFigure 1: The original signals
Independent Component Analysis: A Tutorial Aapo Hyv rinen and Erkki Oja Helsinki University of Technology Laboratory of Computer and Information Science P.O. Box 5400, FIN-02015 Espoo, Finland aapo.hyvarinen@hut.fi,
More informationAS a non-invasive brain imaging modality, electroencephalography
Covariance-Domain Dictionary Learning for Overcomplete EEG Source Identification Ozgur Balkan*, Student Member, IEEE, Kenneth Kreutz-Delgado, Fellow, IEEE, and Scott Makeig arxiv:5.0056v [cs.it] Dec 05
More informationBlind Machine Separation Te-Won Lee
Blind Machine Separation Te-Won Lee University of California, San Diego Institute for Neural Computation Blind Machine Separation Problem we want to solve: Single microphone blind source separation & deconvolution
More informationRobust extraction of specific signals with temporal structure
Robust extraction of specific signals with temporal structure Zhi-Lin Zhang, Zhang Yi Computational Intelligence Laboratory, School of Computer Science and Engineering, University of Electronic Science
More information