An Improved Cumulant Based Method for Independent Component Analysis
|
|
- Godfrey Freeman
- 6 years ago
- Views:
Transcription
1 An Improved Cumulant Based Method for Independent Component Analysis Tobias Blaschke and Laurenz Wiskott Institute for Theoretical Biology Humboldt University Berlin Invalidenstraße 43 D Berlin Germany {t.blaschkel.wiskott}@biologie.hu-berlin.de Abstract. An improved method for independent component analysis based on the diagonalization of cumulant tensors is proposed. It is based on Comon s algorithm [] but it takes third- and fourth-order cumulant tensors into account simultaneously. The underlying contrast function is also mathematically much simpler and has a more intuitive interpretation. It is therefore easier to optimize and approximate. A comparison with Comon s algorithm JADE [2] and FastICA [3] on different data sets demonstrates its performance. Introduction Vectorial input signals xt = [x t... x N t] T to be analyzed are often a mixture of some underlying signals s i t coming from different sources. In a simple model of this data generation process it is assumed that there are as many sources as input signal components that the source signal components s i are statistically independent and that the mixing is linear and noise free yielding the relation x = As with fixed mixing matrix A and source signal st = [s t... s N t] T. In the following we will drop the reference to time and simply assume some sets of zero mean sources and input data related by. The input components are usually statistically dependent due to the mixing process while the sources are not. If one succeeds in finding a matrix R that yields statistically independent output components u k given by u = Rx = RAs 2 one can recover the original sources s i up to a permutation and constant scaling of the sources. R is called the unmixing matrix and finding the matrix is referred to as independent component analysis ICA or blind source separation. There is a variety of methods for performing ICA and a large body of literature see [45] for an overview. We extend here on the cumulant based methods [67] and present an improved algorithm that takes third- and fourth-order cumulants into account simultaneously and at the same time is simpler and faster than Comon s algorithm [] which our algorithm is based upon. Supported by a grant from the Volkswagen Foundation.
2 2 Improved ICA Algorithm 2. Cumulants and Independence Statistical properties of the output data set u can be described by its moments or more conveniently by its cumulants C... u. Cumulants of a given order form a tensor e.g. third-order cumulants are defined by C u ijk := u iu j u k with indicating the mean over all data points. The diagonal elements characterize the distribution of single components. For example C u i C u ii C u iii and Cu iiii are the mean variance skewness and kurtosis of u i respectively. The off-diagonal elements all cumulants with ijkl iiii characterize the statistical dependencies between components. If and only if all components u i are statistically independent the off-diagonal elements vanish and the cumulant tensors of all orders are diagonal assuming infinite amount of data. Thus ICA is equivalent to finding an unmixing matrix that diagonalizes the cumulant tensors C... u of the output data u i at least approximately. The first order cumulant tensor is a vector and doesn t have off-diagonal elements. The second order cumulant tensor can be diagonalized easily by whitening the input data x with an appropriate matrix W yielding y = Wx. It can be shown that this exact diagonalization of the second order cumulant tensor of the whitened data y is preserved if and only if the final matrix Q generating the output data u = Qy is orthogonal i.e. a pure rotation possibly plus reflections [8]. In general there doesn t exist an orthogonal matrix that would diagonalize also the thirdor fourth-order cumulant tensor thus the diagonalization of these tensors can only be approximate and we need to define an optimization criterion for this approximate diagonalization which is done in the next section. 2.2 Contrast Function Since the goal is to diagonalize the cumulant tensors a suitable contrast function or cost function to be minimized would be the square sum over all third- and fourth-order off-diagonal elements. However because the square sum over all elements of a cumulant tensor is preserved under any orthogonal transformation Q of the underlying data y [9] one can equally well maximize the sum over the diagonal elements Ψ 34 u = 3! α 2 C ααα u + 4! α C u αααα 2. 3 The factors 3! and 4! arise from an expansion of the Kullback-Leibler divergence of u and y which provides a principled derivation of this contrast function [70]. Due to the multilinearity of the cumulants C... u in C... y 3 can be rewritten as Ψ 34 Q = 2+ Q αβ Q αγ Q αδ C y 2 βγδ Q αβ Q αγ Q αδ Q αɛ C y βγδɛ 3! 4! α βγδ α βγδɛ }{{}}{{} C u ααα C u αααα 4
3 which is now subject to an optimization procedure to find the orthogonal matrix Q that maximizes it. 2.3 Givens Rotations Any orthogonal N N matrix such as Q can be written as a product of NN 2 or more Givens rotation matrices Q r for the rotation part and a diagonal matrix with diagonal elements ± for the reflection part. Since reflections don t matter in our case we only consider the Givens rotations. A Givens rotation is a plane rotation around the origin within a subspace spanned by two selected components. For simplicity and without loss of generality we consider only N = 2 so that the Givens rotation matrix reads Q r cos φ sin φ =. 5 sin φ cos φ With 5 the contrast function 4 can be rewritten as Ψ 34 φ = Ψ 3 φ + Ψ 4 φ with Ψ n φ := n d ni cos φ 2n i sin φ i + cos φ i sin φ 2n i 6 n! i=0 with some constants d ni that depend only on the cumulants before rotation. The next step is to find the angle of rotation φ that maximizes Ψ 34. For N > 2 one has to perform several Givens rotations for each pair of components in order to find the global maximum. To simplify this equation Comon [] defined some auxiliary variables θ := tan φ and ξ := θ θ and derived Ψ 3 θ = θ a i θ i θ i 7 3! θ i= Ψ 4 ξ = ξ b i ξ i 8 4! i=0 for 6 with some constants a i and b i depending on the cumulants before rotation. To maximize 7 or 8 one has to take their derivative and find the root giving the largest value for Ψ 34. With this formulation only either the thirdorder or the fourth-order diagonal cumulants can be maximized but not both simultaneously. In a more direct approach and after some quite involved calculations using various trigonometric addition theorems we were able to derive a contrast function that combines third- and fourth-order cumulants is mathematically much simpler has a more intuitive interpretation and is therefore easier to optimize and approximate. We found Ψ 34 φ = A 0 + A 4 cos 4φ + φ 4 + A 8 cos 8φ + φ 8 9
4 with some constants A 0 A 4 A 8 and φ 4 φ 8 that depend only on the cumulants before rotation see the appendix. The third term comes from the fourth order cumulants only while the first two terms incorporate information from the thirdand the fourth-order cumulants. We found empirically that A 8 is usually small compared to A 4 by a factor of about /8. In the simulations we will therefore also consider an approximate contrast function Ψ 34 φ := A 0 + A 4 cos 4φ + φ 4 which is trivial to maximize. Note that Ψ 34 still takes third- and fourth-order cumulants into account. Contrast functions for third- or fourth-order cumulants only i.e. Ψ 3 Ψ 4 Ψ 3 or Ψ 4 can be easily obtained by setting all fourth- or third-order cumulants to zero respectively. 3 Simulations We compared four variants of the improved algorithm namely those based on Ψ 34 φ Ψ34 φ Ψ 4 φ and Ψ 4 φ with Comon s original algorithm based on Ψ 4 ξ [] with the JADE algorithm [2] that diagonalizes 4th order cumulant matrices and the FastICA package using a fixed-point algorithm with different nonlinearities [3]. We assembled three different data sets of length 4428 each mixed by a randomly chosen mixing matrix. To quantify the performances we used the error measure E proposed by Amari et al. [] which indicates good unmixing by low values and vanishes for perfect unmixing. Table shows the performance of the different algorithms in Matlab implementation 2. For underlying symmetric distributions all algorithms perform similarly well. If the source distributions are skewed the additional third order information is crucial and the new method and the FastICA algorithm using a corresponding nonlinearity clearly unmix better. In the case where the sources are both symmetric and asymmetric only the new algorithm can discriminate between the different distributions. Furthermore the algorithms with approximate contrast work as well as those with the exact contrast. 4 Conclusion We have proposed an improved cumulant based method for independent component analysis. In contrast to Comon s method [] it diagonalizes third- and fourth-order cumulants simultaneously and is thus able to handle linear mixtures of symmetric and skew distributed source signals. Due to its mathematically simple formulation an approximate algorithm can be derived which shows equal unmixing performance. It can handle also sources with symmetric and asymmteric components. We have always chosen the nonlinearity yielding best performance. 2 Code is available from
5 Table. Unmixing performance for different algorithms and data sets: i 5 real acoustic sources [2] mean kurtosis: normal distributed source N 0 ii 5 skew-normal distributed sources mean skewness: normal distributed source iii 3 music sources [3] mean kurtosis: skew-normal distributed sources + normal distributed source. Low values of E indicate good performance. Simulations have been carried out on a.4 GHz AMD-Athlon PC using Matlab implementation. contrast function / method unmixing performance E elapsed time/s i ii iii i ii iii Ψ 34 φ 3rd+4th order Ψ 34 φ 3rd+4th order approx Ψ 4 φ 4th order Ψ 4 φ 4th order approx Ψ 4 ξ Comon [] JADE [2] FastICA [3] Appendix From 6 one can derive Ψ n φ = a n0 + s n4 sin 4φ + c n4 cos 4φ + s n8 sin 8φ + c n8 cos 8φ with a 30 := [ 5 C y 2 y 2 + C 222 3! 8 ] + 9 C y 2 y 2 + C C y Cy 22 + Cy 2 Cy 222 a 40 := [ 35 C y 2 y 2 + C ! C y 2 y 2 + C C y Cy 22 + Cy 22 Cy C y 2 y C 2 Cy Cy Cy 2222 [ ] s 34 := 6 C y 3! 4 Cy 2 Cy 22 Cy 222 c 34 := [ 3 C y 2 y 2 + C 222 3! 8 9 C y 2 y 2 + C s 44 := [ 8 7 C y 4! 32 Cy 2 Cy 222 Cy C y 2 Cy 22 Cy 22 Cy c 44 := [ 7 C y 2 y 2 + C ! 6 6 C y 2 y 2 + C ] C y Cy 22 + Cy 2 Cy 222 ] C y Cy 222 Cy 2 Cy 2222 C y Cy 22 + Cy 22 Cy 2222 ]
6 ] 36 C y 2 y C 2 Cy Cy Cy 2222 [ s 48 := 8 4! c 48 := [ 4! 64 6 C y Cy 2 Cy 222 Cy 2222 C y 2 Cy 22 Cy 22 Cy 222 C y 2 y + C C y 2 y 2 + C ] C y Cy 222 Cy 2 Cy 2222 C y Cy 22 + Cy 22 Cy 2222 ] + 36 C y 2 y C 2 Cy Cy Cy 2222 and s 38 := c 38 := 0. With this it is trivial to determine the constants for Ψ 34 φ = Ψ 3 φ + Ψ 4 φ in the form given in 9. We find: A 0 := a 30 + a 40 A 4 := c 34 + c s 34 + s 44 2 A 8 := c s2 48 tan φ4 := s 34+s 44 c 34 +c 44 tan φ 8 := s 48 c 48. The coefficients A 0 A 8 A 4 and φ 8 φ 4 are functions of the cumulants of 3rd and 4th order of the centered and whitened signals y. References. Comon P.: Tensor diagonalization a useful tool in signal processing. In Blanke M. Soderstrom T. eds.: IFAC-SYSID 0th IFAC Symposium on System Identification. Volume. Copenhagen Denmark Cardoso J.F. Souloumiac A.: Blind beamforming for non Gaussian signals. IEE Proceedings-F Hyvärinen A.: Fast and robust fixed-point algorithms for independent component analysis. IEEE Transactions on Neural Networks Hyvärinen A. Karhunen J. Oja E.: Independent Component Analysis. Wiley Series on Adaptive and Learning Systems for Signal Processing Communications and Control. John Wiley&Sons Lee T.W. Girolami M. Bell A. Sejnowski T.: A unifying framework for independent component analysis. International Journal of Computers and Mathematics with Applications Comon P.: Independent component analysis. In Lacoume J. ed.: Proc. Int. Sig. Proc. Workshop on Higher-Order Statistics Chamrousse France Cardoso J.F.: High-order contrasts for independent component analysis. Neural Computation Hyvärinen A.: Survey on independent component analysis. Neural Computing Surveys Deco G. Obradovic D.: An Information-Theoretic Approach to Neural Computing. Springer Series in Perspectives in Neural Computing. Springer McCullagh P.: Tensor Methods in Statistics. Monographs on Statistics and Applied Probability. Chapmann and Hall 987. Amari S. Cichocki A. Yang H.: Recurrent neural networks for blind separation of sources. In: Proc. of the Int. Symposium on Nonlinear Theory and its Applications NOLTA-95 Las Vegas USA John F. Kennedy Library Boston: Sound excerpts from the speeches of president John F. Kennedy Pearlmutter B.: 6 clips sampled from audio CDs. ~bap
Fundamentals of Principal Component Analysis (PCA), Independent Component Analysis (ICA), and Independent Vector Analysis (IVA)
Fundamentals of Principal Component Analysis (PCA),, and Independent Vector Analysis (IVA) Dr Mohsen Naqvi Lecturer in Signal and Information Processing, School of Electrical and Electronic Engineering,
More informationPROPERTIES OF THE EMPIRICAL CHARACTERISTIC FUNCTION AND ITS APPLICATION TO TESTING FOR INDEPENDENCE. Noboru Murata
' / PROPERTIES OF THE EMPIRICAL CHARACTERISTIC FUNCTION AND ITS APPLICATION TO TESTING FOR INDEPENDENCE Noboru Murata Waseda University Department of Electrical Electronics and Computer Engineering 3--
More information1 Introduction Independent component analysis (ICA) [10] is a statistical technique whose main applications are blind source separation, blind deconvo
The Fixed-Point Algorithm and Maximum Likelihood Estimation for Independent Component Analysis Aapo Hyvarinen Helsinki University of Technology Laboratory of Computer and Information Science P.O.Box 5400,
More informationUsing Kernel PCA for Initialisation of Variational Bayesian Nonlinear Blind Source Separation Method
Using Kernel PCA for Initialisation of Variational Bayesian Nonlinear Blind Source Separation Method Antti Honkela 1, Stefan Harmeling 2, Leo Lundqvist 1, and Harri Valpola 1 1 Helsinki University of Technology,
More informationIndependent Component Analysis (ICA)
Independent Component Analysis (ICA) Université catholique de Louvain (Belgium) Machine Learning Group http://www.dice.ucl ucl.ac.be/.ac.be/mlg/ 1 Overview Uncorrelation vs Independence Blind source separation
More informationICA [6] ICA) [7, 8] ICA ICA ICA [9, 10] J-F. Cardoso. [13] Matlab ICA. Comon[3], Amari & Cardoso[4] ICA ICA
16 1 (Independent Component Analysis: ICA) 198 9 ICA ICA ICA 1 ICA 198 Jutten Herault Comon[3], Amari & Cardoso[4] ICA Comon (PCA) projection persuit projection persuit ICA ICA ICA 1 [1] [] ICA ICA EEG
More informationCIFAR Lectures: Non-Gaussian statistics and natural images
CIFAR Lectures: Non-Gaussian statistics and natural images Dept of Computer Science University of Helsinki, Finland Outline Part I: Theory of ICA Definition and difference to PCA Importance of non-gaussianity
More informationICA. Independent Component Analysis. Zakariás Mátyás
ICA Independent Component Analysis Zakariás Mátyás Contents Definitions Introduction History Algorithms Code Uses of ICA Definitions ICA Miture Separation Signals typical signals Multivariate statistics
More informationNon-Euclidean Independent Component Analysis and Oja's Learning
Non-Euclidean Independent Component Analysis and Oja's Learning M. Lange 1, M. Biehl 2, and T. Villmann 1 1- University of Appl. Sciences Mittweida - Dept. of Mathematics Mittweida, Saxonia - Germany 2-
More informationIndependent Component Analysis. Contents
Contents Preface xvii 1 Introduction 1 1.1 Linear representation of multivariate data 1 1.1.1 The general statistical setting 1 1.1.2 Dimension reduction methods 2 1.1.3 Independence as a guiding principle
More informationMassoud BABAIE-ZADEH. Blind Source Separation (BSS) and Independent Componen Analysis (ICA) p.1/39
Blind Source Separation (BSS) and Independent Componen Analysis (ICA) Massoud BABAIE-ZADEH Blind Source Separation (BSS) and Independent Componen Analysis (ICA) p.1/39 Outline Part I Part II Introduction
More informationIndependent Component Analysis and Its Applications. By Qing Xue, 10/15/2004
Independent Component Analysis and Its Applications By Qing Xue, 10/15/2004 Outline Motivation of ICA Applications of ICA Principles of ICA estimation Algorithms for ICA Extensions of basic ICA framework
More informationGatsby Theoretical Neuroscience Lectures: Non-Gaussian statistics and natural images Parts I-II
Gatsby Theoretical Neuroscience Lectures: Non-Gaussian statistics and natural images Parts I-II Gatsby Unit University College London 27 Feb 2017 Outline Part I: Theory of ICA Definition and difference
More informationAnalytical solution of the blind source separation problem using derivatives
Analytical solution of the blind source separation problem using derivatives Sebastien Lagrange 1,2, Luc Jaulin 2, Vincent Vigneron 1, and Christian Jutten 1 1 Laboratoire Images et Signaux, Institut National
More informationNONLINEAR INDEPENDENT FACTOR ANALYSIS BY HIERARCHICAL MODELS
NONLINEAR INDEPENDENT FACTOR ANALYSIS BY HIERARCHICAL MODELS Harri Valpola, Tomas Östman and Juha Karhunen Helsinki University of Technology, Neural Networks Research Centre P.O. Box 5400, FIN-02015 HUT,
More informationNatural Gradient Learning for Over- and Under-Complete Bases in ICA
NOTE Communicated by Jean-François Cardoso Natural Gradient Learning for Over- and Under-Complete Bases in ICA Shun-ichi Amari RIKEN Brain Science Institute, Wako-shi, Hirosawa, Saitama 351-01, Japan Independent
More informationBlind signal processing algorithms
12th Int. Workshop on Systems, Signals & Image Processing, 22-24 September 2005, Chalkida, Greece 105 Blind signal processing algorithms Athanasios Margaris and Efthimios Kotsialos Department of Applied
More informationIntroduction to Independent Component Analysis. Jingmei Lu and Xixi Lu. Abstract
Final Project 2//25 Introduction to Independent Component Analysis Abstract Independent Component Analysis (ICA) can be used to solve blind signal separation problem. In this article, we introduce definition
More informationEstimation of linear non-gaussian acyclic models for latent factors
Estimation of linear non-gaussian acyclic models for latent factors Shohei Shimizu a Patrik O. Hoyer b Aapo Hyvärinen b,c a The Institute of Scientific and Industrial Research, Osaka University Mihogaoka
More informationBLIND SOURCES SEPARATION BY SIMULTANEOUS GENERALIZED REFERENCED CONTRASTS DIAGONALIZATION
BLIND SOURCES SEPARATION BY SIMULTANEOUS GENERALIZED REFERENCED CONTRASTS DIAGONALIZATION Abdellah ADIB, Eric MOREAU and Driss ABOUTAJDINE Institut Scientifique, av. Ibn Battouta, BP. 73, Rabat, Maroc.
More informationBlind separation of instantaneous mixtures of dependent sources
Blind separation of instantaneous mixtures of dependent sources Marc Castella and Pierre Comon GET/INT, UMR-CNRS 7, 9 rue Charles Fourier, 9 Évry Cedex, France marc.castella@int-evry.fr, CNRS, I3S, UMR
More informationRecursive Generalized Eigendecomposition for Independent Component Analysis
Recursive Generalized Eigendecomposition for Independent Component Analysis Umut Ozertem 1, Deniz Erdogmus 1,, ian Lan 1 CSEE Department, OGI, Oregon Health & Science University, Portland, OR, USA. {ozertemu,deniz}@csee.ogi.edu
More informationOn Information Maximization and Blind Signal Deconvolution
On Information Maximization and Blind Signal Deconvolution A Röbel Technical University of Berlin, Institute of Communication Sciences email: roebel@kgwtu-berlinde Abstract: In the following paper we investigate
More informationANALYSING ICA COMPONENTS BY INJECTING NOISE. August-Bebel-Strasse 89, Potsdam, Germany
ANALYSING ICA COMPONENTS BY INJECTING NOISE Stefan Harmeling, Frank Meinecke, and Klaus-Robert Müller, Fraunhofer FIRST.IDA, Kekuléstrasse, 9 Berlin, Germany University of Potsdam, Department of Computer
More informationTWO METHODS FOR ESTIMATING OVERCOMPLETE INDEPENDENT COMPONENT BASES. Mika Inki and Aapo Hyvärinen
TWO METHODS FOR ESTIMATING OVERCOMPLETE INDEPENDENT COMPONENT BASES Mika Inki and Aapo Hyvärinen Neural Networks Research Centre Helsinki University of Technology P.O. Box 54, FIN-215 HUT, Finland ABSTRACT
More informationTo appear in Proceedings of the ICA'99, Aussois, France, A 2 R mn is an unknown mixture matrix of full rank, v(t) is the vector of noises. The
To appear in Proceedings of the ICA'99, Aussois, France, 1999 1 NATURAL GRADIENT APPROACH TO BLIND SEPARATION OF OVER- AND UNDER-COMPLETE MIXTURES L.-Q. Zhang, S. Amari and A. Cichocki Brain-style Information
More informationc Springer, Reprinted with permission.
Zhijian Yuan and Erkki Oja. A FastICA Algorithm for Non-negative Independent Component Analysis. In Puntonet, Carlos G.; Prieto, Alberto (Eds.), Proceedings of the Fifth International Symposium on Independent
More information1 Introduction Blind source separation (BSS) is a fundamental problem which is encountered in a variety of signal processing problems where multiple s
Blind Separation of Nonstationary Sources in Noisy Mixtures Seungjin CHOI x1 and Andrzej CICHOCKI y x Department of Electrical Engineering Chungbuk National University 48 Kaeshin-dong, Cheongju Chungbuk
More informationSoft-LOST: EM on a Mixture of Oriented Lines
Soft-LOST: EM on a Mixture of Oriented Lines Paul D. O Grady and Barak A. Pearlmutter Hamilton Institute National University of Ireland Maynooth Co. Kildare Ireland paul.ogrady@may.ie barak@cs.may.ie Abstract.
More informationIndependent Component Analysis
A Short Introduction to Independent Component Analysis Aapo Hyvärinen Helsinki Institute for Information Technology and Depts of Computer Science and Psychology University of Helsinki Problem of blind
More informationParametric independent component analysis for stable distributions
ORIGIAL RESEARCH Parametric independent component analysis for stable distributions ohammad Reza Ameri, 2, ona Shokripour 3, Adel ohammadpour 2, Vahid assiri 2, 4. Department of Computer Science and Software
More informationIndependent Component Analysis of Incomplete Data
Independent Component Analysis of Incomplete Data Max Welling Markus Weber California Institute of Technology 136-93 Pasadena, CA 91125 fwelling,rmwg@vision.caltech.edu Keywords: EM, Missing Data, ICA
More informationFREQUENCY-DOMAIN IMPLEMENTATION OF BLOCK ADAPTIVE FILTERS FOR ICA-BASED MULTICHANNEL BLIND DECONVOLUTION
FREQUENCY-DOMAIN IMPLEMENTATION OF BLOCK ADAPTIVE FILTERS FOR ICA-BASED MULTICHANNEL BLIND DECONVOLUTION Kyungmin Na, Sang-Chul Kang, Kyung Jin Lee, and Soo-Ik Chae School of Electrical Engineering Seoul
More informationWavelet de-noising for blind source separation in noisy mixtures.
Wavelet for blind source separation in noisy mixtures. Bertrand Rivet 1, Vincent Vigneron 1, Anisoara Paraschiv-Ionescu 2 and Christian Jutten 1 1 Institut National Polytechnique de Grenoble. Laboratoire
More informationPrincipal Component Analysis
Principal Component Analysis Laurenz Wiskott Institute for Theoretical Biology Humboldt-University Berlin Invalidenstraße 43 D-10115 Berlin, Germany 11 March 2004 1 Intuition Problem Statement Experimental
More informationRobust extraction of specific signals with temporal structure
Robust extraction of specific signals with temporal structure Zhi-Lin Zhang, Zhang Yi Computational Intelligence Laboratory, School of Computer Science and Engineering, University of Electronic Science
More informationwhere A 2 IR m n is the mixing matrix, s(t) is the n-dimensional source vector (n» m), and v(t) is additive white noise that is statistically independ
BLIND SEPARATION OF NONSTATIONARY AND TEMPORALLY CORRELATED SOURCES FROM NOISY MIXTURES Seungjin CHOI x and Andrzej CICHOCKI y x Department of Electrical Engineering Chungbuk National University, KOREA
More informationBlind Instantaneous Noisy Mixture Separation with Best Interference-plus-noise Rejection
Blind Instantaneous Noisy Mixture Separation with Best Interference-plus-noise Rejection Zbyněk Koldovský 1,2 and Petr Tichavský 1 1 Institute of Information Theory and Automation, Pod vodárenskou věží
More informationBlind Machine Separation Te-Won Lee
Blind Machine Separation Te-Won Lee University of California, San Diego Institute for Neural Computation Blind Machine Separation Problem we want to solve: Single microphone blind source separation & deconvolution
More informationIndependent Component Analysis
Independent Component Analysis James V. Stone November 4, 24 Sheffield University, Sheffield, UK Keywords: independent component analysis, independence, blind source separation, projection pursuit, complexity
More informationScatter Matrices and Independent Component Analysis
AUSTRIAN JOURNAL OF STATISTICS Volume 35 (2006), Number 2&3, 175 189 Scatter Matrices and Independent Component Analysis Hannu Oja 1, Seija Sirkiä 2, and Jan Eriksson 3 1 University of Tampere, Finland
More informationBLIND SEPARATION OF INSTANTANEOUS MIXTURES OF NON STATIONARY SOURCES
BLIND SEPARATION OF INSTANTANEOUS MIXTURES OF NON STATIONARY SOURCES Dinh-Tuan Pham Laboratoire de Modélisation et Calcul URA 397, CNRS/UJF/INPG BP 53X, 38041 Grenoble cédex, France Dinh-Tuan.Pham@imag.fr
More informationNONLINEAR BLIND SOURCE SEPARATION USING KERNEL FEATURE SPACES.
NONLINEAR BLIND SOURCE SEPARATION USING KERNEL FEATURE SPACES Stefan Harmeling 1, Andreas Ziehe 1, Motoaki Kawanabe 1, Benjamin Blankertz 1, Klaus-Robert Müller 1, 1 GMD FIRST.IDA, Kekuléstr. 7, 1489 Berlin,
More informationBlind Source Separation via Generalized Eigenvalue Decomposition
Journal of Machine Learning Research 4 (2003) 1261-1269 Submitted 10/02; Published 12/03 Blind Source Separation via Generalized Eigenvalue Decomposition Lucas Parra Department of Biomedical Engineering
More informationON SOME EXTENSIONS OF THE NATURAL GRADIENT ALGORITHM. Brain Science Institute, RIKEN, Wako-shi, Saitama , Japan
ON SOME EXTENSIONS OF THE NATURAL GRADIENT ALGORITHM Pando Georgiev a, Andrzej Cichocki b and Shun-ichi Amari c Brain Science Institute, RIKEN, Wako-shi, Saitama 351-01, Japan a On leave from the Sofia
More informationIndependent Component Analysis
A Short Introduction to Independent Component Analysis with Some Recent Advances Aapo Hyvärinen Dept of Computer Science Dept of Mathematics and Statistics University of Helsinki Problem of blind source
More informationIndependent component analysis: algorithms and applications
PERGAMON Neural Networks 13 (2000) 411 430 Invited article Independent component analysis: algorithms and applications A. Hyvärinen, E. Oja* Neural Networks Research Centre, Helsinki University of Technology,
More informationThe squared symmetric FastICA estimator
he squared symmetric FastICA estimator Jari Miettinen, Klaus Nordhausen, Hannu Oja, Sara askinen and Joni Virta arxiv:.0v [math.s] Dec 0 Abstract In this paper we study the theoretical properties of the
More informationMULTICHANNEL SIGNAL PROCESSING USING SPATIAL RANK COVARIANCE MATRICES
MULTICHANNEL SIGNAL PROCESSING USING SPATIAL RANK COVARIANCE MATRICES S. Visuri 1 H. Oja V. Koivunen 1 1 Signal Processing Lab. Dept. of Statistics Tampere Univ. of Technology University of Jyväskylä P.O.
More informationOptimization and Testing in Linear. Non-Gaussian Component Analysis
Optimization and Testing in Linear Non-Gaussian Component Analysis arxiv:1712.08837v2 [stat.me] 29 Dec 2017 Ze Jin, Benjamin B. Risk, David S. Matteson May 13, 2018 Abstract Independent component analysis
More informationIndependent Component Analysis
1 Independent Component Analysis Background paper: http://www-stat.stanford.edu/ hastie/papers/ica.pdf 2 ICA Problem X = AS where X is a random p-vector representing multivariate input measurements. S
More informationAn Introduction to Independent Components Analysis (ICA)
An Introduction to Independent Components Analysis (ICA) Anish R. Shah, CFA Northfield Information Services Anish@northinfo.com Newport Jun 6, 2008 1 Overview of Talk Review principal components Introduce
More informationIndependent Component Analysis and Its Application on Accelerator Physics
Independent Component Analysis and Its Application on Accelerator Physics Xiaoying Pang LA-UR-12-20069 ICA and PCA Similarities: Blind source separation method (BSS) no model Observed signals are linear
More informationBLIND SEPARATION USING ABSOLUTE MOMENTS BASED ADAPTIVE ESTIMATING FUNCTION. Juha Karvanen and Visa Koivunen
BLIND SEPARATION USING ABSOLUTE MOMENTS BASED ADAPTIVE ESTIMATING UNCTION Juha Karvanen and Visa Koivunen Signal Processing Laboratory Helsinki University of Technology P.O. Box 3, IN-215 HUT, inland Tel.
More informationA Canonical Genetic Algorithm for Blind Inversion of Linear Channels
A Canonical Genetic Algorithm for Blind Inversion of Linear Channels Fernando Rojas, Jordi Solé-Casals, Enric Monte-Moreno 3, Carlos G. Puntonet and Alberto Prieto Computer Architecture and Technology
More informationNon-orthogonal Support-Width ICA
ESANN'6 proceedings - European Symposium on Artificial Neural Networks Bruges (Belgium), 6-8 April 6, d-side publi., ISBN -9337-6-4. Non-orthogonal Support-Width ICA John A. Lee, Frédéric Vrins and Michel
More informationON-LINE BLIND SEPARATION OF NON-STATIONARY SIGNALS
Yugoslav Journal of Operations Research 5 (25), Number, 79-95 ON-LINE BLIND SEPARATION OF NON-STATIONARY SIGNALS Slavica TODOROVIĆ-ZARKULA EI Professional Electronics, Niš, bssmtod@eunet.yu Branimir TODOROVIĆ,
More informationComparative Performance Analysis of Three Algorithms for Principal Component Analysis
84 R. LANDQVIST, A. MOHAMMED, COMPARATIVE PERFORMANCE ANALYSIS OF THR ALGORITHMS Comparative Performance Analysis of Three Algorithms for Principal Component Analysis Ronnie LANDQVIST, Abbas MOHAMMED Dept.
More informationUndercomplete Independent Component. Analysis for Signal Separation and. Dimension Reduction. Category: Algorithms and Architectures.
Undercomplete Independent Component Analysis for Signal Separation and Dimension Reduction John Porrill and James V Stone Psychology Department, Sheeld University, Sheeld, S10 2UR, England. Tel: 0114 222
More informationICA and ISA Using Schweizer-Wolff Measure of Dependence
Keywords: independent component analysis, independent subspace analysis, copula, non-parametric estimation of dependence Abstract We propose a new algorithm for independent component and independent subspace
More informationIndependent Component Analysis (ICA) Bhaskar D Rao University of California, San Diego
Independent Component Analysis (ICA) Bhaskar D Rao University of California, San Diego Email: brao@ucsdedu References 1 Hyvarinen, A, Karhunen, J, & Oja, E (2004) Independent component analysis (Vol 46)
More informationBlind Extraction of Singularly Mixed Source Signals
IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL 11, NO 6, NOVEMBER 2000 1413 Blind Extraction of Singularly Mixed Source Signals Yuanqing Li, Jun Wang, Senior Member, IEEE, and Jacek M Zurada, Fellow, IEEE Abstract
More informationORIENTED PCA AND BLIND SIGNAL SEPARATION
ORIENTED PCA AND BLIND SIGNAL SEPARATION K. I. Diamantaras Department of Informatics TEI of Thessaloniki Sindos 54101, Greece kdiamant@it.teithe.gr Th. Papadimitriou Department of Int. Economic Relat.
More informationA Constrained EM Algorithm for Independent Component Analysis
LETTER Communicated by Hagai Attias A Constrained EM Algorithm for Independent Component Analysis Max Welling Markus Weber California Institute of Technology, Pasadena, CA 91125, U.S.A. We introduce a
More informationIndependent Component Analysis and Blind Source Separation
Independent Component Analysis and Blind Source Separation Aapo Hyvärinen University of Helsinki and Helsinki Institute of Information Technology 1 Blind source separation Four source signals : 1.5 2 3
More informationSingle Channel Signal Separation Using MAP-based Subspace Decomposition
Single Channel Signal Separation Using MAP-based Subspace Decomposition Gil-Jin Jang, Te-Won Lee, and Yung-Hwan Oh 1 Spoken Language Laboratory, Department of Computer Science, KAIST 373-1 Gusong-dong,
More informationBayesian ensemble learning of generative models
Chapter Bayesian ensemble learning of generative models Harri Valpola, Antti Honkela, Juha Karhunen, Tapani Raiko, Xavier Giannakopoulos, Alexander Ilin, Erkki Oja 65 66 Bayesian ensemble learning of generative
More informationAutoregressive Independent Process Analysis with Missing Observations
Autoregressive Independent Process Analysis with Missing Observations Zoltán Szabó Eötvös Loránd University - Department of Software Technology and Methodology Pázmány P. sétány 1/C, Budapest, H-1117 -
More informationBlind Spectral-GMM Estimation for Underdetermined Instantaneous Audio Source Separation
Blind Spectral-GMM Estimation for Underdetermined Instantaneous Audio Source Separation Simon Arberet 1, Alexey Ozerov 2, Rémi Gribonval 1, and Frédéric Bimbot 1 1 METISS Group, IRISA-INRIA Campus de Beaulieu,
More informationAcoustic Source Separation with Microphone Arrays CCNY
Acoustic Source Separation with Microphone Arrays Lucas C. Parra Biomedical Engineering Department City College of New York CCNY Craig Fancourt Clay Spence Chris Alvino Montreal Workshop, Nov 6, 2004 Blind
More informationIDENTIFIABILITY AND SEPARABILITY OF LINEAR ICA MODELS REVISITED. Jan Eriksson and Visa Koivunen
J IDENTIFIABILITY AND SEPARABILITY OF LINEAR IA MODELS REVISITED Jan Eriksson Visa Koivunen Signal Processing Laboratory Helsinki University of Technology PO Box 3000, FIN02015 HUT, Finl janeriksson,visakoivunen
More informationIndependent Component Analysis. PhD Seminar Jörgen Ungh
Independent Component Analysis PhD Seminar Jörgen Ungh Agenda Background a motivater Independence ICA vs. PCA Gaussian data ICA theory Examples Background & motivation The cocktail party problem Bla bla
More informationFourier PCA. Navin Goyal (MSR India), Santosh Vempala (Georgia Tech) and Ying Xiao (Georgia Tech)
Fourier PCA Navin Goyal (MSR India), Santosh Vempala (Georgia Tech) and Ying Xiao (Georgia Tech) Introduction 1. Describe a learning problem. 2. Develop an efficient tensor decomposition. Independent component
More informationCONTROL SYSTEMS ANALYSIS VIA BLIND SOURCE DECONVOLUTION. Kenji Sugimoto and Yoshito Kikkawa
CONTROL SYSTEMS ANALYSIS VIA LIND SOURCE DECONVOLUTION Kenji Sugimoto and Yoshito Kikkawa Nara Institute of Science and Technology Graduate School of Information Science 896-5 Takayama-cho, Ikoma-city,
More informationPrincipal Component Analysis
Principal Component Analysis Introduction Consider a zero mean random vector R n with autocorrelation matri R = E( T ). R has eigenvectors q(1),,q(n) and associated eigenvalues λ(1) λ(n). Let Q = [ q(1)
More informationIndependent Component Analysis for Blind Source Separation
Independent Component Analysis for Blind Source Separation Ilmenau University of Technology and Fraunhofer Institute for Digital Media Technology (IDMT) June 26, 2017 1 Introduction ICA 2 ICA 3 Conclusions
More informationMISEP Linear and Nonlinear ICA Based on Mutual Information
Journal of Machine Learning Research 4 (23) 297-38 Submitted /2; Published 2/3 MISEP Linear and Nonlinear ICA Based on Mutual Information Luís B. Almeida INESC-ID, R. Alves Redol, 9, -29 Lisboa, Portugal
More informationSpeed and Accuracy Enhancement of Linear ICA Techniques Using Rational Nonlinear Functions
Speed and Accuracy Enhancement of Linear ICA Techniques Using Rational Nonlinear Functions Petr Tichavský 1, Zbyněk Koldovský 1,2, and Erkki Oja 3 1 Institute of Information Theory and Automation, Pod
More informationA Convex Cauchy-Schwarz Divergence Measure for Blind Source Separation
INTERNATIONAL JOURNAL OF CIRCUITS, SYSTEMS AND SIGNAL PROCESSING Volume, 8 A Convex Cauchy-Schwarz Divergence Measure for Blind Source Separation Zaid Albataineh and Fathi M. Salem Abstract We propose
More informationIndependent Components Analysis by Direct Entropy Minimization
Independent Components Analysis by Direct Entropy Minimization Erik G. Miller egmil@eecs.berkeley.edu Computer Science Division University of California Berkeley, CA 94720, USA John W. Fisher III fisher@ai.mit.edu
More informationFuncICA for time series pattern discovery
FuncICA for time series pattern discovery Nishant Mehta and Alexander Gray Georgia Institute of Technology The problem Given a set of inherently continuous time series (e.g. EEG) Find a set of patterns
More informationComparison of Fast ICA and Gradient Algorithms of Independent Component Analysis for Separation of Speech Signals
K. Mohanaprasad et.al / International Journal of Engineering and echnolog (IJE) Comparison of Fast ICA and Gradient Algorithms of Independent Component Analsis for Separation of Speech Signals K. Mohanaprasad
More informationOne-unit Learning Rules for Independent Component Analysis
One-unit Learning Rules for Independent Component Analysis Aapo Hyvarinen and Erkki Oja Helsinki University of Technology Laboratory of Computer and Information Science Rakentajanaukio 2 C, FIN-02150 Espoo,
More informationAdvanced Introduction to Machine Learning CMU-10715
Advanced Introduction to Machine Learning CMU-10715 Independent Component Analysis Barnabás Póczos Independent Component Analysis 2 Independent Component Analysis Model original signals Observations (Mixtures)
More informationBLIND source separation (BSS) aims at recovering a vector
1030 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 53, NO. 3, MARCH 2007 Mixing and Non-Mixing Local Minima of the Entropy Contrast for Blind Source Separation Frédéric Vrins, Student Member, IEEE, Dinh-Tuan
More informationBLIND SEPARATION OF INSTANTANEOUS MIXTURES OF NON STATIONARY SOURCES
BLIND SEPARATION OF INSTANTANEOUS MIXTURES OF NON STATIONARY SOURCES Dinh-Tuan Pham Laboratoire de Modélisation et Calcul URA 397, CNRS/UJF/INPG BP 53X, 38041 Grenoble cédex, France Dinh-Tuan.Pham@imag.fr
More informationA Pseudo-Euclidean Iteration for Optimal Recovery in Noisy ICA
A Pseudo-Euclidean Iteration for Optimal Recovery in Noisy ICA James Voss The Ohio State University vossj@cse.ohio-state.edu Mikhail Belkin The Ohio State University mbelkin@cse.ohio-state.edu Luis Rademacher
More informationBlind Signal Separation: Statistical Principles
Blind Signal Separation: Statistical Principles JEAN-FRANÇOIS CARDOSO, MEMBER, IEEE Invited Paper Blind signal separation (BSS) and independent component analysis (ICA) are emerging techniques of array
More informationLecture 10: Dimension Reduction Techniques
Lecture 10: Dimension Reduction Techniques Radu Balan Department of Mathematics, AMSC, CSCAMM and NWC University of Maryland, College Park, MD April 17, 2018 Input Data It is assumed that there is a set
More informationLecture'12:' SSMs;'Independent'Component'Analysis;' Canonical'Correla;on'Analysis'
Lecture'12:' SSMs;'Independent'Component'Analysis;' Canonical'Correla;on'Analysis' Lester'Mackey' May'7,'2014' ' Stats'306B:'Unsupervised'Learning' Beyond'linearity'in'state'space'modeling' Credit:'Alex'Simma'
More informationNonlinear reverse-correlation with synthesized naturalistic noise
Cognitive Science Online, Vol1, pp1 7, 2003 http://cogsci-onlineucsdedu Nonlinear reverse-correlation with synthesized naturalistic noise Hsin-Hao Yu Department of Cognitive Science University of California
More informationPrincipal Component Analysis vs. Independent Component Analysis for Damage Detection
6th European Workshop on Structural Health Monitoring - Fr..D.4 Principal Component Analysis vs. Independent Component Analysis for Damage Detection D. A. TIBADUIZA, L. E. MUJICA, M. ANAYA, J. RODELLAR
More informationCONVOLUTIVE NON-NEGATIVE MATRIX FACTORISATION WITH SPARSENESS CONSTRAINT
CONOLUTIE NON-NEGATIE MATRIX FACTORISATION WITH SPARSENESS CONSTRAINT Paul D. O Grady Barak A. Pearlmutter Hamilton Institute National University of Ireland, Maynooth Co. Kildare, Ireland. ABSTRACT Discovering
More informationBy choosing to view this document, you agree to all provisions of the copyright laws protecting it.
This material is posted here with permission of the IEEE. Such permission of the IEEE does not in any way imply IEEE endorsement of any of Helsinki University of Technology's products or services. Internal
More informationSemi-Blind approaches to source separation: introduction to the special session
Semi-Blind approaches to source separation: introduction to the special session Massoud BABAIE-ZADEH 1 Christian JUTTEN 2 1- Sharif University of Technology, Tehran, IRAN 2- Laboratory of Images and Signals
More informationSeparation of Different Voices in Speech using Fast Ica Algorithm
Volume-6, Issue-6, November-December 2016 International Journal of Engineering and Management Research Page Number: 364-368 Separation of Different Voices in Speech using Fast Ica Algorithm Dr. T.V.P Sundararajan
More informationShort-Time ICA for Blind Separation of Noisy Speech
Short-Time ICA for Blind Separation of Noisy Speech Jing Zhang, P.C. Ching Department of Electronic Engineering The Chinese University of Hong Kong, Hong Kong jzhang@ee.cuhk.edu.hk, pcching@ee.cuhk.edu.hk
More informationRobust Independent Component Analysis via Minimum -Divergence Estimation
614 IEEE JOURNAL OF SELECTED TOPICS IN SIGNAL PROCESSING, VOL. 7, NO. 4, AUGUST 2013 Robust Independent Component Analysis via Minimum -Divergence Estimation Pengwen Chen, Hung Hung, Osamu Komori, Su-Yun
More informationADAPTIVE LATERAL INHIBITION FOR NON-NEGATIVE ICA. Mark Plumbley
Submitteed to the International Conference on Independent Component Analysis and Blind Signal Separation (ICA2) ADAPTIVE LATERAL INHIBITION FOR NON-NEGATIVE ICA Mark Plumbley Audio & Music Lab Department
More information