Independent Component Analysis (ICA)
|
|
- Clara Douglas
- 5 years ago
- Views:
Transcription
1 Independent Component Analysis (ICA) Université catholique de Louvain (Belgium) Machine Learning Group ucl.ac.be/.ac.be/mlg/ 1 Overview Uncorrelation vs Independence Blind source separation & cocktail party problem Equations, indeterminations & assumptions Pre-whitening step Some examples The Gaussian case Objective functions: how to recover independent components? Non-Gaussianity approach & central limit theorem Minimum dependence approach Real-world examples Extensions 1
2 What is ICA? PCA: finding a transformation that uncorrelate variables ICA: finding a transformation that make variables as independent as possible In this lecture: the transformation is constrained to be linear and instantaneous 3 Independence is stronger than uncorrelation Uncorrelation between x and y : E[xy xy] ] = E[x]E[ ]E[y] Independence between x and y : E[f(x)g(y)]=E[ )]=E[f(x)]E[g(y)],)], for any non-linear functions f and g x does not carry any information about y In other words : Correlation measures the existence of a linear relation between variables Dependence measures the existence of any relation between variables 4
3 Uncorrelation vs indepence: example Let u be a random variable with uniform distribution: E[u]= and E[u ]=1 f u 1/ 3 Let v = u E[uv] = E[u 3 ] = and E[u]E[v]= u and v are uncorrelated (no linear relation between u and v) u Let f(u)=u and g(u)=u E f g [ ( u ) ( v )] = E [ u 4 ] = ( 3) u and v are dependent (a link exists between u and v) 4 but E f [ ( u )] E [ u ] E [ g ( v )] E [u ] = PCA vs ICA? PCA (whitening): maximum variance projection NO independence! rem: whitening conservation for any rotation ICA: minimum dependence directions (can t say anything on y knowing x) rem: independence remains for k π rotation (k in Z) up to permutation and sign!
4 PCA vs ICA? s x y 1 y s 1 mixing matrix x 1 A 1 = 1 PCA x x y x 1 ICA y 1 x 1 7 Independent Component Analysis (ICA) «source separation» or «cocktail-party» problem aims: to separate signals to use an independence criterion instead of variance maximisation (PCA) 8 4
5 Blind source separation Sources S A Mixtures X W Outputs Y UNKNOWN KNOWN TO ESTIMATE 9 Method? Under several assumptions: Y=Estim(S)=W ICA X Cocktail party Hypotheses: linear and additive mixing no phase delay signals rather than data 1 Why ICA rather than PCA? Uncorrelation independence If W is a whitening matrix, then UW s.t. UU T =I is also a whitening W whitening matrix highly non-unique (up to any rotation matrix!) W ICA is unique, up to indeterminations
6 The problem in equations Notations independent signals (unknown) measured signals linear mixing: The problem : to estimate W A -1 x () t = [ s () t s () t s ()] T,, n t s 1, K 1, K () t = [ x () t x () t x ()] T,, n t () t = A s () t x. but A is unknown! so that y = Wx = WAs will be an estimate of the sources: y = sˆ 11 Independence hypothesis A is unknown we cannot compute W = A -1 This lack of information is compensated by the independence hypothesis Solution - indeterminations Solution We measure the independence of signals y i (t) when this independence is maximum : y t Indeterminations order of signals (indep. is symmetric) multiplying factor on each signal solution: n x i = aij s j j = 1 W 1 = PDA n = j = 1 α s j. aij α () s () t constant Diagonal matrix (non-zero coefficients) A calibration could be necessary Permutation matrix Low importance in applications 1 6
7 Solution - assumptions Source signals are mutually independent Since the magnitude of the s i cannot be known, it is fixed s.t. E[s i s it ]=1. Hence, it is supposed that : T E [ ss ] = I The mixing matrix is supposed to be constant in time 13 Whitening: a preprocessing to ICA? Sources centered observations Output signals s x=as Whitened signals - - Y=Wz z=vx=vas 14 7
8 Whitening: a preprocessing to ICA? Why unmixing z (whitened signals) instead of x? If z is white VA is orthogonal: T E [ zz ] = T T ( VA) E [ ss ] VA 1443 I = I If VA is orthogonal W reduces to an orthogonal matrix : T E [ yy ] = T T W E [ zz ] W 1443 I = I Hence: only n(n-1)/ instead of n parameters have to be estimated 1 Separation of Uniform signals sources 1 An estimation -Permutation s1-s - Double inversion. - - Other estimation. 1 - Inversion of s1 - No permutation
9 Uncorrelation and independence [1] sources Mixtures ICA Whitening (FastICA) Uncorrelation and independence [] (SWICA) Sources Mixtures Whitening ICA F. Vrins Warning: mean and variance of original images are important! 18 9
10 The Gaussian case If the sources have Gaussian distribution Temporal structure looks (but isn t t!) similar to uniform random signal Scatter plot is very different Sources (temp. Struct.) Sources (scatter plot) ϕ ( x µ ) 1 ( ) = σ x e πσ (fully described by mean and variance) The Gaussian case If the sources have Gaussian distribution in the Gaussian case: Independence is equivalent to uncorrelation! n! Mixtures Max. Variance Scaling (whitening) A rotation after whitening does not change anything! 1
11 Sources The Gaussian case superimposition White mixt p How to find the rotation corresponding to original sources (up to perm/scale)? p Other information needed (temporal struct.,etc) to separate Gaussian sources. 1 Main tool for ICA: independence Discrete case Continuous case ( A, B ) p ( A) p ( B ) ( A B ) p ( A) P = P = f x n ( x ) = f x ( x i ) i = 1 i The problem to measure independence between signals to minimize this independence 11
12 ICA objective functions Y = WX = WAS Non-Gaussianity approach By the central limit theorem, the PDF of a sum of n indepedent random variables converges to a Gaussian Measure of non-gaussianity Finding W such that the outputs PDF are as different as possible from the Gaussian function one output signal at a time Independence approach Find independence measures between signals Estimation of PDF or of independence criterion all output signals together 3 Gaussianity and CLT Central Limit Theorem: illustration with uniform variable n=1 n= n= n=1 n= n=1 4 1
13 Non-Gaussianity approach Minimum differential entropy Gaussians have maximum differential entropy Maximum negentropy equivalent to differential entropy Maximum positive transform of kurtosis Gaussians have kurtosis = Gram-Charlier expansion measure the difference between output pdf and Gaussian pdf Entropy [1/] Discrete case H K ( x ) = log( ) i p i p i = 1 for j i H ( ) ( minimum ) ( x ) log( K ) ( maximum) if pi = 1, p j = x = if p i = 1 K H = Continuous case: differential entropy h ( x ) f ( u ) logf ( u ) = du 6 13
14 Entropy [/] Continuous case (continued) maximum differential entropy (if variance = σ ): Gaussian 1 ( x ) log( πeσ ) h G = differential entropy: invariant to orthogonal transforms Minimizing h(x) make PDF of x far from Gaussian Finding W s.t. the outputs entropies are low (x i are unit-variance) 7 Negentropy negentropy: : difference wrt the entropy of a Gaussian J ( x ) = h ( x ) h( x ) multi-dimensional case : J G ( x ) f ( u ) x = log f f x x G ( u ) ( u ) du Finding W s.t. the J(x) is maximum (x i are unit-variance) 8 14
15 Kurtosis : intuitive considerations Definition of Kurtosis : κ 4 ( ) E[ ] 3( [ ]) x = x x 4 E Interesting properties: - for Gaussian PDF: κ ( x ) 4 G = - for most non-gaussian PDF: ( x ) κ 4 > Finding W s.t. m i = 1 κ 4 ( x i ) is maximum (x i are unit-variance) 9 Kurtosis: illustration 3 1
16 Gram-Charlier Expansion Taylor expansion approximate a function f around f(x ) Gram-Charlier expansion approximate a PDF p x around the Gaussian function ϕ truncated at fourth order: p x H 3 ( ξ ) H ( ) ( ) ( ) 1 + ( ) + ( ) 4 ξ ξ ϕ ξ κ3 x κ 4 x 3! 4! «Non-Gaussian part» of p x 31 Minimum dependence approach Minimum Mutual Information Minimum sum of marginal entropies Minimum positive transform of cross-cumulant cumulant 3 16
17 Mutual information and marginal entropies Mutual information (MI) I ( x ) f ( u ) = f x log n f i = 1 I(x)= iff all x i are independent x ( u ) ( u ) x i i du MI and sum of outputs marginal entropies 33 I ( x ) = = = m h ( x i ) i = 1 m h ( x i ) i = 1 m h ( x i ) i = 1 h h h ( x ) ( Wz ) ( z ) log( det( W ) ) (WW T =I) Mutual information and marginal entropies Mutual information Difficult to estimate (joint pdf of x ) Computational cost Finding W s.t. the I(x) is minimum Sum of outputs marginal entropies Better than MI because no estimation of joint PDF Finding W s.t. WW T =I minimizing m h(xi ) i =
18 Moments and cumulants Probability density function Properties of the distribution (mean, variance, ) Moments ordre r moment r r ( x ) E[ x ] µ' = centered ordre r moment µ r [ ] r ( x ) = E ( x - E[ x] ) 3 Independence,, PCA and ICA Whitening Diagonalization of covariance matrix (+ scaling) Only moments with order < are taken into account ICA Diagonlization of higher-order cumulant tensor ( hyper( hyper-matrix with four indices: I,j,k,l): like higher-order covariance! In order to go further than (linear) decorrelation! Independence: one should know all cross-cumulants cumulants and then make them equal to zero! 36 18
19 The Gaussian case (con t) Gaussian distribution Perfectly defined by the mean/variance of the variable! All moments of order> are stricly zero! PCA ICA data are described by covariance matrices only moments with order < are taken into account (set to zero) = Make the higher order statistics to zero But it is already the case for Gaussian PDF (see G-C G C exp.)! 37 Decorrelation = independence for Gaussian variables! Decorrelation transform: up to rotation transform Too many indeterminations for the BSS problem Other information needed (temporal structure, frequency, ) Theory and Practice In theory, independence measures require the knowledge of the PDF (to compute mutual information or entropies, ). In practice, those PDF are unknown. Two possibilities Density estimation (but difficult task) Estimate directly the independence measures 38 Example of independence approximation: independence = all higher-order cross-cumulants cumulants must be zero approx. of indep. = covariance and kurtosis must be zero 19
20 Dependence minimization 39 How to maximize independence or non-gaussianity? for example, through cumulants or negentropy Often: a pre-whitening step is useful ICA problem reduces to find a rotation matrix: ++ : the number of elements to estimate reduces from n to n(n-1)/ - - : computational problem and errors (if PCA failed) Objective functions (OF) to minimize estimation or approximation of non-gaussianity and independence measures Local minima? Algorithms to minimize OF: neural (gradient-based, based, ) and algebraic methods, specific algorithms for specific problems Local vs Global criterion «local criterion» π Source scatter plot I(y) 1 π π 3π θ π «global criterion» 4. 4 y = Ws s.t. W = θ cos sin θ sin cos θ θ pi/4 pi/ 3*pi/4 pi pi/4 3pi/7pi/4 pi
21 Image preprocessing 41 Signal separation Example 1 Example x1 y 1 -s 1 x y -s 4 1
22 Biomedical appl. : FECG extraction Whitening/ICA Signals recorded on a pregnant woman abdomen Extracted source signals F. Vrins Maternal ECG, Fetal ECG 43 Mixture of digital sources (PCA ICA) PCA ICA 44
23 Independent subimages A.J. Bell, T.J. Sejnowski, Edges are the Independent Components of natural scenes, NIPS 96, pp , MIT Press, Handfree phone in car N. Charkani El Hassani, Séparation auto-adaptative de sources pour des mélanges convolutifs Application à la téléphonie mains-libres dans les voitures, thèse de doctorat, INP Grenoble,
24 Multiple RF tags Y. Deville, J. Damour, N. Charkani, Improved multi-tag radio-frequency identification systems based on new source separation neural networks, Proc. of ICA 99, Aussois (France), January 1999, pp Financial time series ICA reconstruction (4 ICs) A.D. Back, A.S. Weigend, A First Application of Independent Component Analysis to Extracting Structure from Stock Returns, International Journal of Neural Systems, Vol. 8, No. (October, 1997) 48 4
25 Cocktail party Speech music separation observations estimations Speech speech separation observations estimations T.-W. Lee, Institute for Neural Computation, University of California (San Diego), 49 Problem extensions [1/3] Basic model Extensions n measures, m sources, n m n > m: first : m is estimated PCA stage : dim. red. from n to m m m signals then source separation (ICA) n < m: n x i = ij j, j = 1 () t a s () t, i = 1, n the m most powerful sources are estimated the result is corrupted by the m - n other sources Other techniques, using sparisty, K
26 Problem extensions [/3] Extensions (ctnd( ctnd) noisy observations n x i t = aij s j i K, j = 1 () () t + n () t, i = 1, n () t = As () t n() t x + Even if W is estimated perfectly: () t = Wx() t = WAs() t Wn() t y + If n >>m: : specific algorithms (projection in the signal subspace) 1 ill-conditioned mixings lines of A are similar specific algorithms Problem extensions [3/3] Extensions (ctnd( ctnd) more complex mixtures (filtering) specific algorithms convolutive mixtures x () t = A() t s () t n p 1 x i = ij j, j = 1 k = () t a ( k ) s ( t k ), i = 1, n post non-linear mixtures n x i = i ij j, j = 1 () t f a s () t, i = 1, n non-linear mixtures x i K 1 K n = K () t = f ( s () t,, s () t ), i 1,, n i K 6
27 Sources and References Some ideas and figures contained in these slides come from: Blind separation of sources, Part I, C. Jutten & J. Hérault,, Signal Processing 4, pp. 1-1, 1 1, Improving Independent Component Analysis Performances by Variable Selection, F. Vrins, J. A. Lee,, V. Vigneron and C. Jutten, IEEE NNSP'3, pp , 368, September 17-19, 19, 3, Toulouse (France). High performance magnetic field smart sensor arrays with source separation, A. Paraschiv-Ionescu et al., Proc. 1st Int. Conf. On Modeling and Simulation of Microsystems (MSM98), Santa Clara (USA), April 6-8, Elements of Information Theory, Cover and Thomas, Wiley and Sons,, New York, 1. Independent component analysis,, A. Hyvarinen, J. Karhunen and E. Oja, Wiley series on adaptive and learning systems for signal processing, communications and control, S. Haykin edt, 1. Thanks to Frédéric ric Vrins for many slides! 3 7
Massoud BABAIE-ZADEH. Blind Source Separation (BSS) and Independent Componen Analysis (ICA) p.1/39
Blind Source Separation (BSS) and Independent Componen Analysis (ICA) Massoud BABAIE-ZADEH Blind Source Separation (BSS) and Independent Componen Analysis (ICA) p.1/39 Outline Part I Part II Introduction
More informationIndependent Component Analysis (ICA) Bhaskar D Rao University of California, San Diego
Independent Component Analysis (ICA) Bhaskar D Rao University of California, San Diego Email: brao@ucsdedu References 1 Hyvarinen, A, Karhunen, J, & Oja, E (2004) Independent component analysis (Vol 46)
More informationIndependent Component Analysis and Its Applications. By Qing Xue, 10/15/2004
Independent Component Analysis and Its Applications By Qing Xue, 10/15/2004 Outline Motivation of ICA Applications of ICA Principles of ICA estimation Algorithms for ICA Extensions of basic ICA framework
More informationGatsby Theoretical Neuroscience Lectures: Non-Gaussian statistics and natural images Parts I-II
Gatsby Theoretical Neuroscience Lectures: Non-Gaussian statistics and natural images Parts I-II Gatsby Unit University College London 27 Feb 2017 Outline Part I: Theory of ICA Definition and difference
More informationIndependent Component Analysis. Contents
Contents Preface xvii 1 Introduction 1 1.1 Linear representation of multivariate data 1 1.1.1 The general statistical setting 1 1.1.2 Dimension reduction methods 2 1.1.3 Independence as a guiding principle
More informationHST.582J/6.555J/16.456J
Blind Source Separation: PCA & ICA HST.582J/6.555J/16.456J Gari D. Clifford gari [at] mit. edu http://www.mit.edu/~gari G. D. Clifford 2005-2009 What is BSS? Assume an observation (signal) is a linear
More informationNon-orthogonal Support-Width ICA
ESANN'6 proceedings - European Symposium on Artificial Neural Networks Bruges (Belgium), 6-8 April 6, d-side publi., ISBN -9337-6-4. Non-orthogonal Support-Width ICA John A. Lee, Frédéric Vrins and Michel
More informationCIFAR Lectures: Non-Gaussian statistics and natural images
CIFAR Lectures: Non-Gaussian statistics and natural images Dept of Computer Science University of Helsinki, Finland Outline Part I: Theory of ICA Definition and difference to PCA Importance of non-gaussianity
More informationHST.582J / 6.555J / J Biomedical Signal and Image Processing Spring 2007
MIT OpenCourseWare http://ocw.mit.edu HST.582J / 6.555J / 16.456J Biomedical Signal and Image Processing Spring 2007 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms.
More informationAn Improved Cumulant Based Method for Independent Component Analysis
An Improved Cumulant Based Method for Independent Component Analysis Tobias Blaschke and Laurenz Wiskott Institute for Theoretical Biology Humboldt University Berlin Invalidenstraße 43 D - 0 5 Berlin Germany
More informationAdvanced Introduction to Machine Learning CMU-10715
Advanced Introduction to Machine Learning CMU-10715 Independent Component Analysis Barnabás Póczos Independent Component Analysis 2 Independent Component Analysis Model original signals Observations (Mixtures)
More informationSemi-Blind approaches to source separation: introduction to the special session
Semi-Blind approaches to source separation: introduction to the special session Massoud BABAIE-ZADEH 1 Christian JUTTEN 2 1- Sharif University of Technology, Tehran, IRAN 2- Laboratory of Images and Signals
More informationIndependent Component Analysis
1 Independent Component Analysis Background paper: http://www-stat.stanford.edu/ hastie/papers/ica.pdf 2 ICA Problem X = AS where X is a random p-vector representing multivariate input measurements. S
More informationFundamentals of Principal Component Analysis (PCA), Independent Component Analysis (ICA), and Independent Vector Analysis (IVA)
Fundamentals of Principal Component Analysis (PCA),, and Independent Vector Analysis (IVA) Dr Mohsen Naqvi Lecturer in Signal and Information Processing, School of Electrical and Electronic Engineering,
More informationICA. Independent Component Analysis. Zakariás Mátyás
ICA Independent Component Analysis Zakariás Mátyás Contents Definitions Introduction History Algorithms Code Uses of ICA Definitions ICA Miture Separation Signals typical signals Multivariate statistics
More informationAn Introduction to Independent Components Analysis (ICA)
An Introduction to Independent Components Analysis (ICA) Anish R. Shah, CFA Northfield Information Services Anish@northinfo.com Newport Jun 6, 2008 1 Overview of Talk Review principal components Introduce
More informationBlind Machine Separation Te-Won Lee
Blind Machine Separation Te-Won Lee University of California, San Diego Institute for Neural Computation Blind Machine Separation Problem we want to solve: Single microphone blind source separation & deconvolution
More informationAnalytical solution of the blind source separation problem using derivatives
Analytical solution of the blind source separation problem using derivatives Sebastien Lagrange 1,2, Luc Jaulin 2, Vincent Vigneron 1, and Christian Jutten 1 1 Laboratoire Images et Signaux, Institut National
More informationIndependent Component Analysis. PhD Seminar Jörgen Ungh
Independent Component Analysis PhD Seminar Jörgen Ungh Agenda Background a motivater Independence ICA vs. PCA Gaussian data ICA theory Examples Background & motivation The cocktail party problem Bla bla
More informationLecture'12:' SSMs;'Independent'Component'Analysis;' Canonical'Correla;on'Analysis'
Lecture'12:' SSMs;'Independent'Component'Analysis;' Canonical'Correla;on'Analysis' Lester'Mackey' May'7,'2014' ' Stats'306B:'Unsupervised'Learning' Beyond'linearity'in'state'space'modeling' Credit:'Alex'Simma'
More informationWavelet de-noising for blind source separation in noisy mixtures.
Wavelet for blind source separation in noisy mixtures. Bertrand Rivet 1, Vincent Vigneron 1, Anisoara Paraschiv-Ionescu 2 and Christian Jutten 1 1 Institut National Polytechnique de Grenoble. Laboratoire
More informationRobust extraction of specific signals with temporal structure
Robust extraction of specific signals with temporal structure Zhi-Lin Zhang, Zhang Yi Computational Intelligence Laboratory, School of Computer Science and Engineering, University of Electronic Science
More informationIndependent Component Analysis
A Short Introduction to Independent Component Analysis Aapo Hyvärinen Helsinki Institute for Information Technology and Depts of Computer Science and Psychology University of Helsinki Problem of blind
More informationArtificial Intelligence Module 2. Feature Selection. Andrea Torsello
Artificial Intelligence Module 2 Feature Selection Andrea Torsello We have seen that high dimensional data is hard to classify (curse of dimensionality) Often however, the data does not fill all the space
More informationIntroduction to Independent Component Analysis. Jingmei Lu and Xixi Lu. Abstract
Final Project 2//25 Introduction to Independent Component Analysis Abstract Independent Component Analysis (ICA) can be used to solve blind signal separation problem. In this article, we introduce definition
More informationMTTS1 Dimensionality Reduction and Visualization Spring 2014 Jaakko Peltonen
MTTS1 Dimensionality Reduction and Visualization Spring 2014 Jaakko Peltonen Lecture 3: Linear feature extraction Feature extraction feature extraction: (more general) transform the original to (k < d).
More informationIndependent Component Analysis
Independent Component Analysis James V. Stone November 4, 24 Sheffield University, Sheffield, UK Keywords: independent component analysis, independence, blind source separation, projection pursuit, complexity
More informationSeparation of Different Voices in Speech using Fast Ica Algorithm
Volume-6, Issue-6, November-December 2016 International Journal of Engineering and Management Research Page Number: 364-368 Separation of Different Voices in Speech using Fast Ica Algorithm Dr. T.V.P Sundararajan
More information1 Introduction Independent component analysis (ICA) [10] is a statistical technique whose main applications are blind source separation, blind deconvo
The Fixed-Point Algorithm and Maximum Likelihood Estimation for Independent Component Analysis Aapo Hyvarinen Helsinki University of Technology Laboratory of Computer and Information Science P.O.Box 5400,
More informationA Canonical Genetic Algorithm for Blind Inversion of Linear Channels
A Canonical Genetic Algorithm for Blind Inversion of Linear Channels Fernando Rojas, Jordi Solé-Casals, Enric Monte-Moreno 3, Carlos G. Puntonet and Alberto Prieto Computer Architecture and Technology
More informationIndependent component analysis: algorithms and applications
PERGAMON Neural Networks 13 (2000) 411 430 Invited article Independent component analysis: algorithms and applications A. Hyvärinen, E. Oja* Neural Networks Research Centre, Helsinki University of Technology,
More informationIndependent Component Analysis
Department of Physics Seminar I b 1st year, 2nd cycle Independent Component Analysis Author: Žiga Zaplotnik Advisor: prof. dr. Simon Širca Ljubljana, April 2014 Abstract In this seminar we present a computational
More informationSlide11 Haykin Chapter 10: Information-Theoretic Models
Slide11 Haykin Chapter 10: Information-Theoretic Models CPSC 636-600 Instructor: Yoonsuck Choe Spring 2015 ICA section is heavily derived from Aapo Hyvärinen s ICA tutorial: http://www.cis.hut.fi/aapo/papers/ijcnn99_tutorialweb/.
More informationIndependent Component Analysis
A Short Introduction to Independent Component Analysis with Some Recent Advances Aapo Hyvärinen Dept of Computer Science Dept of Mathematics and Statistics University of Helsinki Problem of blind source
More informationIndependent Component Analysis
000 001 002 003 004 005 006 007 008 009 010 011 012 013 014 015 016 017 018 019 020 021 022 023 024 025 026 027 028 029 030 031 032 033 034 035 036 037 038 039 040 041 042 043 044 1 Introduction Indepent
More informationDifferent Estimation Methods for the Basic Independent Component Analysis Model
Washington University in St. Louis Washington University Open Scholarship Arts & Sciences Electronic Theses and Dissertations Arts & Sciences Winter 12-2018 Different Estimation Methods for the Basic Independent
More informationUndercomplete Independent Component. Analysis for Signal Separation and. Dimension Reduction. Category: Algorithms and Architectures.
Undercomplete Independent Component Analysis for Signal Separation and Dimension Reduction John Porrill and James V Stone Psychology Department, Sheeld University, Sheeld, S10 2UR, England. Tel: 0114 222
More informationPrincipal Component Analysis
Principal Component Analysis Introduction Consider a zero mean random vector R n with autocorrelation matri R = E( T ). R has eigenvectors q(1),,q(n) and associated eigenvalues λ(1) λ(n). Let Q = [ q(1)
More informationINDEPENDENT COMPONENT ANALYSIS
INDEPENDENT COMPONENT ANALYSIS A THESIS SUBMITTED IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE DEGREE OF Bachelor of Technology in Electronics and Communication Engineering Department By P. SHIVA
More informationIndependent Component Analysis and Its Application on Accelerator Physics
Independent Component Analysis and Its Application on Accelerator Physics Xiaoying Pang LA-UR-12-20069 ICA and PCA Similarities: Blind source separation method (BSS) no model Observed signals are linear
More informationA GUIDE TO INDEPENDENT COMPONENT ANALYSIS Theory and Practice
CONTROL ENGINEERING LABORATORY A GUIDE TO INDEPENDENT COMPONENT ANALYSIS Theory and Practice Jelmer van Ast and Mika Ruusunen Report A No 3, March 004 University of Oulu Control Engineering Laboratory
More informationIndependent Component Analysis and Blind Source Separation
Independent Component Analysis and Blind Source Separation Aapo Hyvärinen University of Helsinki and Helsinki Institute of Information Technology 1 Blind source separation Four source signals : 1.5 2 3
More informationIndependent Component Analysis
Independent Component Analysis Seungjin Choi Department of Computer Science Pohang University of Science and Technology, Korea seungjin@postech.ac.kr March 4, 2009 1 / 78 Outline Theory and Preliminaries
More informationBLIND source separation (BSS) aims at recovering a vector
1030 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 53, NO. 3, MARCH 2007 Mixing and Non-Mixing Local Minima of the Entropy Contrast for Blind Source Separation Frédéric Vrins, Student Member, IEEE, Dinh-Tuan
More informationTWO METHODS FOR ESTIMATING OVERCOMPLETE INDEPENDENT COMPONENT BASES. Mika Inki and Aapo Hyvärinen
TWO METHODS FOR ESTIMATING OVERCOMPLETE INDEPENDENT COMPONENT BASES Mika Inki and Aapo Hyvärinen Neural Networks Research Centre Helsinki University of Technology P.O. Box 54, FIN-215 HUT, Finland ABSTRACT
More informationLECTURE :ICA. Rita Osadchy. Based on Lecture Notes by A. Ng
LECURE :ICA Rita Osadchy Based on Lecture Notes by A. Ng Cocktail Party Person 1 2 s 1 Mike 2 s 3 Person 3 1 Mike 1 s 2 Person 2 3 Mike 3 microphone signals are mied speech signals 1 2 3 ( t) ( t) ( t)
More informationComparative Analysis of ICA Based Features
International Journal of Emerging Engineering Research and Technology Volume 2, Issue 7, October 2014, PP 267-273 ISSN 2349-4395 (Print) & ISSN 2349-4409 (Online) Comparative Analysis of ICA Based Features
More informationIndependent Component Analysis and Unsupervised Learning
Independent Component Analysis and Unsupervised Learning Jen-Tzung Chien National Cheng Kung University TABLE OF CONTENTS 1. Independent Component Analysis 2. Case Study I: Speech Recognition Independent
More informationTutorial on Blind Source Separation and Independent Component Analysis
Tutorial on Blind Source Separation and Independent Component Analysis Lucas Parra Adaptive Image & Signal Processing Group Sarnoff Corporation February 09, 2002 Linear Mixtures... problem statement...
More informationIndependent Component Analysis (ICA)
Independent Component Analysis (ICA) Seungjin Choi Department of Computer Science and Engineering Pohang University of Science and Technology 77 Cheongam-ro, Nam-gu, Pohang 37673, Korea seungjin@postech.ac.kr
More informationICA and ISA Using Schweizer-Wolff Measure of Dependence
Keywords: independent component analysis, independent subspace analysis, copula, non-parametric estimation of dependence Abstract We propose a new algorithm for independent component and independent subspace
More informationPROPERTIES OF THE EMPIRICAL CHARACTERISTIC FUNCTION AND ITS APPLICATION TO TESTING FOR INDEPENDENCE. Noboru Murata
' / PROPERTIES OF THE EMPIRICAL CHARACTERISTIC FUNCTION AND ITS APPLICATION TO TESTING FOR INDEPENDENCE Noboru Murata Waseda University Department of Electrical Electronics and Computer Engineering 3--
More informationNatural Image Statistics
Natural Image Statistics A probabilistic approach to modelling early visual processing in the cortex Dept of Computer Science Early visual processing LGN V1 retina From the eye to the primary visual cortex
More informationBlind Source Separation Using Artificial immune system
American Journal of Engineering Research (AJER) e-issn : 2320-0847 p-issn : 2320-0936 Volume-03, Issue-02, pp-240-247 www.ajer.org Research Paper Open Access Blind Source Separation Using Artificial immune
More informationBLIND SEPARATION OF POSITIVE SOURCES USING NON-NEGATIVE PCA
BLIND SEPARATION OF POSITIVE SOURCES USING NON-NEGATIVE PCA Erkki Oja Neural Networks Research Centre Helsinki University of Technology P.O.Box 54, 215 HUT, Finland erkki.oja@hut.fi Mark Plumbley Department
More informationOne-unit Learning Rules for Independent Component Analysis
One-unit Learning Rules for Independent Component Analysis Aapo Hyvarinen and Erkki Oja Helsinki University of Technology Laboratory of Computer and Information Science Rakentajanaukio 2 C, FIN-02150 Espoo,
More informationBlind channel deconvolution of real world signals using source separation techniques
Blind channel deconvolution of real world signals using source separation techniques Jordi Solé-Casals 1, Enric Monte-Moreno 2 1 Signal Processing Group, University of Vic, Sagrada Família 7, 08500, Vic
More informationIndependent Component Analysis and Unsupervised Learning. Jen-Tzung Chien
Independent Component Analysis and Unsupervised Learning Jen-Tzung Chien TABLE OF CONTENTS 1. Independent Component Analysis 2. Case Study I: Speech Recognition Independent voices Nonparametric likelihood
More informationSeparation of the EEG Signal using Improved FastICA Based on Kurtosis Contrast Function
Australian Journal of Basic and Applied Sciences, 5(9): 2152-2156, 211 ISSN 1991-8178 Separation of the EEG Signal using Improved FastICA Based on Kurtosis Contrast Function 1 Tahir Ahmad, 2 Hjh.Norma
More informationSingle Channel Signal Separation Using MAP-based Subspace Decomposition
Single Channel Signal Separation Using MAP-based Subspace Decomposition Gil-Jin Jang, Te-Won Lee, and Yung-Hwan Oh 1 Spoken Language Laboratory, Department of Computer Science, KAIST 373-1 Gusong-dong,
More informationFile: ica tutorial2.tex. James V Stone and John Porrill, Psychology Department, Sheeld University, Tel: Fax:
File: ica tutorial2.tex Independent Component Analysis and Projection Pursuit: A Tutorial Introduction James V Stone and John Porrill, Psychology Department, Sheeld University, Sheeld, S 2UR, England.
More informationwhere A 2 IR m n is the mixing matrix, s(t) is the n-dimensional source vector (n» m), and v(t) is additive white noise that is statistically independ
BLIND SEPARATION OF NONSTATIONARY AND TEMPORALLY CORRELATED SOURCES FROM NOISY MIXTURES Seungjin CHOI x and Andrzej CICHOCKI y x Department of Electrical Engineering Chungbuk National University, KOREA
More informationNatural Gradient Learning for Over- and Under-Complete Bases in ICA
NOTE Communicated by Jean-François Cardoso Natural Gradient Learning for Over- and Under-Complete Bases in ICA Shun-ichi Amari RIKEN Brain Science Institute, Wako-shi, Hirosawa, Saitama 351-01, Japan Independent
More informationUnsupervised learning: beyond simple clustering and PCA
Unsupervised learning: beyond simple clustering and PCA Liza Rebrova Self organizing maps (SOM) Goal: approximate data points in R p by a low-dimensional manifold Unlike PCA, the manifold does not have
More informationOn Information Maximization and Blind Signal Deconvolution
On Information Maximization and Blind Signal Deconvolution A Röbel Technical University of Berlin, Institute of Communication Sciences email: roebel@kgwtu-berlinde Abstract: In the following paper we investigate
More informationBlind Signal Separation: Statistical Principles
Blind Signal Separation: Statistical Principles JEAN-FRANÇOIS CARDOSO, MEMBER, IEEE Invited Paper Blind signal separation (BSS) and independent component analysis (ICA) are emerging techniques of array
More informationDistinguishing Causes from Effects using Nonlinear Acyclic Causal Models
Distinguishing Causes from Effects using Nonlinear Acyclic Causal Models Kun Zhang Dept of Computer Science and HIIT University of Helsinki 14 Helsinki, Finland kun.zhang@cs.helsinki.fi Aapo Hyvärinen
More informationDistinguishing Causes from Effects using Nonlinear Acyclic Causal Models
JMLR Workshop and Conference Proceedings 6:17 164 NIPS 28 workshop on causality Distinguishing Causes from Effects using Nonlinear Acyclic Causal Models Kun Zhang Dept of Computer Science and HIIT University
More informationTRINICON: A Versatile Framework for Multichannel Blind Signal Processing
TRINICON: A Versatile Framework for Multichannel Blind Signal Processing Herbert Buchner, Robert Aichner, Walter Kellermann {buchner,aichner,wk}@lnt.de Telecommunications Laboratory University of Erlangen-Nuremberg
More informationRecursive Generalized Eigendecomposition for Independent Component Analysis
Recursive Generalized Eigendecomposition for Independent Component Analysis Umut Ozertem 1, Deniz Erdogmus 1,, ian Lan 1 CSEE Department, OGI, Oregon Health & Science University, Portland, OR, USA. {ozertemu,deniz}@csee.ogi.edu
More informationIndependent Component Analysis of Incomplete Data
Independent Component Analysis of Incomplete Data Max Welling Markus Weber California Institute of Technology 136-93 Pasadena, CA 91125 fwelling,rmwg@vision.caltech.edu Keywords: EM, Missing Data, ICA
More informationMachine Learning for Signal Processing. Analysis. Class Nov Instructor: Bhiksha Raj. 8 Nov /18797
11-755 Machine Learning for Signal Processing Independent Component Analysis Class 20. 8 Nov 2012 Instructor: Bhiksha Raj 8 Nov 2012 11755/18797 1 A brief review of basic probability Uncorrelated: Two
More informationTHEORETICAL CONCEPTS & APPLICATIONS OF INDEPENDENT COMPONENT ANALYSIS
THEORETICAL CONCEPTS & APPLICATIONS OF INDEPENDENT COMPONENT ANALYSIS SONALI MISHRA 1, NITISH BHARDWAJ 2, DR. RITA JAIN 3 1,2 Student (B.E.- EC), LNCT, Bhopal, M.P. India. 3 HOD (EC) LNCT, Bhopal, M.P.
More informationICA [6] ICA) [7, 8] ICA ICA ICA [9, 10] J-F. Cardoso. [13] Matlab ICA. Comon[3], Amari & Cardoso[4] ICA ICA
16 1 (Independent Component Analysis: ICA) 198 9 ICA ICA ICA 1 ICA 198 Jutten Herault Comon[3], Amari & Cardoso[4] ICA Comon (PCA) projection persuit projection persuit ICA ICA ICA 1 [1] [] ICA ICA EEG
More informationIndependent Component Analysis
Chapter 5 Independent Component Analysis Part I: Introduction and applications Motivation Skillikorn chapter 7 2 Cocktail party problem Did you see that Have you heard So, yesterday this guy I said, darling
More informationBy choosing to view this document, you agree to all provisions of the copyright laws protecting it.
This material is posted here with permission of the IEEE. Such permission of the IEEE does not in any way imply IEEE endorsement of any of Helsinki University of Technology's products or services. Internal
More informationFeature Extraction with Weighted Samples Based on Independent Component Analysis
Feature Extraction with Weighted Samples Based on Independent Component Analysis Nojun Kwak Samsung Electronics, Suwon P.O. Box 105, Suwon-Si, Gyeonggi-Do, KOREA 442-742, nojunk@ieee.org, WWW home page:
More informationMULTICHANNEL SIGNAL PROCESSING USING SPATIAL RANK COVARIANCE MATRICES
MULTICHANNEL SIGNAL PROCESSING USING SPATIAL RANK COVARIANCE MATRICES S. Visuri 1 H. Oja V. Koivunen 1 1 Signal Processing Lab. Dept. of Statistics Tampere Univ. of Technology University of Jyväskylä P.O.
More informationZero-Entropy Minimization for Blind Extraction of Bounded Sources (BEBS)
Zero-Entropy Minimization for Blind Extraction of Bounded Sources (BEBS) Frédéric Vrins 1, Deniz Erdogmus 2, Christian Jutten 3, and Michel Verleysen 1 1 Machine Learning Group Université catholique de
More informationAcoustic Source Separation with Microphone Arrays CCNY
Acoustic Source Separation with Microphone Arrays Lucas C. Parra Biomedical Engineering Department City College of New York CCNY Craig Fancourt Clay Spence Chris Alvino Montreal Workshop, Nov 6, 2004 Blind
More informationStatistical signal processing
Statistical signal processing Short overview of the fundamentals Outline Random variables Random processes Stationarity Ergodicity Spectral analysis Random variable and processes Intuition: A random variable
More informationFEATURE EXTRACTION USING SUPERVISED INDEPENDENT COMPONENT ANALYSIS BY MAXIMIZING CLASS DISTANCE
FEATURE EXTRACTION USING SUPERVISED INDEPENDENT COMPONENT ANALYSIS BY MAXIMIZING CLASS DISTANCE Yoshinori Sakaguchi*, Seiichi Ozawa*, and Manabu Kotani** *Graduate School of Science and Technology, Kobe
More informationc Springer, Reprinted with permission.
Zhijian Yuan and Erkki Oja. A FastICA Algorithm for Non-negative Independent Component Analysis. In Puntonet, Carlos G.; Prieto, Alberto (Eds.), Proceedings of the Fifth International Symposium on Independent
More informationNonlinear reverse-correlation with synthesized naturalistic noise
Cognitive Science Online, Vol1, pp1 7, 2003 http://cogsci-onlineucsdedu Nonlinear reverse-correlation with synthesized naturalistic noise Hsin-Hao Yu Department of Cognitive Science University of California
More informationLecture 10: Dimension Reduction Techniques
Lecture 10: Dimension Reduction Techniques Radu Balan Department of Mathematics, AMSC, CSCAMM and NWC University of Maryland, College Park, MD April 17, 2018 Input Data It is assumed that there is a set
More informationCPSC 340: Machine Learning and Data Mining. More PCA Fall 2017
CPSC 340: Machine Learning and Data Mining More PCA Fall 2017 Admin Assignment 4: Due Friday of next week. No class Monday due to holiday. There will be tutorials next week on MAP/PCA (except Monday).
More informationCPSC 340: Machine Learning and Data Mining. Sparse Matrix Factorization Fall 2018
CPSC 340: Machine Learning and Data Mining Sparse Matrix Factorization Fall 2018 Last Time: PCA with Orthogonal/Sequential Basis When k = 1, PCA has a scaling problem. When k > 1, have scaling, rotation,
More informationMINIMIZATION-PROJECTION (MP) APPROACH FOR BLIND SOURCE SEPARATION IN DIFFERENT MIXING MODELS
MINIMIZATION-PROJECTION (MP) APPROACH FOR BLIND SOURCE SEPARATION IN DIFFERENT MIXING MODELS Massoud Babaie-Zadeh ;2, Christian Jutten, Kambiz Nayebi 2 Institut National Polytechnique de Grenoble (INPG),
More informationShort-Time ICA for Blind Separation of Noisy Speech
Short-Time ICA for Blind Separation of Noisy Speech Jing Zhang, P.C. Ching Department of Electronic Engineering The Chinese University of Hong Kong, Hong Kong jzhang@ee.cuhk.edu.hk, pcching@ee.cuhk.edu.hk
More informationON-LINE BLIND SEPARATION OF NON-STATIONARY SIGNALS
Yugoslav Journal of Operations Research 5 (25), Number, 79-95 ON-LINE BLIND SEPARATION OF NON-STATIONARY SIGNALS Slavica TODOROVIĆ-ZARKULA EI Professional Electronics, Niš, bssmtod@eunet.yu Branimir TODOROVIĆ,
More informationSTATS 306B: Unsupervised Learning Spring Lecture 12 May 7
STATS 306B: Unsupervised Learning Spring 2014 Lecture 12 May 7 Lecturer: Lester Mackey Scribe: Lan Huong, Snigdha Panigrahi 12.1 Beyond Linear State Space Modeling Last lecture we completed our discussion
More informationBlind separation of sources that have spatiotemporal variance dependencies
Blind separation of sources that have spatiotemporal variance dependencies Aapo Hyvärinen a b Jarmo Hurri a a Neural Networks Research Centre, Helsinki University of Technology, Finland b Helsinki Institute
More informationData Analysis and Manifold Learning Lecture 6: Probabilistic PCA and Factor Analysis
Data Analysis and Manifold Learning Lecture 6: Probabilistic PCA and Factor Analysis Radu Horaud INRIA Grenoble Rhone-Alpes, France Radu.Horaud@inrialpes.fr http://perception.inrialpes.fr/ Outline of Lecture
More informationFAST ALGORITHM FOR ESTIMATING MUTUAL INFORMATION, ENTROPIES AND SCORE FUNCTIONS. Dinh Tuan Pham
FAST ALGORITHM FOR ESTIMATING MUTUAL INFORMATION, ENTROPIES AND SCORE FUNCTIONS Dinh Tuan Pham Laboratoire de Modelling and Computation, CNRS, IMAG BP. 53, 384 Grenoble cedex, France ABSTRACT This papers
More informationIndependent component analysis for biomedical signals
INSTITUTE OF PHYSICS PUBLISHING Physiol. Meas. 26 (2005) R15 R39 PHYSIOLOGICAL MEASUREMENT doi:10.1088/0967-3334/26/1/r02 TOPICAL REVIEW Independent component analysis for biomedical signals Christopher
More informationFinal Report For Undergraduate Research Opportunities Project Name: Biomedical Signal Processing in EEG. Zhang Chuoyao 1 and Xu Jianxin 2
ABSTRACT Final Report For Undergraduate Research Opportunities Project Name: Biomedical Signal Processing in EEG Zhang Chuoyao 1 and Xu Jianxin 2 Department of Electrical and Computer Engineering, National
More informationBlind separation of instantaneous mixtures of dependent sources
Blind separation of instantaneous mixtures of dependent sources Marc Castella and Pierre Comon GET/INT, UMR-CNRS 7, 9 rue Charles Fourier, 9 Évry Cedex, France marc.castella@int-evry.fr, CNRS, I3S, UMR
More informationComparison of Fast ICA and Gradient Algorithms of Independent Component Analysis for Separation of Speech Signals
K. Mohanaprasad et.al / International Journal of Engineering and echnolog (IJE) Comparison of Fast ICA and Gradient Algorithms of Independent Component Analsis for Separation of Speech Signals K. Mohanaprasad
More informationDimensionality Reduction. CS57300 Data Mining Fall Instructor: Bruno Ribeiro
Dimensionality Reduction CS57300 Data Mining Fall 2016 Instructor: Bruno Ribeiro Goal } Visualize high dimensional data (and understand its Geometry) } Project the data into lower dimensional spaces }
More informationTo appear in Proceedings of the ICA'99, Aussois, France, A 2 R mn is an unknown mixture matrix of full rank, v(t) is the vector of noises. The
To appear in Proceedings of the ICA'99, Aussois, France, 1999 1 NATURAL GRADIENT APPROACH TO BLIND SEPARATION OF OVER- AND UNDER-COMPLETE MIXTURES L.-Q. Zhang, S. Amari and A. Cichocki Brain-style Information
More information