Independent Component Analysis (ICA) Bhaskar D Rao University of California, San Diego

Similar documents
Introduction to Independent Component Analysis. Jingmei Lu and Xixi Lu. Abstract

Independent Component Analysis and Its Applications. By Qing Xue, 10/15/2004

Advanced Introduction to Machine Learning CMU-10715

Massoud BABAIE-ZADEH. Blind Source Separation (BSS) and Independent Componen Analysis (ICA) p.1/39

Principal Component Analysis

Independent Component Analysis

HST.582J/6.555J/16.456J

Final Report For Undergraduate Research Opportunities Project Name: Biomedical Signal Processing in EEG. Zhang Chuoyao 1 and Xu Jianxin 2

Independent Component Analysis. Contents

HST.582J / 6.555J / J Biomedical Signal and Image Processing Spring 2007

Fundamentals of Principal Component Analysis (PCA), Independent Component Analysis (ICA), and Independent Vector Analysis (IVA)

Independent Component Analysis

Artificial Intelligence Module 2. Feature Selection. Andrea Torsello

Independent Component Analysis. PhD Seminar Jörgen Ungh

Independent Component Analysis and Its Application on Accelerator Physics

Independent Component Analysis

Independent Component Analysis

ICA. Independent Component Analysis. Zakariás Mátyás

Independent Component Analysis (ICA)

Independent Component Analysis

Independent Component Analysis

An Introduction to Independent Components Analysis (ICA)

Gatsby Theoretical Neuroscience Lectures: Non-Gaussian statistics and natural images Parts I-II

PCA & ICA. CE-717: Machine Learning Sharif University of Technology Spring Soleymani

Unsupervised learning: beyond simple clustering and PCA

CIFAR Lectures: Non-Gaussian statistics and natural images

Independent Component Analysis of Incomplete Data

ARTEFACT DETECTION IN ASTROPHYSICAL IMAGE DATA USING INDEPENDENT COMPONENT ANALYSIS. Maria Funaro, Erkki Oja, and Harri Valpola

Direction of Arrival Estimation: Subspace Methods. Bhaskar D Rao University of California, San Diego

Principal Component Analysis vs. Independent Component Analysis for Damage Detection

Clutter Rejection for Doppler Ultrasound Based on Blind Source Separation

LECTURE :ICA. Rita Osadchy. Based on Lecture Notes by A. Ng

ICA [6] ICA) [7, 8] ICA ICA ICA [9, 10] J-F. Cardoso. [13] Matlab ICA. Comon[3], Amari & Cardoso[4] ICA ICA

Lecture'12:' SSMs;'Independent'Component'Analysis;' Canonical'Correla;on'Analysis'

Blind Source Separation Using Artificial immune system

One-unit Learning Rules for Independent Component Analysis

Spatial Smoothing and Broadband Beamforming. Bhaskar D Rao University of California, San Diego

MTTS1 Dimensionality Reduction and Visualization Spring 2014 Jaakko Peltonen

Undercomplete Independent Component. Analysis for Signal Separation and. Dimension Reduction. Category: Algorithms and Architectures.

THEORETICAL CONCEPTS & APPLICATIONS OF INDEPENDENT COMPONENT ANALYSIS

A MULTIVARIATE MODEL FOR COMPARISON OF TWO DATASETS AND ITS APPLICATION TO FMRI ANALYSIS

Dimensionality Reduction. CS57300 Data Mining Fall Instructor: Bruno Ribeiro

Comparative Analysis of ICA Based Features

Independent component analysis: algorithms and applications

Blind Machine Separation Te-Won Lee

Chapter 15 - BLIND SOURCE SEPARATION:

Separation of the EEG Signal using Improved FastICA Based on Kurtosis Contrast Function

Separation of Different Voices in Speech using Fast Ica Algorithm

Independent Component Analysis and Blind Source Separation

where A 2 IR m n is the mixing matrix, s(t) is the n-dimensional source vector (n» m), and v(t) is additive white noise that is statistically independ

Independent component analysis for biomedical signals

A Constrained EM Algorithm for Independent Component Analysis

1 Introduction Independent component analysis (ICA) [10] is a statistical technique whose main applications are blind source separation, blind deconvo

Independent Component Analysis of Evoked Potentials in EEG

Autoregressive Independent Process Analysis with Missing Observations

Extraction of Sleep-Spindles from the Electroencephalogram (EEG)

Soft-LOST: EM on a Mixture of Oriented Lines

Independent Component Analysis (ICA)

Generalized Sidelobe Canceller and MVDR Power Spectrum Estimation. Bhaskar D Rao University of California, San Diego

Slide11 Haykin Chapter 10: Information-Theoretic Models

FEATURE EXTRACTION USING SUPERVISED INDEPENDENT COMPONENT ANALYSIS BY MAXIMIZING CLASS DISTANCE

FuncICA for time series pattern discovery

Independent Components Analysis

BLIND SEPARATION OF POSITIVE SOURCES USING NON-NEGATIVE PCA

Manifold Learning for Signal and Visual Processing Lecture 9: Probabilistic PCA (PPCA), Factor Analysis, Mixtures of PPCA

Non-orthogonal Support-Width ICA

Data Analysis and Manifold Learning Lecture 6: Probabilistic PCA and Factor Analysis

Acoustic Source Separation with Microphone Arrays CCNY

Tutorial on Blind Source Separation and Independent Component Analysis

Principal Component Analysis

An Improved Cumulant Based Method for Independent Component Analysis

Lecture III. Independent component analysis (ICA) for linear static problems: information- theoretic approaches

Introduction PCA classic Generative models Beyond and summary. PCA, ICA and beyond

ISSN: (Online) Volume 3, Issue 5, May 2015 International Journal of Advance Research in Computer Science and Management Studies

STATS 306B: Unsupervised Learning Spring Lecture 12 May 7

Principal Component Analysis

Natural Gradient Learning for Over- and Under-Complete Bases in ICA

Independent Component Analysis for Blind Source Separation

File: ica tutorial2.tex. James V Stone and John Porrill, Psychology Department, Sheeld University, Tel: Fax:

ORIENTED PCA AND BLIND SIGNAL SEPARATION

Independent Component Analysis

Recursive Generalized Eigendecomposition for Independent Component Analysis

Different Estimation Methods for the Basic Independent Component Analysis Model

Principal Component Analysis -- PCA (also called Karhunen-Loeve transformation)

MULTICHANNEL SIGNAL PROCESSING USING SPATIAL RANK COVARIANCE MATRICES

A GUIDE TO INDEPENDENT COMPONENT ANALYSIS Theory and Practice

Independent Component Analysis

Localization of Multiple Deep Epileptic Sources in a Realistic Head Model via Independent Component Analysis

Independent component analysis: an introduction

Independent Component Analysis on the Basis of Helmholtz Machine

Maximum variance formulation

Mathematical foundations - linear algebra

Comparative Performance Analysis of Three Algorithms for Principal Component Analysis

Blind signal processing algorithms

Whitening and Coloring Transformations for Multivariate Gaussian Data. A Slecture for ECE 662 by Maliha Hossain

Rigid Structure from Motion from a Blind Source Separation Perspective

ICS 6N Computational Linear Algebra Symmetric Matrices and Orthogonal Diagonalization

Optimization and Testing in Linear. Non-Gaussian Component Analysis

Announcements (repeat) Principal Components Analysis

DOA Estimation using MUSIC and Root MUSIC Methods

Transcription:

Independent Component Analysis (ICA) Bhaskar D Rao University of California, San Diego Email: brao@ucsdedu

References 1 Hyvarinen, A, Karhunen, J, & Oja, E (2004) Independent component analysis (Vol 46) John Wiley & Sons 2 Hyvrinen, Aapo, and Erkki Oja Independent component analysis: algorithms and applications Neural networks 134-5 (2000): 411-430 3 Adali, Tulay, Matthew Anderson, and Geng-Shen Fu Diversity in independent component and vector analyses: Identifiability, algorithms, and applications in medical imaging IEEE Signal Processing Magazine 313 (2014): 18-33 4 Novey, Michael, and Tulay Adali Complex ICA by negentropy maximization IEEE Transactions on Neural Networks 194 (2008): 596-609

ICA Instantaneous Mixing Model: Assume N sensors, D sources with D < N, that are linearly mixed x 1 [n] a 11 a 12 a 1D s 1 [n] x 2 [n] a 21 a 22 a 2N s 2 [n] x[n] = As[n]+Z[n]or x N [n] = N1 a N2 a ND s D [n] Problem: Both mixing matrix A and sources s[n] are unknown From measurements of x[n], n = 0, 1,, L 1, recover A and s[n] Assumptions: Sources are zero mean, ie E(s l [n] = 0, l = 1,, D Sources are uncorrelated, ie E(s[n]s T [n]) = diag(p l ) Sources are independent, ie p(s 1, s 2,, s D ) = Π D l=1 p l(s l ) Sources are non-gaussian distributed except for possibly one ICA relies on independence and non-gaussian source assumption +Z[n]

ICA and Applications More general model is Convolutive Mixing: x l [n] = D k=1 h lk[n] s k [n] ICA has several practical applications: The cocktail party problem : separation of voices or music or sounds Sensor array processing, eg radar Biomedical signal processing with multiple sensors: EEG, ECG, MEG, fmri Telecommunications: eg multiuser detection in CDMA Financial and other time series Noise removal from signals and images Feature extraction for images and signals Brain modelling

Cocktail Party Problem

ICA and EEG Processing 1 1 Makeig, S, Bell, A J, Jung, T P, & Sejnowski, T J (1996) Independent component analysis of electroencephalographic data In Advances in neural information processing systems (pp 145-151)

Principal Component Analysis(PCA) PCA is a data representation/ compression technique: Finds an efficient low-dimensional orthonormal basis to represent a random vector in a mean squared sense R xx = E(x[n]x H [n]) = APA H + σ 2 z I, where P = diag(p l ) Eigen-decomposition should allow us to find the number of sources D and the noise variance Let us assume N = D and σ 2 z = 0 Problem: Find an orthonormal basis W for a p-dimensional subspace (p N), ie W C N p and W H W = I p p, such that E( W H x[n] 2 ) is maximized max W C N p,w H W =I p p W H R xx W Solution (PCA): W = [q 1,, q p ], where q l is an eigenvector of R xx corresponding to the lth largest eigenvalue The approximation for x[n] is obtained as ˆx[n] = WW H x[n]

PCA and Whitening If E = [q 1,, q N ], then R xx = EΛE H, is an eigen-decomposition where Λ = diag(λ l ) contains the ordered eigenvalues, ie λ 1 λ 2 λ N 0 Then B = Λ 1 2 E H is a whitening matrix, ie BR xx B H = I N N and y[n] = Bx[n] has uncorrelated components If B is a whitening matrix, QB is also a whitening matrix where Q is an orthonormal matrix and y Q [n] = QBx[n] also has uncorrelated components Implications Gaussian sources are not separable: If the sources (s[n]) are Gaussian, y[n] and y Q [n] represent independent Gaussian sources Non-unique Uncorrelated property is not enough to separate sources Need Independent and Non-Gaussian sources: The goal is to find Q, the remaining unknown

Ambiguities/Indeterminacies in ICA problem x[n] = As[n] 1 We cannot determine the variances (energies) of the independent components (Scaling) The reason is that we can always write x[n] = ADD 1 s[n], where D is a diagonal matrix Therefore, the magnitudes of the si [n] can be freely normalized, typically unit variance This still leaves the sign of s i [n] undetermined Fortunately, often the waveform s i [n] is of interest 2 We cannot determine the order of the independent components (Permutation) This is because we can always change the order in the sum and call any one of the IC s the first one Mathematically this means instead of the original s[n], we can only obtain DPs[n] where D is any nonsingular diagonal scaling matrix and P is any permutation matrix Note that the elements of vector DPs[n] are independent if and only if the elements of s[n] are independent

ICA Methods Assumption: All random variables are zero mean, number of sensors equal to sources, ien = D, and A is square and invertible Maximum Likelihood Estimate: Assume a certain non-gaussian prior on the sources and compute a ML estimate of the mixing matrix A Once you have A, can demix the sources Independence criteria Random variable Y 1 and Y 2 are independent implies p Y1Y 2 (y 1, y 2 ) = p Y1 (y 1 )p Y2 (y 2 ) Also E(G(Y 1 )H(Y 2 )) = E(G(Y 1 ))E(H(Y 2 )), for arbitrary G() and H() Non-linear decorrelation Kullback-Liebler distance KL(p q) = p(y) ln p(y) q(y) dy 0: Use KL(p(s 1,, s D ) p(s 1 )p(s 2 )p(s D )) to measure independence Non-Gaussianity: Maximize distance from Gaussianity Kurtosis : kurt(x ) = E(X 4 ) 3(E(X 2 )) 2 Neg-Entropy: J(X ) = H(X Gauss ) H(X ) 0, where H(X ) is the differential entropy of the random vector X and X Gauss is a Gaussian random vector with same covariance matrix as X