Real and Complex Independent Subspace Analysis by Generalized Variance

Similar documents
Cross-Entropy Optimization for Independent Process Analysis

Independent Subspace Analysis

Complete Blind Subspace Deconvolution

Fast Parallel Estimation of High Dimensional Information Theoretical Quantities with Low Dimensional Random Projection Ensembles

Undercomplete Blind Subspace Deconvolution via Linear Prediction

Autoregressive Independent Process Analysis with Missing Observations

ICA and ISA Using Schweizer-Wolff Measure of Dependence

Independent Subspace Analysis on Innovations

Natural Gradient Learning for Over- and Under-Complete Bases in ICA

where A 2 IR m n is the mixing matrix, s(t) is the n-dimensional source vector (n» m), and v(t) is additive white noise that is statistically independ

Independent Component Analysis

ICA. Independent Component Analysis. Zakariás Mátyás

Artificial Intelligence Module 2. Feature Selection. Andrea Torsello

Independent Component Analysis and Blind Source Separation

Advanced Introduction to Machine Learning CMU-10715

Fundamentals of Principal Component Analysis (PCA), Independent Component Analysis (ICA), and Independent Vector Analysis (IVA)

Complex Independent Process Analysis

Riemannian Optimization Method on the Flag Manifold for Independent Subspace Analysis

NONPARAMETRIC DIVERGENCE ESTIMATORS FOR INDEPENDENT SUBSPACE ANALYSIS

Hilbert Schmidt Independence Criterion

Dimensionality Reduction. CS57300 Data Mining Fall Instructor: Bruno Ribeiro

Independent Component Analysis and Its Applications. By Qing Xue, 10/15/2004

Blind separation of sources that have spatiotemporal variance dependencies

Blind separation of instantaneous mixtures of dependent sources

LECTURE :ICA. Rita Osadchy. Based on Lecture Notes by A. Ng

Independent Component Analysis. Contents

Blind Machine Separation Te-Won Lee

Separation Theorem for Independent Subspace Analysis and its Consequences

Independent Component Analysis

NONPARAMETRIC DIVERGENCE ESTIMATORS FOR INDEPENDENT SUBSPACE ANALYSIS

A METHOD OF ICA IN TIME-FREQUENCY DOMAIN

MTTS1 Dimensionality Reduction and Visualization Spring 2014 Jaakko Peltonen

1 Introduction Independent component analysis (ICA) [10] is a statistical technique whose main applications are blind source separation, blind deconvo

Basic Principles of Unsupervised and Unsupervised

Gatsby Theoretical Neuroscience Lectures: Non-Gaussian statistics and natural images Parts I-II

BLIND SEPARATION OF INSTANTANEOUS MIXTURES OF NON STATIONARY SOURCES

Independent Component Analysis (ICA) Bhaskar D Rao University of California, San Diego

Final Report For Undergraduate Research Opportunities Project Name: Biomedical Signal Processing in EEG. Zhang Chuoyao 1 and Xu Jianxin 2

FINDING CLUSTERS IN INDEPENDENT COMPONENT ANALYSIS

Independent Component Analysis of Incomplete Data

PCA & ICA. CE-717: Machine Learning Sharif University of Technology Spring Soleymani

Independent Component Analysis

Massoud BABAIE-ZADEH. Blind Source Separation (BSS) and Independent Componen Analysis (ICA) p.1/39

Semi-Blind approaches to source separation: introduction to the special session

Introduction to Independent Component Analysis. Jingmei Lu and Xixi Lu. Abstract

Information Theoretical Estimators Toolbox

A two-layer ICA-like model estimated by Score Matching

PROPERTIES OF THE EMPIRICAL CHARACTERISTIC FUNCTION AND ITS APPLICATION TO TESTING FOR INDEPENDENCE. Noboru Murata

x 1 (t) Spectrogram t s

An Improved Cumulant Based Method for Independent Component Analysis

Biomedical signal processing application of optimization methods for machine learning problems

Efficient Line Search Methods for Riemannian Optimization Under Unitary Matrix Constraint

CIFAR Lectures: Non-Gaussian statistics and natural images

Independent Component Analysis

Internal Covariate Shift Batch Normalization Implementation Experiments. Batch Normalization. Devin Willmott. University of Kentucky.

Distinguishing Causes from Effects using Nonlinear Acyclic Causal Models

Beyond Independent Components: Trees and Clusters

Unsupervised learning: beyond simple clustering and PCA

Tree-dependent Component Analysis

DIMENSIONALITY REDUCTION METHODS IN INDEPENDENT SUBSPACE ANALYSIS FOR SIGNAL DETECTION. Mijail Guillemard, Armin Iske, Sara Krause-Solberg

BLIND SEPARATION OF INSTANTANEOUS MIXTURES OF NON STATIONARY SOURCES

Separation Principles in Independent Process Analysis

HST.582J / 6.555J / J Biomedical Signal and Image Processing Spring 2007

Independent Component Analysis (ICA)

Kernel Independent Component Analysis

A MULTIVARIATE MODEL FOR COMPARISON OF TWO DATASETS AND ITS APPLICATION TO FMRI ANALYSIS

Lecture 10: Dimension Reduction Techniques

ICA based on a Smooth Estimation of the Differential Entropy

A UNIFIED PRESENTATION OF BLIND SEPARATION METHODS FOR CONVOLUTIVE MIXTURES USING BLOCK-DIAGONALIZATION

THE THREE EASY ROUTES TO INDEPENDENT COMPONENT ANALYSIS; CONTRASTS AND GEOMETRY. Jean-François Cardoso

FASTGEO - A HISTOGRAM BASED APPROACH TO LINEAR GEOMETRIC ICA. Andreas Jung, Fabian J. Theis, Carlos G. Puntonet, Elmar W. Lang

Non-orthogonal Support-Width ICA

Independent Component Analysis (ICA)

FCOMBI Algorithm for Fetal ECG Extraction

HST.582J/6.555J/16.456J

Distinguishing Causes from Effects using Nonlinear Acyclic Causal Models

Joint Blind Source Separation of Multidimensional Components: Model and Algorithm

Analytical solution of the blind source separation problem using derivatives

1 Introduction Blind source separation (BSS) is a fundamental problem which is encountered in a variety of signal processing problems where multiple s

Independent component analysis: algorithms and applications

Independent Component Analysis

STATS 306B: Unsupervised Learning Spring Lecture 12 May 7

INDEPENDENT COMPONENT ANALYSIS VIA

Lecture'12:' SSMs;'Independent'Component'Analysis;' Canonical'Correla;on'Analysis'

Independent Component Analysis

Introduction PCA classic Generative models Beyond and summary. PCA, ICA and beyond

VARIABLE SELECTION AND INDEPENDENT COMPONENT

Independent Component Analysis. PhD Seminar Jörgen Ungh

A Convex Cauchy-Schwarz Divergence Measure for Blind Source Separation

Independent Component Analysis and Unsupervised Learning

Short Term Memory Quantifications in Input-Driven Linear Dynamical Systems

Robust extraction of specific signals with temporal structure

Machine Learning for Signal Processing. Analysis. Class Nov Instructor: Bhiksha Raj. 8 Nov /18797

Recursive Generalized Eigendecomposition for Independent Component Analysis

Separation of Different Voices in Speech using Fast Ica Algorithm

One-unit Learning Rules for Independent Component Analysis

Hilbert Space Representations of Probability Distributions

ICA [6] ICA) [7, 8] ICA ICA ICA [9, 10] J-F. Cardoso. [13] Matlab ICA. Comon[3], Amari & Cardoso[4] ICA ICA

Measuring Statistical Dependence with Hilbert-Schmidt Norms

On the INFOMAX algorithm for blind signal separation

Transcription:

Real and Complex Independent Subspace Analysis by Generalized Variance Neural Information Processing Group, Department of Information Systems, Eötvös Loránd University, Budapest, Hungary ICA Research Network 2006

K-Independent Subspace Analysis (K-ISA) Cocktail party problem: groups of people / music bands ISA nicknames: MICA, group ICA, IVA K {R,C }

K-ISA Equations Observation z is mixture of independent components: z(t) = As(t), s(t) = [s 1 (t);...;s M (t)], where s m (t) K dm are i.i.d. sampled random variables in time, s i is independent of s j, if i j, mixing matrix A R D D is invertible, with D := dim(s). Goal: ŝ. Specially for d m = 1: R-ICA, C-ICA.

Ambiguities of the K-ISA Model Hidden components can (only) be determined up to permutation, and invertible linear transformation within subspaces. Whitening assumption: E[s] = 0, cov [s] = I D, E[z] = 0, cov [z] = I D. }{{} Lessened ambiguities: invertible orthogonal (R) / unitary(c).

f-uncorrelatedness Whitening second order uncorrelatedness Our approach: independence is approximated as uncorrelatedness for a lot of functions (for f F) Formally, 1 Estimation of the hidden source in feedforward architecture: y(t) = Wz(t). 2 f-covariance matrix is estimated empirically, after applying a function f( F): C(f, T) = ĉov[f(y), f(y)], 3 Cost function for W to make the blocks zero in C(f, T) out of the block-diagonal, for all f F.

Cost for ISA Based on Joint f-decorrelation Cost function for ISA: J(F, T, W) := f F M C(f, T, W) 2 min W:orthogonal, where M picks out the f-covariance of different subspaces. J should be minimized to solve the ISA problem. Too difficult: optimization on the Stiefel / flag manifold [Nishimori et al., 2006]! Reduce the task further.

The K-ISA Separation Theorem Essence: K-ISA = K-ICA + permutation search (with K-ISA cost). Formally, Theorem Let H denote Shannon s differential entropy. Let us suppose that the u := s m components in K-ISA satisfy ( d ) H w i u i i=1 d w i 2 H(u i ), w K = 1 i=1 then W ISA = PW ICA with a P R D D permutation matrix. References: [Cardoso, 1998], [Szabó et al., 2006a].

Simulations Test databases (can be scaled in d) d-geom (M = 4): d-spherical (M = 3): Performance measure: normalized Amari-error r [0, 1]. Measures block permutation property of WA.

Results-1: d-geom Amari error (r) 10 0 d geom 10 1 d=10 (D=40) d=20 (D=80) d=30 (D=120) d=40 (D=160) d=50 (D=200) d=60 (D=240) d=70 (D=280) d=80 (D=320) 1 2 5 10 15 20 30 Number of samples (T) x10 3

Results-2: d-spherical Amari error (r) 10 0 d spherical 10 1 d=20 (D=60) d=30 (D=90) d=40 (D=120) d=50 (D=150) d=60 (D=180) d=70 (D=210) d=80 (D=240) d=90 (D=270) d=100 (D=300) d=110 (D=330) 1 2 5 10 15 20 30 Number of samples (T) x10 3

Connection to Other Techniques (d = 1) Alike to KC (kernel covariance) we use a function set F [Gretton et al., 2003]. For F = {f}: f-decorrelation is equivalent to minimization the cost { } 0 Q W (f, T) := 1 2 log det[c(f, T)] M. m=1 det[cm,m (f, T)] This is right the cost function of KGV (kernel generalized variance) [Bach and Jordan, 2002]. With RNN architecture (instead of feedforward), K = R gives rise to self-organization ( gradient) [Meyer-Bäse et al., 2006].

References-1 Nishimori, Y., Akaho, S., and Plumbley, M. D. (2006). Riemannian optimization method on the flag manifold for independent subspace analysis. In Independent Component Analysis and Blind Signal Separation (ICA 2006), volume 3889 of LNCS, pages 295 302. Springer. Bach, F. R. and Jordan, M. I. (2002). Kernel independent component analysis. JMLR, 3:1 48. Gretton, A., Herbrich, R., and Smola, A. (2003). The kernel mutual information. In IEEE ICASSP, volume 4, pages 880 883. Meyer-Bäse, A., Gruber, P., Theis, F., and Foo, S. (2006). Blind source separation based on self-organizing neural network. Engineering Applications of AI, 19:305 311.

References-2 Cardoso, J. (1998). Multidimensional independent component analysis. In Proceedings of ICASSP 98, Seattle, WA. Szabó, Z., Póczos, B., and Lőrincz, A. (2006a). Cross-entropy optimization for independent process analysis. In Independent Component Analysis and Blind Signal Separation (ICA 2006), LNCS 3889, pages 909 916. Springer. Szabó, Z., Póczos, B., and Lőrincz, A. (2006b). Separation theorem for K-independent subspace analysis with sufficient conditions. Technical report, Eötvös Loránd University, Budapest. http://arxiv.org/abs/math.st/0608100.

Summary Presented K-ISA method: joint decorrelation on function set F: with feedforward architecture: Reduction using the K-ISA Separation Theorem K-ISA = K-ICA + permutation search (with K-ISA cost) KGV for F = {f}. with recurrent architecture: self-organization. First step toward large scale problems: few hundred dimensions, power law decrease of estimation error.

Thank you for the paying attention!