Independent Component Analysis and Blind Source Separation

Size: px
Start display at page:

Download "Independent Component Analysis and Blind Source Separation"

Transcription

1 Independent Component Analysis and Blind Source Separation Aapo Hyvärinen University of Helsinki and Helsinki Institute of Information Technology 1

2 Blind source separation Four source signals : Due to some external circumstances, only linear mixtures of the source signals are observed Estimate (separate) original signals! 2

3 Solution by independence Use only information on statistical independence to recover: These are the independent components! 3

4 Independent Component Analysis. (Jutten and Hérault, 1991) Observed random vector x is modelled by a linear latent variable model x i = m a i j s j, i = 1...n (1) j=1 or in matrix form: x = As (2) where The mixing matrix A is constant (a parameter matrix). The s i are latent random variables called the independent components. Estimate both A and s, observing only x. 4

5 Basic properties of the ICA model Must assume: The s i are mutually independent The s i are nongaussian. For simplicity: The matrix A is square. The s i defined only up to a multiplicative constant. The s i are not ordered. 5

6 ICA and decorrelation First approach: decorrelate variables. Whitening or sphering: decorrelate and normalize E{xx T } = I Simple by eigen-value decomposition of covariance matrix. But: Decorrelation uses only correlation matrix: n 2 /2 equations, and A has n 2 elements Not enough information! 6

7 Independence is better Fortunately, independence is stronger than uncorrelatedness. For independent variables we have E{h 1 (y 1 )h 2 (y 2 )} E{h 1 (y 1 )}E{h 2 (y 2 )} = 0. (3) Still, decorrelation ( whitening ) is usually done before ICA for various technical reasons For example: after decorrelation and standardization, A can be considered orthogonal. Gaussian data determined by correlations alone model cannot be estimated for gaussian data. 7

8 Illustration of whitening Two ICs with uniform distributions: Original variables, observed mixtures, whitened mixtures. Cf. gaussian density: symmetric in all directions. 8

9 Basic intuitive principle of ICA estimation. (Sloppy version of) the Central Limit Theorem (Donoho, 1981). Consider a linear combination w T x = q T s q i s i + q j s j is more gaussian than s i. Maximizing the nongaussianity of q T s, we can find s i. Also known as projection pursuit. 9

10 Marginal and joint densities, uniform distributions. Marginal and joint densities, whitened mixtures of uniform ICs 10

11 Marginal and joint densities, supergaussian distributions. Whitened mixtures of supergaussian ICs 11

12 Kurtosis as nongaussianity measure. Problem: how to measure nongaussianity? Definition: kurt(x) = E{x 4 } 3(E{x 2 }) 2 (4) if variance constrained to unity, essentially 4th moment. Simple algebraic properties because it s a cumulant: kurt(s 1 + s 2 ) = kurt(s 1 ) + kurt(s 2 ) (5) kurt(αs 1 ) = α 4 kurt(s 1 ) (6) zero for gaussian RV, non-zero for most nongaussian RV s. positive vs. negative kurtosis have typical forms of pdf. 12

13 Left: Laplacian pdf, positive kurt ( supergaussian ). Right: Uniform pdf, negative kurt ( subgaussian ). 13

14 The extrema of kurtosis by the properties of kurtosis: kurt(w T x) = kurt(q T s) = q 4 1 kurt(s 1 ) + q 4 2 kurt(s 2 ) (7) constrain variance to equal unity E{(w T x) 2 } = E{(q T s) 2 } = q q 2 2 = 1 (8) for simplicity, consider kurtoses equal to one. maxima of kurtosis give independent components (see figure) general result: absolute value of kurtosis maximized by the s i (Delfosse and Loubaton, 1995). Note: extrema are orthogonal due to whitening. 14

15 Optimization landscape for kurtosis. Thick curve is unit sphere, thin curves are contours where kurtosis is constant. 15

16 kurtosis angle of w Kurtosis as a function of the direction of projection. For positive kurtosis, kurtosis (and its absolute value) are maximized in the directions of the independent components. 16

17 kurtosis angle of w Case of negative kurtosis. Kurtosis is minimized, and its absolute value maximized, in the directions of the independent components. 17

18 Basic ICA estimation procedure 1. Whiten the data to give z. 2. Set iteration count i = Take a random vector w i. 4. Maximize nongaussianity of w T i z, under constraints w i 2 = 1 and w T i w j = 0, j < i (by a suitable algorithm, see later) 5. increment iteration count by 1, go back to 3 Alternatively: maximize all the w i in parallel, keeping them orthogonal. 18

19 Why kurtosis is not optimal Sensitive to outliers: Consider a sample of 1000 values with unit var, one value equal to 10. Kurtosis equals at least 10 4 / = 7. For supergaussian variables, statistical performance not optimal even without outliers. Other measures of nongaussianity should be considered. 19

20 Differential entropy as nongaussianity measure Generalization of ordinary discrete Shannon entropy: H(x) = E{ log p(x)} (9) for fixed variance, maximized by gaussian distribution. often normalized to give negentropy J(x) = H(x gauss ) H(x) (10) Good statistical properties, but computationally difficult. 20

21 Approximation of negentropy Approximations of negentropy (Hyvärinen, 1998): J G (x) = (E{G(x)} E{G(x gauss )}) 2 (11) where G is a nonquadratic function. Generalization of (square of) kurtosis (which is G(x) = x 4 ). A good compromise? statistical properties not bad (for suitable choice of G) computationally simple 21

22 Maximum likelihood estimation. (Pham and Garrat, 1997) Log-likelihood of the model: (W = Â 1 ) L = T n t=1 i=1 log p si (w T i x(t))) + T log detw (12) Equivalent to the infomax approach in neural networks. Needs estimates of the p si, but these need not be exact at all. Roughly: consistent if p si is of the right type (sub or supergaussian). 22

23 Maximum likelihood and nongaussianity If W constrained to be orthogonal (whitened data), and the densities of the components consistently estimated: where H is differential entropy n lim L = T H(w T i x) + const. (13) i=1 This is sum of nongaussianities! (note minus sign) Rigorous derivation of maximization of nongaussianities. 23

24 Overview of ICA estimation principles. Basic approach is maximizing the nongaussianity of ICs, which is roughly equivalent to MLE. Basic choice: the nonquadratic function in the nongaussianity measure: kurtosis: fourth power entropy/likelihood: log of density approx of entropy: G(s) = log cosh s or others. One-by-one estimation vs. estimation of the whole model. Estimates constrained to be white vs. no constraint 24

25 Algorithms (1). Adaptive gradient methods Gradient methods for one-by-one estimation straightforward. Stochastic gradient ascent for likelihood (Bell and Sejnowski, 1995) W (W 1 ) T + g(wx)x T (14) with g = (log p s ). Problem: needs matrix inversion! Better: natural/relative gradient ascent of likelihood (Amari, 1998; Cardoso and Laheld, 1996) W [I + g(y)y T ]W (15) with y = Wx. Obtained by multiplying gradient by W T W. 25

26 Algorithms (2). The FastICA fixed-point algorithm (Hyvärinen, 1999) An approximate Newton method in block (batch) mode. No matrix inversion, but still quadratic (or cubic) convergence. No parameters to be tuned. For a single IC (whitened data) w E{xg(w T x)} E{g (w T x)}w, normalize w where g is the derivative of G. For likelihood: W W + D 1 [D 2 + E{g(y)y T }]W, orthonormalize W The FastICA MATLAB package on the WWW (The FastICA Team, 1998). 26

27 kurtosis iteration count Convergence of FastICA. Vectors after 1 and 2 iterations, values of kurtosis. 27

28 kurtosis iteration count Convergence of FastICA (2). Vectors after 1 and 2 iterations, values of kurtosis. 28

29 Stability/reliability analysis Are the components an algorithm gives reliable, really there? All optimization methods are prone to get stuck in local minima a) b) Reliability should be analyzed by running algorithm many times from random initial points: the Icasso software package (Himberg et al., 2004) Statistical reliability of components could also be analyzed by bootstrapping Computation of p-values still a subject for future research. 29

30 Relations to other methods (1): Projection pursuit (Friedman, 1987; Huber, 1985) Projection pursuit is a method for visualization and exploratory data analysis. Attempts to show clustering structure of data by finding interesting projections. PCA is not designed to find clustering structure. Interestingness is usually measured by nongaussianity. For example, bimodal distributions are very nongaussian. 30

31 Illustration of projection pursuit. The projection pursuit direction is horizontal, the principal component vertical. 31

32 Relations to other methods. (2) Factor analysis: ICA is a nongaussian (usually noise-free) version Blind deconvolution: obtained by constraining the mixing matrix Principal component analysis often the same applications very different statistical principles 32

33 Basic ICA estimation: conclusions ICA is very simple as a model: linear nongaussian latent variables model. Estimation not so simple due to nongaussianity: objective functions cannot be quadratic. Estimation by maximizing nongaussianity of independent components. More or less equivalent to maximum likelihood estimation Algorithms: adaptive (natural gradient descent) vs. block/batch mode (FastICA). Choice of nonlinearity: cubic (kurtosis) vs. non-polynomial functions For more information, see e.g. (Hyvärinen and Oja, 2000; Hyvärinen et al., 2001b; Cardoso, 1998a; Amari and Cardoso, 1997) 33

34 Blind source separation using time dependencies 34

35 Using autocorrelations for ICA estimation Take the basic linear mixture model x(t) = As(t) (16) Cannot be estimated in general (e.g. gaussian RV s) Usually in ICA, we assume the s i to be nongaussian higher-order statistics provide missing information. Alternatively: assume the s i are time-dependent signals use time correlations to give more information For example, a lagged covariance matrix measures covariances of lagged signals. C x τ = E{x(t)x(t τ)}. (17) 35

36 The AMUSE algorithm for using autocorrelations (Tong et al., 1991; Molgedey and Schuster, 1994) Basic principle: decorrelate each signal y = Wx with other signals, lagged as well as not lagged. In other words: E{y i (t)y j (t τ)} = 0 for all i j. To do this: 1. Whiten the data to obtain z(t) = Vx(t) 2. Find orthogonal transformation W so that the lagged covariance matrix of y(t) = Wz(t) is identity. Matrix diagonalization problem C x τ = E{x(t)x(t τ)} = E{As(t)s(t τ) T A T } = AC s τa T of a (more or less) symmetric matrix. 36

37 Pros and cons of separation by autocorrelations Very fast to compute: a single eigen-value decomposition, like PCA Can only separate ICs with different autocorrelations Because the lagged covariance matrix must have different eigenvalues Some improvement can be achieved by using several lags in the algorithm (Belouchrani et al., 1997). but if signals have identical Fourier spectra, autocorrelations just cannot separate them 37

38 Combining nongaussianity and autocorrelations Best results should be obtained by using these two kinds of information. E.g.: Model temporal structure of signals with e.g. ARMA models Basic case: linear non-gaussian autoregressive model (Hyvärinen, 2001b) s i (t) = as i (t 1) + n i (t) (18) Straighforward formulation of likelihood. Use parametric model for the distribution of the innovation process. 38

39 Estimation using variance nonstationarity (Matsuoka et al., 1995) An alternative to autocorrelations (and nongaussianity) Variance changes slowly over time Similar to autoregressive conditional heteroscedasticity (ARCH) models for each independent component This gives enough information to estimate model (Pham and Cardoso, 2001; Hyvärinen, 2001a) 39

40 Unifying autoregressive model (Hyvärinen, 2005) A simple model that incorporates the three properties of nongaussianity, distinct autocorrelations, and a smoothly changing nonstationary variance. Model each s i by an autoregressive model where the innovation term n i (t) is nongaussian, its variance can be nonstationary. s i (t) = α τ i s i (t τ) + n i (t) (19) τ>0 40

41 Coding complexity as a general theoretical framework (Pajunen, 1998) A more general approach: minimize coding (Kolmogoroff) complexity Find a decomposition y = Wx so that the y i are easy to code. For whitened data z, and an orthogonal W: minimize sum of coding lengths of the y = Wz. If only marginal distributions are used, coding length is given by entropy, i.e. nongaussianity. If only autocorrelations are used, coding length is related to autocorrelations. Signals are easy to code if they are nongaussian, or have time dependencies, or have nonstationary variances. 41

42 Convolutive ICA Often the signals do not arrive at the same time in the sensors Or: latent events are not immediately manifested in observed variables. There may be echos as well (multi-path phenomena) Include convolution in the model: x i (t) = n a i j (t) s i (t) = j=1 n j=1 a i j (k)s i (t k),for i = 1,...,n, (20) k In theory: Estimation by the same principles as ordinary ICA In practice: huge number of parameters since (de)convolving filters may be very long special methods may need to be used FastICA can be adapted (Douglas et al., 2005). 42

43 Modelling dependencies between components 43

44 Relaxing independence For most data sets, the estimated components are not very independent. In fact, independent components can not be found in general by a linear transformation. We attempt to model some of the remaining dependencies. Basic models group components: Multidimensional ICA, and Independent Subspace Analysis. 44

45 Multidimensional ICA (Cardoso, 1998b) One approach to relaxing independence. the s i can be divided into n-tuples, such that the s i inside a given n-tuple may be dependent on each other dependencies between different n-tuples are not allowed. Every n-tuple corresponds to a subspace. 45

46 Invariant-feature subspaces (Kohonen, 1996) Linear filters (like in ICA) necessarily lack any invariance. invariant-feature subspaces is an abstract approach to representing invariant features. Principle: invariant feature is a linear subspace in a feature space. The value of the invariant feature is given by norm of the projection on that subspace. k (w T i x) 2 (21) i=1 46

47 Independent Subspace Analysis (Hyvärinen and Hoyer, 2000) Combination of multidimensional ICA and invariant-feature subspaces. The probability density inside each subspace is spherically symmetric, i.e. depends only on the norm of the projection. Simplifies the model considerably. The nature of the invariant features is not specified. 47

48 <w 1, I> (.) 2 <w 2, I> <w, I> 3 (.) 2 (.) 2 Σ <w 4, I> (.) 2 Input I <w 5, I> <w, I> 6 <w 7, I> (.) 2 (.) 2 (.) 2 Σ <w 8, I> (.) 2 48

49 Problem: Dependencies still remain Linear decomposition often does not give independence, even for subspaces. Remaining dependencies could be visualized or else utilized. Components can be decorrelated, so only higher-order correlations are interesting How to visualize them? E.g. using topographic order 49

50 Extending the model to include topography Instead of having unordered components, they are arranged on a two-dimensional lattice dependent independent The components are typically sparse, but not independent. Near-by components have higher-order correlations. 50

51 Dependence through local variances Related to ARCH models where the variance variables are shared Components are independent given their variances In our model, variances are not independent instead: correlated for near-by components e.g. generated by another ICA model, with topographic mixing INDEPENDENT TOPOGRAPHIC VARIANCE DEPENDENCE 51

52 Two signals that are independent given their variances. 52

53 Topographic ICA model (Hyvärinen et al., 2001a) u Σ Σ Σ x x x A 3 φ φ φ u u 1 2 s s s σ σ 1 σ 2 3 Variance-generating variables u i are generated randomly, and mixed linearly inside their topographic neighbourhoods. Mixtures are transformed using a nonlinearity φ, thus giving variances σ i of the s i. Finally, ordinary linear mixing. 53

54 Approximation of likelihood Likelihood of the model intractable Approximation: T t=1 n j=1 G( n i=1 h(i, j)(w T i x(t)) 2 ) + T log detw. (22) where h(i, j) is neighborhood function, and G a nonlinear function. Generalization of independent subspace analysis. Function of local energies only! 54

55 Example of blind source separation with topographic ICA

56 Independent subspace analysis vs. topographic ICA In ISA, single components are not independent, but subspaces are. In topographic ICA, dependencies modelled continuously. No strict division into subspaces. Topographic ICA is a generalization of ISA, incorporating the invariant-feature subspace principle as invariant-feature neighbourhoods. 56

57 Double-blind source separation (Hyvärinen and Hurri, 2004) Using time-dependencies can help in separating independent components. Actually, it is not necessary to know or model how the variances depend from each other. Theorem: Maximization of [ cov([w T i x(t)] 2,[w T j x(t t)] 2 ) ] 2 i, j under the constraint of orthogonality of W gives the original sources Assumption 1: the sources are dependent only through their variances as in topographic ICA Assumption 2: x(t) is spatially and temporally whitened Assumption 3: the matrix K i j = cov(s 2 i (t),s2 j (t t)) is of full rank (23) 57

58 Final Summary ICA is a very simple model. Simplicity implies wide applicability. A nongaussian alternative to PCA or factor analysis. Decorrelation or whitening is only half ICA. The other half uses the higher-order statistics of nongaussian variables Alternatively, separation is possible using time dependencies: Linear autocorrelations Smoothly changing variances (ARCH) Since dependencies cannot always be cancelled, subspaces or topographic versions may be useful. Nongaussianity is beautiful!? 58

59 REFERENCES REFERENCE References Amari, S.-I. (1998). Natural gradient works efficiently in learning. Neural Computation, 10(2): Amari, S.-I. and Cardoso, J.-F. (1997). Blind source separation semiparametric statistical approach. IEEE Trans. on Signal Processing, 45(11): Bell, A. and Sejnowski, T. (1995). An information-maximization approach to blind separation and blind deconvolution. Neural Computation, 7: Belouchrani, A., Meraim, K. A., Cardoso, J.-F., and Moulines, E. (1997). A blind source separation technique based on second order statistics. IEEE Trans. on Signal Processing, 45(2): Cardoso, J.-F. (1998a). Blind signal separation: statistical principles. Proceedings of the IEEE, 9(10): Cardoso, J.-F. (1998b). Multidimensional independent component analysis. In Proc. IEEE Int. Conf. on Acoustics, Speech and Signal Processing (ICASSP 98), Seattle, WA. Cardoso, J.-F. and Laheld, B. H. (1996). Equivariant adaptive source separation. IEEE Trans. on Signal Processing, 44(12): Delfosse, N. and Loubaton, P. (1995). Adaptive blind separation of independent sources: a deflation approach. Signal Processing, 45: Donoho, D. L. (1981). On minimum entropy deconvolution. In Applied Time Series Analysis II, pages Academic Press. Douglas, S., Sawada, H., and Makino, S. (2005). A spatio-temporal fastica algorithm for separating convolutive mixtures. In Proc. IEEE Int. Conf. on Acoustics, Speech and Signal Processing (ICASSP2005), Philadephia, PA. IEEE Press. Friedman, J. (1987). Exploratory projection pursuit. J. of the American Statistical Association, 82(397): Himberg, J., Hyv ärinen, A., and Esposito, F. (2004). Validating the independent components of neuroimaging time-series via clustering and visualization. NeuroImage, 22(3): Huber, P. (1985). Projection pursuit. The Annals of Statistics, 13(2):

60 REFERENCES REFERENCE Hyv ärinen, A. (1998). New approximations of differential entropy for independent component analysis and projection pursuit. In Advances in Neural Information Processing Systems, volume 10, pages MIT Press. Hyv ärinen, A. (1999). Fast and robust fixed-point algorithms for independent component analysis. IEEE Transactions on Neural Networks, 10(3): Hyv ärinen, A. (2001a). Blind source separation by nonstationarity of variance: A cumulant-based approach. IEEE Transactions on Neural Networks, 12(6): Hyv ärinen, A. (2001b). Complexity pursuit: Separating interesting components from time-series. Neural Computation, 13(4): Hyv ärinen, A. (2005). A unifying model for blind separation of independent sources. Signal Processing, 85(7): Hyv ärinen, A. and Hoyer, P. O. (2000). Emergence of phase and shift invariant features by decomposition of natural images into independent feature subspaces. Neural Computation, 12(7): Hyv ärinen, A., Hoyer, P. O., and Inki, M. (2001a). Topographic independent component analysis. Neural Computation, 13(7): Hyv ärinen, A. and Hurri, J. (2004). Blind separation of sources that have spatiotemporal variance dependencies. Signal Processing, 84(2): Hyv ärinen, A., Karhunen, J., and Oja, E. (2001b). Independent Component Analysis. Wiley Interscience. Hyv ärinen, A. and Oja, E. (2000). Independent component analysis: Algorithms and applications. Neural Networks, 13(4-5): Jutten, C. and Hérault, J. (1991). Blind separation of sources, part I: An adaptive algorithm based on neuromimetic architecture. Signal Processing, 24:1 10. Kohonen, T. (1996). Emergence of invariant-feature detectors in the adaptive-subspace self-organizing map. Biological Cybernetics, 75: Matsuoka, K., Ohya, M., and Kawamoto, M. (1995). A neural net for blind separation of nonstationary signals. Neural Networks, 8(3): Molgedey, L. and Schuster, H. G. (1994). Separation of a mixture of independent signals using time delayed correlations. Physical Review Letters, 72: Pajunen, P. (1998). Blind source separation using algorithmic information theory. Neurocomputing, 22:

61 REFERENCES REFERENCE Pham, D.-T. and Cardoso, J.-F. (2001). Blind separation of instantaneous mixtures of non stationary sources. IEEE Trans. Signal Processing, 49(9): Pham, D.-T. and Garrat, P. (1997). Blind separation of mixture of independent sources through a quasi-maximum likelihood approach. IEEE Trans. on Signal Processing, 45(7): The FastICA Team (1998). The FastICA MATLAB package. Available at Tong, L., Liu, R.-W., Soon, V., and Huang, Y.-F. (1991). Indeterminacy and identifiability of blind identification. IEEE Trans. on Circuits and Systems, 38:

CIFAR Lectures: Non-Gaussian statistics and natural images

CIFAR Lectures: Non-Gaussian statistics and natural images CIFAR Lectures: Non-Gaussian statistics and natural images Dept of Computer Science University of Helsinki, Finland Outline Part I: Theory of ICA Definition and difference to PCA Importance of non-gaussianity

More information

Gatsby Theoretical Neuroscience Lectures: Non-Gaussian statistics and natural images Parts I-II

Gatsby Theoretical Neuroscience Lectures: Non-Gaussian statistics and natural images Parts I-II Gatsby Theoretical Neuroscience Lectures: Non-Gaussian statistics and natural images Parts I-II Gatsby Unit University College London 27 Feb 2017 Outline Part I: Theory of ICA Definition and difference

More information

Independent Component Analysis and Its Applications. By Qing Xue, 10/15/2004

Independent Component Analysis and Its Applications. By Qing Xue, 10/15/2004 Independent Component Analysis and Its Applications By Qing Xue, 10/15/2004 Outline Motivation of ICA Applications of ICA Principles of ICA estimation Algorithms for ICA Extensions of basic ICA framework

More information

Independent Component Analysis. Contents

Independent Component Analysis. Contents Contents Preface xvii 1 Introduction 1 1.1 Linear representation of multivariate data 1 1.1.1 The general statistical setting 1 1.1.2 Dimension reduction methods 2 1.1.3 Independence as a guiding principle

More information

Independent Component Analysis

Independent Component Analysis A Short Introduction to Independent Component Analysis with Some Recent Advances Aapo Hyvärinen Dept of Computer Science Dept of Mathematics and Statistics University of Helsinki Problem of blind source

More information

Blind separation of sources that have spatiotemporal variance dependencies

Blind separation of sources that have spatiotemporal variance dependencies Blind separation of sources that have spatiotemporal variance dependencies Aapo Hyvärinen a b Jarmo Hurri a a Neural Networks Research Centre, Helsinki University of Technology, Finland b Helsinki Institute

More information

TWO METHODS FOR ESTIMATING OVERCOMPLETE INDEPENDENT COMPONENT BASES. Mika Inki and Aapo Hyvärinen

TWO METHODS FOR ESTIMATING OVERCOMPLETE INDEPENDENT COMPONENT BASES. Mika Inki and Aapo Hyvärinen TWO METHODS FOR ESTIMATING OVERCOMPLETE INDEPENDENT COMPONENT BASES Mika Inki and Aapo Hyvärinen Neural Networks Research Centre Helsinki University of Technology P.O. Box 54, FIN-215 HUT, Finland ABSTRACT

More information

Fundamentals of Principal Component Analysis (PCA), Independent Component Analysis (ICA), and Independent Vector Analysis (IVA)

Fundamentals of Principal Component Analysis (PCA), Independent Component Analysis (ICA), and Independent Vector Analysis (IVA) Fundamentals of Principal Component Analysis (PCA),, and Independent Vector Analysis (IVA) Dr Mohsen Naqvi Lecturer in Signal and Information Processing, School of Electrical and Electronic Engineering,

More information

1 Introduction Independent component analysis (ICA) [10] is a statistical technique whose main applications are blind source separation, blind deconvo

1 Introduction Independent component analysis (ICA) [10] is a statistical technique whose main applications are blind source separation, blind deconvo The Fixed-Point Algorithm and Maximum Likelihood Estimation for Independent Component Analysis Aapo Hyvarinen Helsinki University of Technology Laboratory of Computer and Information Science P.O.Box 5400,

More information

Analytical solution of the blind source separation problem using derivatives

Analytical solution of the blind source separation problem using derivatives Analytical solution of the blind source separation problem using derivatives Sebastien Lagrange 1,2, Luc Jaulin 2, Vincent Vigneron 1, and Christian Jutten 1 1 Laboratoire Images et Signaux, Institut National

More information

Independent Component Analysis

Independent Component Analysis A Short Introduction to Independent Component Analysis Aapo Hyvärinen Helsinki Institute for Information Technology and Depts of Computer Science and Psychology University of Helsinki Problem of blind

More information

One-unit Learning Rules for Independent Component Analysis

One-unit Learning Rules for Independent Component Analysis One-unit Learning Rules for Independent Component Analysis Aapo Hyvarinen and Erkki Oja Helsinki University of Technology Laboratory of Computer and Information Science Rakentajanaukio 2 C, FIN-02150 Espoo,

More information

ACENTRAL problem in neural-network research, as well

ACENTRAL problem in neural-network research, as well 626 IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL. 10, NO. 3, MAY 1999 Fast and Robust Fixed-Point Algorithms for Independent Component Analysis Aapo Hyvärinen Abstract Independent component analysis (ICA)

More information

Blind Machine Separation Te-Won Lee

Blind Machine Separation Te-Won Lee Blind Machine Separation Te-Won Lee University of California, San Diego Institute for Neural Computation Blind Machine Separation Problem we want to solve: Single microphone blind source separation & deconvolution

More information

EXTENSIONS OF ICA AS MODELS OF NATURAL IMAGES AND VISUAL PROCESSING. Aapo Hyvärinen, Patrik O. Hoyer and Jarmo Hurri

EXTENSIONS OF ICA AS MODELS OF NATURAL IMAGES AND VISUAL PROCESSING. Aapo Hyvärinen, Patrik O. Hoyer and Jarmo Hurri EXTENSIONS OF ICA AS MODELS OF NATURAL IMAGES AND VISUAL PROCESSING Aapo Hyvärinen, Patrik O. Hoyer and Jarmo Hurri Neural Networks Research Centre Helsinki University of Technology P.O. Box 5400, FIN-02015

More information

Independent component analysis: algorithms and applications

Independent component analysis: algorithms and applications PERGAMON Neural Networks 13 (2000) 411 430 Invited article Independent component analysis: algorithms and applications A. Hyvärinen, E. Oja* Neural Networks Research Centre, Helsinki University of Technology,

More information

Emergence of Phase- and Shift-Invariant Features by Decomposition of Natural Images into Independent Feature Subspaces

Emergence of Phase- and Shift-Invariant Features by Decomposition of Natural Images into Independent Feature Subspaces LETTER Communicated by Bartlett Mel Emergence of Phase- and Shift-Invariant Features by Decomposition of Natural Images into Independent Feature Subspaces Aapo Hyvärinen Patrik Hoyer Helsinki University

More information

Different Estimation Methods for the Basic Independent Component Analysis Model

Different Estimation Methods for the Basic Independent Component Analysis Model Washington University in St. Louis Washington University Open Scholarship Arts & Sciences Electronic Theses and Dissertations Arts & Sciences Winter 12-2018 Different Estimation Methods for the Basic Independent

More information

Independent Component Analysis

Independent Component Analysis 1 Independent Component Analysis Background paper: http://www-stat.stanford.edu/ hastie/papers/ica.pdf 2 ICA Problem X = AS where X is a random p-vector representing multivariate input measurements. S

More information

Estimating Overcomplete Independent Component Bases for Image Windows

Estimating Overcomplete Independent Component Bases for Image Windows Journal of Mathematical Imaging and Vision 17: 139 152, 2002 c 2002 Kluwer Academic Publishers. Manufactured in The Netherlands. Estimating Overcomplete Independent Component Bases for Image Windows AAPO

More information

ICA [6] ICA) [7, 8] ICA ICA ICA [9, 10] J-F. Cardoso. [13] Matlab ICA. Comon[3], Amari & Cardoso[4] ICA ICA

ICA [6] ICA) [7, 8] ICA ICA ICA [9, 10] J-F. Cardoso. [13] Matlab ICA. Comon[3], Amari & Cardoso[4] ICA ICA 16 1 (Independent Component Analysis: ICA) 198 9 ICA ICA ICA 1 ICA 198 Jutten Herault Comon[3], Amari & Cardoso[4] ICA Comon (PCA) projection persuit projection persuit ICA ICA ICA 1 [1] [] ICA ICA EEG

More information

Estimation of linear non-gaussian acyclic models for latent factors

Estimation of linear non-gaussian acyclic models for latent factors Estimation of linear non-gaussian acyclic models for latent factors Shohei Shimizu a Patrik O. Hoyer b Aapo Hyvärinen b,c a The Institute of Scientific and Industrial Research, Osaka University Mihogaoka

More information

A Canonical Genetic Algorithm for Blind Inversion of Linear Channels

A Canonical Genetic Algorithm for Blind Inversion of Linear Channels A Canonical Genetic Algorithm for Blind Inversion of Linear Channels Fernando Rojas, Jordi Solé-Casals, Enric Monte-Moreno 3, Carlos G. Puntonet and Alberto Prieto Computer Architecture and Technology

More information

BLIND SEPARATION OF INSTANTANEOUS MIXTURES OF NON STATIONARY SOURCES

BLIND SEPARATION OF INSTANTANEOUS MIXTURES OF NON STATIONARY SOURCES BLIND SEPARATION OF INSTANTANEOUS MIXTURES OF NON STATIONARY SOURCES Dinh-Tuan Pham Laboratoire de Modélisation et Calcul URA 397, CNRS/UJF/INPG BP 53X, 38041 Grenoble cédex, France Dinh-Tuan.Pham@imag.fr

More information

INDEPENDENT COMPONENT ANALYSIS

INDEPENDENT COMPONENT ANALYSIS INDEPENDENT COMPONENT ANALYSIS A THESIS SUBMITTED IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE DEGREE OF Bachelor of Technology in Electronics and Communication Engineering Department By P. SHIVA

More information

Temporal Coherence, Natural Image Sequences, and the Visual Cortex

Temporal Coherence, Natural Image Sequences, and the Visual Cortex Temporal Coherence, Natural Image Sequences, and the Visual Cortex Jarmo Hurri and Aapo Hyvärinen Neural Networks Research Centre Helsinki University of Technology P.O.Box 9800, 02015 HUT, Finland {jarmo.hurri,aapo.hyvarinen}@hut.fi

More information

where A 2 IR m n is the mixing matrix, s(t) is the n-dimensional source vector (n» m), and v(t) is additive white noise that is statistically independ

where A 2 IR m n is the mixing matrix, s(t) is the n-dimensional source vector (n» m), and v(t) is additive white noise that is statistically independ BLIND SEPARATION OF NONSTATIONARY AND TEMPORALLY CORRELATED SOURCES FROM NOISY MIXTURES Seungjin CHOI x and Andrzej CICHOCKI y x Department of Electrical Engineering Chungbuk National University, KOREA

More information

Separation of Different Voices in Speech using Fast Ica Algorithm

Separation of Different Voices in Speech using Fast Ica Algorithm Volume-6, Issue-6, November-December 2016 International Journal of Engineering and Management Research Page Number: 364-368 Separation of Different Voices in Speech using Fast Ica Algorithm Dr. T.V.P Sundararajan

More information

Natural Image Statistics

Natural Image Statistics Natural Image Statistics A probabilistic approach to modelling early visual processing in the cortex Dept of Computer Science Early visual processing LGN V1 retina From the eye to the primary visual cortex

More information

ORIENTED PCA AND BLIND SIGNAL SEPARATION

ORIENTED PCA AND BLIND SIGNAL SEPARATION ORIENTED PCA AND BLIND SIGNAL SEPARATION K. I. Diamantaras Department of Informatics TEI of Thessaloniki Sindos 54101, Greece kdiamant@it.teithe.gr Th. Papadimitriou Department of Int. Economic Relat.

More information

Simultaneous Diagonalization in the Frequency Domain (SDIF) for Source Separation

Simultaneous Diagonalization in the Frequency Domain (SDIF) for Source Separation Simultaneous Diagonalization in the Frequency Domain (SDIF) for Source Separation Hsiao-Chun Wu and Jose C. Principe Computational Neuro-Engineering Laboratory Department of Electrical and Computer Engineering

More information

Lecture'12:' SSMs;'Independent'Component'Analysis;' Canonical'Correla;on'Analysis'

Lecture'12:' SSMs;'Independent'Component'Analysis;' Canonical'Correla;on'Analysis' Lecture'12:' SSMs;'Independent'Component'Analysis;' Canonical'Correla;on'Analysis' Lester'Mackey' May'7,'2014' ' Stats'306B:'Unsupervised'Learning' Beyond'linearity'in'state'space'modeling' Credit:'Alex'Simma'

More information

Natural Gradient Learning for Over- and Under-Complete Bases in ICA

Natural Gradient Learning for Over- and Under-Complete Bases in ICA NOTE Communicated by Jean-François Cardoso Natural Gradient Learning for Over- and Under-Complete Bases in ICA Shun-ichi Amari RIKEN Brain Science Institute, Wako-shi, Hirosawa, Saitama 351-01, Japan Independent

More information

HST.582J/6.555J/16.456J

HST.582J/6.555J/16.456J Blind Source Separation: PCA & ICA HST.582J/6.555J/16.456J Gari D. Clifford gari [at] mit. edu http://www.mit.edu/~gari G. D. Clifford 2005-2009 What is BSS? Assume an observation (signal) is a linear

More information

Massoud BABAIE-ZADEH. Blind Source Separation (BSS) and Independent Componen Analysis (ICA) p.1/39

Massoud BABAIE-ZADEH. Blind Source Separation (BSS) and Independent Componen Analysis (ICA) p.1/39 Blind Source Separation (BSS) and Independent Componen Analysis (ICA) Massoud BABAIE-ZADEH Blind Source Separation (BSS) and Independent Componen Analysis (ICA) p.1/39 Outline Part I Part II Introduction

More information

Independent Component Analysis (ICA) Bhaskar D Rao University of California, San Diego

Independent Component Analysis (ICA) Bhaskar D Rao University of California, San Diego Independent Component Analysis (ICA) Bhaskar D Rao University of California, San Diego Email: brao@ucsdedu References 1 Hyvarinen, A, Karhunen, J, & Oja, E (2004) Independent component analysis (Vol 46)

More information

Advanced Introduction to Machine Learning CMU-10715

Advanced Introduction to Machine Learning CMU-10715 Advanced Introduction to Machine Learning CMU-10715 Independent Component Analysis Barnabás Póczos Independent Component Analysis 2 Independent Component Analysis Model original signals Observations (Mixtures)

More information

A two-layer ICA-like model estimated by Score Matching

A two-layer ICA-like model estimated by Score Matching A two-layer ICA-like model estimated by Score Matching Urs Köster and Aapo Hyvärinen University of Helsinki and Helsinki Institute for Information Technology Abstract. Capturing regularities in high-dimensional

More information

Recursive Generalized Eigendecomposition for Independent Component Analysis

Recursive Generalized Eigendecomposition for Independent Component Analysis Recursive Generalized Eigendecomposition for Independent Component Analysis Umut Ozertem 1, Deniz Erdogmus 1,, ian Lan 1 CSEE Department, OGI, Oregon Health & Science University, Portland, OR, USA. {ozertemu,deniz}@csee.ogi.edu

More information

HST.582J / 6.555J / J Biomedical Signal and Image Processing Spring 2007

HST.582J / 6.555J / J Biomedical Signal and Image Processing Spring 2007 MIT OpenCourseWare http://ocw.mit.edu HST.582J / 6.555J / 16.456J Biomedical Signal and Image Processing Spring 2007 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms.

More information

1 Introduction Blind source separation (BSS) is a fundamental problem which is encountered in a variety of signal processing problems where multiple s

1 Introduction Blind source separation (BSS) is a fundamental problem which is encountered in a variety of signal processing problems where multiple s Blind Separation of Nonstationary Sources in Noisy Mixtures Seungjin CHOI x1 and Andrzej CICHOCKI y x Department of Electrical Engineering Chungbuk National University 48 Kaeshin-dong, Cheongju Chungbuk

More information

Independent Component Analysis. PhD Seminar Jörgen Ungh

Independent Component Analysis. PhD Seminar Jörgen Ungh Independent Component Analysis PhD Seminar Jörgen Ungh Agenda Background a motivater Independence ICA vs. PCA Gaussian data ICA theory Examples Background & motivation The cocktail party problem Bla bla

More information

Final Report For Undergraduate Research Opportunities Project Name: Biomedical Signal Processing in EEG. Zhang Chuoyao 1 and Xu Jianxin 2

Final Report For Undergraduate Research Opportunities Project Name: Biomedical Signal Processing in EEG. Zhang Chuoyao 1 and Xu Jianxin 2 ABSTRACT Final Report For Undergraduate Research Opportunities Project Name: Biomedical Signal Processing in EEG Zhang Chuoyao 1 and Xu Jianxin 2 Department of Electrical and Computer Engineering, National

More information

Independent Component Analysis

Independent Component Analysis Independent Component Analysis Seungjin Choi Department of Computer Science Pohang University of Science and Technology, Korea seungjin@postech.ac.kr March 4, 2009 1 / 78 Outline Theory and Preliminaries

More information

Artificial Intelligence Module 2. Feature Selection. Andrea Torsello

Artificial Intelligence Module 2. Feature Selection. Andrea Torsello Artificial Intelligence Module 2 Feature Selection Andrea Torsello We have seen that high dimensional data is hard to classify (curse of dimensionality) Often however, the data does not fill all the space

More information

From independent component analysis to score matching

From independent component analysis to score matching From independent component analysis to score matching Aapo Hyvärinen Dept of Computer Science & HIIT Dept of Mathematics and Statistics University of Helsinki Finland 1 Abstract First, short introduction

More information

A Constrained EM Algorithm for Independent Component Analysis

A Constrained EM Algorithm for Independent Component Analysis LETTER Communicated by Hagai Attias A Constrained EM Algorithm for Independent Component Analysis Max Welling Markus Weber California Institute of Technology, Pasadena, CA 91125, U.S.A. We introduce a

More information

Separation of the EEG Signal using Improved FastICA Based on Kurtosis Contrast Function

Separation of the EEG Signal using Improved FastICA Based on Kurtosis Contrast Function Australian Journal of Basic and Applied Sciences, 5(9): 2152-2156, 211 ISSN 1991-8178 Separation of the EEG Signal using Improved FastICA Based on Kurtosis Contrast Function 1 Tahir Ahmad, 2 Hjh.Norma

More information

An Improved Cumulant Based Method for Independent Component Analysis

An Improved Cumulant Based Method for Independent Component Analysis An Improved Cumulant Based Method for Independent Component Analysis Tobias Blaschke and Laurenz Wiskott Institute for Theoretical Biology Humboldt University Berlin Invalidenstraße 43 D - 0 5 Berlin Germany

More information

PROPERTIES OF THE EMPIRICAL CHARACTERISTIC FUNCTION AND ITS APPLICATION TO TESTING FOR INDEPENDENCE. Noboru Murata

PROPERTIES OF THE EMPIRICAL CHARACTERISTIC FUNCTION AND ITS APPLICATION TO TESTING FOR INDEPENDENCE. Noboru Murata ' / PROPERTIES OF THE EMPIRICAL CHARACTERISTIC FUNCTION AND ITS APPLICATION TO TESTING FOR INDEPENDENCE Noboru Murata Waseda University Department of Electrical Electronics and Computer Engineering 3--

More information

Introduction to Independent Component Analysis. Jingmei Lu and Xixi Lu. Abstract

Introduction to Independent Component Analysis. Jingmei Lu and Xixi Lu. Abstract Final Project 2//25 Introduction to Independent Component Analysis Abstract Independent Component Analysis (ICA) can be used to solve blind signal separation problem. In this article, we introduce definition

More information

THE functional role of simple and complex cells has

THE functional role of simple and complex cells has 37 A Novel Temporal Generative Model of Natural Video as an Internal Model in Early Vision Jarmo Hurri and Aapo Hyvärinen Neural Networks Research Centre Helsinki University of Technology P.O.Box 9800,

More information

ON SOME EXTENSIONS OF THE NATURAL GRADIENT ALGORITHM. Brain Science Institute, RIKEN, Wako-shi, Saitama , Japan

ON SOME EXTENSIONS OF THE NATURAL GRADIENT ALGORITHM. Brain Science Institute, RIKEN, Wako-shi, Saitama , Japan ON SOME EXTENSIONS OF THE NATURAL GRADIENT ALGORITHM Pando Georgiev a, Andrzej Cichocki b and Shun-ichi Amari c Brain Science Institute, RIKEN, Wako-shi, Saitama 351-01, Japan a On leave from the Sofia

More information

Robust extraction of specific signals with temporal structure

Robust extraction of specific signals with temporal structure Robust extraction of specific signals with temporal structure Zhi-Lin Zhang, Zhang Yi Computational Intelligence Laboratory, School of Computer Science and Engineering, University of Electronic Science

More information

Blind signal processing algorithms

Blind signal processing algorithms 12th Int. Workshop on Systems, Signals & Image Processing, 22-24 September 2005, Chalkida, Greece 105 Blind signal processing algorithms Athanasios Margaris and Efthimios Kotsialos Department of Applied

More information

Markovian Source Separation

Markovian Source Separation IEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL. 51, NO. 12, DECEMBER 2003 3009 Markovian Source Separation Shahram Hosseini, Christian Jutten, Associate Member, IEEE, and Dinh Tuan Pham, Member, IEEE Abstract

More information

Non-Euclidean Independent Component Analysis and Oja's Learning

Non-Euclidean Independent Component Analysis and Oja's Learning Non-Euclidean Independent Component Analysis and Oja's Learning M. Lange 1, M. Biehl 2, and T. Villmann 1 1- University of Appl. Sciences Mittweida - Dept. of Mathematics Mittweida, Saxonia - Germany 2-

More information

Speed and Accuracy Enhancement of Linear ICA Techniques Using Rational Nonlinear Functions

Speed and Accuracy Enhancement of Linear ICA Techniques Using Rational Nonlinear Functions Speed and Accuracy Enhancement of Linear ICA Techniques Using Rational Nonlinear Functions Petr Tichavský 1, Zbyněk Koldovský 1,2, and Erkki Oja 3 1 Institute of Information Theory and Automation, Pod

More information

Real and Complex Independent Subspace Analysis by Generalized Variance

Real and Complex Independent Subspace Analysis by Generalized Variance Real and Complex Independent Subspace Analysis by Generalized Variance Neural Information Processing Group, Department of Information Systems, Eötvös Loránd University, Budapest, Hungary ICA Research Network

More information

SPARSE REPRESENTATION AND BLIND DECONVOLUTION OF DYNAMICAL SYSTEMS. Liqing Zhang and Andrzej Cichocki

SPARSE REPRESENTATION AND BLIND DECONVOLUTION OF DYNAMICAL SYSTEMS. Liqing Zhang and Andrzej Cichocki SPARSE REPRESENTATON AND BLND DECONVOLUTON OF DYNAMCAL SYSTEMS Liqing Zhang and Andrzej Cichocki Lab for Advanced Brain Signal Processing RKEN Brain Science nstitute Wako shi, Saitama, 351-198, apan zha,cia

More information

NON-NEGATIVE SPARSE CODING

NON-NEGATIVE SPARSE CODING NON-NEGATIVE SPARSE CODING Patrik O. Hoyer Neural Networks Research Centre Helsinki University of Technology P.O. Box 9800, FIN-02015 HUT, Finland patrik.hoyer@hut.fi To appear in: Neural Networks for

More information

Independent Component Analysis (ICA)

Independent Component Analysis (ICA) Independent Component Analysis (ICA) Université catholique de Louvain (Belgium) Machine Learning Group http://www.dice.ucl ucl.ac.be/.ac.be/mlg/ 1 Overview Uncorrelation vs Independence Blind source separation

More information

An Iterative Blind Source Separation Method for Convolutive Mixtures of Images

An Iterative Blind Source Separation Method for Convolutive Mixtures of Images An Iterative Blind Source Separation Method for Convolutive Mixtures of Images Marc Castella and Jean-Christophe Pesquet Université de Marne-la-Vallée / UMR-CNRS 8049 5 bd Descartes, Champs-sur-Marne 77454

More information

Unsupervised Feature Extraction by Time-Contrastive Learning and Nonlinear ICA

Unsupervised Feature Extraction by Time-Contrastive Learning and Nonlinear ICA Unsupervised Feature Extraction by Time-Contrastive Learning and Nonlinear ICA with Hiroshi Morioka Dept of Computer Science University of Helsinki, Finland Facebook AI Summit, 13th June 2016 Abstract

More information

Blind Source Separation via Generalized Eigenvalue Decomposition

Blind Source Separation via Generalized Eigenvalue Decomposition Journal of Machine Learning Research 4 (2003) 1261-1269 Submitted 10/02; Published 12/03 Blind Source Separation via Generalized Eigenvalue Decomposition Lucas Parra Department of Biomedical Engineering

More information

BLIND SEPARATION OF INSTANTANEOUS MIXTURES OF NON STATIONARY SOURCES

BLIND SEPARATION OF INSTANTANEOUS MIXTURES OF NON STATIONARY SOURCES BLIND SEPARATION OF INSTANTANEOUS MIXTURES OF NON STATIONARY SOURCES Dinh-Tuan Pham Laboratoire de Modélisation et Calcul URA 397, CNRS/UJF/INPG BP 53X, 38041 Grenoble cédex, France Dinh-Tuan.Pham@imag.fr

More information

New Approximations of Differential Entropy for Independent Component Analysis and Projection Pursuit

New Approximations of Differential Entropy for Independent Component Analysis and Projection Pursuit New Approximations of Differential Entropy for Independent Component Analysis and Projection Pursuit Aapo Hyvarinen Helsinki University of Technology Laboratory of Computer and Information Science P.O.

More information

Independent Component Analysis

Independent Component Analysis 000 001 002 003 004 005 006 007 008 009 010 011 012 013 014 015 016 017 018 019 020 021 022 023 024 025 026 027 028 029 030 031 032 033 034 035 036 037 038 039 040 041 042 043 044 1 Introduction Indepent

More information

Independent Component Analysis

Independent Component Analysis Independent Component Analysis James V. Stone November 4, 24 Sheffield University, Sheffield, UK Keywords: independent component analysis, independence, blind source separation, projection pursuit, complexity

More information

Distinguishing Causes from Effects using Nonlinear Acyclic Causal Models

Distinguishing Causes from Effects using Nonlinear Acyclic Causal Models JMLR Workshop and Conference Proceedings 6:17 164 NIPS 28 workshop on causality Distinguishing Causes from Effects using Nonlinear Acyclic Causal Models Kun Zhang Dept of Computer Science and HIIT University

More information

MULTICHANNEL SIGNAL PROCESSING USING SPATIAL RANK COVARIANCE MATRICES

MULTICHANNEL SIGNAL PROCESSING USING SPATIAL RANK COVARIANCE MATRICES MULTICHANNEL SIGNAL PROCESSING USING SPATIAL RANK COVARIANCE MATRICES S. Visuri 1 H. Oja V. Koivunen 1 1 Signal Processing Lab. Dept. of Statistics Tampere Univ. of Technology University of Jyväskylä P.O.

More information

FASTICA BASED BLIND SOURCE SEPARATION FOR CT IMAGING UNDER NOISE CONDITIONS

FASTICA BASED BLIND SOURCE SEPARATION FOR CT IMAGING UNDER NOISE CONDITIONS FASTICA BASED BLIND SOURCE SEPARATION FOR CT IMAGING UNDER NOISE CONDITIONS Rohit Kumar Malik 1 and Ketaki Solanki 2 1 Applications Engineer, Oracle India Private Limited, Bangalore, Karnatka, India 2

More information

ICA. Independent Component Analysis. Zakariás Mátyás

ICA. Independent Component Analysis. Zakariás Mátyás ICA Independent Component Analysis Zakariás Mátyás Contents Definitions Introduction History Algorithms Code Uses of ICA Definitions ICA Miture Separation Signals typical signals Multivariate statistics

More information

Tutorial on Blind Source Separation and Independent Component Analysis

Tutorial on Blind Source Separation and Independent Component Analysis Tutorial on Blind Source Separation and Independent Component Analysis Lucas Parra Adaptive Image & Signal Processing Group Sarnoff Corporation February 09, 2002 Linear Mixtures... problem statement...

More information

On Information Maximization and Blind Signal Deconvolution

On Information Maximization and Blind Signal Deconvolution On Information Maximization and Blind Signal Deconvolution A Röbel Technical University of Berlin, Institute of Communication Sciences email: roebel@kgwtu-berlinde Abstract: In the following paper we investigate

More information

Semi-Blind approaches to source separation: introduction to the special session

Semi-Blind approaches to source separation: introduction to the special session Semi-Blind approaches to source separation: introduction to the special session Massoud BABAIE-ZADEH 1 Christian JUTTEN 2 1- Sharif University of Technology, Tehran, IRAN 2- Laboratory of Images and Signals

More information

To appear in Proceedings of the ICA'99, Aussois, France, A 2 R mn is an unknown mixture matrix of full rank, v(t) is the vector of noises. The

To appear in Proceedings of the ICA'99, Aussois, France, A 2 R mn is an unknown mixture matrix of full rank, v(t) is the vector of noises. The To appear in Proceedings of the ICA'99, Aussois, France, 1999 1 NATURAL GRADIENT APPROACH TO BLIND SEPARATION OF OVER- AND UNDER-COMPLETE MIXTURES L.-Q. Zhang, S. Amari and A. Cichocki Brain-style Information

More information

Simple-Cell-Like Receptive Fields Maximize Temporal Coherence in Natural Video

Simple-Cell-Like Receptive Fields Maximize Temporal Coherence in Natural Video LETTER Communicated by Bruno Olshausen Simple-Cell-Like Receptive Fields Maximize Temporal Coherence in Natural Video Jarmo Hurri jarmo.hurri@hut.fi Aapo Hyvärinen aapo.hyvarinen@hut.fi Neural Networks

More information

Robustness of Principal Components

Robustness of Principal Components PCA for Clustering An objective of principal components analysis is to identify linear combinations of the original variables that are useful in accounting for the variation in those original variables.

More information

THE THREE EASY ROUTES TO INDEPENDENT COMPONENT ANALYSIS; CONTRASTS AND GEOMETRY. Jean-François Cardoso

THE THREE EASY ROUTES TO INDEPENDENT COMPONENT ANALYSIS; CONTRASTS AND GEOMETRY. Jean-François Cardoso TH THR ASY ROUTS TO IPT COMPOT AALYSIS; COTRASTS A GOMTRY. Jean-François Cardoso CRS/ST, 6 rue Barrault, 563 Paris, France mailto:cardoso@tsi.enst.fr http://tsi.enst.fr/ cardoso/stuff.html ABSTRACT Blind

More information

Distinguishing Causes from Effects using Nonlinear Acyclic Causal Models

Distinguishing Causes from Effects using Nonlinear Acyclic Causal Models Distinguishing Causes from Effects using Nonlinear Acyclic Causal Models Kun Zhang Dept of Computer Science and HIIT University of Helsinki 14 Helsinki, Finland kun.zhang@cs.helsinki.fi Aapo Hyvärinen

More information

Nonlinear reverse-correlation with synthesized naturalistic noise

Nonlinear reverse-correlation with synthesized naturalistic noise Cognitive Science Online, Vol1, pp1 7, 2003 http://cogsci-onlineucsdedu Nonlinear reverse-correlation with synthesized naturalistic noise Hsin-Hao Yu Department of Cognitive Science University of California

More information

Blind separation of instantaneous mixtures of dependent sources

Blind separation of instantaneous mixtures of dependent sources Blind separation of instantaneous mixtures of dependent sources Marc Castella and Pierre Comon GET/INT, UMR-CNRS 7, 9 rue Charles Fourier, 9 Évry Cedex, France marc.castella@int-evry.fr, CNRS, I3S, UMR

More information

Independent Component Analysis and Its Application on Accelerator Physics

Independent Component Analysis and Its Application on Accelerator Physics Independent Component Analysis and Its Application on Accelerator Physics Xiaoying Pang LA-UR-12-20069 ICA and PCA Similarities: Blind source separation method (BSS) no model Observed signals are linear

More information

Blind Extraction of Singularly Mixed Source Signals

Blind Extraction of Singularly Mixed Source Signals IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL 11, NO 6, NOVEMBER 2000 1413 Blind Extraction of Singularly Mixed Source Signals Yuanqing Li, Jun Wang, Senior Member, IEEE, and Jacek M Zurada, Fellow, IEEE Abstract

More information

ON-LINE BLIND SEPARATION OF NON-STATIONARY SIGNALS

ON-LINE BLIND SEPARATION OF NON-STATIONARY SIGNALS Yugoslav Journal of Operations Research 5 (25), Number, 79-95 ON-LINE BLIND SEPARATION OF NON-STATIONARY SIGNALS Slavica TODOROVIĆ-ZARKULA EI Professional Electronics, Niš, bssmtod@eunet.yu Branimir TODOROVIĆ,

More information

BLIND SEPARATION USING ABSOLUTE MOMENTS BASED ADAPTIVE ESTIMATING FUNCTION. Juha Karvanen and Visa Koivunen

BLIND SEPARATION USING ABSOLUTE MOMENTS BASED ADAPTIVE ESTIMATING FUNCTION. Juha Karvanen and Visa Koivunen BLIND SEPARATION USING ABSOLUTE MOMENTS BASED ADAPTIVE ESTIMATING UNCTION Juha Karvanen and Visa Koivunen Signal Processing Laboratory Helsinki University of Technology P.O. Box 3, IN-215 HUT, inland Tel.

More information

MINIMIZATION-PROJECTION (MP) APPROACH FOR BLIND SOURCE SEPARATION IN DIFFERENT MIXING MODELS

MINIMIZATION-PROJECTION (MP) APPROACH FOR BLIND SOURCE SEPARATION IN DIFFERENT MIXING MODELS MINIMIZATION-PROJECTION (MP) APPROACH FOR BLIND SOURCE SEPARATION IN DIFFERENT MIXING MODELS Massoud Babaie-Zadeh ;2, Christian Jutten, Kambiz Nayebi 2 Institut National Polytechnique de Grenoble (INPG),

More information

Blind compensation of polynomial mixtures of Gaussian signals with application in nonlinear blind source separation

Blind compensation of polynomial mixtures of Gaussian signals with application in nonlinear blind source separation Blind compensation of polynomial mixtures of Gaussian signals with application in nonlinear blind source separation Bahram Ehsandoust, Bertrand Rivet, Massoud Babaie-Zadeh, Christian Jutten To cite this

More information

Blind channel deconvolution of real world signals using source separation techniques

Blind channel deconvolution of real world signals using source separation techniques Blind channel deconvolution of real world signals using source separation techniques Jordi Solé-Casals 1, Enric Monte-Moreno 2 1 Signal Processing Group, University of Vic, Sagrada Família 7, 08500, Vic

More information

Complete Blind Subspace Deconvolution

Complete Blind Subspace Deconvolution Complete Blind Subspace Deconvolution Zoltán Szabó Department of Information Systems, Eötvös Loránd University, Pázmány P. sétány 1/C, Budapest H-1117, Hungary szzoli@cs.elte.hu http://nipg.inf.elte.hu

More information

Blind Source Separation Using Artificial immune system

Blind Source Separation Using Artificial immune system American Journal of Engineering Research (AJER) e-issn : 2320-0847 p-issn : 2320-0936 Volume-03, Issue-02, pp-240-247 www.ajer.org Research Paper Open Access Blind Source Separation Using Artificial immune

More information

NONLINEAR BLIND SOURCE SEPARATION USING KERNEL FEATURE SPACES.

NONLINEAR BLIND SOURCE SEPARATION USING KERNEL FEATURE SPACES. NONLINEAR BLIND SOURCE SEPARATION USING KERNEL FEATURE SPACES Stefan Harmeling 1, Andreas Ziehe 1, Motoaki Kawanabe 1, Benjamin Blankertz 1, Klaus-Robert Müller 1, 1 GMD FIRST.IDA, Kekuléstr. 7, 1489 Berlin,

More information

Statistical and Adaptive Signal Processing

Statistical and Adaptive Signal Processing r Statistical and Adaptive Signal Processing Spectral Estimation, Signal Modeling, Adaptive Filtering and Array Processing Dimitris G. Manolakis Massachusetts Institute of Technology Lincoln Laboratory

More information

Independent component analysis for biomedical signals

Independent component analysis for biomedical signals INSTITUTE OF PHYSICS PUBLISHING Physiol. Meas. 26 (2005) R15 R39 PHYSIOLOGICAL MEASUREMENT doi:10.1088/0967-3334/26/1/r02 TOPICAL REVIEW Independent component analysis for biomedical signals Christopher

More information

Dimensionality Reduction. CS57300 Data Mining Fall Instructor: Bruno Ribeiro

Dimensionality Reduction. CS57300 Data Mining Fall Instructor: Bruno Ribeiro Dimensionality Reduction CS57300 Data Mining Fall 2016 Instructor: Bruno Ribeiro Goal } Visualize high dimensional data (and understand its Geometry) } Project the data into lower dimensional spaces }

More information

An Introduction to Independent Components Analysis (ICA)

An Introduction to Independent Components Analysis (ICA) An Introduction to Independent Components Analysis (ICA) Anish R. Shah, CFA Northfield Information Services Anish@northinfo.com Newport Jun 6, 2008 1 Overview of Talk Review principal components Introduce

More information

Principal Component Analysis

Principal Component Analysis Principal Component Analysis Introduction Consider a zero mean random vector R n with autocorrelation matri R = E( T ). R has eigenvectors q(1),,q(n) and associated eigenvalues λ(1) λ(n). Let Q = [ q(1)

More information

Scatter Matrices and Independent Component Analysis

Scatter Matrices and Independent Component Analysis AUSTRIAN JOURNAL OF STATISTICS Volume 35 (2006), Number 2&3, 175 189 Scatter Matrices and Independent Component Analysis Hannu Oja 1, Seija Sirkiä 2, and Jan Eriksson 3 1 University of Tampere, Finland

More information