Independent Component Analysis

Size: px
Start display at page:

Download "Independent Component Analysis"

Transcription

1 Department of Physics Seminar I b 1st year, 2nd cycle Independent Component Analysis Author: Žiga Zaplotnik Advisor: prof. dr. Simon Širca Ljubljana, April 2014 Abstract In this seminar we present a computational method for blind signal separation. Its purpose is to deduce source signals from a set of various mixed signals without any information about the source signals or mixing process. Independent component analysis is a recently developed method in which the goal is to find source components which are statistically independent, or as independent as possible. We give a description of the basic concepts of this method based on cocktail-party problem example. We also discuss its implementation. Finally, we present some already used applications of independent component analysis.

2 Contents 1 Introduction 2 2 Principles of ICA estimation Definition of ICA model Constraints Estimating ICA model by maximizing non-gaussianity Ambiguities of ICA Applications of ICA Electrocardiography Reflection Cancelling Conclusions 12 References 12 1 Introduction Independent component analysis (ICA) is a statistical computational method for blind signal (source) separation that means separation of source signals (audio, speech, image, geophysical, medical...) from a set of mixed signals. Term 'blind' suggests that there is little or no information about the signals or mixing process. ICA was first introduced in 1986 by Herault and Jutten [1]. Comon provided the most widely used mathematical definition of ICA in 1994 [2]. We will give a more detailed description of it in the next chapter. First fast and efficient ICA algorithm was introduced by Bell and Seynowski in 1995 [3]. From then on many different ICA algorithms were published. One of the most used, including in industrial applications (which we will discuss in the last chapter) is the FastICA algorithm, developed by Hyvärinen and Oja in 1999 [4,5,6]. As the name implies, the basic goal of ICA is to estimate such a transformation of an observed mixture of signals, that the resulting source components are minimally correlated or as independent as possible in a probabilistic sense. Furthermore, the signals have to be non-gaussian. For example, we measure signals ( ), ( ) of two microphones recording the speech of two persons (sources) ( ) and ( ). We assume that each source reaches both microphones at the same time (there is no time/phase delay). That means that the distance between microphones has to be much smaller than the source signal wavelength. However, it is important that the microphones amplify more the signals coming from certain directions (like our ears do), otherwise the measured signals ( ) are all the same. So we can assume that each of these recorded 2

3 signals is a weighted sum (linear combination) of the speech signals. We could express this as a linear equation ( ( ) ( ) ) ( )( ( ) ( ) ) (1) which holds for every timestamp. Here, both sources and matrix elements are unknown. The value of depends on the relative contribution of the source signal to recorded mixture. An illustration of this example is shown in Fig. 1 and Fig. 2. These are, of course, not realistic speech signals, but suffice for the explanation. The original (source) signals are shown in Fig. 1 and the mixed signals could look like those in Fig. 2. The problem is to recover the data in Fig. 1, using only the data in Fig. 2. If we know the parameters, the linear equation (1) is trivial to solve. However, the point is that the parameters are unknown and then the problem is considerably more difficult. Figure 1. Original (source) signals. Lets denote the upper signal ( ), the lower signal ( ). [4] Figure 2. Linearly mixed signals. Here we denote upper mixed signal ( ) and the lower signal ( ). It can be clearly seen, that in ( ), the dominant source signal (the one with larger absolute value of weight) is ( ) from Fig. 1, while in ( ) the dominant source signal is ( ) from Fig. 1. Considering the definition in Eq. (1), we can also obtain that the weight (mixing parameter) is negative. [4] 3

4 The above example is an example of the famous cocktail-party problem, a problem related to cocktail-party effect. The cocktail-party effect is the phenomenon of being able to focus attention on a particular stimulus while filtering out other stimuli. For example one is able to focus on a single conversation in a noisy room. However, this requires hearing with both ears. People with only one functioning ear are much more distracted by interfering noise than people with two healthy ears. The reason is following: the auditory system (sensory system for the sense of hearing) has to be able to localize and process at least two sound sources in order to extract the signals of one sound source out of a mixture of interfering sound sources. Here localizing means turning head into the direction of the source of interest and so amplifying the signal coming from it [7]. A similar condition occurs in ICA a number of obseved sound mixtures must be at least as large as the number of independent sound components. Algortihm alone ofcourse cannot localize (and so amplify) certain signals, therefore only two microphones in a room of three simultaneously speaking people are insufficient in order to separate sound signals of all three speakers. Here, we started a discussion about conditions and limits of ICA. In the next section we describe a generalized cocktail-party problem. In order to be able to solve this problem we also give a detailed insight into the ideas and construction of ICA. 2 Principles of ICA estimation 2.1 Definition of ICA model Here we only describe noiseless linear independent component analysis. Assume that we observe linear mixtures of independent components (sources). Then we can define the ICA model where is the mixing matrix. ICA model (2) describes how the observed data are generated by a process of mixing the components (sources). We have now dropped the time index as (2) holds at any given time instant. With the purpose of efficiently describing ICA, we rather work with random variables instead of the proper time signals from here on. That means we assume each mixture as well as each independent component is a random variable, instead of the amplitude of the signal at certain time. We neglect measurement errors, which could (in case they exist) be added to the right side of (2). We assume that noise is already included in the sources. So there is also no noise at the right side of (2) as we said, we are performing noiseless ICA. We also assume that the signal is centered (random variables have zero mean or ) and that the variables are of unit variance:. Let us now illustrate the ICA model in statistical terms. Consider two independent components, with uniform distributions (2) ( ) { (3) 4

5 The range of values for this uniform distribution were chosen so as to make the mean zero and the variance equal to one, in accordance with the preceding paragraph. Joint probabilty density of and is then uniform on a square, as shown in Fig. 3. Now let us mix these two independent components with the following mixing matrix ( ) (4) The mixing gives us two mixed variables, and. Now, the mixed data has a uniform distribution on a parallelogram (also shown in Fig. 3). But random components and are not independent any more. This can be clearly seen if we are trying to predict the value of one of them, for example, based on the value of the other,. For example, when attains maximum or minimum value, then this completely determines the value of. That means they are not independent. Figure 3. [LEFT] The joint distribution of the independent source components and with uniform distributions. Horizontal axis:, vertical axis:. [RIGHT] The joint distribution of the observed mixtures and. Variables and are not independent any more. For example, when taking maximum or minimum value of, variable is uniquely determined by. Horizontal axis:, vertical axis:. [4] Our main task is to reconstruct sources based on the observed. We have no information about the mixing process by matrix. However, if the matrix is of full rank, a separation or unmixing matrix exists. Then unmixing process can be written as ( ) ( ) (5) Here are the rows of. We attempt to find an estimate of the separation matrix so that we can reconstruct. In order to estimate matrix, several additional requirements (conditions) should be fulfilled. The constraints for the validity and possibility of solving blind signal separation problem with ICA method are discussed in the next section. 5

6 2.2 Constraints Here we describe three important constraints for ICA method to be valid. 1) The problem is well- or over-determined. The number of measured signals has to be equal to or greater than the number of source signals ( ). Otherwise it is impossible to separate the sources using ICA alone. 2) Mixing is instantenous. No delay is permitted between the time each source signal reaches sensors/microphones/observers/recording stations located in different places. In order to avoid (time) delay, recorded signals are aligned using the cross-correlation function (sliding dot product; similar to convolution). For continuous signals and, it is defined as ( )( ) ( ) ( ) (6) If and are functions that differ only by an unknown shift along the -axis, then cross-correlation can be used to find how much must be shifted along the -axis to align with. When the functions match, the value of ( ) is maximized. 3) Statistical independance of sources. The independence of variables is mathematically expressed by the statement that their joint probability density is factorizable in the following way: ( ) ( ) ( ) ( ). Basically the variables are said to be independent if information on the value of does not give any information on the value of, and vice versa. A pair of statistically independent variables ( ) is uncorrelated and has the covariance ( ). Uncorrelatedness is a weaker form of independence: if the variables are independent, they are uncorrelated, but the opposite is not necessary true. The source signals must be statistically independent or as independent as possible. This is difficult to verify in advance because distribution of data is not available in real world problems. 4) Source signals must be non-gaussian (except one signal). (Fig. 4) This is the fundamental restriction of ICA to be possible. Here is an explanation, why more than one Gaussian variable make ICA impossible. Assume that the mixing matrix is orthogonal and the variables and are Gaussian. Then and are Gaussian too and their joint probability density is Gaussian again ( ) ( ) (7) The probabilty density is perfectly symmetric (Fig. 4) and so it does not contain any information on the directions of the columns of the mixing matrix. That means matrix cannot be estimated. One can prove, that the distribution of any orthogonal transformation of the gaussian variables ( ) has exactly the same distribution as ( ) and that and are still independent. Thus, in the case of more than one Gaussian variable, we can only estimate up to an orthogonal transformation and the matrix is not identifiable. But if only one independent component is Gaussian, ICA 6

7 is still possible. Non-Gaussianity is the key to estimating the ICA model, as we will see in the next section. Figure 4. [LEFT] The comparison between probabilty density function of Gaussian and non - Gaussian signal. [RIGHT] The joint multivariate distribution of two independent Gaussian variables (Eq. 6). Horizontal axis:, vertical axis:. 2.3 Estimating ICA model by maximizing non-gaussianity First, we denote one-dimensional projections of the observed signals : where are the vectors to be determined. If one were one of the rows of the generalized inverse (separation matrix) of, this linear combination would actually equal to one of the independent components,. Now the question is how to determine so that it would equal one of the rows in unmixing matrix. We can show from (2) and (8) that this relation holds: 7 (8) (9) That means that is a linear combination of with weights. Now we use a statement that comes from the Central Limit Theorem 'a sum of two independent random variables has a distribution that is closer to Gaussian than a distribution of any of the two original random variables'. In our case, that implies that is more Gaussian than any of the in the linear combination (9). Therefore becomes least Gaussian when it in fact equals one of the independent components (which means that only one of the elements of is nonzero). That means we would take as a vector that minimizes Gaussianity or maximizes non-gaussianity of [4]. In practice, we would start with some vector, compute the direction, in which non-gaussianity is growing most strongly based on the observed mixture vector, and use some gradient method (for example Newton's method) for finding a new vector. The whole algorithm is beyond the scope of this seminar but can be found in [5].

8 To maximize non-gaussianity, first, we must have a measure of non-gaussianity. The classical measure (because of computational simplicity) for non-gaussianity of a continuous random variable is the absoulute value of kurtosis ( ) (10) which is a fourth order cumulant of probability distribution. For Gaussian random variable, the kurtosis is zero. However, practically (when estimated from a measured sample), curtosis is not robust as it is very sensitive to outliers. Its value may strongly depend on only a few observations in the tails of the distribution, which may not be correct at all. The difference between Gaussian distribution and the distribution of could then rather be measured by Shannon (information) entropy ( ) ( ) ( ) (11) where ( ) is the probabilty density function of. The more unpredictable the variable is, the larger its entropy. An important result of information theory is that 'a Gaussian variable has the largest entropy among all random variables of equal variance' [8]. This means entropy could be used as a measure of non-gaussianity. It turns out, that it is more practical to work with negentropy ( ) ( ) ( ) (12) which is zero for a Gaussian variable and positive otherwise. The problem of using negentropy as a measure of non-gaussianity is that ( ) from (11) is not known in advance and that it is computationally very difficult. We therefore use approximations of negentropy, most often ( ) [ ( ) ( ) ] (13) where ( ) is a non-quadratic function of. The following choices of have proved useful [4]: ( ) and ( ) ( ), where. 2.4 Ambiguities of ICA By looking at (2), it is obvious that the following ambiguities will hold (Fig. 5): 1) Variances (energies) of the independent components cannot be determined. Vector of sources and the mixing matrix are both unknown. That means that any scalar multiplier of the source could always be cancelled by dividing the corresponding column of by the same scalar. So the magnitudes (amplitudes) of the independent components are unknown. 2) Sign ambiguity. We could multiply any independent component by (phase reversal) without affecting the model. 3) Order of independent components cannot be determined. The estimated source signals may be recovered in different order. The reason is again that both and are unknown. 8

9 Figure 5. The estimates of the original source signals, shown in Fig. 1. These estimates were made using only the observed signals in Fig. 2. Considering the above mentioned ambiguities of the ICA model, the original signals were very accurately estimated, up to multiplicative signs. [4] 3 Applications of ICA In this chapter, we describe some applications of ICA. The most basic application of ICA, that is the cocktail-party problem as an example of blind source separation, was already explained in the Introduction. There are numerous other practical applications of ICA in different research domains such as: Biomedical signal processing (magnetoencephalography, electroencephalography, electrocardiography). Geophysical signal processing. ICA was applied to seismic signals of tremors recorded at Stromboli volcano [10]. Meteorological signal processing. For example, an ICA based study of a relation between El Nino southern oscillation (ENSO) and Atlantic tropical surface sea temperature anomaly (SST) has been made. [11]. Sonar and radar signal processing. Telecommunications. Separation of the user's own signal from the interfering other users' signals in CDMA (Code-Division Multiple Access) mobile communications. CDMA is used in 3G mobile phones standard [12]. Image processing (image denoising, reflection cancelling, separation of light sources, face recognition, feature extraction). Watermarking [13]. Clustering or Cluster Analysis. Text mining. Recently it has been found that ICA is a powerful tool for analyzing text document data as well, if the text documents are presented in a suitable numerical form. For example the purpose of one study was to find the topics of documents and grouping them accordingly. Other applications where factor analysis and principal component analysis are currently used. We will discuss some of the abovementioned applications in the following subsections. 9

10 3.1 Electrocardiography Routinely recorded electro-cardiograms (ECGs) are often corrupted by different types of artifacts. The term 'artifact' comes from word artificial that means that it is recorded from sources other than the electronic signals of the heart. Examples of artifacts in ECGs are electrical interference from a nearby electrical appliance, patient's muscle tremors as a result of movements, speaking, deep respiration etc. The presence of artifacts in the ECGs is very common and the knowledge of them is necessary to prevent misinterpretation of a heart's rhythm which can lead to wrong diagnoses. Even better is to almost completely remove noise and artefacts from ECGs. This can be done with ICA [14]. Here, an important difficulty is one of the ICA ambiguities determination of the order of the independent components. That means that still a trained physician is needed to manually interpret the deduced independent components. Figure 6. [ABOVE] Electro-cardiogram of a pregnant woman. We can see a weak and noisy signal of the child's heartbeat in signals, and. It is mixed with the signal of the mother's heartbeat and other artefacts. [BELOW] Independent components (sources) of the mixed signals in the picture above. The child's heartbeat can be seen in independent components and. It is faster than mother's heartbeat, which is displayed in all the other ICs, except, which is an artifact, probably generated by respiration during the measurement [9] 10

11 3.2 Reflection Cancelling When we take a picture through a window, the observed image is often a linear superposition of two images: the image of the scene beyond the window and the image of the scene reflected by window. In such cases we cannot view clearly a scene due to reflections from dielectric surfaces (e.g. glass). Since reflections off a semireflecting medium such as a glass window at some angle are partially polarized, the strength of the reflection can be manipulated with a polarizer. However the reflection can only be completely eliminated when the viewing angle and the incident light direction are in particular configuration called the Brewster angle. By using images obtained through a linear polarizer at two or more orientations and applying ICA to them, we can completely separate reflections [15]. Let's take an example of a photograph of a painting behind glassy window. The final image is a linear combination of the light reflected by the painting and the light directly reflected by the glass in front of the painting (Fig. 7). The amount of light at a single point in the image can be expressed as where is the amount of light contributed by the painintg and by the reflection. and are the multiplicative constants. By changing the orientation of the polarizer, we obtain two new multiplicative constants and, because the amount of light from each of the sources changes. That means we have a linear mixing problem where observations are [ ], independent components [ ] and the mixing matrix ( ). Then, we apply ICA to (15) in order to obtain independent components and. The results are shown in Fig. 8. (14) (15) Figure 7. [LEFT] Renoir's On the Terrace, Sheila and Sheila's reflection. [RIGHT] A photograph of a painting behind glass contains a superposition of the light that is reflected by the painting, and the light that is reflected directly off the glass [15]. 11

12 Figure 8. [LEFT] A pair of images of Renoir's On the Terrace with a reflection of Sheila photographed through a linear polarizer at orthogonal orientations. [RIGHT] A pair of ICs [15]. 4 Conclusions We have seen that ICA is a very general purpose and applicative computational method with an almost unlimited potential for future use. Unfortunately, for now, the ICA method is restricted to linear and barely nonlinear problems. However, the research on the field of independent component analysis is still very active nowadays, which gives hope for further improvements of the method and its algorithmic implementations. References [1] J. Herault and C. Jutten, Signal Processing 24, 1-10 (1986). [2] P. Comon, Signal Processing 36, (1994). [3] A.J. Bell and T.J. Sejnowski, Neural Computation 7, (1995). [4] A. Hyvärinen and E. Oja, Neural Networks 13, (2000) [5] A. Hyvärinen, IEEE Trans. Neural Networks 10, (1999). [6] A. Hyvärinen, Neural Computing Surveys 2, (1999). [7] J.B. Fritz, M. Elhilali, Curr. Opin. Neurbiol 17, (2007). [8] A. Papoulis, Probability, Random Variables and Stochastic Processes, 3rd edn. (McGraW-Hill, New York, 1991). [9] S. Širca, A. Horvat, Computational Methods for Physicists (Springer-Verlag, 2012). [10] A.Ciaramella, Nonlinear Processes in Geophysics 11, (2004). [11] F. Aires, A. Chedin and J.P. Nadal, J. Geophys. Res. 105 (D13), (2000). [12] T. Ristaniemi, J. Joutsensalo, Signal Processing 82, (2002). [13] S. Bounlong, D. Lowe, D. Saad, Journal of Machine Learning 4, (2003). [14] T. He, G. Clifford and L. Tarassenko. Neural Comput. & Applic 15, (2006). [15] H. Farid and E.H. Adelson. IEEE Comp. Soc. Conf. on Comp. Vis. and Patt. Rec. 1, (1999). 12

ICA. Independent Component Analysis. Zakariás Mátyás

ICA. Independent Component Analysis. Zakariás Mátyás ICA Independent Component Analysis Zakariás Mátyás Contents Definitions Introduction History Algorithms Code Uses of ICA Definitions ICA Miture Separation Signals typical signals Multivariate statistics

More information

Advanced Introduction to Machine Learning CMU-10715

Advanced Introduction to Machine Learning CMU-10715 Advanced Introduction to Machine Learning CMU-10715 Independent Component Analysis Barnabás Póczos Independent Component Analysis 2 Independent Component Analysis Model original signals Observations (Mixtures)

More information

Independent component analysis: algorithms and applications

Independent component analysis: algorithms and applications PERGAMON Neural Networks 13 (2000) 411 430 Invited article Independent component analysis: algorithms and applications A. Hyvärinen, E. Oja* Neural Networks Research Centre, Helsinki University of Technology,

More information

THEORETICAL CONCEPTS & APPLICATIONS OF INDEPENDENT COMPONENT ANALYSIS

THEORETICAL CONCEPTS & APPLICATIONS OF INDEPENDENT COMPONENT ANALYSIS THEORETICAL CONCEPTS & APPLICATIONS OF INDEPENDENT COMPONENT ANALYSIS SONALI MISHRA 1, NITISH BHARDWAJ 2, DR. RITA JAIN 3 1,2 Student (B.E.- EC), LNCT, Bhopal, M.P. India. 3 HOD (EC) LNCT, Bhopal, M.P.

More information

Independent Component Analysis and Its Applications. By Qing Xue, 10/15/2004

Independent Component Analysis and Its Applications. By Qing Xue, 10/15/2004 Independent Component Analysis and Its Applications By Qing Xue, 10/15/2004 Outline Motivation of ICA Applications of ICA Principles of ICA estimation Algorithms for ICA Extensions of basic ICA framework

More information

Introduction to Independent Component Analysis. Jingmei Lu and Xixi Lu. Abstract

Introduction to Independent Component Analysis. Jingmei Lu and Xixi Lu. Abstract Final Project 2//25 Introduction to Independent Component Analysis Abstract Independent Component Analysis (ICA) can be used to solve blind signal separation problem. In this article, we introduce definition

More information

Independent Component Analysis

Independent Component Analysis 000 001 002 003 004 005 006 007 008 009 010 011 012 013 014 015 016 017 018 019 020 021 022 023 024 025 026 027 028 029 030 031 032 033 034 035 036 037 038 039 040 041 042 043 044 1 Introduction Indepent

More information

Massoud BABAIE-ZADEH. Blind Source Separation (BSS) and Independent Componen Analysis (ICA) p.1/39

Massoud BABAIE-ZADEH. Blind Source Separation (BSS) and Independent Componen Analysis (ICA) p.1/39 Blind Source Separation (BSS) and Independent Componen Analysis (ICA) Massoud BABAIE-ZADEH Blind Source Separation (BSS) and Independent Componen Analysis (ICA) p.1/39 Outline Part I Part II Introduction

More information

Independent Component Analysis

Independent Component Analysis A Short Introduction to Independent Component Analysis Aapo Hyvärinen Helsinki Institute for Information Technology and Depts of Computer Science and Psychology University of Helsinki Problem of blind

More information

Separation of Different Voices in Speech using Fast Ica Algorithm

Separation of Different Voices in Speech using Fast Ica Algorithm Volume-6, Issue-6, November-December 2016 International Journal of Engineering and Management Research Page Number: 364-368 Separation of Different Voices in Speech using Fast Ica Algorithm Dr. T.V.P Sundararajan

More information

Independent Component Analysis

Independent Component Analysis Independent Component Analysis James V. Stone November 4, 24 Sheffield University, Sheffield, UK Keywords: independent component analysis, independence, blind source separation, projection pursuit, complexity

More information

Artificial Intelligence Module 2. Feature Selection. Andrea Torsello

Artificial Intelligence Module 2. Feature Selection. Andrea Torsello Artificial Intelligence Module 2 Feature Selection Andrea Torsello We have seen that high dimensional data is hard to classify (curse of dimensionality) Often however, the data does not fill all the space

More information

INDEPENDENT COMPONENT ANALYSIS

INDEPENDENT COMPONENT ANALYSIS INDEPENDENT COMPONENT ANALYSIS A THESIS SUBMITTED IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE DEGREE OF Bachelor of Technology in Electronics and Communication Engineering Department By P. SHIVA

More information

Independent Component Analysis. Contents

Independent Component Analysis. Contents Contents Preface xvii 1 Introduction 1 1.1 Linear representation of multivariate data 1 1.1.1 The general statistical setting 1 1.1.2 Dimension reduction methods 2 1.1.3 Independence as a guiding principle

More information

1 Introduction Independent component analysis (ICA) [10] is a statistical technique whose main applications are blind source separation, blind deconvo

1 Introduction Independent component analysis (ICA) [10] is a statistical technique whose main applications are blind source separation, blind deconvo The Fixed-Point Algorithm and Maximum Likelihood Estimation for Independent Component Analysis Aapo Hyvarinen Helsinki University of Technology Laboratory of Computer and Information Science P.O.Box 5400,

More information

Lecture'12:' SSMs;'Independent'Component'Analysis;' Canonical'Correla;on'Analysis'

Lecture'12:' SSMs;'Independent'Component'Analysis;' Canonical'Correla;on'Analysis' Lecture'12:' SSMs;'Independent'Component'Analysis;' Canonical'Correla;on'Analysis' Lester'Mackey' May'7,'2014' ' Stats'306B:'Unsupervised'Learning' Beyond'linearity'in'state'space'modeling' Credit:'Alex'Simma'

More information

CIFAR Lectures: Non-Gaussian statistics and natural images

CIFAR Lectures: Non-Gaussian statistics and natural images CIFAR Lectures: Non-Gaussian statistics and natural images Dept of Computer Science University of Helsinki, Finland Outline Part I: Theory of ICA Definition and difference to PCA Importance of non-gaussianity

More information

HST.582J / 6.555J / J Biomedical Signal and Image Processing Spring 2007

HST.582J / 6.555J / J Biomedical Signal and Image Processing Spring 2007 MIT OpenCourseWare http://ocw.mit.edu HST.582J / 6.555J / 16.456J Biomedical Signal and Image Processing Spring 2007 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms.

More information

LECTURE :ICA. Rita Osadchy. Based on Lecture Notes by A. Ng

LECTURE :ICA. Rita Osadchy. Based on Lecture Notes by A. Ng LECURE :ICA Rita Osadchy Based on Lecture Notes by A. Ng Cocktail Party Person 1 2 s 1 Mike 2 s 3 Person 3 1 Mike 1 s 2 Person 2 3 Mike 3 microphone signals are mied speech signals 1 2 3 ( t) ( t) ( t)

More information

Independent Component Analysis (ICA)

Independent Component Analysis (ICA) Independent Component Analysis (ICA) Université catholique de Louvain (Belgium) Machine Learning Group http://www.dice.ucl ucl.ac.be/.ac.be/mlg/ 1 Overview Uncorrelation vs Independence Blind source separation

More information

One-unit Learning Rules for Independent Component Analysis

One-unit Learning Rules for Independent Component Analysis One-unit Learning Rules for Independent Component Analysis Aapo Hyvarinen and Erkki Oja Helsinki University of Technology Laboratory of Computer and Information Science Rakentajanaukio 2 C, FIN-02150 Espoo,

More information

TWO METHODS FOR ESTIMATING OVERCOMPLETE INDEPENDENT COMPONENT BASES. Mika Inki and Aapo Hyvärinen

TWO METHODS FOR ESTIMATING OVERCOMPLETE INDEPENDENT COMPONENT BASES. Mika Inki and Aapo Hyvärinen TWO METHODS FOR ESTIMATING OVERCOMPLETE INDEPENDENT COMPONENT BASES Mika Inki and Aapo Hyvärinen Neural Networks Research Centre Helsinki University of Technology P.O. Box 54, FIN-215 HUT, Finland ABSTRACT

More information

Fundamentals of Principal Component Analysis (PCA), Independent Component Analysis (ICA), and Independent Vector Analysis (IVA)

Fundamentals of Principal Component Analysis (PCA), Independent Component Analysis (ICA), and Independent Vector Analysis (IVA) Fundamentals of Principal Component Analysis (PCA),, and Independent Vector Analysis (IVA) Dr Mohsen Naqvi Lecturer in Signal and Information Processing, School of Electrical and Electronic Engineering,

More information

Independent Components Analysis

Independent Components Analysis CS229 Lecture notes Andrew Ng Part XII Independent Components Analysis Our next topic is Independent Components Analysis (ICA). Similar to PCA, this will find a new basis in which to represent our data.

More information

Gatsby Theoretical Neuroscience Lectures: Non-Gaussian statistics and natural images Parts I-II

Gatsby Theoretical Neuroscience Lectures: Non-Gaussian statistics and natural images Parts I-II Gatsby Theoretical Neuroscience Lectures: Non-Gaussian statistics and natural images Parts I-II Gatsby Unit University College London 27 Feb 2017 Outline Part I: Theory of ICA Definition and difference

More information

Comparison of Fast ICA and Gradient Algorithms of Independent Component Analysis for Separation of Speech Signals

Comparison of Fast ICA and Gradient Algorithms of Independent Component Analysis for Separation of Speech Signals K. Mohanaprasad et.al / International Journal of Engineering and echnolog (IJE) Comparison of Fast ICA and Gradient Algorithms of Independent Component Analsis for Separation of Speech Signals K. Mohanaprasad

More information

Independent Component Analysis

Independent Component Analysis A Short Introduction to Independent Component Analysis with Some Recent Advances Aapo Hyvärinen Dept of Computer Science Dept of Mathematics and Statistics University of Helsinki Problem of blind source

More information

Independent Component Analysis

Independent Component Analysis Chapter 5 Independent Component Analysis Part I: Introduction and applications Motivation Skillikorn chapter 7 2 Cocktail party problem Did you see that Have you heard So, yesterday this guy I said, darling

More information

HST.582J/6.555J/16.456J

HST.582J/6.555J/16.456J Blind Source Separation: PCA & ICA HST.582J/6.555J/16.456J Gari D. Clifford gari [at] mit. edu http://www.mit.edu/~gari G. D. Clifford 2005-2009 What is BSS? Assume an observation (signal) is a linear

More information

Tutorial on Blind Source Separation and Independent Component Analysis

Tutorial on Blind Source Separation and Independent Component Analysis Tutorial on Blind Source Separation and Independent Component Analysis Lucas Parra Adaptive Image & Signal Processing Group Sarnoff Corporation February 09, 2002 Linear Mixtures... problem statement...

More information

Chapter 15 - BLIND SOURCE SEPARATION:

Chapter 15 - BLIND SOURCE SEPARATION: HST-582J/6.555J/16.456J Biomedical Signal and Image Processing Spr ing 2005 Chapter 15 - BLIND SOURCE SEPARATION: Principal & Independent Component Analysis c G.D. Clifford 2005 Introduction In this chapter

More information

Independent Component Analysis. PhD Seminar Jörgen Ungh

Independent Component Analysis. PhD Seminar Jörgen Ungh Independent Component Analysis PhD Seminar Jörgen Ungh Agenda Background a motivater Independence ICA vs. PCA Gaussian data ICA theory Examples Background & motivation The cocktail party problem Bla bla

More information

Independent Component Analysis (ICA) Bhaskar D Rao University of California, San Diego

Independent Component Analysis (ICA) Bhaskar D Rao University of California, San Diego Independent Component Analysis (ICA) Bhaskar D Rao University of California, San Diego Email: brao@ucsdedu References 1 Hyvarinen, A, Karhunen, J, & Oja, E (2004) Independent component analysis (Vol 46)

More information

Independent Component Analysis

Independent Component Analysis 1 Independent Component Analysis Background paper: http://www-stat.stanford.edu/ hastie/papers/ica.pdf 2 ICA Problem X = AS where X is a random p-vector representing multivariate input measurements. S

More information

Independent component analysis: an introduction

Independent component analysis: an introduction Research Update 59 Techniques & Applications Independent component analysis: an introduction James V. Stone Independent component analysis (ICA) is a method for automatically identifying the underlying

More information

MTTS1 Dimensionality Reduction and Visualization Spring 2014 Jaakko Peltonen

MTTS1 Dimensionality Reduction and Visualization Spring 2014 Jaakko Peltonen MTTS1 Dimensionality Reduction and Visualization Spring 2014 Jaakko Peltonen Lecture 3: Linear feature extraction Feature extraction feature extraction: (more general) transform the original to (k < d).

More information

Adaptive Noise Cancellation

Adaptive Noise Cancellation Adaptive Noise Cancellation P. Comon and V. Zarzoso January 5, 2010 1 Introduction In numerous application areas, including biomedical engineering, radar, sonar and digital communications, the goal is

More information

Donghoh Kim & Se-Kang Kim

Donghoh Kim & Se-Kang Kim Behav Res (202) 44:239 243 DOI 0.3758/s3428-02-093- Comparing patterns of component loadings: Principal Analysis (PCA) versus Independent Analysis (ICA) in analyzing multivariate non-normal data Donghoh

More information

Robust extraction of specific signals with temporal structure

Robust extraction of specific signals with temporal structure Robust extraction of specific signals with temporal structure Zhi-Lin Zhang, Zhang Yi Computational Intelligence Laboratory, School of Computer Science and Engineering, University of Electronic Science

More information

ACENTRAL problem in neural-network research, as well

ACENTRAL problem in neural-network research, as well 626 IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL. 10, NO. 3, MAY 1999 Fast and Robust Fixed-Point Algorithms for Independent Component Analysis Aapo Hyvärinen Abstract Independent component analysis (ICA)

More information

CONTROL SYSTEMS ANALYSIS VIA BLIND SOURCE DECONVOLUTION. Kenji Sugimoto and Yoshito Kikkawa

CONTROL SYSTEMS ANALYSIS VIA BLIND SOURCE DECONVOLUTION. Kenji Sugimoto and Yoshito Kikkawa CONTROL SYSTEMS ANALYSIS VIA LIND SOURCE DECONVOLUTION Kenji Sugimoto and Yoshito Kikkawa Nara Institute of Science and Technology Graduate School of Information Science 896-5 Takayama-cho, Ikoma-city,

More information

Comparative Analysis of ICA Based Features

Comparative Analysis of ICA Based Features International Journal of Emerging Engineering Research and Technology Volume 2, Issue 7, October 2014, PP 267-273 ISSN 2349-4395 (Print) & ISSN 2349-4409 (Online) Comparative Analysis of ICA Based Features

More information

Single Channel Signal Separation Using MAP-based Subspace Decomposition

Single Channel Signal Separation Using MAP-based Subspace Decomposition Single Channel Signal Separation Using MAP-based Subspace Decomposition Gil-Jin Jang, Te-Won Lee, and Yung-Hwan Oh 1 Spoken Language Laboratory, Department of Computer Science, KAIST 373-1 Gusong-dong,

More information

PROPERTIES OF THE EMPIRICAL CHARACTERISTIC FUNCTION AND ITS APPLICATION TO TESTING FOR INDEPENDENCE. Noboru Murata

PROPERTIES OF THE EMPIRICAL CHARACTERISTIC FUNCTION AND ITS APPLICATION TO TESTING FOR INDEPENDENCE. Noboru Murata ' / PROPERTIES OF THE EMPIRICAL CHARACTERISTIC FUNCTION AND ITS APPLICATION TO TESTING FOR INDEPENDENCE Noboru Murata Waseda University Department of Electrical Electronics and Computer Engineering 3--

More information

MULTICHANNEL SIGNAL PROCESSING USING SPATIAL RANK COVARIANCE MATRICES

MULTICHANNEL SIGNAL PROCESSING USING SPATIAL RANK COVARIANCE MATRICES MULTICHANNEL SIGNAL PROCESSING USING SPATIAL RANK COVARIANCE MATRICES S. Visuri 1 H. Oja V. Koivunen 1 1 Signal Processing Lab. Dept. of Statistics Tampere Univ. of Technology University of Jyväskylä P.O.

More information

An Introduction to Independent Components Analysis (ICA)

An Introduction to Independent Components Analysis (ICA) An Introduction to Independent Components Analysis (ICA) Anish R. Shah, CFA Northfield Information Services Anish@northinfo.com Newport Jun 6, 2008 1 Overview of Talk Review principal components Introduce

More information

ADAPTIVE LATERAL INHIBITION FOR NON-NEGATIVE ICA. Mark Plumbley

ADAPTIVE LATERAL INHIBITION FOR NON-NEGATIVE ICA. Mark Plumbley Submitteed to the International Conference on Independent Component Analysis and Blind Signal Separation (ICA2) ADAPTIVE LATERAL INHIBITION FOR NON-NEGATIVE ICA Mark Plumbley Audio & Music Lab Department

More information

A Constrained EM Algorithm for Independent Component Analysis

A Constrained EM Algorithm for Independent Component Analysis LETTER Communicated by Hagai Attias A Constrained EM Algorithm for Independent Component Analysis Max Welling Markus Weber California Institute of Technology, Pasadena, CA 91125, U.S.A. We introduce a

More information

Independent Component Analysis and Unsupervised Learning. Jen-Tzung Chien

Independent Component Analysis and Unsupervised Learning. Jen-Tzung Chien Independent Component Analysis and Unsupervised Learning Jen-Tzung Chien TABLE OF CONTENTS 1. Independent Component Analysis 2. Case Study I: Speech Recognition Independent voices Nonparametric likelihood

More information

1 Introduction. 2 Data Set and Linear Spectral Analysis

1 Introduction. 2 Data Set and Linear Spectral Analysis Analogical model for self-sustained sounds generated by organ pipe E. DE LAURO, S. DE MARTINO, M. FALANGA Department of Physics Salerno University Via S. Allende, I-848, Baronissi (SA) ITALY Abstract:

More information

Recursive Generalized Eigendecomposition for Independent Component Analysis

Recursive Generalized Eigendecomposition for Independent Component Analysis Recursive Generalized Eigendecomposition for Independent Component Analysis Umut Ozertem 1, Deniz Erdogmus 1,, ian Lan 1 CSEE Department, OGI, Oregon Health & Science University, Portland, OR, USA. {ozertemu,deniz}@csee.ogi.edu

More information

Dimensionality Reduction. CS57300 Data Mining Fall Instructor: Bruno Ribeiro

Dimensionality Reduction. CS57300 Data Mining Fall Instructor: Bruno Ribeiro Dimensionality Reduction CS57300 Data Mining Fall 2016 Instructor: Bruno Ribeiro Goal } Visualize high dimensional data (and understand its Geometry) } Project the data into lower dimensional spaces }

More information

Learning features by contrasting natural images with noise

Learning features by contrasting natural images with noise Learning features by contrasting natural images with noise Michael Gutmann 1 and Aapo Hyvärinen 12 1 Dept. of Computer Science and HIIT, University of Helsinki, P.O. Box 68, FIN-00014 University of Helsinki,

More information

An Improved Cumulant Based Method for Independent Component Analysis

An Improved Cumulant Based Method for Independent Component Analysis An Improved Cumulant Based Method for Independent Component Analysis Tobias Blaschke and Laurenz Wiskott Institute for Theoretical Biology Humboldt University Berlin Invalidenstraße 43 D - 0 5 Berlin Germany

More information

CPSC 340: Machine Learning and Data Mining. More PCA Fall 2017

CPSC 340: Machine Learning and Data Mining. More PCA Fall 2017 CPSC 340: Machine Learning and Data Mining More PCA Fall 2017 Admin Assignment 4: Due Friday of next week. No class Monday due to holiday. There will be tutorials next week on MAP/PCA (except Monday).

More information

PCA & ICA. CE-717: Machine Learning Sharif University of Technology Spring Soleymani

PCA & ICA. CE-717: Machine Learning Sharif University of Technology Spring Soleymani PCA & ICA CE-717: Machine Learning Sharif University of Technology Spring 2015 Soleymani Dimensionality Reduction: Feature Selection vs. Feature Extraction Feature selection Select a subset of a given

More information

Estimation of linear non-gaussian acyclic models for latent factors

Estimation of linear non-gaussian acyclic models for latent factors Estimation of linear non-gaussian acyclic models for latent factors Shohei Shimizu a Patrik O. Hoyer b Aapo Hyvärinen b,c a The Institute of Scientific and Industrial Research, Osaka University Mihogaoka

More information

Discriminative Direction for Kernel Classifiers

Discriminative Direction for Kernel Classifiers Discriminative Direction for Kernel Classifiers Polina Golland Artificial Intelligence Lab Massachusetts Institute of Technology Cambridge, MA 02139 polina@ai.mit.edu Abstract In many scientific and engineering

More information

The Neural Impulse Response Filter

The Neural Impulse Response Filter The Neural Impulse Response Filter Volker Tresp a, Ira Leuthäusser a, Martin Schlang a, Ralph Neuneier b, Klaus Abraham- Fuchs c and Wolfgang Härer c a Siemens AG, Central Research and Development, Otto-Hahn-Ring

More information

Different Estimation Methods for the Basic Independent Component Analysis Model

Different Estimation Methods for the Basic Independent Component Analysis Model Washington University in St. Louis Washington University Open Scholarship Arts & Sciences Electronic Theses and Dissertations Arts & Sciences Winter 12-2018 Different Estimation Methods for the Basic Independent

More information

Blind separation of sources that have spatiotemporal variance dependencies

Blind separation of sources that have spatiotemporal variance dependencies Blind separation of sources that have spatiotemporal variance dependencies Aapo Hyvärinen a b Jarmo Hurri a a Neural Networks Research Centre, Helsinki University of Technology, Finland b Helsinki Institute

More information

Unsupervised learning: beyond simple clustering and PCA

Unsupervised learning: beyond simple clustering and PCA Unsupervised learning: beyond simple clustering and PCA Liza Rebrova Self organizing maps (SOM) Goal: approximate data points in R p by a low-dimensional manifold Unlike PCA, the manifold does not have

More information

A GUIDE TO INDEPENDENT COMPONENT ANALYSIS Theory and Practice

A GUIDE TO INDEPENDENT COMPONENT ANALYSIS Theory and Practice CONTROL ENGINEERING LABORATORY A GUIDE TO INDEPENDENT COMPONENT ANALYSIS Theory and Practice Jelmer van Ast and Mika Ruusunen Report A No 3, March 004 University of Oulu Control Engineering Laboratory

More information

Artifact Extraction from EEG Data Using Independent Component Analysis

Artifact Extraction from EEG Data Using Independent Component Analysis The University of Kansas Technical Report Artifact Extraction from EEG Data Using Independent Component Analysis Shadab Mozaffar David W. Petr ITTC-FY2003-TR-03050-02 December 2002 Copyright 2002: The

More information

A METHOD OF ICA IN TIME-FREQUENCY DOMAIN

A METHOD OF ICA IN TIME-FREQUENCY DOMAIN A METHOD OF ICA IN TIME-FREQUENCY DOMAIN Shiro Ikeda PRESTO, JST Hirosawa 2-, Wako, 35-98, Japan Shiro.Ikeda@brain.riken.go.jp Noboru Murata RIKEN BSI Hirosawa 2-, Wako, 35-98, Japan Noboru.Murata@brain.riken.go.jp

More information

A Canonical Genetic Algorithm for Blind Inversion of Linear Channels

A Canonical Genetic Algorithm for Blind Inversion of Linear Channels A Canonical Genetic Algorithm for Blind Inversion of Linear Channels Fernando Rojas, Jordi Solé-Casals, Enric Monte-Moreno 3, Carlos G. Puntonet and Alberto Prieto Computer Architecture and Technology

More information

Non-orthogonal Support-Width ICA

Non-orthogonal Support-Width ICA ESANN'6 proceedings - European Symposium on Artificial Neural Networks Bruges (Belgium), 6-8 April 6, d-side publi., ISBN -9337-6-4. Non-orthogonal Support-Width ICA John A. Lee, Frédéric Vrins and Michel

More information

Non-Euclidean Independent Component Analysis and Oja's Learning

Non-Euclidean Independent Component Analysis and Oja's Learning Non-Euclidean Independent Component Analysis and Oja's Learning M. Lange 1, M. Biehl 2, and T. Villmann 1 1- University of Appl. Sciences Mittweida - Dept. of Mathematics Mittweida, Saxonia - Germany 2-

More information

Analytical solution of the blind source separation problem using derivatives

Analytical solution of the blind source separation problem using derivatives Analytical solution of the blind source separation problem using derivatives Sebastien Lagrange 1,2, Luc Jaulin 2, Vincent Vigneron 1, and Christian Jutten 1 1 Laboratoire Images et Signaux, Institut National

More information

On Information Maximization and Blind Signal Deconvolution

On Information Maximization and Blind Signal Deconvolution On Information Maximization and Blind Signal Deconvolution A Röbel Technical University of Berlin, Institute of Communication Sciences email: roebel@kgwtu-berlinde Abstract: In the following paper we investigate

More information

Independent Component Analysis and Unsupervised Learning

Independent Component Analysis and Unsupervised Learning Independent Component Analysis and Unsupervised Learning Jen-Tzung Chien National Cheng Kung University TABLE OF CONTENTS 1. Independent Component Analysis 2. Case Study I: Speech Recognition Independent

More information

Blind Source Separation Using Artificial immune system

Blind Source Separation Using Artificial immune system American Journal of Engineering Research (AJER) e-issn : 2320-0847 p-issn : 2320-0936 Volume-03, Issue-02, pp-240-247 www.ajer.org Research Paper Open Access Blind Source Separation Using Artificial immune

More information

CPSC 340: Machine Learning and Data Mining. Sparse Matrix Factorization Fall 2018

CPSC 340: Machine Learning and Data Mining. Sparse Matrix Factorization Fall 2018 CPSC 340: Machine Learning and Data Mining Sparse Matrix Factorization Fall 2018 Last Time: PCA with Orthogonal/Sequential Basis When k = 1, PCA has a scaling problem. When k > 1, have scaling, rotation,

More information

Independent component analysis and blind source separation

Independent component analysis and blind source separation Chapter 3 Independent component analysis and blind source separation Erkki Oja, Juha Karhunen, Alexander Ilin, Antti Honkela, Karthikesh Raju, Tomas Ukkonen, Zhirong Yang, Zhijian Yuan 79 80 Independent

More information

Independent Component Analysis and Blind Source Separation

Independent Component Analysis and Blind Source Separation Independent Component Analysis and Blind Source Separation Aapo Hyvärinen University of Helsinki and Helsinki Institute of Information Technology 1 Blind source separation Four source signals : 1.5 2 3

More information

ICA and ISA Using Schweizer-Wolff Measure of Dependence

ICA and ISA Using Schweizer-Wolff Measure of Dependence Keywords: independent component analysis, independent subspace analysis, copula, non-parametric estimation of dependence Abstract We propose a new algorithm for independent component and independent subspace

More information

Final Report For Undergraduate Research Opportunities Project Name: Biomedical Signal Processing in EEG. Zhang Chuoyao 1 and Xu Jianxin 2

Final Report For Undergraduate Research Opportunities Project Name: Biomedical Signal Processing in EEG. Zhang Chuoyao 1 and Xu Jianxin 2 ABSTRACT Final Report For Undergraduate Research Opportunities Project Name: Biomedical Signal Processing in EEG Zhang Chuoyao 1 and Xu Jianxin 2 Department of Electrical and Computer Engineering, National

More information

Blind Machine Separation Te-Won Lee

Blind Machine Separation Te-Won Lee Blind Machine Separation Te-Won Lee University of California, San Diego Institute for Neural Computation Blind Machine Separation Problem we want to solve: Single microphone blind source separation & deconvolution

More information

Frequency-Based Separation of Climate Signals

Frequency-Based Separation of Climate Signals Frequency-Based Separation of Climate Signals Alexander Ilin 1 and Harri Valpola 2 1 Helsinki University of Technology, Neural Networks Research Centre, P.O. Box 5400, FI-02015 TKK, Espoo, Finland Alexander.Ilin@tkk.fi

More information

Emergence of Phase- and Shift-Invariant Features by Decomposition of Natural Images into Independent Feature Subspaces

Emergence of Phase- and Shift-Invariant Features by Decomposition of Natural Images into Independent Feature Subspaces LETTER Communicated by Bartlett Mel Emergence of Phase- and Shift-Invariant Features by Decomposition of Natural Images into Independent Feature Subspaces Aapo Hyvärinen Patrik Hoyer Helsinki University

More information

Natural Image Statistics

Natural Image Statistics Natural Image Statistics A probabilistic approach to modelling early visual processing in the cortex Dept of Computer Science Early visual processing LGN V1 retina From the eye to the primary visual cortex

More information

From independent component analysis to score matching

From independent component analysis to score matching From independent component analysis to score matching Aapo Hyvärinen Dept of Computer Science & HIIT Dept of Mathematics and Statistics University of Helsinki Finland 1 Abstract First, short introduction

More information

Signal Modeling Techniques in Speech Recognition. Hassan A. Kingravi

Signal Modeling Techniques in Speech Recognition. Hassan A. Kingravi Signal Modeling Techniques in Speech Recognition Hassan A. Kingravi Outline Introduction Spectral Shaping Spectral Analysis Parameter Transforms Statistical Modeling Discussion Conclusions 1: Introduction

More information

Semiblind Source Separation of Climate Data Detects El Niño as the Component with the Highest Interannual Variability

Semiblind Source Separation of Climate Data Detects El Niño as the Component with the Highest Interannual Variability Semiblind Source Separation of Climate Data Detects El Niño as the Component with the Highest Interannual Variability Alexander Ilin Neural Networks Research Centre Helsinki University of Technology P.O.

More information

ICA [6] ICA) [7, 8] ICA ICA ICA [9, 10] J-F. Cardoso. [13] Matlab ICA. Comon[3], Amari & Cardoso[4] ICA ICA

ICA [6] ICA) [7, 8] ICA ICA ICA [9, 10] J-F. Cardoso. [13] Matlab ICA. Comon[3], Amari & Cardoso[4] ICA ICA 16 1 (Independent Component Analysis: ICA) 198 9 ICA ICA ICA 1 ICA 198 Jutten Herault Comon[3], Amari & Cardoso[4] ICA Comon (PCA) projection persuit projection persuit ICA ICA ICA 1 [1] [] ICA ICA EEG

More information

CONVOLUTIVE NON-NEGATIVE MATRIX FACTORISATION WITH SPARSENESS CONSTRAINT

CONVOLUTIVE NON-NEGATIVE MATRIX FACTORISATION WITH SPARSENESS CONSTRAINT CONOLUTIE NON-NEGATIE MATRIX FACTORISATION WITH SPARSENESS CONSTRAINT Paul D. O Grady Barak A. Pearlmutter Hamilton Institute National University of Ireland, Maynooth Co. Kildare, Ireland. ABSTRACT Discovering

More information

x 1 (t) Spectrogram t s

x 1 (t) Spectrogram t s A METHOD OF ICA IN TIME-FREQUENCY DOMAIN Shiro Ikeda PRESTO, JST Hirosawa 2-, Wako, 35-98, Japan Shiro.Ikeda@brain.riken.go.jp Noboru Murata RIKEN BSI Hirosawa 2-, Wako, 35-98, Japan Noboru.Murata@brain.riken.go.jp

More information

DETECTING PROCESS STATE CHANGES BY NONLINEAR BLIND SOURCE SEPARATION. Alexandre Iline, Harri Valpola and Erkki Oja

DETECTING PROCESS STATE CHANGES BY NONLINEAR BLIND SOURCE SEPARATION. Alexandre Iline, Harri Valpola and Erkki Oja DETECTING PROCESS STATE CHANGES BY NONLINEAR BLIND SOURCE SEPARATION Alexandre Iline, Harri Valpola and Erkki Oja Laboratory of Computer and Information Science Helsinki University of Technology P.O.Box

More information

New Approximations of Differential Entropy for Independent Component Analysis and Projection Pursuit

New Approximations of Differential Entropy for Independent Component Analysis and Projection Pursuit New Approximations of Differential Entropy for Independent Component Analysis and Projection Pursuit Aapo Hyvarinen Helsinki University of Technology Laboratory of Computer and Information Science P.O.

More information

Independent Component Analysis of Incomplete Data

Independent Component Analysis of Incomplete Data Independent Component Analysis of Incomplete Data Max Welling Markus Weber California Institute of Technology 136-93 Pasadena, CA 91125 fwelling,rmwg@vision.caltech.edu Keywords: EM, Missing Data, ICA

More information

Independent Component Analysis and FastICA. Copyright Changwei Xiong June last update: July 7, 2016

Independent Component Analysis and FastICA. Copyright Changwei Xiong June last update: July 7, 2016 Independent Component Analysis and FastICA Copyright Changwei Xiong 016 June 016 last update: July 7, 016 TABLE OF CONTENTS Table of Contents...1 1. Introduction.... Independence by Non-gaussianity....1.

More information

ARTEFACT DETECTION IN ASTROPHYSICAL IMAGE DATA USING INDEPENDENT COMPONENT ANALYSIS. Maria Funaro, Erkki Oja, and Harri Valpola

ARTEFACT DETECTION IN ASTROPHYSICAL IMAGE DATA USING INDEPENDENT COMPONENT ANALYSIS. Maria Funaro, Erkki Oja, and Harri Valpola ARTEFACT DETECTION IN ASTROPHYSICAL IMAGE DATA USING INDEPENDENT COMPONENT ANALYSIS Maria Funaro, Erkki Oja, and Harri Valpola Neural Networks Research Centre, Helsinki University of Technology P.O.Box

More information

STATS 306B: Unsupervised Learning Spring Lecture 12 May 7

STATS 306B: Unsupervised Learning Spring Lecture 12 May 7 STATS 306B: Unsupervised Learning Spring 2014 Lecture 12 May 7 Lecturer: Lester Mackey Scribe: Lan Huong, Snigdha Panigrahi 12.1 Beyond Linear State Space Modeling Last lecture we completed our discussion

More information

ECE521 week 3: 23/26 January 2017

ECE521 week 3: 23/26 January 2017 ECE521 week 3: 23/26 January 2017 Outline Probabilistic interpretation of linear regression - Maximum likelihood estimation (MLE) - Maximum a posteriori (MAP) estimation Bias-variance trade-off Linear

More information

Independent Component Analysis

Independent Component Analysis Independent Component Analysis Philippe B. Laval KSU Fall 2017 Philippe B. Laval (KSU) ICA Fall 2017 1 / 18 Introduction Independent Component Analysis (ICA) falls under the broader topic of Blind Source

More information

Linear Models for Regression CS534

Linear Models for Regression CS534 Linear Models for Regression CS534 Prediction Problems Predict housing price based on House size, lot size, Location, # of rooms Predict stock price based on Price history of the past month Predict the

More information

Short-Time ICA for Blind Separation of Noisy Speech

Short-Time ICA for Blind Separation of Noisy Speech Short-Time ICA for Blind Separation of Noisy Speech Jing Zhang, P.C. Ching Department of Electronic Engineering The Chinese University of Hong Kong, Hong Kong jzhang@ee.cuhk.edu.hk, pcching@ee.cuhk.edu.hk

More information

Using Kernel PCA for Initialisation of Variational Bayesian Nonlinear Blind Source Separation Method

Using Kernel PCA for Initialisation of Variational Bayesian Nonlinear Blind Source Separation Method Using Kernel PCA for Initialisation of Variational Bayesian Nonlinear Blind Source Separation Method Antti Honkela 1, Stefan Harmeling 2, Leo Lundqvist 1, and Harri Valpola 1 1 Helsinki University of Technology,

More information

Distinguishing Causes from Effects using Nonlinear Acyclic Causal Models

Distinguishing Causes from Effects using Nonlinear Acyclic Causal Models JMLR Workshop and Conference Proceedings 6:17 164 NIPS 28 workshop on causality Distinguishing Causes from Effects using Nonlinear Acyclic Causal Models Kun Zhang Dept of Computer Science and HIIT University

More information

Natural Gradient Learning for Over- and Under-Complete Bases in ICA

Natural Gradient Learning for Over- and Under-Complete Bases in ICA NOTE Communicated by Jean-François Cardoso Natural Gradient Learning for Over- and Under-Complete Bases in ICA Shun-ichi Amari RIKEN Brain Science Institute, Wako-shi, Hirosawa, Saitama 351-01, Japan Independent

More information