Independent Component Analysis (ICA)
|
|
- Curtis Hood
- 5 years ago
- Views:
Transcription
1 Data Mining 160 SECTION 5A-ICA Independent Component Analysis (ICA) Independent Component Analysis (ICA) ( Independent Component Analysis:Algorithms and Applications Hyvarinen and Oja (2000)) is a variation of Principal Component Analysis (PCA).and a strong competitor to Factor Analysis. ICA is an attempt to decompose complex data into independent subparts (also known as the blind source separation problem or the cocktail party problem.) It attempts to determine the source signals S given only the observed mixtures.x (It is necessary to assume independence of source signals, i.e. the value of one signal does not give any information re other signals). Using singular value decompositon X UDV T and writing S N U and A T DVT we can write N X SA T thus each column of X is a linear combination of the columns of S. Since U is orthogonal, and assuming that the columns of X each have mean zero, this means the columns of S have zero mean, are uncorrelated, and have unit variance. p We have X i a ij S j, i 1,..., p j1 or (writing X and S as column vectors) X AS AR T RS A S for any orthogonal p p matrix R ICA assumes the S i are statistically independent (thus determining all the cross moments) rather than uncorrelated.(which determines only the second-order cross moments). Independence implies uncorrelatedness so ICA constrains the estimation procedure to give uncorrelated estimates of the independent components (this reduces the number of free parameters and thus simplifies the problem) The extra moment conditions identify A uniquely. NOTE: In Factor Analysis with q p we have q X i a ij S j, i, j1 i 1,...,p or X AS where the S are the common factors and represents unique factors. ICA can be viewed as another Factor Analysis rotation method (just like varimax or quartimax); it starts essentially from a Factor Analysis solution and looks for rotations that lead to independent components. Mills 2017 ICA 160
2 Data Mining In Factor Analysis, the S j and i are generally assumed to be Gaussian; orthogonal transformations AS of Gaussians are still Gaussian. Hence we can estimate the model only up to an orthogonal transformation. Thus A is not identifiable for independent Gaussian components. (If just one component is Gaussian, the ICA model can be estimated.) However we actually do not want Gaussian source variables (we allow at most one Gaussian source variable) because if S i are Gaussian and the mixing matrix A is orthogonal, the X i will also be Gaussian and uncorrelated with unit variance so the joint density will be completely symmetric and we will have no information on the directions of the columns of the mixing matrix A hence A cannot be estimated. We avoid this identifiability problem by assuming the S i are independent and non-gaussian so (because A is orthogonal). S A 1 X A T X We assume X has been whitened (i.e. sphered) via SVD to have CovX I; A is orthogonal and solving the ICA problem means finding an orthogonal S such that the components of S A T X are independent and non-gaussian. Writing where and setting we obtain Y W T X X AS Z A T W Y W T X W T AS A T W T S Z T S which can be more Gaussian than any of the S i. and the least Gaussian when it equals one of the S i (i.e. when only one of the elements of Z is nonzero). We want to find W so as to maximize the nongaussianity of Y; this corresponds (in the transformed coordinate systems) to Z which has only one nonzero component. Thus is one of the independent components. Y W T X Z T S Thus finding A that minimizes mutual information IS A T X requires looking for an orthogonal transformation that gives the most independence between its components. This is equivalent to minimizing the sum of the entropies of separate components of Y, which is equivalent to maximizing their departures from Gaussianity (since Gaussian variables have maximum entropy). Mills
3 Data Mining 162 There are two problems: 1. We cannot determine the variances of the independent components. Since S and A are both unknown, a scalar multiple of one S i can be cancelled out by dividing the corresponding column a i of A by the same scalar. Thus we fix the magnitude of the independent components S i - since they are all random variables, we assume each has unit variance and, since they have been centered, this means ES i 2 1. Note that we can multiply an independent component by1 without affecting the model so there is also ambiguity of sign. 2. We cannot determine the order of the independent components. Since S and A are both unknowns, we are free to change the order of the terms, setting any one of them first. Thus a permutation matrix P and its inverse can be substituted in the model to give X AP 1 PS where AP 1 is the new unknown mixing matrix to be solved for by ICA and the elements of PS are the original independent S i but in different (i.e. permuted) order. Read some required files: drive - D: code.dir - paste(drive, DATA/Data Mining R-Code, sep / ) data.dir - paste(drive, DATA/Data Mining Data, sep / ) source(paste(code.dir, BorderHist.r, sep / )) source(paste(code.dir, WaveIO.r, sep / )) library(fastica) We will create and display two signals (Figure 16) S.1 - sin((1:1000)/20) S.2 - rep((((1:200)-100)/100), 5) S - cbind(s.1, S.2) plot(s.1) plot(s.2) Mills 2017 ICA 162
4 Data Mining Figure 16. Original signals Mills
5 Data Mining 164 and rotate them: a - pi/4 A - matrix(c(cos(a), sin(a), -sin(a), cos(a)), 2, 2) X - S%*%A plot(x[,1]) plot(x[,2]) Figure 17. Rotated signals We combine them and then display them with their histograms: border.hist(s.1, S.2) border.hist(x[,1], X[,2]) Figure 18. Border histograms of the original (left) and rotated signals. Mills 2017 ICA 164
6 Data Mining Now start with the mixed signals and observe what happens to the histograms as we rotate the axes on which the signals are projected: b - pi/36 W - matrix(c(cos(b), -sin(b), sin(b), cos(b)), 2, 2) XX - X for (i in 1:9) { XX - XX%*%W border.hist(xx[,1], XX[,2]) readline( Press Enter... ) Mills
7 Data Mining 166 Figure 19. Effect of rotating the projection plane Mills 2017 ICA 166
8 Data Mining We see that for the fully mixed signals the histograms appear nearly Gaussian. As we move through the different projections the histograms move away from normality. The resulting signals are: plot(xx[,1]) plot(xx[,2]) Figure 20. Result of the ICA Now consider what happens for 3 signals - a sine function, a sawtooth, and a pair of exponentials. S.1 - sin((1:1000)/20) S.2 - rep((((1:200)-100)/100), 5) S.3 - rep(c(exp(seq(0,.99,.01)) , -exp(seq(0,.99,.01)) ), 5) S - cbind(s.1, S.2, S.3) A - matrix(runif(9), 3, 3) # Set a random mixing X - S%*%A Mills
9 Data Mining 168 Do an ICA on the mixed data: a - fastica(x, 3, alg.typ parallel, fun logcosh, alpha 1, method R, row.norm FALSE, maxit 200, tol , verbose TRUE) Whitening Symmetric FastICA using logcosh approx. to neg-entropy function Iteration 1 tol Iteration 2 tol Iteration 3 tol Iteration 4 tol e-06 We then plot the original, mixed, and recovered data: oldpar - par(mfcol c(3, 3), marc(2, 2, 2, 1)) plot(1:1000, S[,1], type l, main Original Signals, xlab, ylab ) for (i in 2:3) { plot(1:1000, S[,i ], type l, xlab, ylab ) plot(1:1000, X[,1 ], type l, main Mixed Signals, xlab, ylab ) for (i in 2:3) { plot(1:1000, X[,i], type l, xlab, ylab ) plot(1:1000, a$s[,1 ], type l, main ICA source estimates, xlab, ylab ) for (i in 2:3) { plot(1:1000, a$s[,i], type l, xlab, ylab ) par(oldpar) Figure 21. Original, mixed and recovered Mills 2017 ICA 168
10 Data Mining Repeat the process with four signals: S.1 - sin((1:1000)/20) S.2 - rep((((1:200)-100)/100), 5) s.3 - tan(seq(-pi/2.1,pi/2-.1,.0118)) S.3 - rep(s.3, 4) S.4 - rep(c(exp(seq(0,.99,.01)) , -exp(seq(0,.99,.01)) ), 5) S - cbind(s.1, S.2, S.3, S.4) (A - matrix(runif(16), 4, 4)) [,1] [,2] [,3] [,4] [1,] [2,] [3,] [4,] X - S%*%A a - fastica(x, 4, alg.typ parallel, fun logcosh, alpha 1, method R, row.norm FALSE, maxit 200, tol , verbose TRUE) Centering Whitening Symmetric FastICA using logcosh approx. to neg-entropy function Iteration 1 tol Iteration 2 tol Iteration 3 tol Iteration 4 tol Iteration 5 tol e-05 oldpar - par(mfcol c(4, 3), marc(2, 2, 2, 1)) plot(1:1000, S[,1], type l, main Original Signals, xlab, ylab ) for (i in 2:4) { plot(1:1000, S[,i ], type l, xlab, ylab ) plot(1:1000, X[,1 ], type l, main Mixed Signals, xlab, ylab ) for (i in 2:4) { plot(1:1000, X[,i], type l, xlab, ylab ) plot(1:1000, a$s[,1 ], type l, main ICA source estimates, xlab, ylab ) for (i in 2:4) { plot(1:1000, a$s[,i], type l, xlab, ylab ) par(oldpar) Mills
11 Data Mining 170 Figure 22. Mills 2017 ICA 170
12 Data Mining For this example we will look at three mixtures of 4 signals (note the warning messages):. A - matrix(runif(12), 4, 3) X - S%*%A a - fastica(x, 4, alg.typ parallel, fun logcosh, alpha 1, method R, row.norm FALSE, maxit 200, tol , verbose TRUE) n.comp is too large n.comp set to 3 Centering Whitening Symmetric FastICA using logcosh approx. to neg-entropy function Iteration 1 tol Iteration 2 tol Iteration 3 tol e-05 oldpar - par(mfcol c(4, 3), marc(2, 2, 2, 1)) plot(1:1000, S[,1], type l, main Original Signals, xlab, ylab ) for (i in 2:4) { plot(1:1000, S[,i ], type l, xlab, ylab ) plot(1:1000, X[,1 ], type l, main Mixed Signals, xlab, ylab ) for (i in 2:3) { plot(1:1000, X[,i], type l, xlab, ylab ) plot(0, type n )) # Dummy to fill plot(1:1000, a$s[,1 ], type l, main ICA source estimates, xlab, ylab ) for (i in 2:3) { plot(1:1000, a$s[,i], type l, xlab, ylab ) plot(0, type n ) # Dummy to fill par(oldpar) Mills
13 Data Mining 172 Figure 23. Mills 2017 ICA 172
14 Data Mining The next example uses ICA on sounds. This is a demonstration found at the Laboratory of Computer and Information Science (CIS) of the Department of Computer Science and Engineering at Helsinki University of Technology For this example we will need to read and write.wav files. A.wav file has the basic structure.described in the next function: read.wavz -z function(d.file)z { zz - file(d.file, rb ) # Open binary file for reading # RIFF chunk RIFF - readchar(zz,4) # Word RIFF (4) file.len - readbin(zz, integer(), 1) # Number of bytes in file (4) WAVE - readchar(zz,4) # Word WAVE (4) # FORMAT chunk fmt - readchar(zz,4) # fmt (3) len.of.format - readbin(zz, integer(), 1) # format length (40 f.one - readbin(zz, integer(), 1, size2) # Number 1 (2) Channel.numbs - readbin(zz, integer(), 1, size2) # Number of channels (2) Sample.Rate - readbin(zz, integer(), 1) # Sample rate (4) Bytes.P.Sec - readbin(zz, integer(), 1) # Bytes/sec (4) Bytes.P.Sample - readbin(zz, integer(), 1, size2) # Bytes/sample (2) Bits.P.Sample - readbin(zz, integer(), 1, size2) # Bits/sample (2) # DATA chunk DATA - readchar(zz,4) # Word DATA (4) data.len - readbin(zz, integer(), 1) # Length of data (4) bias - 2^(Bits.P.Sample - 1) wav.data - rep(0, data.len) # Create a place to store data # Read data based on above parameters wav.data - readbin(zz, integer(), data.len, sizebytes.p.sample, signedf) close(zz) # Close the file wav.data - wav.data - bias # Shift based on bias # Return the information for R list(riffriff, File.Lenfile.len,WAVEWAVE, formatfmt, len.of.formatlen.of.format,f.onef.one,channel.numbschannel.numbs, Sample.RateSample.Rate,Bytes.P.SecBytes.P.Sec, Bytes.P.SampleBytes.P.Sample, Bits.P.SampleBits.P.Sample, DATADATA, data.lendata.len, datawav.data) Set up variables for the data and create the file names for the input, mixed, and output file: numb.source - 9 in.file - matrix(0, numb.source, 1) mix.file - matrix(0, numb.source, 1) out.file - matrix(0, numb.source, 1) for (i in 1:numb.source) { in.file[i,] - paste(data.dir, /source,i,.wav, sep ) mix.file[i,] - paste(data.dir, /m,i,.wav, sep ) out.file[i,] - paste(data.dir, /s,i,.wav, sep ) Mills
15 Data Mining 174 in.wav - { for (m in 1:numb.source) { in.wav - c(in.wav, list(read.wav(in.file[m,]))) Mills 2017 ICA 174
16 Data Mining We can look at the characteristics of the file with: wav.char - function(wav) { cat( RIFF, wav$riff, \n ) cat( Length, wav$file.len, \n ) cat( Wave, wav$wave, \n ) cat( Format, wav$format, \n ) cat( Format Length, wav$len.of.format, \n ) cat( One, wav$f.one, \n ) cat( Number of Channels, wav$channel.numbs, \n ) cat( Sample Rate, wav$sample.rate, \n ) cat( Bytes/Sec, wav$bytes.p.sec, \n ) cat( Bytes/Sample, wav$bytes.p.sample, \n ) cat( Bits/Sample, wav$bits.p.sample, \n ) cat( Data, wav$data, \n ) cat( Data Length, wav$data.len, \n ) wav.char(in.wav[[1]]) RIFF RIFF Length Wave WAVE Format fmt Format Length 16 One 1 Number of Channels 1 Sample Rate 8000 Bytes/Sec 8000 Bytes/Sample 1 Bits/Sample 8 Data data Data Length Set up a random matrix for mixing: A - matrix(runif(numb.source*numb.source),numb.source,numb.source) We will create a matrix (50009) that has one source in each column: mixed - { for (i in 1:numb.source) { mixed - cbind(mixed, in.wav[[i]]$data) We multiply by the 99mixing matrix to produce a new (5000 9) matrix in which each column is a mixture of the 9 columns of the original matrix. mixed - mixed%*%a Mills
17 Data Mining 176 We now plot the resulting wave form (Figure 24). old.par - par(mfcol c(numb.source, 1)) par(marc(2,2,2,2)0.1) plot(mixed[,1], type l, main Mixed ) for (m in 2:numb.source) { plot(mixed[,m], type l ) if (dev.cur()[[1]]!1) bringtotop(whichdev.cur()) par(old.par) Figure signals mixed Mills 2017 ICA 176
18 Data Mining In order to save the signal as a.wav file, we need the header information. We cheat a little by simply using the in.wav header and replacing its data part with the mixed data. The first part of the following code simply creates a mixed list from the in list and the second part does the data replacement. mix.wav - { for (m in 1:numb.source) { mix.wav - c(mix.wav, list(in.wav[[m]])) for (m in 1:numb.source) { mix.wav[[m]]$data - mixed[,m] write.wav(mix.file[m,], mix.wav[[m]]) # Play them Use the sound library to play the mixed sound: library(sound) play(mix.file[1,]) play(mix.file[2,]) play(mix.file[3,]) play(mix.file[4,]) play(mix.file[5,]) play(mix.file[6,]) play(mix.file[7,]) play(mix.file[8,]) play(mix.file[9,]) # Unmix them We will use fastica to unmix the signals and save then play the results as we did for the mixed signal: mixed.all - { for (i in 1:numb.source) { mixed.all - cbind(mixed.all, mixed[,i]) ICA.wavs - fastica(mixed.all, numb.source, alg.typ parallel, fun logcosh, alpha 1, method R, row.norm FALSE, maxit 200, tol , verbose TRUE) # Save them new.wav - { for (m in 1:numb.source) { new.wav - c(new.wav, list(in.wav[[m]])) for (m in 1:numb.source) { new.wav[[m]]$data - 5*ICA.wavs$S[,m] write.wav(out.file[m,], new.wav[[m]]) # Play them play(out.file[1,]) Mills
19 Data Mining 178 play(out.file[2,]) play(out.file[3,]) play(out.file[4,]) play(out.file[5,]) play(out.file[6,]) play(out.file[7,]) play(out.file[8,]) play(out.file[9,]) # Plot them old.par - par(mfcol c(numb.source, 3)) par(marc(2,2,2,2)0.1) plot(in.wav[[1]]$data, type l, main Original ) for (m in 2:numb.source) { plot(in.wav[[m]]$data, type l ) plot(mixed[,1], type l, main Mixed ) for (m in 2:numb.source) { plot(mixed[,m], type l ) plot(ica.wavs$s[,1], type l, ) if (dev.cur()[[1]]!1) bringtotop(whichdev.cur()) for (m in 2:numb.source) { plot(ica.wavs$s[,m], type l ) par(old.par) Mills 2017 ICA 178
20 Data Mining Figure 25. # Original - play play(in.file[1,]) play(in.file[2,]) play(in.file[3,]) play(in.file[4,]) play(in.file[5,]) play(in.file[6,]) play(in.file[7,]) play(in.file[8,]) play(in.file[9,]) Mills
Independent Component Analysis (ICA)
SECTION 5 Independent Component Analysis (ICA) ( Independent Component Analysis:Algorithms and Applications Hyvarinen and Oja (2000)) is a variation of Principal Component Analysis (PCA).and a strong competitor
More informationLecture'12:' SSMs;'Independent'Component'Analysis;' Canonical'Correla;on'Analysis'
Lecture'12:' SSMs;'Independent'Component'Analysis;' Canonical'Correla;on'Analysis' Lester'Mackey' May'7,'2014' ' Stats'306B:'Unsupervised'Learning' Beyond'linearity'in'state'space'modeling' Credit:'Alex'Simma'
More informationIntroduction to Independent Component Analysis. Jingmei Lu and Xixi Lu. Abstract
Final Project 2//25 Introduction to Independent Component Analysis Abstract Independent Component Analysis (ICA) can be used to solve blind signal separation problem. In this article, we introduce definition
More informationIndependent Component Analysis
1 Independent Component Analysis Background paper: http://www-stat.stanford.edu/ hastie/papers/ica.pdf 2 ICA Problem X = AS where X is a random p-vector representing multivariate input measurements. S
More informationIndependent Component Analysis
A Short Introduction to Independent Component Analysis Aapo Hyvärinen Helsinki Institute for Information Technology and Depts of Computer Science and Psychology University of Helsinki Problem of blind
More informationIndependent Component Analysis and Its Applications. By Qing Xue, 10/15/2004
Independent Component Analysis and Its Applications By Qing Xue, 10/15/2004 Outline Motivation of ICA Applications of ICA Principles of ICA estimation Algorithms for ICA Extensions of basic ICA framework
More informationArtificial Intelligence Module 2. Feature Selection. Andrea Torsello
Artificial Intelligence Module 2 Feature Selection Andrea Torsello We have seen that high dimensional data is hard to classify (curse of dimensionality) Often however, the data does not fill all the space
More informationUnsupervised learning: beyond simple clustering and PCA
Unsupervised learning: beyond simple clustering and PCA Liza Rebrova Self organizing maps (SOM) Goal: approximate data points in R p by a low-dimensional manifold Unlike PCA, the manifold does not have
More informationIndependent Component Analysis
000 001 002 003 004 005 006 007 008 009 010 011 012 013 014 015 016 017 018 019 020 021 022 023 024 025 026 027 028 029 030 031 032 033 034 035 036 037 038 039 040 041 042 043 044 1 Introduction Indepent
More informationIndependent Component Analysis
Chapter 5 Independent Component Analysis Part I: Introduction and applications Motivation Skillikorn chapter 7 2 Cocktail party problem Did you see that Have you heard So, yesterday this guy I said, darling
More informationFundamentals of Principal Component Analysis (PCA), Independent Component Analysis (ICA), and Independent Vector Analysis (IVA)
Fundamentals of Principal Component Analysis (PCA),, and Independent Vector Analysis (IVA) Dr Mohsen Naqvi Lecturer in Signal and Information Processing, School of Electrical and Electronic Engineering,
More informationCIFAR Lectures: Non-Gaussian statistics and natural images
CIFAR Lectures: Non-Gaussian statistics and natural images Dept of Computer Science University of Helsinki, Finland Outline Part I: Theory of ICA Definition and difference to PCA Importance of non-gaussianity
More informationDimensionality Reduction. CS57300 Data Mining Fall Instructor: Bruno Ribeiro
Dimensionality Reduction CS57300 Data Mining Fall 2016 Instructor: Bruno Ribeiro Goal } Visualize high dimensional data (and understand its Geometry) } Project the data into lower dimensional spaces }
More informationGatsby Theoretical Neuroscience Lectures: Non-Gaussian statistics and natural images Parts I-II
Gatsby Theoretical Neuroscience Lectures: Non-Gaussian statistics and natural images Parts I-II Gatsby Unit University College London 27 Feb 2017 Outline Part I: Theory of ICA Definition and difference
More informationIndependent Component Analysis
A Short Introduction to Independent Component Analysis with Some Recent Advances Aapo Hyvärinen Dept of Computer Science Dept of Mathematics and Statistics University of Helsinki Problem of blind source
More informationPCA & ICA. CE-717: Machine Learning Sharif University of Technology Spring Soleymani
PCA & ICA CE-717: Machine Learning Sharif University of Technology Spring 2015 Soleymani Dimensionality Reduction: Feature Selection vs. Feature Extraction Feature selection Select a subset of a given
More informationIndependent Component Analysis (ICA)
Independent Component Analysis (ICA) Université catholique de Louvain (Belgium) Machine Learning Group http://www.dice.ucl ucl.ac.be/.ac.be/mlg/ 1 Overview Uncorrelation vs Independence Blind source separation
More informationICA. Independent Component Analysis. Zakariás Mátyás
ICA Independent Component Analysis Zakariás Mátyás Contents Definitions Introduction History Algorithms Code Uses of ICA Definitions ICA Miture Separation Signals typical signals Multivariate statistics
More informationIndependent component analysis: algorithms and applications
PERGAMON Neural Networks 13 (2000) 411 430 Invited article Independent component analysis: algorithms and applications A. Hyvärinen, E. Oja* Neural Networks Research Centre, Helsinki University of Technology,
More informationMulti-user FSO Communication Link
Multi-user FSO Communication Link Federica Aveta, Hazem Refai University of Oklahoma Peter LoPresti University of Tulsa Outline q MOTIVATION q BLIND SOURCE SEPARATION q INDEPENDENT COMPONENT ANALYSIS Ø
More informationSeparation of the EEG Signal using Improved FastICA Based on Kurtosis Contrast Function
Australian Journal of Basic and Applied Sciences, 5(9): 2152-2156, 211 ISSN 1991-8178 Separation of the EEG Signal using Improved FastICA Based on Kurtosis Contrast Function 1 Tahir Ahmad, 2 Hjh.Norma
More informationIndependent Component Analysis. PhD Seminar Jörgen Ungh
Independent Component Analysis PhD Seminar Jörgen Ungh Agenda Background a motivater Independence ICA vs. PCA Gaussian data ICA theory Examples Background & motivation The cocktail party problem Bla bla
More informationIndependent Component Analysis of Incomplete Data
Independent Component Analysis of Incomplete Data Max Welling Markus Weber California Institute of Technology 136-93 Pasadena, CA 91125 fwelling,rmwg@vision.caltech.edu Keywords: EM, Missing Data, ICA
More informationA GUIDE TO INDEPENDENT COMPONENT ANALYSIS Theory and Practice
CONTROL ENGINEERING LABORATORY A GUIDE TO INDEPENDENT COMPONENT ANALYSIS Theory and Practice Jelmer van Ast and Mika Ruusunen Report A No 3, March 004 University of Oulu Control Engineering Laboratory
More informationHST.582J/6.555J/16.456J
Blind Source Separation: PCA & ICA HST.582J/6.555J/16.456J Gari D. Clifford gari [at] mit. edu http://www.mit.edu/~gari G. D. Clifford 2005-2009 What is BSS? Assume an observation (signal) is a linear
More informationComparison of Fast ICA and Gradient Algorithms of Independent Component Analysis for Separation of Speech Signals
K. Mohanaprasad et.al / International Journal of Engineering and echnolog (IJE) Comparison of Fast ICA and Gradient Algorithms of Independent Component Analsis for Separation of Speech Signals K. Mohanaprasad
More informationPrincipal Component Analysis vs. Independent Component Analysis for Damage Detection
6th European Workshop on Structural Health Monitoring - Fr..D.4 Principal Component Analysis vs. Independent Component Analysis for Damage Detection D. A. TIBADUIZA, L. E. MUJICA, M. ANAYA, J. RODELLAR
More informationAn Introduction to Independent Components Analysis (ICA)
An Introduction to Independent Components Analysis (ICA) Anish R. Shah, CFA Northfield Information Services Anish@northinfo.com Newport Jun 6, 2008 1 Overview of Talk Review principal components Introduce
More informationIndependent Component Analysis and Its Application on Accelerator Physics
Independent Component Analysis and Its Application on Accelerator Physics Xiaoying Pang LA-UR-12-20069 ICA and PCA Similarities: Blind source separation method (BSS) no model Observed signals are linear
More informationIndependent Component Analysis. Contents
Contents Preface xvii 1 Introduction 1 1.1 Linear representation of multivariate data 1 1.1.1 The general statistical setting 1 1.1.2 Dimension reduction methods 2 1.1.3 Independence as a guiding principle
More informationIndependent Component Analysis and FastICA. Copyright Changwei Xiong June last update: July 7, 2016
Independent Component Analysis and FastICA Copyright Changwei Xiong 016 June 016 last update: July 7, 016 TABLE OF CONTENTS Table of Contents...1 1. Introduction.... Independence by Non-gaussianity....1.
More informationIndependent Component Analysis (ICA) Bhaskar D Rao University of California, San Diego
Independent Component Analysis (ICA) Bhaskar D Rao University of California, San Diego Email: brao@ucsdedu References 1 Hyvarinen, A, Karhunen, J, & Oja, E (2004) Independent component analysis (Vol 46)
More informationINDEPENDENT COMPONENT ANALYSIS
INDEPENDENT COMPONENT ANALYSIS A THESIS SUBMITTED IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE DEGREE OF Bachelor of Technology in Electronics and Communication Engineering Department By P. SHIVA
More informationCPSC 340: Machine Learning and Data Mining. More PCA Fall 2017
CPSC 340: Machine Learning and Data Mining More PCA Fall 2017 Admin Assignment 4: Due Friday of next week. No class Monday due to holiday. There will be tutorials next week on MAP/PCA (except Monday).
More informationNon-orthogonal Support-Width ICA
ESANN'6 proceedings - European Symposium on Artificial Neural Networks Bruges (Belgium), 6-8 April 6, d-side publi., ISBN -9337-6-4. Non-orthogonal Support-Width ICA John A. Lee, Frédéric Vrins and Michel
More informationMachine Learning (BSMC-GA 4439) Wenke Liu
Machine Learning (BSMC-GA 4439) Wenke Liu 02-01-2018 Biomedical data are usually high-dimensional Number of samples (n) is relatively small whereas number of features (p) can be large Sometimes p>>n Problems
More informationMTTS1 Dimensionality Reduction and Visualization Spring 2014 Jaakko Peltonen
MTTS1 Dimensionality Reduction and Visualization Spring 2014 Jaakko Peltonen Lecture 3: Linear feature extraction Feature extraction feature extraction: (more general) transform the original to (k < d).
More informationIndependent Component Analysis
Department of Physics Seminar I b 1st year, 2nd cycle Independent Component Analysis Author: Žiga Zaplotnik Advisor: prof. dr. Simon Širca Ljubljana, April 2014 Abstract In this seminar we present a computational
More informationTWO METHODS FOR ESTIMATING OVERCOMPLETE INDEPENDENT COMPONENT BASES. Mika Inki and Aapo Hyvärinen
TWO METHODS FOR ESTIMATING OVERCOMPLETE INDEPENDENT COMPONENT BASES Mika Inki and Aapo Hyvärinen Neural Networks Research Centre Helsinki University of Technology P.O. Box 54, FIN-215 HUT, Finland ABSTRACT
More informationIndependent Component Analysis
Independent Component Analysis Philippe B. Laval KSU Fall 2017 Philippe B. Laval (KSU) ICA Fall 2017 1 / 18 Introduction Independent Component Analysis (ICA) falls under the broader topic of Blind Source
More informationCPSC 340: Machine Learning and Data Mining. Sparse Matrix Factorization Fall 2018
CPSC 340: Machine Learning and Data Mining Sparse Matrix Factorization Fall 2018 Last Time: PCA with Orthogonal/Sequential Basis When k = 1, PCA has a scaling problem. When k > 1, have scaling, rotation,
More informationSTATS 306B: Unsupervised Learning Spring Lecture 12 May 7
STATS 306B: Unsupervised Learning Spring 2014 Lecture 12 May 7 Lecturer: Lester Mackey Scribe: Lan Huong, Snigdha Panigrahi 12.1 Beyond Linear State Space Modeling Last lecture we completed our discussion
More informationAn Improved Cumulant Based Method for Independent Component Analysis
An Improved Cumulant Based Method for Independent Component Analysis Tobias Blaschke and Laurenz Wiskott Institute for Theoretical Biology Humboldt University Berlin Invalidenstraße 43 D - 0 5 Berlin Germany
More informationPrincipal Component Analysis
Principal Component Analysis Introduction Consider a zero mean random vector R n with autocorrelation matri R = E( T ). R has eigenvectors q(1),,q(n) and associated eigenvalues λ(1) λ(n). Let Q = [ q(1)
More informationIndependent Component Analysis
Independent Component Analysis James V. Stone November 4, 24 Sheffield University, Sheffield, UK Keywords: independent component analysis, independence, blind source separation, projection pursuit, complexity
More informationDonghoh Kim & Se-Kang Kim
Behav Res (202) 44:239 243 DOI 0.3758/s3428-02-093- Comparing patterns of component loadings: Principal Analysis (PCA) versus Independent Analysis (ICA) in analyzing multivariate non-normal data Donghoh
More informationEstimation of linear non-gaussian acyclic models for latent factors
Estimation of linear non-gaussian acyclic models for latent factors Shohei Shimizu a Patrik O. Hoyer b Aapo Hyvärinen b,c a The Institute of Scientific and Industrial Research, Osaka University Mihogaoka
More informationLECTURE :ICA. Rita Osadchy. Based on Lecture Notes by A. Ng
LECURE :ICA Rita Osadchy Based on Lecture Notes by A. Ng Cocktail Party Person 1 2 s 1 Mike 2 s 3 Person 3 1 Mike 1 s 2 Person 2 3 Mike 3 microphone signals are mied speech signals 1 2 3 ( t) ( t) ( t)
More informationChapter 15 - BLIND SOURCE SEPARATION:
HST-582J/6.555J/16.456J Biomedical Signal and Image Processing Spr ing 2005 Chapter 15 - BLIND SOURCE SEPARATION: Principal & Independent Component Analysis c G.D. Clifford 2005 Introduction In this chapter
More informationSeparation of Different Voices in Speech using Fast Ica Algorithm
Volume-6, Issue-6, November-December 2016 International Journal of Engineering and Management Research Page Number: 364-368 Separation of Different Voices in Speech using Fast Ica Algorithm Dr. T.V.P Sundararajan
More informationIndependent Components Analysis
CS229 Lecture notes Andrew Ng Part XII Independent Components Analysis Our next topic is Independent Components Analysis (ICA). Similar to PCA, this will find a new basis in which to represent our data.
More informationIndependent Component Analysis of Rock Magnetic Measurements
Independent Component Analysis of Rock Magnetic Measurements Norbert Marwan March 18, 23 Title Today I will not talk about recurrence plots. Marco and Mamen will talk about them later. Moreover, for the
More informationIndependent component analysis for functional data
Independent component analysis for functional data Hannu Oja Department of Mathematics and Statistics University of Turku Version 12.8.216 August 216 Oja (UTU) FICA Date bottom 1 / 38 Outline 1 Probability
More informationBlind Source Separation Using Artificial immune system
American Journal of Engineering Research (AJER) e-issn : 2320-0847 p-issn : 2320-0936 Volume-03, Issue-02, pp-240-247 www.ajer.org Research Paper Open Access Blind Source Separation Using Artificial immune
More informationReal and Complex Independent Subspace Analysis by Generalized Variance
Real and Complex Independent Subspace Analysis by Generalized Variance Neural Information Processing Group, Department of Information Systems, Eötvös Loránd University, Budapest, Hungary ICA Research Network
More informationFinding a causal ordering via independent component analysis
Finding a causal ordering via independent component analysis Shohei Shimizu a,b,1,2 Aapo Hyvärinen a Patrik O. Hoyer a Yutaka Kano b a Helsinki Institute for Information Technology, Basic Research Unit,
More informationHST.582J / 6.555J / J Biomedical Signal and Image Processing Spring 2007
MIT OpenCourseWare http://ocw.mit.edu HST.582J / 6.555J / 16.456J Biomedical Signal and Image Processing Spring 2007 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms.
More informationProperties of Matrices and Operations on Matrices
Properties of Matrices and Operations on Matrices A common data structure for statistical analysis is a rectangular array or matris. Rows represent individual observational units, or just observations,
More informationArtifact Extraction from EEG Data Using Independent Component Analysis
The University of Kansas Technical Report Artifact Extraction from EEG Data Using Independent Component Analysis Shadab Mozaffar David W. Petr ITTC-FY2003-TR-03050-02 December 2002 Copyright 2002: The
More informationAdvanced Introduction to Machine Learning CMU-10715
Advanced Introduction to Machine Learning CMU-10715 Independent Component Analysis Barnabás Póczos Independent Component Analysis 2 Independent Component Analysis Model original signals Observations (Mixtures)
More informationA non-gaussian decomposition of Total Water Storage (TWS), using Independent Component Analysis (ICA)
Titelmaster A non-gaussian decomposition of Total Water Storage (TWS, using Independent Component Analysis (ICA Ehsan Forootan and Jürgen Kusche Astronomical Physical & Mathematical Geodesy, Bonn University
More informationA Constrained EM Algorithm for Independent Component Analysis
LETTER Communicated by Hagai Attias A Constrained EM Algorithm for Independent Component Analysis Max Welling Markus Weber California Institute of Technology, Pasadena, CA 91125, U.S.A. We introduce a
More informationFrom independent component analysis to score matching
From independent component analysis to score matching Aapo Hyvärinen Dept of Computer Science & HIIT Dept of Mathematics and Statistics University of Helsinki Finland 1 Abstract First, short introduction
More informationPackage ProDenICA. February 19, 2015
Type Package Package ProDenICA February 19, 2015 Title Product Density Estimation for ICA using tilted Gaussian density estimates Version 1.0 Date 2010-04-19 Author Trevor Hastie, Rob Tibshirani Maintainer
More informationPackage steadyica. November 11, 2015
Type Package Package steadyica November 11, 2015 Title ICA and Tests of Independence via Multivariate Distance Covariance Version 1.0 Date 2015-11-08 Author Benjamin B. Risk and Nicholas A. James and David
More informationMultivariate Random Variable
Multivariate Random Variable Author: Author: Andrés Hincapié and Linyi Cao This Version: August 7, 2016 Multivariate Random Variable 3 Now we consider models with more than one r.v. These are called multivariate
More informationUniversity of Illinois ECE 313: Final Exam Fall 2014
University of Illinois ECE 313: Final Exam Fall 2014 Monday, December 15, 2014, 7:00 p.m. 10:00 p.m. Sect. B, names A-O, 1013 ECE, names P-Z, 1015 ECE; Section C, names A-L, 1015 ECE; all others 112 Gregory
More informationMatrix Algebra 2.1 MATRIX OPERATIONS Pearson Education, Inc.
2 Matrix Algebra 2.1 MATRIX OPERATIONS MATRIX OPERATIONS m n If A is an matrixthat is, a matrix with m rows and n columnsthen the scalar entry in the ith row and jth column of A is denoted by a ij and
More informationDifferent Estimation Methods for the Basic Independent Component Analysis Model
Washington University in St. Louis Washington University Open Scholarship Arts & Sciences Electronic Theses and Dissertations Arts & Sciences Winter 12-2018 Different Estimation Methods for the Basic Independent
More informationUNIVERSITY of PENNSYLVANIA CIS 520: Machine Learning Final, Fall 2013
UNIVERSITY of PENNSYLVANIA CIS 520: Machine Learning Final, Fall 2013 Exam policy: This exam allows two one-page, two-sided cheat sheets; No other materials. Time: 2 hours. Be sure to write your name and
More informationICA and ISA Using Schweizer-Wolff Measure of Dependence
Keywords: independent component analysis, independent subspace analysis, copula, non-parametric estimation of dependence Abstract We propose a new algorithm for independent component and independent subspace
More informationCovariance and Correlation Matrix
Covariance and Correlation Matrix Given sample {x n } N 1, where x Rd, x n = x 1n x 2n. x dn sample mean x = 1 N N n=1 x n, and entries of sample mean are x i = 1 N N n=1 x in sample covariance matrix
More informationBlind Instantaneous Noisy Mixture Separation with Best Interference-plus-noise Rejection
Blind Instantaneous Noisy Mixture Separation with Best Interference-plus-noise Rejection Zbyněk Koldovský 1,2 and Petr Tichavský 1 1 Institute of Information Theory and Automation, Pod vodárenskou věží
More informationFinal Report For Undergraduate Research Opportunities Project Name: Biomedical Signal Processing in EEG. Zhang Chuoyao 1 and Xu Jianxin 2
ABSTRACT Final Report For Undergraduate Research Opportunities Project Name: Biomedical Signal Processing in EEG Zhang Chuoyao 1 and Xu Jianxin 2 Department of Electrical and Computer Engineering, National
More informationwhere A 2 IR m n is the mixing matrix, s(t) is the n-dimensional source vector (n» m), and v(t) is additive white noise that is statistically independ
BLIND SEPARATION OF NONSTATIONARY AND TEMPORALLY CORRELATED SOURCES FROM NOISY MIXTURES Seungjin CHOI x and Andrzej CICHOCKI y x Department of Electrical Engineering Chungbuk National University, KOREA
More informationLinear Regression Linear Regression with Shrinkage
Linear Regression Linear Regression ith Shrinkage Introduction Regression means predicting a continuous (usually scalar) output y from a vector of continuous inputs (features) x. Example: Predicting vehicle
More informationAnnouncements (repeat) Principal Components Analysis
4/7/7 Announcements repeat Principal Components Analysis CS 5 Lecture #9 April 4 th, 7 PA4 is due Monday, April 7 th Test # will be Wednesday, April 9 th Test #3 is Monday, May 8 th at 8AM Just hour long
More informationConcentration Ellipsoids
Concentration Ellipsoids ECE275A Lecture Supplement Fall 2008 Kenneth Kreutz Delgado Electrical and Computer Engineering Jacobs School of Engineering University of California, San Diego VERSION LSECE275CE
More informationLinear Algebra for Machine Learning. Sargur N. Srihari
Linear Algebra for Machine Learning Sargur N. srihari@cedar.buffalo.edu 1 Overview Linear Algebra is based on continuous math rather than discrete math Computer scientists have little experience with it
More informationSingle Channel Signal Separation Using MAP-based Subspace Decomposition
Single Channel Signal Separation Using MAP-based Subspace Decomposition Gil-Jin Jang, Te-Won Lee, and Yung-Hwan Oh 1 Spoken Language Laboratory, Department of Computer Science, KAIST 373-1 Gusong-dong,
More informationOne-unit Learning Rules for Independent Component Analysis
One-unit Learning Rules for Independent Component Analysis Aapo Hyvarinen and Erkki Oja Helsinki University of Technology Laboratory of Computer and Information Science Rakentajanaukio 2 C, FIN-02150 Espoo,
More informationMatrix Factorizations
1 Stat 540, Matrix Factorizations Matrix Factorizations LU Factorization Definition... Given a square k k matrix S, the LU factorization (or decomposition) represents S as the product of two triangular
More informationPCA, Kernel PCA, ICA
PCA, Kernel PCA, ICA Learning Representations. Dimensionality Reduction. Maria-Florina Balcan 04/08/2015 Big & High-Dimensional Data High-Dimensions = Lot of Features Document classification Features per
More informationOn Information Maximization and Blind Signal Deconvolution
On Information Maximization and Blind Signal Deconvolution A Röbel Technical University of Berlin, Institute of Communication Sciences email: roebel@kgwtu-berlinde Abstract: In the following paper we investigate
More informationLecture 44. Better and successive approximations x2, x3,, xn to the root are obtained from
Lecture 44 Solution of Non-Linear Equations Regula-Falsi Method Method of iteration Newton - Raphson Method Muller s Method Graeffe s Root Squaring Method Newton -Raphson Method An approximation to the
More informationExam, Solutions
Exam, - Solutions Q Constructing a balanced sequence containing three kinds of stimuli Here we design a balanced cyclic sequence for three kinds of stimuli (labeled {,, }, in which every three-element
More informationA6523 Modeling, Inference, and Mining Jim Cordes, Cornell University
A6523 Modeling, Inference, and Mining Jim Cordes, Cornell University Lecture 19 Modeling Topics plan: Modeling (linear/non- linear least squares) Bayesian inference Bayesian approaches to spectral esbmabon;
More informationMultiple Random Variables
Multiple Random Variables This Version: July 30, 2015 Multiple Random Variables 2 Now we consider models with more than one r.v. These are called multivariate models For instance: height and weight An
More informationCase Studies of Independent Component Analysis For CS383C Numerical Analysis of Linear Algebra Alan Oursland, Judah De Paula, Nasim Mahmood
Case Studies of Independent Component Analysis For CS383C Numerical Analysis of Linear Algebra Alan Oursland, Judah De Paula, Nasim Mahmood Introduction Our project does an Independent Component Analysis
More informationBLIND DECONVOLUTION ALGORITHMS FOR MIMO-FIR SYSTEMS DRIVEN BY FOURTH-ORDER COLORED SIGNALS
BLIND DECONVOLUTION ALGORITHMS FOR MIMO-FIR SYSTEMS DRIVEN BY FOURTH-ORDER COLORED SIGNALS M. Kawamoto 1,2, Y. Inouye 1, A. Mansour 2, and R.-W. Liu 3 1. Department of Electronic and Control Systems Engineering,
More informationAcoustic classification using independent component analysis
Rochester Institute of Technology RIT Scholar Works Theses Thesis/Dissertation Collections 2006 Acoustic classification using independent component analysis James Brock Follow this and additional works
More informationReview (Probability & Linear Algebra)
Review (Probability & Linear Algebra) CE-725 : Statistical Pattern Recognition Sharif University of Technology Spring 2013 M. Soleymani Outline Axioms of probability theory Conditional probability, Joint
More informationMeasurement, Scaling, and Dimensional Analysis Summer 2017 METRIC MDS IN R
Measurement, Scaling, and Dimensional Analysis Summer 2017 Bill Jacoby METRIC MDS IN R This handout shows the contents of an R session that carries out a metric multidimensional scaling analysis of the
More informationIndependent Component Analysis and Blind Source Separation
Independent Component Analysis and Blind Source Separation Aapo Hyvärinen University of Helsinki and Helsinki Institute of Information Technology 1 Blind source separation Four source signals : 1.5 2 3
More informationat Some sort of quantization is necessary to represent continuous signals in digital form
Quantization at Some sort of quantization is necessary to represent continuous signals in digital form x(n 1,n ) x(t 1,tt ) D Sampler Quantizer x q (n 1,nn ) Digitizer (A/D) Quantization is also used for
More informationFile: ica tutorial2.tex. James V Stone and John Porrill, Psychology Department, Sheeld University, Tel: Fax:
File: ica tutorial2.tex Independent Component Analysis and Projection Pursuit: A Tutorial Introduction James V Stone and John Porrill, Psychology Department, Sheeld University, Sheeld, S 2UR, England.
More informationDimensionality Reduction and Principle Components
Dimensionality Reduction and Principle Components Ken Kreutz-Delgado (Nuno Vasconcelos) UCSD ECE Department Winter 2012 Motivation Recall, in Bayesian decision theory we have: World: States Y in {1,...,
More informationLinear Equations in Linear Algebra
1 Linear Equations in Linear Algebra 1.7 LINEAR INDEPENDENCE LINEAR INDEPENDENCE Definition: An indexed set of vectors {v 1,, v p } in n is said to be linearly independent if the vector equation x x x
More informationBlind separation of sources that have spatiotemporal variance dependencies
Blind separation of sources that have spatiotemporal variance dependencies Aapo Hyvärinen a b Jarmo Hurri a a Neural Networks Research Centre, Helsinki University of Technology, Finland b Helsinki Institute
More informationLearning with Singular Vectors
Learning with Singular Vectors CIS 520 Lecture 30 October 2015 Barry Slaff Based on: CIS 520 Wiki Materials Slides by Jia Li (PSU) Works cited throughout Overview Linear regression: Given X, Y find w:
More information