DIMENSIONALITY REDUCTION METHODS IN INDEPENDENT SUBSPACE ANALYSIS FOR SIGNAL DETECTION. Mijail Guillemard, Armin Iske, Sara Krause-Solberg
|
|
- Audra Byrd
- 5 years ago
- Views:
Transcription
1 DIMENSIONALIY EDUCION MEHODS IN INDEPENDEN SUBSPACE ANALYSIS FO SIGNAL DEECION Mijail Guillemard, Armin Iske, Sara Krause-Solberg Department of Mathematics, University of Hamburg, {guillemard, iske, ABSAC In the last few years an important family of methods for singlechannel signal separation has been developed using tools from time-frequency analysis. Given a mixture of signals s = i s i, the task is to estimate the components s i with some assumptions on their time-frequency or statistical characteristics. A well known strategy, sometimes denominated independent subspace analysis (ISA), is to reduce the embedding dimension of the time-frequency representation of s, prior to the application of independent component analysis (ICA). In these methods, a standard strategy for dimensionality reduction is principal component analysis (PCA), but other nonlinear methods have also been proposed over the last few years. In this paper, we compare different dimensionality reduction methods for single channel signal separation in the context of ISA. Our focus is on signals with transitory components, and the objective is to detect the time behavior of each individual signal s i. Keywords Dimensionality eduction, Wavelet, Gabor transforms, SF, Signal Separation, Independent Subspace Analysis. 1. INODUCION Signal separation is a central topic in many engineering fields and its modern development depends on new experimental and empirical insights together with modern mathematical tools In the last decade, different approaches have been proposed for dealing with the blind source separation of single channel signals. A strategy that has benn proposed in [2, 3] integrates the well known method of independent component analysis (ICA) with time-frequency transforms. A crucial step in this framework is to reduce the dimensionality of the data prior to the application of ICA. In recent developments of data analysis new strategies for dimensionality reduction have been inspired by geometrical and topological concepts [4]. New algorithms based on differential geometry are Whitney embedding based methods, Isomap, LSA, Laplacian eigenmaps, iemannian normal coordinates, to mention but a few. In parallel developments, probabilistic conditions and numerical algorithms (e.g. persistent homology) have provided new tools for reconstructing the homology of a manifold M n from a finite dataset X = {x i } m i=1 hese techniques have also delived new strategies for cluster analysis of point cloud data. he objective of this paper is to use these new developments in the framework of time-frequency signal separation based on independent subspace analysis. Our main contributions are the evaluation of different dimensionality reduction techniques, together with cluster techniques for improving the quality of the signal separation. he outline of this paper is as follows. In Section 4, we describe the basic elements of time-frequency independent subspace analysis. In Section 2, we describe the basic elements of our modulation map framework based on dimensionality reduction. An important component of our work are efficient interpolation techniques based on radial basis functions that we discuss in Section??. 2. DIMENSIONALIY EDUCION AND SIGNAL POCESSING Due to the high dimensionality of the time-frequency data, it is of interest to work with analysis techniques that combine signal processing transforms with dimensionality reduction methods. In this case, the basic objects are the manifold M, the data samples X = {x i } m i=1 taken from M, and a diffeomorphism A : Ω M, where Ω is the low-dimensional copy of M to be reconstructed via dimensionality reduction. Here, the only algorithmic input is the dataset X, but with the assumption that we can reconstruct topological information of M with X (see for instance [5]). he other basic object in our scheme is a signal processing map : M M, which may be based on Fourier analysis, wavelet transforms, or convolution filters, together with the resulting set M := { (p), p M} of transformed data. he following diagram shows the basic situation. Ω d Ω d A X M n (X) M n he main objective is to find an approximation of Ω, denoted Ω = (M ), by using a suitable dimensionality reduction map. Some properties of Ω and Ω may differ depending on the dimensionality reduction technique, but the target is to
2 construct Ω, so that geometrical and topological properties of Ω are recovered. In Section??, we use a particular modulation map A and we study the geometrical effects being incurred by several dimensionality reduction maps : M Ω. Dim ed Since real life data is multifaceted and complex a given data set remains even after discretization (on a set of certain observations) very high dimensional. Analysing and interpreting this kind of datasets pose some mathematical and computational challenges and might cause the failure of traditional statistical methods. We observe that in many cases not all information contained in the data points are relevant for understanding the underlying characteristics or properties of the data. Also low dimensional datasets are much easier to operate with in case of classification, visualization or compression. As a consequence we would like to reduce the dimensionality of the data. his is where dimensionality reduction comes in. Dimensionality reduction is an embedding of the data into a significant manifold of fewer dimension within the higher dimensional space in order to encode important information of the dataset. his lower dimension should ideally correspond to the intrinsic dimensionality of the data [?]. Mathematically the above problem can be formulated as follows: Let X = {x i } n i=1 D be a dataset of dimensionality D represented as a n D matrix. Furthermore the data is assumed to lie on or nearby a (smooth) manifold M of dimension d embedded in a D-dimensional space, with the intrinsic dimensionality of the data d being d D. hen there exists a (non-)linear mapping from X to the manifold his transformation maps the dataset X with dimensionality D into a new dataset Y with dimensionality d preserving the main structure of the data. In this setting usually neither the parameter d nor the manifold M are known. here are two major types of dimensionality reduction methods: linear and non-linear ones. In this context linearity refers to the idea that each data point on the manifold is a linear combination of the original data points, i.e. we assume the manifold M to be linear [?]. Non-linear techniques mainly base on at least one of the following qualities [?]: 1. Preservation of global properties or structures of the dataset in the low dimensional dataset 2. Preservation of local properties or structures 3. Composition of linear techniques he choice of a technique depends seriously on the concrete problem setting. ICA he input of the ICA algorithm is a Point cloud data, defined as a finite sequence of vector values, written in matrix form as X = (x 1... x m ) n m. he objective is to find a sequence of source signals S = (s 1... s m ) n m, assuming a linear dependence between X and S. By denoting the mixing matrix as W n n, this property can be expressed as: X = W S with X = (x 1... x m ), and S = (s 1... s m ). In this equation, the mixing matrix W and the source signals S are the unknown variables to be found with the ICA procedure. he second core assumption of the ICA algorithm is the statistical independence of the signals {s i } n i=1. In order to resolve this problem, a general strategy can be described with the following measure for a set of random variables Y = {y i } n i=1 : I(Y ) = D(P Y, i P Yi ) with D(f, g) = f(x)log ( ) f(x) dx. g(x) he measure I allows to compute the degree of statistical independence by comparing the joint distribution P Y, and the marginal distributions P Yi. he comparison function D, used in the measure I, is the Kullback-Leibler distance, also known as relative entropy. he function I allows to express the ICA algorithm as an optimization problem, where the solution space is the General Linear Group, defined as the set of n n invertible matrices: GL(n, ) = {A n n, det(a) 0}, with f(a) := I(A 1 X), and X = (x 1... x m ) n m : min f(a) with A GL(n, ) 3. IME-FEQUENCY ISA In our problem, we consider a bandlimited signal f L 2 () and a segmentation of its domain in such a way that small consecutive signal patches are analyzed, as routinely performed in SF or wavelet analysis. For instance, the set of signal patches can be defined as a dataset X f = {x f i }m i=1, x f i = (f(t k(i 1)+j)) n 1 j=0 n, for k N a fixed hop-size. Here, the regular sampling grid {t l } km k+n 1 l=0 is constructed when considering the Nyquist-Shannon theorem for f. he fundamental problem of signal separation has been described in different applications. A particular example are cocktail party effect problems, where f = g + h is a mixture of two signals g and h, and the objective is to separate g and h from f Our concrete acoustical example is a one-channel signal f composed of two different instruments (represented by g and h). It is reasonable to obtain sample patches x g X g and x h X h, but due to their complex frequency characteristics, an accurate separation of f, specially when g and h are played simultaneously, is a challenging problem. In the particular case of noise reduction, power spectral subtraction is a fundamental strategy which removes the noise signal g from f = g+h by subtracting the frequency content ˆf k ĥk at each frequency bin k [?]. A basic hypothesis is that the noise and clear signal vectors are orthogonal to each other. But this assumption is usually wrong,
3 and a generalized approach takes into account a more accurate geometrical relation between the noise and signal vectors [?]. In our framework we use this generalized scenario but considering point cloud data structures instead of single frequency bins. Given a signal s = s i contaning a mixture of signals s i, the accurate extraction of signals is the denominated signal separation problem. here are different variations including cocktail party effect problem, X g (X g ) X h (X h ) (X g ) (X h )) Ω g h Independent subspace analysis (ISA) is a natural generalization of independent component analysis (ICA). ecall than in ICA, we have a 4. COMPUAIONAL EXPEIMENS Fig. 2. he PCA 3D projection of the frequency content of he Isomap 3D projection of the frequency content of Fig. 1. he PCA 3D projection of the frequency content of he Isomap 3D projection of the frequency content of Laplacian LSA PCA 5. EFEENCES [1] DS Broomhead and Kirby. A new approach to dimensionality reduction: heory and algorithms. SIAM Journal on Applied Mathematics, 60(6): , Fig. 3. he PCA 3D projection of the frequency content of he Isomap 3D projection of the frequency content of [2] A. Casey and A. Westner. Separation of mixed audio sources by independent subspace analysis. In Proceedings of the International Computer Music Conference, [3] D. FitzGerald, E. Coyle, and B. Lawlor. Independent subspace analysis using locally linear embedding. In Proc. DAFx, pages Citeseer, [4] J.A. Lee and Verleysen. Nonlinear dimensionality reduction. Springer, 2007.
4 Fig. 4. he PCA 3D projection of the frequency content of he Isomap 3D projection of the frequency content of Fig. 6. he PCA 3D projection of the frequency content of he Isomap 3D projection of the frequency content of Fig. 5. he PCA 3D projection of the frequency content of he Isomap 3D projection of the frequency content of [5] P. Niyogi, S. Smale, and S. Weinberger. Finding the homology of submanifolds with high confidence from random samples. Discrete and Computational Geometry, 39(1): , Fig. 7. he PCA 3D projection of the frequency content of he Isomap 3D projection of the frequency content of
5 Fig. 8. he PCA 3D projection of the frequency content of he Isomap 3D projection of the frequency content of Fig. 10. he PCA 3D projection of the frequency content of he Isomap 3D projection of the frequency content of Fig. 9. he PCA 3D projection of the frequency content of he Isomap 3D projection of the frequency content of
Dimensionality Reduction. CS57300 Data Mining Fall Instructor: Bruno Ribeiro
Dimensionality Reduction CS57300 Data Mining Fall 2016 Instructor: Bruno Ribeiro Goal } Visualize high dimensional data (and understand its Geometry) } Project the data into lower dimensional spaces }
More informationDimension Reduction Techniques. Presented by Jie (Jerry) Yu
Dimension Reduction Techniques Presented by Jie (Jerry) Yu Outline Problem Modeling Review of PCA and MDS Isomap Local Linear Embedding (LLE) Charting Background Advances in data collection and storage
More informationNon-linear Dimensionality Reduction
Non-linear Dimensionality Reduction CE-725: Statistical Pattern Recognition Sharif University of Technology Spring 2013 Soleymani Outline Introduction Laplacian Eigenmaps Locally Linear Embedding (LLE)
More informationNonlinear Dimensionality Reduction
Nonlinear Dimensionality Reduction Piyush Rai CS5350/6350: Machine Learning October 25, 2011 Recap: Linear Dimensionality Reduction Linear Dimensionality Reduction: Based on a linear projection of the
More informationLearning from persistence diagrams
Learning from persistence diagrams Ulrich Bauer TUM July 7, 2015 ACAT final meeting IST Austria Joint work with: Jan Reininghaus, Stefan Huber, Roland Kwitt 1 / 28 ? 2 / 28 ? 2 / 28 3 / 28 3 / 28 3 / 28
More informationAdaptive Approximation Algorithms for Sparse Data Representation
Adaptive Approximation Algorithms for Sparse Data Representation M. Guillemard, D. Heinen, A. Iske, S. Krause-Solberg, and G. Plonka Abstract We survey our latest results on the development and analysis
More informationNonlinear Dimensionality Reduction. Jose A. Costa
Nonlinear Dimensionality Reduction Jose A. Costa Mathematics of Information Seminar, Dec. Motivation Many useful of signals such as: Image databases; Gene expression microarrays; Internet traffic time
More informationLearning Eigenfunctions: Links with Spectral Clustering and Kernel PCA
Learning Eigenfunctions: Links with Spectral Clustering and Kernel PCA Yoshua Bengio Pascal Vincent Jean-François Paiement University of Montreal April 2, Snowbird Learning 2003 Learning Modal Structures
More informationConnection of Local Linear Embedding, ISOMAP, and Kernel Principal Component Analysis
Connection of Local Linear Embedding, ISOMAP, and Kernel Principal Component Analysis Alvina Goh Vision Reading Group 13 October 2005 Connection of Local Linear Embedding, ISOMAP, and Kernel Principal
More informationCIFAR Lectures: Non-Gaussian statistics and natural images
CIFAR Lectures: Non-Gaussian statistics and natural images Dept of Computer Science University of Helsinki, Finland Outline Part I: Theory of ICA Definition and difference to PCA Importance of non-gaussianity
More informationAdvances in Manifold Learning Presented by: Naku Nak l Verm r a June 10, 2008
Advances in Manifold Learning Presented by: Nakul Verma June 10, 008 Outline Motivation Manifolds Manifold Learning Random projection of manifolds for dimension reduction Introduction to random projections
More informationSTATS 306B: Unsupervised Learning Spring Lecture 12 May 7
STATS 306B: Unsupervised Learning Spring 2014 Lecture 12 May 7 Lecturer: Lester Mackey Scribe: Lan Huong, Snigdha Panigrahi 12.1 Beyond Linear State Space Modeling Last lecture we completed our discussion
More informationArtificial Intelligence Module 2. Feature Selection. Andrea Torsello
Artificial Intelligence Module 2 Feature Selection Andrea Torsello We have seen that high dimensional data is hard to classify (curse of dimensionality) Often however, the data does not fill all the space
More informationGatsby Theoretical Neuroscience Lectures: Non-Gaussian statistics and natural images Parts I-II
Gatsby Theoretical Neuroscience Lectures: Non-Gaussian statistics and natural images Parts I-II Gatsby Unit University College London 27 Feb 2017 Outline Part I: Theory of ICA Definition and difference
More informationNonlinear Methods. Data often lies on or near a nonlinear low-dimensional curve aka manifold.
Nonlinear Methods Data often lies on or near a nonlinear low-dimensional curve aka manifold. 27 Laplacian Eigenmaps Linear methods Lower-dimensional linear projection that preserves distances between all
More informationLecture 3: Compressive Classification
Lecture 3: Compressive Classification Richard Baraniuk Rice University dsp.rice.edu/cs Compressive Sampling Signal Sparsity wideband signal samples large Gabor (TF) coefficients Fourier matrix Compressive
More informationLecture 7: Con3nuous Latent Variable Models
CSC2515 Fall 2015 Introduc3on to Machine Learning Lecture 7: Con3nuous Latent Variable Models All lecture slides will be available as.pdf on the course website: http://www.cs.toronto.edu/~urtasun/courses/csc2515/
More informationPHONEME CLASSIFICATION OVER THE RECONSTRUCTED PHASE SPACE USING PRINCIPAL COMPONENT ANALYSIS
PHONEME CLASSIFICATION OVER THE RECONSTRUCTED PHASE SPACE USING PRINCIPAL COMPONENT ANALYSIS Jinjin Ye jinjin.ye@mu.edu Michael T. Johnson mike.johnson@mu.edu Richard J. Povinelli richard.povinelli@mu.edu
More informationLearning gradients: prescriptive models
Department of Statistical Science Institute for Genome Sciences & Policy Department of Computer Science Duke University May 11, 2007 Relevant papers Learning Coordinate Covariances via Gradients. Sayan
More informationChanging coordinates to adapt to a map of constant rank
Introduction to Submanifolds Most manifolds of interest appear as submanifolds of others e.g. of R n. For instance S 2 is a submanifold of R 3. It can be obtained in two ways: 1 as the image of a map into
More informationData Analysis and Manifold Learning Lecture 7: Spectral Clustering
Data Analysis and Manifold Learning Lecture 7: Spectral Clustering Radu Horaud INRIA Grenoble Rhone-Alpes, France Radu.Horaud@inrialpes.fr http://perception.inrialpes.fr/ Outline of Lecture 7 What is spectral
More informationDimensionality Reduction:
Dimensionality Reduction: From Data Representation to General Framework Dong XU School of Computer Engineering Nanyang Technological University, Singapore What is Dimensionality Reduction? PCA LDA Examples:
More informationAdvanced Introduction to Machine Learning CMU-10715
Advanced Introduction to Machine Learning CMU-10715 Independent Component Analysis Barnabás Póczos Independent Component Analysis 2 Independent Component Analysis Model original signals Observations (Mixtures)
More informationCSE 291. Assignment Spectral clustering versus k-means. Out: Wed May 23 Due: Wed Jun 13
CSE 291. Assignment 3 Out: Wed May 23 Due: Wed Jun 13 3.1 Spectral clustering versus k-means Download the rings data set for this problem from the course web site. The data is stored in MATLAB format as
More informationUnsupervised Learning Techniques Class 07, 1 March 2006 Andrea Caponnetto
Unsupervised Learning Techniques 9.520 Class 07, 1 March 2006 Andrea Caponnetto About this class Goal To introduce some methods for unsupervised learning: Gaussian Mixtures, K-Means, ISOMAP, HLLE, Laplacian
More informationGraphs, Geometry and Semi-supervised Learning
Graphs, Geometry and Semi-supervised Learning Mikhail Belkin The Ohio State University, Dept of Computer Science and Engineering and Dept of Statistics Collaborators: Partha Niyogi, Vikas Sindhwani In
More informationECE 521. Lecture 11 (not on midterm material) 13 February K-means clustering, Dimensionality reduction
ECE 521 Lecture 11 (not on midterm material) 13 February 2017 K-means clustering, Dimensionality reduction With thanks to Ruslan Salakhutdinov for an earlier version of the slides Overview K-means clustering
More informationManifold Regularization
9.520: Statistical Learning Theory and Applications arch 3rd, 200 anifold Regularization Lecturer: Lorenzo Rosasco Scribe: Hooyoung Chung Introduction In this lecture we introduce a class of learning algorithms,
More informationREVIEW OF SINGLE CHANNEL SOURCE SEPARATION TECHNIQUES
REVIEW OF SINGLE CHANNEL SOURCE SEPARATION TECHNIQUES Kedar Patki University of Rochester Dept. of Electrical and Computer Engineering kedar.patki@rochester.edu ABSTRACT The paper reviews the problem of
More informationHamburger Beiträge zur Angewandten Mathematik
Hamburger Beiträge zur Angewandten Mathematik On Groupoid C -Algebras, Persistent Homology and Time-Frequency Analysis Mijail Guillemard and Armin Iske Nr. 2011-17 September 2011 On Groupoid C -Algebras,
More informationPCA & ICA. CE-717: Machine Learning Sharif University of Technology Spring Soleymani
PCA & ICA CE-717: Machine Learning Sharif University of Technology Spring 2015 Soleymani Dimensionality Reduction: Feature Selection vs. Feature Extraction Feature selection Select a subset of a given
More informationIndependent Component Analysis and Unsupervised Learning. Jen-Tzung Chien
Independent Component Analysis and Unsupervised Learning Jen-Tzung Chien TABLE OF CONTENTS 1. Independent Component Analysis 2. Case Study I: Speech Recognition Independent voices Nonparametric likelihood
More informationIndependent Component Analysis. Contents
Contents Preface xvii 1 Introduction 1 1.1 Linear representation of multivariate data 1 1.1.1 The general statistical setting 1 1.1.2 Dimension reduction methods 2 1.1.3 Independence as a guiding principle
More informationHamburger Beiträge zur Angewandten Mathematik
Hamburger Beiträge zur Angewandten Mathematik Error Estimates for Filtered Back Projection Matthias Beckmann and Armin Iske Nr. 2015-03 January 2015 Error Estimates for Filtered Back Projection Matthias
More informationStatistical Pattern Recognition
Statistical Pattern Recognition Feature Extraction Hamid R. Rabiee Jafar Muhammadi, Alireza Ghasemi, Payam Siyari Spring 2014 http://ce.sharif.edu/courses/92-93/2/ce725-2/ Agenda Dimensionality Reduction
More informationIntrinsic Structure Study on Whale Vocalizations
1 2015 DCLDE Conference Intrinsic Structure Study on Whale Vocalizations Yin Xian 1, Xiaobai Sun 2, Yuan Zhang 3, Wenjing Liao 3 Doug Nowacek 1,4, Loren Nolte 1, Robert Calderbank 1,2,3 1 Department of
More informationNonlinear Dimensionality Reduction
Outline Hong Chang Institute of Computing Technology, Chinese Academy of Sciences Machine Learning Methods (Fall 2012) Outline Outline I 1 Kernel PCA 2 Isomap 3 Locally Linear Embedding 4 Laplacian Eigenmap
More informationMassoud BABAIE-ZADEH. Blind Source Separation (BSS) and Independent Componen Analysis (ICA) p.1/39
Blind Source Separation (BSS) and Independent Componen Analysis (ICA) Massoud BABAIE-ZADEH Blind Source Separation (BSS) and Independent Componen Analysis (ICA) p.1/39 Outline Part I Part II Introduction
More informationL26: Advanced dimensionality reduction
L26: Advanced dimensionality reduction The snapshot CA approach Oriented rincipal Components Analysis Non-linear dimensionality reduction (manifold learning) ISOMA Locally Linear Embedding CSCE 666 attern
More informationSingle Channel Signal Separation Using MAP-based Subspace Decomposition
Single Channel Signal Separation Using MAP-based Subspace Decomposition Gil-Jin Jang, Te-Won Lee, and Yung-Hwan Oh 1 Spoken Language Laboratory, Department of Computer Science, KAIST 373-1 Gusong-dong,
More informationGeometric Modeling Summer Semester 2012 Linear Algebra & Function Spaces
Geometric Modeling Summer Semester 2012 Linear Algebra & Function Spaces (Recap) Announcement Room change: On Thursday, April 26th, room 024 is occupied. The lecture will be moved to room 021, E1 4 (the
More informationSTA 414/2104: Machine Learning
STA 414/2104: Machine Learning Russ Salakhutdinov Department of Computer Science! Department of Statistics! rsalakhu@cs.toronto.edu! http://www.cs.toronto.edu/~rsalakhu/ Lecture 8 Continuous Latent Variable
More informationNon-Negative Tensor Factorisation for Sound Source Separation
ISSC 2005, Dublin, Sept. -2 Non-Negative Tensor Factorisation for Sound Source Separation Derry FitzGerald, Matt Cranitch φ and Eugene Coyle* φ Dept. of Electronic Engineering, Cor Institute of Technology
More informationUnsupervised dimensionality reduction
Unsupervised dimensionality reduction Guillaume Obozinski Ecole des Ponts - ParisTech SOCN course 2014 Guillaume Obozinski Unsupervised dimensionality reduction 1/30 Outline 1 PCA 2 Kernel PCA 3 Multidimensional
More informationChapter 3. Riemannian Manifolds - I. The subject of this thesis is to extend the combinatorial curve reconstruction approach to curves
Chapter 3 Riemannian Manifolds - I The subject of this thesis is to extend the combinatorial curve reconstruction approach to curves embedded in Riemannian manifolds. A Riemannian manifold is an abstraction
More informationIndependent Component Analysis and Unsupervised Learning
Independent Component Analysis and Unsupervised Learning Jen-Tzung Chien National Cheng Kung University TABLE OF CONTENTS 1. Independent Component Analysis 2. Case Study I: Speech Recognition Independent
More informationCambridge University Press The Mathematics of Signal Processing Steven B. Damelin and Willard Miller Excerpt More information
Introduction Consider a linear system y = Φx where Φ can be taken as an m n matrix acting on Euclidean space or more generally, a linear operator on a Hilbert space. We call the vector x a signal or input,
More informationIndependent Component Analysis (ICA)
Independent Component Analysis (ICA) Seungjin Choi Department of Computer Science and Engineering Pohang University of Science and Technology 77 Cheongam-ro, Nam-gu, Pohang 37673, Korea seungjin@postech.ac.kr
More informationIntroduction to Machine Learning. PCA and Spectral Clustering. Introduction to Machine Learning, Slides: Eran Halperin
1 Introduction to Machine Learning PCA and Spectral Clustering Introduction to Machine Learning, 2013-14 Slides: Eran Halperin Singular Value Decomposition (SVD) The singular value decomposition (SVD)
More informationSparse Approximation: from Image Restoration to High Dimensional Classification
Sparse Approximation: from Image Restoration to High Dimensional Classification Bin Dong Beijing International Center for Mathematical Research Beijing Institute of Big Data Research Peking University
More informationRobust Laplacian Eigenmaps Using Global Information
Manifold Learning and its Applications: Papers from the AAAI Fall Symposium (FS-9-) Robust Laplacian Eigenmaps Using Global Information Shounak Roychowdhury ECE University of Texas at Austin, Austin, TX
More informationLECTURE NOTE #11 PROF. ALAN YUILLE
LECTURE NOTE #11 PROF. ALAN YUILLE 1. NonLinear Dimension Reduction Spectral Methods. The basic idea is to assume that the data lies on a manifold/surface in D-dimensional space, see figure (1) Perform
More informationTHE HIDDEN CONVEXITY OF SPECTRAL CLUSTERING
THE HIDDEN CONVEXITY OF SPECTRAL CLUSTERING Luis Rademacher, Ohio State University, Computer Science and Engineering. Joint work with Mikhail Belkin and James Voss This talk A new approach to multi-way
More informationFocus was on solving matrix inversion problems Now we look at other properties of matrices Useful when A represents a transformations.
Previously Focus was on solving matrix inversion problems Now we look at other properties of matrices Useful when A represents a transformations y = Ax Or A simply represents data Notion of eigenvectors,
More informationMultiscale Wavelets on Trees, Graphs and High Dimensional Data
Multiscale Wavelets on Trees, Graphs and High Dimensional Data ICML 2010, Haifa Matan Gavish (Weizmann/Stanford) Boaz Nadler (Weizmann) Ronald Coifman (Yale) Boaz Nadler Ronald Coifman Motto... the relationships
More informationSampling Signals from a Union of Subspaces
1 Sampling Signals from a Union of Subspaces Yue M. Lu and Minh N. Do I. INTRODUCTION Our entire digital revolution depends on the sampling process, which converts continuousdomain real-life signals to
More informationUniversität Potsdam Institut für Informatik Lehrstuhl Maschinelles Lernen PCA. Tobias Scheffer
Universität Potsdam Institut für Informatik Lehrstuhl Maschinelles Lernen PCA Tobias Scheffer Overview Principal Component Analysis (PCA) Kernel-PCA Fisher Linear Discriminant Analysis t-sne 2 PCA: Motivation
More informationarxiv: v1 [math.na] 26 Nov 2009
Non-convexly constrained linear inverse problems arxiv:0911.5098v1 [math.na] 26 Nov 2009 Thomas Blumensath Applied Mathematics, School of Mathematics, University of Southampton, University Road, Southampton,
More informationBeyond Scalar Affinities for Network Analysis or Vector Diffusion Maps and the Connection Laplacian
Beyond Scalar Affinities for Network Analysis or Vector Diffusion Maps and the Connection Laplacian Amit Singer Princeton University Department of Mathematics and Program in Applied and Computational Mathematics
More informationNeural coding Ecological approach to sensory coding: efficient adaptation to the natural environment
Neural coding Ecological approach to sensory coding: efficient adaptation to the natural environment Jean-Pierre Nadal CNRS & EHESS Laboratoire de Physique Statistique (LPS, UMR 8550 CNRS - ENS UPMC Univ.
More information7. Variable extraction and dimensionality reduction
7. Variable extraction and dimensionality reduction The goal of the variable selection in the preceding chapter was to find least useful variables so that it would be possible to reduce the dimensionality
More informationDimensionality Reduc1on
Dimensionality Reduc1on contd Aarti Singh Machine Learning 10-601 Nov 10, 2011 Slides Courtesy: Tom Mitchell, Eric Xing, Lawrence Saul 1 Principal Component Analysis (PCA) Principal Components are the
More informationDimension Reduction (PCA, ICA, CCA, FLD,
Dimension Reduction (PCA, ICA, CCA, FLD, Topic Models) Yi Zhang 10-701, Machine Learning, Spring 2011 April 6 th, 2011 Parts of the PCA slides are from previous 10-701 lectures 1 Outline Dimension reduction
More informationReal and Complex Independent Subspace Analysis by Generalized Variance
Real and Complex Independent Subspace Analysis by Generalized Variance Neural Information Processing Group, Department of Information Systems, Eötvös Loránd University, Budapest, Hungary ICA Research Network
More informationLeast Squares Approximation
Chapter 6 Least Squares Approximation As we saw in Chapter 5 we can interpret radial basis function interpolation as a constrained optimization problem. We now take this point of view again, but start
More informationIntroduction to Independent Component Analysis. Jingmei Lu and Xixi Lu. Abstract
Final Project 2//25 Introduction to Independent Component Analysis Abstract Independent Component Analysis (ICA) can be used to solve blind signal separation problem. In this article, we introduce definition
More informationGlobal (ISOMAP) versus Local (LLE) Methods in Nonlinear Dimensionality Reduction
Global (ISOMAP) versus Local (LLE) Methods in Nonlinear Dimensionality Reduction A presentation by Evan Ettinger on a Paper by Vin de Silva and Joshua B. Tenenbaum May 12, 2005 Outline Introduction The
More informationNon-negative Matrix Factor Deconvolution; Extraction of Multiple Sound Sources from Monophonic Inputs
MITSUBISHI ELECTRIC RESEARCH LABORATORIES http://www.merl.com Non-negative Matrix Factor Deconvolution; Extraction of Multiple Sound Sources from Monophonic Inputs Paris Smaragdis TR2004-104 September
More informationPrincipal Component Analysis (PCA)
Principal Component Analysis (PCA) Additional reading can be found from non-assessed exercises (week 8) in this course unit teaching page. Textbooks: Sect. 6.3 in [1] and Ch. 12 in [2] Outline Introduction
More informationNon-Negative Matrix Factorization And Its Application to Audio. Tuomas Virtanen Tampere University of Technology
Non-Negative Matrix Factorization And Its Application to Audio Tuomas Virtanen Tampere University of Technology tuomas.virtanen@tut.fi 2 Contents Introduction to audio signals Spectrogram representation
More informationFace Recognition Using Laplacianfaces He et al. (IEEE Trans PAMI, 2005) presented by Hassan A. Kingravi
Face Recognition Using Laplacianfaces He et al. (IEEE Trans PAMI, 2005) presented by Hassan A. Kingravi Overview Introduction Linear Methods for Dimensionality Reduction Nonlinear Methods and Manifold
More informationMACHINE LEARNING. Methods for feature extraction and reduction of dimensionality: Probabilistic PCA and kernel PCA
1 MACHINE LEARNING Methods for feature extraction and reduction of dimensionality: Probabilistic PCA and kernel PCA 2 Practicals Next Week Next Week, Practical Session on Computer Takes Place in Room GR
More informationMachine Learning: Basis and Wavelet 김화평 (CSE ) Medical Image computing lab 서진근교수연구실 Haar DWT in 2 levels
Machine Learning: Basis and Wavelet 32 157 146 204 + + + + + - + - 김화평 (CSE ) Medical Image computing lab 서진근교수연구실 7 22 38 191 17 83 188 211 71 167 194 207 135 46 40-17 18 42 20 44 31 7 13-32 + + - - +
More informationRobustness of Principal Components
PCA for Clustering An objective of principal components analysis is to identify linear combinations of the original variables that are useful in accounting for the variation in those original variables.
More informationOn Information Maximization and Blind Signal Deconvolution
On Information Maximization and Blind Signal Deconvolution A Röbel Technical University of Berlin, Institute of Communication Sciences email: roebel@kgwtu-berlinde Abstract: In the following paper we investigate
More informationRecovery of Compactly Supported Functions from Spectrogram Measurements via Lifting
Recovery of Compactly Supported Functions from Spectrogram Measurements via Lifting Mark Iwen markiwen@math.msu.edu 2017 Friday, July 7 th, 2017 Joint work with... Sami Merhi (Michigan State University)
More informationMcGill University Department of Mathematics and Statistics. Ph.D. preliminary examination, PART A. PURE AND APPLIED MATHEMATICS Paper BETA
McGill University Department of Mathematics and Statistics Ph.D. preliminary examination, PART A PURE AND APPLIED MATHEMATICS Paper BETA 17 August, 2018 1:00 p.m. - 5:00 p.m. INSTRUCTIONS: (i) This paper
More informationBi-stochastic kernels via asymmetric affinity functions
Bi-stochastic kernels via asymmetric affinity functions Ronald R. Coifman, Matthew J. Hirn Yale University Department of Mathematics P.O. Box 208283 New Haven, Connecticut 06520-8283 USA ariv:1209.0237v4
More informationMultiresolution schemes
Multiresolution schemes Fondamenti di elaborazione del segnale multi-dimensionale Multi-dimensional signal processing Stefano Ferrari Università degli Studi di Milano stefano.ferrari@unimi.it Elaborazione
More informationNMF WITH SPECTRAL AND TEMPORAL CONTINUITY CRITERIA FOR MONAURAL SOUND SOURCE SEPARATION. Julian M. Becker, Christian Sohn and Christian Rohlfing
NMF WITH SPECTRAL AND TEMPORAL CONTINUITY CRITERIA FOR MONAURAL SOUND SOURCE SEPARATION Julian M. ecker, Christian Sohn Christian Rohlfing Institut für Nachrichtentechnik RWTH Aachen University D-52056
More informationSTA 414/2104: Lecture 8
STA 414/2104: Lecture 8 6-7 March 2017: Continuous Latent Variable Models, Neural networks Delivered by Mark Ebden With thanks to Russ Salakhutdinov, Jimmy Ba and others Outline Continuous latent variable
More informationLaplacian Eigenmaps for Dimensionality Reduction and Data Representation
Introduction and Data Representation Mikhail Belkin & Partha Niyogi Department of Electrical Engieering University of Minnesota Mar 21, 2017 1/22 Outline Introduction 1 Introduction 2 3 4 Connections to
More informationDimension Reduction and Low-dimensional Embedding
Dimension Reduction and Low-dimensional Embedding Ying Wu Electrical Engineering and Computer Science Northwestern University Evanston, IL 60208 http://www.eecs.northwestern.edu/~yingwu 1/26 Dimension
More informationManifold Learning and it s application
Manifold Learning and it s application Nandan Dubey SE367 Outline 1 Introduction Manifold Examples image as vector Importance Dimension Reduction Techniques 2 Linear Methods PCA Example MDS Perception
More informationSparseness Constraints on Nonnegative Tensor Decomposition
Sparseness Constraints on Nonnegative Tensor Decomposition Na Li nali@clarksonedu Carmeliza Navasca cnavasca@clarksonedu Department of Mathematics Clarkson University Potsdam, New York 3699, USA Department
More informationPrincipal Component Analysis
B: Chapter 1 HTF: Chapter 1.5 Principal Component Analysis Barnabás Póczos University of Alberta Nov, 009 Contents Motivation PCA algorithms Applications Face recognition Facial expression recognition
More informationThe Curse of Dimensionality for Local Kernel Machines
The Curse of Dimensionality for Local Kernel Machines Yoshua Bengio, Olivier Delalleau & Nicolas Le Roux April 7th 2005 Yoshua Bengio, Olivier Delalleau & Nicolas Le Roux Snowbird Learning Workshop Perspective
More informationUnsupervised learning: beyond simple clustering and PCA
Unsupervised learning: beyond simple clustering and PCA Liza Rebrova Self organizing maps (SOM) Goal: approximate data points in R p by a low-dimensional manifold Unlike PCA, the manifold does not have
More informationConstructing Approximation Kernels for Non-Harmonic Fourier Data
Constructing Approximation Kernels for Non-Harmonic Fourier Data Aditya Viswanathan aditya.v@caltech.edu California Institute of Technology SIAM Annual Meeting 2013 July 10 2013 0 / 19 Joint work with
More informationDimensionality Reduction
Lecture 5 1 Outline 1. Overview a) What is? b) Why? 2. Principal Component Analysis (PCA) a) Objectives b) Explaining variability c) SVD 3. Related approaches a) ICA b) Autoencoders 2 Example 1: Sportsball
More informationAn Introduction to HILBERT-HUANG TRANSFORM and EMPIRICAL MODE DECOMPOSITION (HHT-EMD) Advanced Structural Dynamics (CE 20162)
An Introduction to HILBERT-HUANG TRANSFORM and EMPIRICAL MODE DECOMPOSITION (HHT-EMD) Advanced Structural Dynamics (CE 20162) M. Ahmadizadeh, PhD, PE O. Hemmati 1 Contents Scope and Goals Review on transformations
More informationBeyond the Point Cloud: From Transductive to Semi-Supervised Learning
Beyond the Point Cloud: From Transductive to Semi-Supervised Learning Vikas Sindhwani, Partha Niyogi, Mikhail Belkin Andrew B. Goldberg goldberg@cs.wisc.edu Department of Computer Sciences University of
More informationOnline Nonnegative Matrix Factorization with General Divergences
Online Nonnegative Matrix Factorization with General Divergences Vincent Y. F. Tan (ECE, Mathematics, NUS) Joint work with Renbo Zhao (NUS) and Huan Xu (GeorgiaTech) IWCT, Shanghai Jiaotong University
More informationStatistical Machine Learning
Statistical Machine Learning Christoph Lampert Spring Semester 2015/2016 // Lecture 12 1 / 36 Unsupervised Learning Dimensionality Reduction 2 / 36 Dimensionality Reduction Given: data X = {x 1,..., x
More informationLaplace-Beltrami Eigenfunctions for Deformation Invariant Shape Representation
Laplace-Beltrami Eigenfunctions for Deformation Invariant Shape Representation Author: Raif M. Rustamov Presenter: Dan Abretske Johns Hopkins 2007 Outline Motivation and Background Laplace-Beltrami Operator
More informationORTHOGONALITY-REGULARIZED MASKED NMF FOR LEARNING ON WEAKLY LABELED AUDIO DATA. Iwona Sobieraj, Lucas Rencker, Mark D. Plumbley
ORTHOGONALITY-REGULARIZED MASKED NMF FOR LEARNING ON WEAKLY LABELED AUDIO DATA Iwona Sobieraj, Lucas Rencker, Mark D. Plumbley University of Surrey Centre for Vision Speech and Signal Processing Guildford,
More informationSingle-channel source separation using non-negative matrix factorization
Single-channel source separation using non-negative matrix factorization Mikkel N. Schmidt Technical University of Denmark mns@imm.dtu.dk www.mikkelschmidt.dk DTU Informatics Department of Informatics
More informationImage Analysis & Retrieval Lec 13 - Feature Dimension Reduction
CS/EE 5590 / ENG 401 Special Topics, Spring 2018 Image Analysis & Retrieval Lec 13 - Feature Dimension Reduction Zhu Li Dept of CSEE, UMKC http://l.web.umkc.edu/lizhu Office Hour: Tue/Thr 2:30-4pm@FH560E,
More informationFuncICA for time series pattern discovery
FuncICA for time series pattern discovery Nishant Mehta and Alexander Gray Georgia Institute of Technology The problem Given a set of inherently continuous time series (e.g. EEG) Find a set of patterns
More informationSTA 414/2104: Lecture 8
STA 414/2104: Lecture 8 6-7 March 2017: Continuous Latent Variable Models, Neural networks With thanks to Russ Salakhutdinov, Jimmy Ba and others Outline Continuous latent variable models Background PCA
More information