Data dependent operators for the spatial-spectral fusion problem

Size: px
Start display at page:

Download "Data dependent operators for the spatial-spectral fusion problem"

Transcription

1 Data dependent operators for the spatial-spectral fusion problem Wien, December 3, 2012

2 Joint work with: University of Maryland: J. J. Benedetto, J. A. Dobrosotskaya, T. Doster, K. W. Duke, M. Ehler, A. Halevy, V. Rajapakse Naval Research Laboratory: D. Gillis

3 Outline Mathematical Techniques 1 Mathematical Techniques 2 3

4 Outline Mathematical Techniques 1 Mathematical Techniques 2 3

5 Data Organization and Manifold Learning There are many techniques for Data Organization and Manifold Learning, e.g., Principal Component Analysis (PCA), Locally Linear Embedding (LLE), Isomap, genetic algorithms, and neural networks. We are interested in a subfamily of these techniques known as Kernel Eigenmap Methods. These include Kernel PCA, LLE, Hessian LLE (HLLE), and Laplacian Eigenmaps. Kernel eigenmap methods require two steps. Given data space X of N vectors in R D. 1 Construction of an N N symmetric, positive semi-definite kernel, K, from these N data points in R D. 2 Diagonalization of K, and then choosing d D significant eigenmaps of K. These become our new coordinates, and accomplish dimensionality reduction. We are particularly interested in diffusion kernels K, which are defined by means of transition matrices.

6 Kernel Eigenmap Methods for Dimension Reduction - Kernel Construction Kernel eigenmap methods were introduced to address complexities not resolvable by linear methods. The idea behind kernel methods is to express correlations or similarities between vectors in the data space X in terms of a symmetric, positive semi-definite kernel function K : X X R. Generally, there exists a Hilbert space K and a mapping Φ : X K such that K (x, y) = Φ(x), Φ(y). Then, diagonalize by the spectral theorem and choose significant eigenmaps to obtain dimensionality reduction. Kernels can be constructed by many kernel eigenmap methods. These include Kernel PCA, LLE, HLLE, and Laplacian Eigenmaps.

7 Kernel Eigenmap Methods for Dimension Reduction - Kernel Diagonalization The second step in kernel eigenmap methods is the diagonalization of the kernel. Let e j, j = 1,..., N, be the set of eigenvectors of the kernel matrix K, with eigenvalues λ j. Order the eigenvalues monotonically. Choose the top d << D significant eigenvectors to map the original data points x i R D to (e 1 (i),..., e d (i)) R d, i = 1,..., N.

8 Laplacian Eigenmaps - Theory M. Belkin and P. Niyogi, 2003 Points close on the manifold should remain close in R d f (x) f (y) f (x) x y + o( x y ) arg min M f (x) 2 = arg min M M(f )f f L 2 (M) =1 f L 2 (M) =1 Find eigenfunctions of the Laplace-Beltrami operator M Use a discrete approximation of the Laplace-Beltrami operator Proven convergence (Belkin and Niyogi, )

9 Laplacian Eigenmaps - Implementation 1 Put an edge between nodes i and j if x i and x j are close. Precisely, given a parameter k N, put an edge between nodes i and j if x i is among the k nearest neighbors of x j or vice versa. 2 Given a parameter t > 0, if nodes i and j are connected, set W i,j = e x i x j 2 t. 3 Set D i,i = j W i,j, and let L = D W. Solve Lf = λdf, under the constraint y Dy = Id. Let f 0, f 1,..., f d be d + 1 eigenvector solutions corresponding to the first eigenvalues 0 = λ 0 λ 1 λ d. Discard f 0 and use the next d eigenvectors to embed in d-dimensional Euclidean space using the map x i (f 1 (i), f 2 (i),..., f d (i)).

10 Swiss Roll Mathematical Techniques Figure : a) Original, b) PCA, c f) LE, J. Shen et al., Neurocomputing, Volume 87, 2012

11 From Laplacian to Schroedinger Eigenmaps Consider the following minimization problem, y R d, 1 min y 2 y i y j 2 W i,j = min tr(y Ly). Dy=Id y Dy=E i,j Its solution is given by the d minimal non-zero eigenvalue solutions of Lf = λdf under the constraint y Dy = Id. Similarly, for diagonal V, consider the problem 1 min y 2 Dy=Id y i y j 2 W i,j + i,j i y i 2 V i,i = min tr(y (L+V )y), (1) y Dy=E which leads to solving equation (L + V )f = λdf.

12 Properties of Schroedinger Eigenmaps Let the data graph be connected and let V be a symmetric positive semi-definite matrix. If 1 Null(V ), then λ = 0 is a single eigenvalue of L + V with eigenvector 1. Theorem (with M. Ehler) Let the data graph be connected, let V be a symmetric positive semi-definite, and let n dim(null(v ). Then the minimizer of (1) satisfies: y (α) 2 V = trace2 (y (α)t Vy (α) ) C 1 α Let y (α) be a minimizer of the Schroedinger eigen-problem with αv, α > 0. Let V = diag(v 1,..., v N ). v i y (α) i 2 N v i y (α) i 2 1 C 1, for all i = 1,..., N. α i=1

13 Pointwise Convergence of SE Given n data points x 1, x 2,..., x n sampled independently from a uniform distribution on a smooth, compact, d-dimensional manifold M R D, define the operator ˆL t,n : C(M) C(M) by ˆL t,n (f )(x) = 1 1 (4πt) d/2 t n j f (x)e x x j 2 4t 1 n f (x j )e x x j 2 4t. Let v C(M) be a potential. For x M, let y n (x) = arg min x 1,x 2,...,x n x x i and define V n : C(M) C(M) by V n f (x) = v(y n (x))f (x). Theorem (Pointwise Convergence, with A. Halevy) Let α > 0, and set t n = ( 1 n ) 1 d+2+α. For f C (M), lim ˆL n tn,nf (x) + V n f (x) = C M f (x) + v(x)f (x) in probability. j

14 Spectral Convergence of SE - Theorem Let L t,n be the unnormalized discrete Laplacian. Theorem (Spectral Convergence of SE, with A. Halevy) Let λ i t,n and ei t,n be the ith eigenvalue and corresponding eigenfunction of L t,n + V n. Let λ i and e i be the ith eigenvalue and corresponding eigenfunction of M + V. Then there exists a sequence t n 0 such that, in probability, lim n λi t n,n = λ i and lim et i n n,n e i = 0.

15 SE as Semisupervised Method x x x x x x x x 10 3 The Schroedinger Eigenmaps with diagonal potential V = diag(0,..., 0, 1, 0,..., 0) only acting in one point y i0 in the middle of the arc for α = 0.05, 0.1, 0.5, 5. This point is pushed to zero.

16 SE as Semisupervised Method x x x x x x x x 10 3 By applying the potential to the end points of the arc for α = 0.01, 0.05, 0.1, 1, we are able to control the dimension reduction such that we obtain an almost perfect circle.

17 Outline Mathematical Techniques 1 Mathematical Techniques 2 3

18 Computational Bottleneck If N is the ambient dimension, and n is the number of points, time complexity of constructing an adjacency graph is O(Nn 2 ) What can we do about N? What can we do about the exponent 2? What can we do about n? What can we do about the computational complexity of eigendecomposition?

19 Numerical acceleration Data Compression via Incoherent Random Projections Fast Approximate k Nearest Neighbors algorithms, e.g., via Divide and Conquer strategies Quantization Landmarking Randomized low-rank SVD decompositions

20 Setting for data compression Dataset {x 1, x 2,..., x n } in R N, sampled from a compact K -dimensional Riemannian manifold Assume x i x j A for all i, j and some A > 0 Let 0 < λ 1 λ 2 λ K be the first K nonzero eigenvalues computed by LE, assumed simple, with r = min i,j λ i λ j, and let f j be a normalized eigenvector corresponding to λ j Use a random orthogonal projector Φ to map the points to R M. Let ˆf j be the jth eigenvector computed by LE for the projected data set

21 Laplacian Eigenmaps with random projections Theorem (with A. Halevy) Fix 0 < α < 1 and 0 < ρ < 1. If M 4 2 ln(1/ρ) rα ɛ 2 /200 + ɛ 3 K ln(ckn/ɛ), where ɛ = /3000 4An(n 1), then, with probability at least 1 ρ, f j ˆf j < α. The constant C depends on properties of the manifold. Precisely, C = 1900RV, where R, V and 1/τ are the geodesic covering regularity, τ 1/3 volume, and condition number, respectively.

22 Outline Mathematical Techniques 1 Mathematical Techniques 2 3

23 Hyperspectral Satellite Imagery Figure : Hyperspectral Satellite Image, courtesy of NASA

24 Data: Pavia University Figure : (a) Pavia University with bands [68,30,2], (b) Ground truth map, (c) Class map for purely spectral LE, (d) Class map for purely spatial LE

25 Data Indian Pines Figure : (a) Indian Pines with bands [29,15,12], (b) Ground truth map, (c) Class map for purely spectral LE, (d) Class map for purely spatial LE

26 Pure Spectral and Pure Spatial LE Table : Classification results for spectral LE σ Indian Pines Pavia University Table : Classification results for spatial LE η Indian Pines Pavia University

27 LE and Data Integration Classification of stacked spatial/spectral embeddings: Indian Pines: 46 top spatial, 4 top spectral Laplacian eigenvectors, OA 90.82% Pavia University: 20 top spatial, 5 top spectral Laplacian eigenvectors, OA 96.46%

28 Mixing Spatial/Spectral Graphs and Weights Let G be the knn-graph computed using the spectral metric. Let L f and L s denote, resp., the spectral and spatial Laplace operators on G. Fuse the Laplace operator L in the following ways: L 1 (σ, η) = L f (σ) L s (η), where denotes entry-wise multiplication, L 2 (σ, η) = L f (σ) + L s (η), L 3 (σ, η) = G (L ff (σ) L fs (η)), i.e., we take the usual matrix product of the two Laplacians and then overlay the sparsity structure determined by the adjacency matrix.

29 Figure : Fusion operator class maps for Indian Pines: L 1 (σ = 0.8, η = 0.2, 62.73%), L 2 (σ = 0.4, η = 0.2, 59.01%), L 3 (σ = 0.6, η = 0.2, 58.12%)

30 Figure : Fusion operator class maps for Pavia University: L 1 (σ = 0.4, η = 0.2, 79.19%), L 2 (σ = 0.8, η = 0.2, 78.73%), L 3 (σ = 0.2, η = 0.2, 79.22%)

31 Fusion metric Mathematical Techniques Adjust the norm used to determine the knn-graph and the weight matrix. For γ > 0, we define x i x j γ = x i x j 2 f + γ x i x j 2 s. Let L γ be the Laplacian operator defined using the knn-graph and weight matrix determined by γ.

32 Fused Spatial-Spectral Classification Table : Classification results for the fusion operators acting on the fusion graph, Indian Pines, η = 0.2 γ = 450 L 1 L 2 L 3 σ Table : Classification results for the fusion operators acting on the fusion graph, Pavia University, η = 0.25 γ = 34 L 1 L 2 L 3 σ

33 Fused Spatial-Spectral Classification Fusion operators on fusion metric graphs: Indian Pines, L 1, L 2, L 3.

34 Fused Spatial-Spectral Classification Fusion operators on fusion metric graphs: Pavia University, L 1, L 2, L 3.

35 Cluster Potentials Let ι = {i 1,..., i m } be a collection of nodes. A cluster potential over ι is defined by: V [ι, ι] := For such a V, the minimization problem is equivalent to: 1 min y T Dy=I 2 i,j m 1 y i y j 2 W i,j + α y ik y ik+1 2. k=1

36 Schroedinger Eigenmaps with Cluster Potentials Use k-means clustering to cluster the hyperspectral image. The clustering was initialized randomly so that the construction of the potential would be completely unsupervised. The Euclidean metric was used. Define a cluster potential, V k over each of the k = 1,..., K clusters. The order of the pixels in each cluster is determined by their index (smallest to largest). Aggregate into one potential by summing the individual cluster potentials: K V K = V k. k=1

37 Fused Spatial-Spectral Classification Table : Results for the Cluster Potential on Pavia University K α Table : Results for the Cluster Potential on Indian Pines α K

38 Fused Spatial-Spectral Classification Pavia University: Dimensions 4 and 5 of the SE embedding using cluster potential V 16 for classes 2,3, and 7

39 Fused Spatial-Spectral Classification Indian Pines: Dimensions 17 and 22 of the SE embedding using cluster potential V 16 for classes 2,3, and 10

Data-dependent representations: Laplacian Eigenmaps

Data-dependent representations: Laplacian Eigenmaps Data-dependent representations: Laplacian Eigenmaps November 4, 2015 Data Organization and Manifold Learning There are many techniques for Data Organization and Manifold Learning, e.g., Principal Component

More information

Laplacian Eigenmaps for Dimensionality Reduction and Data Representation

Laplacian Eigenmaps for Dimensionality Reduction and Data Representation Laplacian Eigenmaps for Dimensionality Reduction and Data Representation Neural Computation, June 2003; 15 (6):1373-1396 Presentation for CSE291 sp07 M. Belkin 1 P. Niyogi 2 1 University of Chicago, Department

More information

Learning Eigenfunctions: Links with Spectral Clustering and Kernel PCA

Learning Eigenfunctions: Links with Spectral Clustering and Kernel PCA Learning Eigenfunctions: Links with Spectral Clustering and Kernel PCA Yoshua Bengio Pascal Vincent Jean-François Paiement University of Montreal April 2, Snowbird Learning 2003 Learning Modal Structures

More information

Nonlinear Methods. Data often lies on or near a nonlinear low-dimensional curve aka manifold.

Nonlinear Methods. Data often lies on or near a nonlinear low-dimensional curve aka manifold. Nonlinear Methods Data often lies on or near a nonlinear low-dimensional curve aka manifold. 27 Laplacian Eigenmaps Linear methods Lower-dimensional linear projection that preserves distances between all

More information

Nonlinear Dimensionality Reduction

Nonlinear Dimensionality Reduction Outline Hong Chang Institute of Computing Technology, Chinese Academy of Sciences Machine Learning Methods (Fall 2012) Outline Outline I 1 Kernel PCA 2 Isomap 3 Locally Linear Embedding 4 Laplacian Eigenmap

More information

Unsupervised dimensionality reduction

Unsupervised dimensionality reduction Unsupervised dimensionality reduction Guillaume Obozinski Ecole des Ponts - ParisTech SOCN course 2014 Guillaume Obozinski Unsupervised dimensionality reduction 1/30 Outline 1 PCA 2 Kernel PCA 3 Multidimensional

More information

Data Analysis and Manifold Learning Lecture 3: Graphs, Graph Matrices, and Graph Embeddings

Data Analysis and Manifold Learning Lecture 3: Graphs, Graph Matrices, and Graph Embeddings Data Analysis and Manifold Learning Lecture 3: Graphs, Graph Matrices, and Graph Embeddings Radu Horaud INRIA Grenoble Rhone-Alpes, France Radu.Horaud@inrialpes.fr http://perception.inrialpes.fr/ Outline

More information

Manifold Learning: Theory and Applications to HRI

Manifold Learning: Theory and Applications to HRI Manifold Learning: Theory and Applications to HRI Seungjin Choi Department of Computer Science Pohang University of Science and Technology, Korea seungjin@postech.ac.kr August 19, 2008 1 / 46 Greek Philosopher

More information

Nonlinear Dimensionality Reduction. Jose A. Costa

Nonlinear Dimensionality Reduction. Jose A. Costa Nonlinear Dimensionality Reduction Jose A. Costa Mathematics of Information Seminar, Dec. Motivation Many useful of signals such as: Image databases; Gene expression microarrays; Internet traffic time

More information

Non-linear Dimensionality Reduction

Non-linear Dimensionality Reduction Non-linear Dimensionality Reduction CE-725: Statistical Pattern Recognition Sharif University of Technology Spring 2013 Soleymani Outline Introduction Laplacian Eigenmaps Locally Linear Embedding (LLE)

More information

Laplacian Eigenmaps for Dimensionality Reduction and Data Representation

Laplacian Eigenmaps for Dimensionality Reduction and Data Representation Introduction and Data Representation Mikhail Belkin & Partha Niyogi Department of Electrical Engieering University of Minnesota Mar 21, 2017 1/22 Outline Introduction 1 Introduction 2 3 4 Connections to

More information

Robust Laplacian Eigenmaps Using Global Information

Robust Laplacian Eigenmaps Using Global Information Manifold Learning and its Applications: Papers from the AAAI Fall Symposium (FS-9-) Robust Laplacian Eigenmaps Using Global Information Shounak Roychowdhury ECE University of Texas at Austin, Austin, TX

More information

LECTURE NOTE #11 PROF. ALAN YUILLE

LECTURE NOTE #11 PROF. ALAN YUILLE LECTURE NOTE #11 PROF. ALAN YUILLE 1. NonLinear Dimension Reduction Spectral Methods. The basic idea is to assume that the data lies on a manifold/surface in D-dimensional space, see figure (1) Perform

More information

Nonlinear Dimensionality Reduction

Nonlinear Dimensionality Reduction Nonlinear Dimensionality Reduction Piyush Rai CS5350/6350: Machine Learning October 25, 2011 Recap: Linear Dimensionality Reduction Linear Dimensionality Reduction: Based on a linear projection of the

More information

Intrinsic Structure Study on Whale Vocalizations

Intrinsic Structure Study on Whale Vocalizations 1 2015 DCLDE Conference Intrinsic Structure Study on Whale Vocalizations Yin Xian 1, Xiaobai Sun 2, Yuan Zhang 3, Wenjing Liao 3 Doug Nowacek 1,4, Loren Nolte 1, Robert Calderbank 1,2,3 1 Department of

More information

Dimension Reduction Techniques. Presented by Jie (Jerry) Yu

Dimension Reduction Techniques. Presented by Jie (Jerry) Yu Dimension Reduction Techniques Presented by Jie (Jerry) Yu Outline Problem Modeling Review of PCA and MDS Isomap Local Linear Embedding (LLE) Charting Background Advances in data collection and storage

More information

Locality Preserving Projections

Locality Preserving Projections Locality Preserving Projections Xiaofei He Department of Computer Science The University of Chicago Chicago, IL 60637 xiaofei@cs.uchicago.edu Partha Niyogi Department of Computer Science The University

More information

Lecture 10: Dimension Reduction Techniques

Lecture 10: Dimension Reduction Techniques Lecture 10: Dimension Reduction Techniques Radu Balan Department of Mathematics, AMSC, CSCAMM and NWC University of Maryland, College Park, MD April 17, 2018 Input Data It is assumed that there is a set

More information

Dimensionality Reduc1on

Dimensionality Reduc1on Dimensionality Reduc1on contd Aarti Singh Machine Learning 10-601 Nov 10, 2011 Slides Courtesy: Tom Mitchell, Eric Xing, Lawrence Saul 1 Principal Component Analysis (PCA) Principal Components are the

More information

Advances in Manifold Learning Presented by: Naku Nak l Verm r a June 10, 2008

Advances in Manifold Learning Presented by: Naku Nak l Verm r a June 10, 2008 Advances in Manifold Learning Presented by: Nakul Verma June 10, 008 Outline Motivation Manifolds Manifold Learning Random projection of manifolds for dimension reduction Introduction to random projections

More information

Manifold Regularization

Manifold Regularization 9.520: Statistical Learning Theory and Applications arch 3rd, 200 anifold Regularization Lecturer: Lorenzo Rosasco Scribe: Hooyoung Chung Introduction In this lecture we introduce a class of learning algorithms,

More information

Data Analysis and Manifold Learning Lecture 7: Spectral Clustering

Data Analysis and Manifold Learning Lecture 7: Spectral Clustering Data Analysis and Manifold Learning Lecture 7: Spectral Clustering Radu Horaud INRIA Grenoble Rhone-Alpes, France Radu.Horaud@inrialpes.fr http://perception.inrialpes.fr/ Outline of Lecture 7 What is spectral

More information

Graphs, Geometry and Semi-supervised Learning

Graphs, Geometry and Semi-supervised Learning Graphs, Geometry and Semi-supervised Learning Mikhail Belkin The Ohio State University, Dept of Computer Science and Engineering and Dept of Statistics Collaborators: Partha Niyogi, Vikas Sindhwani In

More information

Contribution from: Springer Verlag Berlin Heidelberg 2005 ISBN

Contribution from: Springer Verlag Berlin Heidelberg 2005 ISBN Contribution from: Mathematical Physics Studies Vol. 7 Perspectives in Analysis Essays in Honor of Lennart Carleson s 75th Birthday Michael Benedicks, Peter W. Jones, Stanislav Smirnov (Eds.) Springer

More information

Beyond Scalar Affinities for Network Analysis or Vector Diffusion Maps and the Connection Laplacian

Beyond Scalar Affinities for Network Analysis or Vector Diffusion Maps and the Connection Laplacian Beyond Scalar Affinities for Network Analysis or Vector Diffusion Maps and the Connection Laplacian Amit Singer Princeton University Department of Mathematics and Program in Applied and Computational Mathematics

More information

Connection of Local Linear Embedding, ISOMAP, and Kernel Principal Component Analysis

Connection of Local Linear Embedding, ISOMAP, and Kernel Principal Component Analysis Connection of Local Linear Embedding, ISOMAP, and Kernel Principal Component Analysis Alvina Goh Vision Reading Group 13 October 2005 Connection of Local Linear Embedding, ISOMAP, and Kernel Principal

More information

Unsupervised Learning Techniques Class 07, 1 March 2006 Andrea Caponnetto

Unsupervised Learning Techniques Class 07, 1 March 2006 Andrea Caponnetto Unsupervised Learning Techniques 9.520 Class 07, 1 March 2006 Andrea Caponnetto About this class Goal To introduce some methods for unsupervised learning: Gaussian Mixtures, K-Means, ISOMAP, HLLE, Laplacian

More information

CSE 291. Assignment Spectral clustering versus k-means. Out: Wed May 23 Due: Wed Jun 13

CSE 291. Assignment Spectral clustering versus k-means. Out: Wed May 23 Due: Wed Jun 13 CSE 291. Assignment 3 Out: Wed May 23 Due: Wed Jun 13 3.1 Spectral clustering versus k-means Download the rings data set for this problem from the course web site. The data is stored in MATLAB format as

More information

Lecture: Some Practical Considerations (3 of 4)

Lecture: Some Practical Considerations (3 of 4) Stat260/CS294: Spectral Graph Methods Lecture 14-03/10/2015 Lecture: Some Practical Considerations (3 of 4) Lecturer: Michael Mahoney Scribe: Michael Mahoney Warning: these notes are still very rough.

More information

Analysis of Spectral Kernel Design based Semi-supervised Learning

Analysis of Spectral Kernel Design based Semi-supervised Learning Analysis of Spectral Kernel Design based Semi-supervised Learning Tong Zhang IBM T. J. Watson Research Center Yorktown Heights, NY 10598 Rie Kubota Ando IBM T. J. Watson Research Center Yorktown Heights,

More information

Graph Metrics and Dimension Reduction

Graph Metrics and Dimension Reduction Graph Metrics and Dimension Reduction Minh Tang 1 Michael Trosset 2 1 Applied Mathematics and Statistics The Johns Hopkins University 2 Department of Statistics Indiana University, Bloomington November

More information

Face Recognition Using Laplacianfaces He et al. (IEEE Trans PAMI, 2005) presented by Hassan A. Kingravi

Face Recognition Using Laplacianfaces He et al. (IEEE Trans PAMI, 2005) presented by Hassan A. Kingravi Face Recognition Using Laplacianfaces He et al. (IEEE Trans PAMI, 2005) presented by Hassan A. Kingravi Overview Introduction Linear Methods for Dimensionality Reduction Nonlinear Methods and Manifold

More information

EECS 275 Matrix Computation

EECS 275 Matrix Computation EECS 275 Matrix Computation Ming-Hsuan Yang Electrical Engineering and Computer Science University of California at Merced Merced, CA 95344 http://faculty.ucmerced.edu/mhyang Lecture 23 1 / 27 Overview

More information

Learning gradients: prescriptive models

Learning gradients: prescriptive models Department of Statistical Science Institute for Genome Sciences & Policy Department of Computer Science Duke University May 11, 2007 Relevant papers Learning Coordinate Covariances via Gradients. Sayan

More information

DIMENSION REDUCTION. min. j=1

DIMENSION REDUCTION. min. j=1 DIMENSION REDUCTION 1 Principal Component Analysis (PCA) Principal components analysis (PCA) finds low dimensional approximations to the data by projecting the data onto linear subspaces. Let X R d and

More information

Focus was on solving matrix inversion problems Now we look at other properties of matrices Useful when A represents a transformations.

Focus was on solving matrix inversion problems Now we look at other properties of matrices Useful when A represents a transformations. Previously Focus was on solving matrix inversion problems Now we look at other properties of matrices Useful when A represents a transformations y = Ax Or A simply represents data Notion of eigenvectors,

More information

Certifying the Global Optimality of Graph Cuts via Semidefinite Programming: A Theoretic Guarantee for Spectral Clustering

Certifying the Global Optimality of Graph Cuts via Semidefinite Programming: A Theoretic Guarantee for Spectral Clustering Certifying the Global Optimality of Graph Cuts via Semidefinite Programming: A Theoretic Guarantee for Spectral Clustering Shuyang Ling Courant Institute of Mathematical Sciences, NYU Aug 13, 2018 Joint

More information

Functional Analysis Review

Functional Analysis Review Outline 9.520: Statistical Learning Theory and Applications February 8, 2010 Outline 1 2 3 4 Vector Space Outline A vector space is a set V with binary operations +: V V V and : R V V such that for all

More information

8.1 Concentration inequality for Gaussian random matrix (cont d)

8.1 Concentration inequality for Gaussian random matrix (cont d) MGMT 69: Topics in High-dimensional Data Analysis Falll 26 Lecture 8: Spectral clustering and Laplacian matrices Lecturer: Jiaming Xu Scribe: Hyun-Ju Oh and Taotao He, October 4, 26 Outline Concentration

More information

Global (ISOMAP) versus Local (LLE) Methods in Nonlinear Dimensionality Reduction

Global (ISOMAP) versus Local (LLE) Methods in Nonlinear Dimensionality Reduction Global (ISOMAP) versus Local (LLE) Methods in Nonlinear Dimensionality Reduction A presentation by Evan Ettinger on a Paper by Vin de Silva and Joshua B. Tenenbaum May 12, 2005 Outline Introduction The

More information

Diffusion Geometries, Diffusion Wavelets and Harmonic Analysis of large data sets.

Diffusion Geometries, Diffusion Wavelets and Harmonic Analysis of large data sets. Diffusion Geometries, Diffusion Wavelets and Harmonic Analysis of large data sets. R.R. Coifman, S. Lafon, MM Mathematics Department Program of Applied Mathematics. Yale University Motivations The main

More information

Apprentissage non supervisée

Apprentissage non supervisée Apprentissage non supervisée Cours 3 Higher dimensions Jairo Cugliari Master ECD 2015-2016 From low to high dimension Density estimation Histograms and KDE Calibration can be done automacally But! Let

More information

Image Analysis & Retrieval Lec 13 - Feature Dimension Reduction

Image Analysis & Retrieval Lec 13 - Feature Dimension Reduction CS/EE 5590 / ENG 401 Special Topics, Spring 2018 Image Analysis & Retrieval Lec 13 - Feature Dimension Reduction Zhu Li Dept of CSEE, UMKC http://l.web.umkc.edu/lizhu Office Hour: Tue/Thr 2:30-4pm@FH560E,

More information

Spectral Techniques for Clustering

Spectral Techniques for Clustering Nicola Rebagliati 1/54 Spectral Techniques for Clustering Nicola Rebagliati 29 April, 2010 Nicola Rebagliati 2/54 Thesis Outline 1 2 Data Representation for Clustering Setting Data Representation and Methods

More information

The Nyström Extension and Spectral Methods in Learning

The Nyström Extension and Spectral Methods in Learning Introduction Main Results Simulation Studies Summary The Nyström Extension and Spectral Methods in Learning New bounds and algorithms for high-dimensional data sets Patrick J. Wolfe (joint work with Mohamed-Ali

More information

Statistical and Computational Analysis of Locality Preserving Projection

Statistical and Computational Analysis of Locality Preserving Projection Statistical and Computational Analysis of Locality Preserving Projection Xiaofei He xiaofei@cs.uchicago.edu Department of Computer Science, University of Chicago, 00 East 58th Street, Chicago, IL 60637

More information

Dimension Reduction and Low-dimensional Embedding

Dimension Reduction and Low-dimensional Embedding Dimension Reduction and Low-dimensional Embedding Ying Wu Electrical Engineering and Computer Science Northwestern University Evanston, IL 60208 http://www.eecs.northwestern.edu/~yingwu 1/26 Dimension

More information

LEC 2: Principal Component Analysis (PCA) A First Dimensionality Reduction Approach

LEC 2: Principal Component Analysis (PCA) A First Dimensionality Reduction Approach LEC 2: Principal Component Analysis (PCA) A First Dimensionality Reduction Approach Dr. Guangliang Chen February 9, 2016 Outline Introduction Review of linear algebra Matrix SVD PCA Motivation The digits

More information

Learning a Kernel Matrix for Nonlinear Dimensionality Reduction

Learning a Kernel Matrix for Nonlinear Dimensionality Reduction Learning a Kernel Matrix for Nonlinear Dimensionality Reduction Kilian Q. Weinberger kilianw@cis.upenn.edu Fei Sha feisha@cis.upenn.edu Lawrence K. Saul lsaul@cis.upenn.edu Department of Computer and Information

More information

15 Singular Value Decomposition

15 Singular Value Decomposition 15 Singular Value Decomposition For any high-dimensional data analysis, one s first thought should often be: can I use an SVD? The singular value decomposition is an invaluable analysis tool for dealing

More information

Introduction to Machine Learning

Introduction to Machine Learning 10-701 Introduction to Machine Learning PCA Slides based on 18-661 Fall 2018 PCA Raw data can be Complex, High-dimensional To understand a phenomenon we measure various related quantities If we knew what

More information

Consistent Manifold Representation for Topological Data Analysis

Consistent Manifold Representation for Topological Data Analysis Consistent Manifold Representation for Topological Data Analysis Tyrus Berry (tberry@gmu.edu) and Timothy Sauer Dept. of Mathematical Sciences, George Mason University, Fairfax, VA 3 Abstract For data

More information

Spectral Dimensionality Reduction

Spectral Dimensionality Reduction Spectral Dimensionality Reduction Yoshua Bengio, Olivier Delalleau, Nicolas Le Roux Jean-François Paiement, Pascal Vincent, and Marie Ouimet Département d Informatique et Recherche Opérationnelle Centre

More information

Multivariate Statistical Analysis

Multivariate Statistical Analysis Multivariate Statistical Analysis Fall 2011 C. L. Williams, Ph.D. Lecture 4 for Applied Multivariate Analysis Outline 1 Eigen values and eigen vectors Characteristic equation Some properties of eigendecompositions

More information

Data Analysis and Manifold Learning Lecture 9: Diffusion on Manifolds and on Graphs

Data Analysis and Manifold Learning Lecture 9: Diffusion on Manifolds and on Graphs Data Analysis and Manifold Learning Lecture 9: Diffusion on Manifolds and on Graphs Radu Horaud INRIA Grenoble Rhone-Alpes, France Radu.Horaud@inrialpes.fr http://perception.inrialpes.fr/ Outline of Lecture

More information

A graph based approach to semi-supervised learning

A graph based approach to semi-supervised learning A graph based approach to semi-supervised learning 1 Feb 2011 Two papers M. Belkin, P. Niyogi, and V Sindhwani. Manifold regularization: a geometric framework for learning from labeled and unlabeled examples.

More information

Introduction to Machine Learning. PCA and Spectral Clustering. Introduction to Machine Learning, Slides: Eran Halperin

Introduction to Machine Learning. PCA and Spectral Clustering. Introduction to Machine Learning, Slides: Eran Halperin 1 Introduction to Machine Learning PCA and Spectral Clustering Introduction to Machine Learning, 2013-14 Slides: Eran Halperin Singular Value Decomposition (SVD) The singular value decomposition (SVD)

More information

TANGENT SPACE INTRINSIC MANIFOLD REGULARIZATION FOR DATA REPRESENTATION. Shiliang Sun

TANGENT SPACE INTRINSIC MANIFOLD REGULARIZATION FOR DATA REPRESENTATION. Shiliang Sun TANGENT SPACE INTRINSIC MANIFOLD REGULARIZATION FOR DATA REPRESENTATION Shiliang Sun Department o Computer Science and Technology, East China Normal University 500 Dongchuan Road, Shanghai 200241, P. R.

More information

SPECTRAL CLUSTERING AND KERNEL PRINCIPAL COMPONENT ANALYSIS ARE PURSUING GOOD PROJECTIONS

SPECTRAL CLUSTERING AND KERNEL PRINCIPAL COMPONENT ANALYSIS ARE PURSUING GOOD PROJECTIONS SPECTRAL CLUSTERING AND KERNEL PRINCIPAL COMPONENT ANALYSIS ARE PURSUING GOOD PROJECTIONS VIKAS CHANDRAKANT RAYKAR DECEMBER 5, 24 Abstract. We interpret spectral clustering algorithms in the light of unsupervised

More information

arxiv: v3 [cs.cv] 12 Sep 2016

arxiv: v3 [cs.cv] 12 Sep 2016 A TALE OF TWO BASES: LOCAL-NONLOCAL REGULARIZATION ON IMAGE PATCHES WITH CONVOLUTION FRAMELETS RUJIE YIN, TINGRAN GAO, YUE M. LU, AND INGRID DAUBECHIES arxiv:1606.01377v3 [cs.cv] 12 Sep 2016 Abstract.

More information

14 Singular Value Decomposition

14 Singular Value Decomposition 14 Singular Value Decomposition For any high-dimensional data analysis, one s first thought should often be: can I use an SVD? The singular value decomposition is an invaluable analysis tool for dealing

More information

Dimensionality Reduction AShortTutorial

Dimensionality Reduction AShortTutorial Dimensionality Reduction AShortTutorial Ali Ghodsi Department of Statistics and Actuarial Science University of Waterloo Waterloo, Ontario, Canada, 2006 c Ali Ghodsi, 2006 Contents 1 An Introduction to

More information

Machine Learning. B. Unsupervised Learning B.2 Dimensionality Reduction. Lars Schmidt-Thieme, Nicolas Schilling

Machine Learning. B. Unsupervised Learning B.2 Dimensionality Reduction. Lars Schmidt-Thieme, Nicolas Schilling Machine Learning B. Unsupervised Learning B.2 Dimensionality Reduction Lars Schmidt-Thieme, Nicolas Schilling Information Systems and Machine Learning Lab (ISMLL) Institute for Computer Science University

More information

Learning from Labeled and Unlabeled Data: Semi-supervised Learning and Ranking p. 1/31

Learning from Labeled and Unlabeled Data: Semi-supervised Learning and Ranking p. 1/31 Learning from Labeled and Unlabeled Data: Semi-supervised Learning and Ranking Dengyong Zhou zhou@tuebingen.mpg.de Dept. Schölkopf, Max Planck Institute for Biological Cybernetics, Germany Learning from

More information

Statistical Pattern Recognition

Statistical Pattern Recognition Statistical Pattern Recognition Feature Extraction Hamid R. Rabiee Jafar Muhammadi, Alireza Ghasemi, Payam Siyari Spring 2014 http://ce.sharif.edu/courses/92-93/2/ce725-2/ Agenda Dimensionality Reduction

More information

Unsupervised Learning: Dimensionality Reduction

Unsupervised Learning: Dimensionality Reduction Unsupervised Learning: Dimensionality Reduction CMPSCI 689 Fall 2015 Sridhar Mahadevan Lecture 3 Outline In this lecture, we set about to solve the problem posed in the previous lecture Given a dataset,

More information

Manifold Learning: From Linear to nonlinear. Presenter: Wei-Lun (Harry) Chao Date: April 26 and May 3, 2012 At: AMMAI 2012

Manifold Learning: From Linear to nonlinear. Presenter: Wei-Lun (Harry) Chao Date: April 26 and May 3, 2012 At: AMMAI 2012 Manifold Learning: From Linear to nonlinear Presenter: Wei-Lun (Harry) Chao Date: April 26 and May 3, 2012 At: AMMAI 2012 1 Preview Goal: Dimensionality Classification reduction and clustering Main idea:

More information

Dimension reduction methods: Algorithms and Applications Yousef Saad Department of Computer Science and Engineering University of Minnesota

Dimension reduction methods: Algorithms and Applications Yousef Saad Department of Computer Science and Engineering University of Minnesota Dimension reduction methods: Algorithms and Applications Yousef Saad Department of Computer Science and Engineering University of Minnesota Université du Littoral- Calais July 11, 16 First..... to the

More information

Diffusion Wavelets and Applications

Diffusion Wavelets and Applications Diffusion Wavelets and Applications J.C. Bremer, R.R. Coifman, P.W. Jones, S. Lafon, M. Mohlenkamp, MM, R. Schul, A.D. Szlam Demos, web pages and preprints available at: S.Lafon: www.math.yale.edu/~sl349

More information

Learning a kernel matrix for nonlinear dimensionality reduction

Learning a kernel matrix for nonlinear dimensionality reduction University of Pennsylvania ScholarlyCommons Departmental Papers (CIS) Department of Computer & Information Science 7-4-2004 Learning a kernel matrix for nonlinear dimensionality reduction Kilian Q. Weinberger

More information

Dimensionality Reduction: A Comparative Review

Dimensionality Reduction: A Comparative Review Tilburg centre for Creative Computing P.O. Box 90153 Tilburg University 5000 LE Tilburg, The Netherlands http://www.uvt.nl/ticc Email: ticc@uvt.nl Copyright c Laurens van der Maaten, Eric Postma, and Jaap

More information

Graph-Laplacian PCA: Closed-form Solution and Robustness

Graph-Laplacian PCA: Closed-form Solution and Robustness 2013 IEEE Conference on Computer Vision and Pattern Recognition Graph-Laplacian PCA: Closed-form Solution and Robustness Bo Jiang a, Chris Ding b,a, Bin Luo a, Jin Tang a a School of Computer Science and

More information

Large-Scale Manifold Learning

Large-Scale Manifold Learning Large-Scale Manifold Learning Ameet Talwalkar Courant Institute New York, NY ameet@cs.nyu.edu Sanjiv Kumar Google Research New York, NY sanjivk@google.com Henry Rowley Google Research Mountain View, CA

More information

Diffusion/Inference geometries of data features, situational awareness and visualization. Ronald R Coifman Mathematics Yale University

Diffusion/Inference geometries of data features, situational awareness and visualization. Ronald R Coifman Mathematics Yale University Diffusion/Inference geometries of data features, situational awareness and visualization Ronald R Coifman Mathematics Yale University Digital data is generally converted to point clouds in high dimensional

More information

MLCC Clustering. Lorenzo Rosasco UNIGE-MIT-IIT

MLCC Clustering. Lorenzo Rosasco UNIGE-MIT-IIT MLCC 2018 - Clustering Lorenzo Rosasco UNIGE-MIT-IIT About this class We will consider an unsupervised setting, and in particular the problem of clustering unlabeled data into coherent groups. MLCC 2018

More information

Spectral Clustering. by HU Pili. June 16, 2013

Spectral Clustering. by HU Pili. June 16, 2013 Spectral Clustering by HU Pili June 16, 2013 Outline Clustering Problem Spectral Clustering Demo Preliminaries Clustering: K-means Algorithm Dimensionality Reduction: PCA, KPCA. Spectral Clustering Framework

More information

https://goo.gl/kfxweg KYOTO UNIVERSITY Statistical Machine Learning Theory Sparsity Hisashi Kashima kashima@i.kyoto-u.ac.jp DEPARTMENT OF INTELLIGENCE SCIENCE AND TECHNOLOGY 1 KYOTO UNIVERSITY Topics:

More information

L26: Advanced dimensionality reduction

L26: Advanced dimensionality reduction L26: Advanced dimensionality reduction The snapshot CA approach Oriented rincipal Components Analysis Non-linear dimensionality reduction (manifold learning) ISOMA Locally Linear Embedding CSCE 666 attern

More information

Course 495: Advanced Statistical Machine Learning/Pattern Recognition

Course 495: Advanced Statistical Machine Learning/Pattern Recognition Course 495: Advanced Statistical Machine Learning/Pattern Recognition Deterministic Component Analysis Goal (Lecture): To present standard and modern Component Analysis (CA) techniques such as Principal

More information

Review of Linear Algebra

Review of Linear Algebra Review of Linear Algebra Definitions An m n (read "m by n") matrix, is a rectangular array of entries, where m is the number of rows and n the number of columns. 2 Definitions (Con t) A is square if m=

More information

How to learn from very few examples?

How to learn from very few examples? How to learn from very few examples? Dengyong Zhou Department of Empirical Inference Max Planck Institute for Biological Cybernetics Spemannstr. 38, 72076 Tuebingen, Germany Outline Introduction Part A

More information

Dimensionality Reduction

Dimensionality Reduction Lecture 5 1 Outline 1. Overview a) What is? b) Why? 2. Principal Component Analysis (PCA) a) Objectives b) Explaining variability c) SVD 3. Related approaches a) ICA b) Autoencoders 2 Example 1: Sportsball

More information

A Duality View of Spectral Methods for Dimensionality Reduction

A Duality View of Spectral Methods for Dimensionality Reduction Lin Xiao lxiao@caltech.edu Center for the Mathematics of Information, California Institute of Technology, Pasadena, CA 91125, USA Jun Sun sunjun@stanford.edu Stephen Boyd boyd@stanford.edu Department of

More information

Manifold Regularization

Manifold Regularization Manifold Regularization Vikas Sindhwani Department of Computer Science University of Chicago Joint Work with Mikhail Belkin and Partha Niyogi TTI-C Talk September 14, 24 p.1 The Problem of Learning is

More information

Is Manifold Learning for Toy Data only?

Is Manifold Learning for Toy Data only? s Manifold Learning for Toy Data only? Marina Meilă University of Washington mmp@stat.washington.edu MMDS Workshop 2016 Outline What is non-linear dimension reduction? Metric Manifold Learning Estimating

More information

A Duality View of Spectral Methods for Dimensionality Reduction

A Duality View of Spectral Methods for Dimensionality Reduction A Duality View of Spectral Methods for Dimensionality Reduction Lin Xiao 1 Jun Sun 2 Stephen Boyd 3 May 3, 2006 1 Center for the Mathematics of Information, California Institute of Technology, Pasadena,

More information

Spectral Processing. Misha Kazhdan

Spectral Processing. Misha Kazhdan Spectral Processing Misha Kazhdan [Taubin, 1995] A Signal Processing Approach to Fair Surface Design [Desbrun, et al., 1999] Implicit Fairing of Arbitrary Meshes [Vallet and Levy, 2008] Spectral Geometry

More information

Distance Metric Learning in Data Mining (Part II) Fei Wang and Jimeng Sun IBM TJ Watson Research Center

Distance Metric Learning in Data Mining (Part II) Fei Wang and Jimeng Sun IBM TJ Watson Research Center Distance Metric Learning in Data Mining (Part II) Fei Wang and Jimeng Sun IBM TJ Watson Research Center 1 Outline Part I - Applications Motivation and Introduction Patient similarity application Part II

More information

Clustering compiled by Alvin Wan from Professor Benjamin Recht s lecture, Samaneh s discussion

Clustering compiled by Alvin Wan from Professor Benjamin Recht s lecture, Samaneh s discussion Clustering compiled by Alvin Wan from Professor Benjamin Recht s lecture, Samaneh s discussion 1 Overview With clustering, we have several key motivations: archetypes (factor analysis) segmentation hierarchy

More information

March 13, Paper: R.R. Coifman, S. Lafon, Diffusion maps ([Coifman06]) Seminar: Learning with Graphs, Prof. Hein, Saarland University

March 13, Paper: R.R. Coifman, S. Lafon, Diffusion maps ([Coifman06]) Seminar: Learning with Graphs, Prof. Hein, Saarland University Kernels March 13, 2008 Paper: R.R. Coifman, S. Lafon, maps ([Coifman06]) Seminar: Learning with Graphs, Prof. Hein, Saarland University Kernels Figure: Example Application from [LafonWWW] meaningful geometric

More information

Norbert Wiener Center for Harmonic Analysis and Applications, University of Maryland, College Park, MD

Norbert Wiener Center for Harmonic Analysis and Applications, University of Maryland, College Park, MD 1 FUSION OF AERIAL GAMMA RAY SURVEY AND REMOTE SENSING DATA FOR A DEEPER UNDERSTANDING OF RADIONUCLIDE FATE AFTER RADIOLOGICAL INCIDENTS: EXAMPLES FROM THE FUKUSHIMA DAI-ICHI RESPONSE Wojciech Czaja [1],

More information

Mid-year Report Linear and Non-linear Dimentionality. Reduction. applied to gene expression data of cancer tissue samples

Mid-year Report Linear and Non-linear Dimentionality. Reduction. applied to gene expression data of cancer tissue samples Mid-year Report Linear and Non-linear Dimentionality applied to gene expression data of cancer tissue samples Franck Olivier Ndjakou Njeunje Applied Mathematics, Statistics, and Scientific Computation

More information

Spectral Clustering. Guokun Lai 2016/10

Spectral Clustering. Guokun Lai 2016/10 Spectral Clustering Guokun Lai 2016/10 1 / 37 Organization Graph Cut Fundamental Limitations of Spectral Clustering Ng 2002 paper (if we have time) 2 / 37 Notation We define a undirected weighted graph

More information

Learning on Graphs and Manifolds. CMPSCI 689 Sridhar Mahadevan U.Mass Amherst

Learning on Graphs and Manifolds. CMPSCI 689 Sridhar Mahadevan U.Mass Amherst Learning on Graphs and Manifolds CMPSCI 689 Sridhar Mahadevan U.Mass Amherst Outline Manifold learning is a relatively new area of machine learning (2000-now). Main idea Model the underlying geometry of

More information

Lecture: Some Statistical Inference Issues (2 of 3)

Lecture: Some Statistical Inference Issues (2 of 3) Stat260/CS294: Spectral Graph Methods Lecture 23-04/16/2015 Lecture: Some Statistical Inference Issues (2 of 3) Lecturer: Michael Mahoney Scribe: Michael Mahoney Warning: these notes are still very rough.

More information

ABSTRACT. Professor Wojciech Czaja Department of Mathematics

ABSTRACT. Professor Wojciech Czaja Department of Mathematics ABSTRACT Title of dissertation: Data Representation for Learning and Information Fusion in Bioinformatics Vinodh N. Rajapakse, Doctor of Philosophy, 2013 Dissertation directed by: Professor Wojciech Czaja

More information

Statistical Learning Notes III-Section 2.4.3

Statistical Learning Notes III-Section 2.4.3 Statistical Learning Notes III-Section 2.4.3 "Graphical" Spectral Features Stephen Vardeman Analytics Iowa LLC January 2019 Stephen Vardeman (Analytics Iowa LLC) Statistical Learning Notes III-Section

More information

Statistical Learning. Dong Liu. Dept. EEIS, USTC

Statistical Learning. Dong Liu. Dept. EEIS, USTC Statistical Learning Dong Liu Dept. EEIS, USTC Chapter 6. Unsupervised and Semi-Supervised Learning 1. Unsupervised learning 2. k-means 3. Gaussian mixture model 4. Other approaches to clustering 5. Principle

More information

THE HIDDEN CONVEXITY OF SPECTRAL CLUSTERING

THE HIDDEN CONVEXITY OF SPECTRAL CLUSTERING THE HIDDEN CONVEXITY OF SPECTRAL CLUSTERING Luis Rademacher, Ohio State University, Computer Science and Engineering. Joint work with Mikhail Belkin and James Voss This talk A new approach to multi-way

More information

PARAMETERIZATION OF NON-LINEAR MANIFOLDS

PARAMETERIZATION OF NON-LINEAR MANIFOLDS PARAMETERIZATION OF NON-LINEAR MANIFOLDS C. W. GEAR DEPARTMENT OF CHEMICAL AND BIOLOGICAL ENGINEERING PRINCETON UNIVERSITY, PRINCETON, NJ E-MAIL:WGEAR@PRINCETON.EDU Abstract. In this report we consider

More information