Spectral Clustering. Spectral Clustering? Two Moons Data. Spectral Clustering Algorithm: Bipartioning. Spectral methods

Size: px
Start display at page:

Download "Spectral Clustering. Spectral Clustering? Two Moons Data. Spectral Clustering Algorithm: Bipartioning. Spectral methods"

Transcription

1 Spectral Clustering Seungjin Choi Department of Computer Science POSTECH, Korea 1 Spectral methods Spectral Clustering? Methods using eigenvectors of some matrices Involve eigen-decomposition (or spectral decomposition) Spectral clustering methods: Algorithms that cluster data points using eigenvectors of matrices derived from the data Closely related to spectral graph partitioning Pairwise (Similarity-based) clustering methods Standard statistical clustering methods assume a probabilistic model that generates the observed data points Pairwise clustering methods define a similarity function between pairs of data points and then formulates a criterion that the clustering must optimize 2 Spectral Clustering Algorithm: Bipartioning Two Moons Data 1. Construct affinity matrix W ij = { exp{ β vi v j 2 } if i j 0 if i = j Calculate the graph Laplacian L: L = D W where D = diag{d 1,...,d n } and d i = j W ij. 3. Compute the second smallest eigenvector of the graph Laplacian (denoted by u = [u 1 u n ], Fiedler vector) 4. Partition u i s by a pre-specified threshold value and assign data points v i to cluster

2 Two Moons Data: k-means Two Moons Data: Fiedler Vector Two Moons Data: Spectral Clustering Graphs Consider a connected graph G(V, E) where V = {v 1,...,v n } and E denote a set of vertices and a set of edges, respectively, with pairwise similarity values being assigned as edge weights. Adjacency matrix (similarity, proximity, affinity matrix): W = [W ij ] R n n. Degree of nodes: d i = j W ij Volume: vol(s 1 ) = d S1 = i S 1 d i

3 Neighborhood Graphs Gaussian similarity function is given by w(v i, v j ) = W ij = exp { v i v j 2 } 2σ 2. ɛ-neighborhood graph k-nearest neighbor graph Graph Laplacian (Unnormalized) graph Laplacian is defined as L = D W. 1. For every vector x R n, we have x Lx = 1 2 n i=1 j=1 n W ij (x i x j ) 2 0. (positive semidefinite) 2. The smallest eigenvalue of L is 0 and the corresponding eigenvector is 1 = [1 1], since D1 = W1, i.e, L1 = L has n nonnegative eigenvalues, λ 1 λ 2 λ n = Normalized Graph Laplacian x Lx = x Dx x Wx n n n = d i x 2 i W ij x i x j i=1 i=1 j=1 = 1 d i x 2 i 2 W ij x i x j + 2 i i j j = 1 W ij (x i x j ) 2. 2 i j d j x 2 j Two different normalization methods are popular, including: Symmetric normalization: L s = D 1 2 LD 2 1 = I D 2 1 WD 1 2. Normalization related to random walks: L rw = D 1 L = I D 1 W

4 1. For every vector x R n, we have x L s x = 1 2 n i=1 j=1 ( ) 2 n x i W ij x j. di dj 2. L sym and L rw are positive semidefinite and have n nonnegative real-valued eigenvalues, λ 1 λ n = λ is an eigenvalue of L rw with eigenvector u if and only if λ is an eigenvalue of L s with eigenvector D 1/2 u. 4. λ is an eigenvalue of L rw with eigenvector u if and only if λ and u solves the generalized eigenvalue problem Lu = λdu is an eigenvalue of L rw with the constant one vector 1 as eigenvector. 0 is an eigenvalue of L s with eigenvector D 1/2 1. Unnormalized Spectral Clustering 1. Construct a neighborhood graph with corresponding adjacency matrix W. 2. Compute the unnormalized graph Laplacian L = D W. 3. Find the k smallest eigenvectors of L and form the matrix U = [u 1 u k ] R n k. 4. Treating each row of U as a point in R k, cluster them into k groups using k-means algorithm. 5. Assign v i to cluster j if and only if row i of U is assigned to cluster j Normalized Spectral Clustering: Shi-Malik 1. Construct a neighborhood graph with corresponding adjacency matrix W. 2. Compute the unnormalized graph Laplacian L = D W. 3. Find the k smallest generalized eigenvectors u 1,...,u k of the problem Lu = λdu and form the matrix U = [u 1 u k ] R n k. 4. Treating each row of U as a point in R k, cluster them into k groups using k-means algorithm. 5. Assign v i to cluster j if and only if row i of U is assigned to cluster j. 15 Normalized Spectral Clustering: Ng-Jordan-Weiss 1. Construct a neighborhood graph with corresponding adjacency matrix W. 2. Compute the normalized graph Laplacian L s = D 1/2 LD 1/2. 3. Find the k smallest eigenvectors u 1,...,u k of L s and form the matrix U = [u 1 u k ] R n k. 4. Form the matrix Ũ from U by re-normalizing each row of U to have unit norm, i.e., Ũ ij = U ij /( j U ij) 1/2. 5. Treating each row of Ũ as a point in Rk, cluster them into k groups using k-means algorithm. 6. Assign v i to cluster j if and only if row i of Ũ is assigned to cluster j. 16

5 Where does this spectral clustering algorithm come from? Pictorial Illustration of Graph Partitioning Spectral graph partitioning Properties of block (diagonal) matrix Markov random walk Graph Partitioning: Bipartitioning Pictorial Illustration: Cut and Volume Consider a connected graph G(V, E) where V = {v 1,...,v n } and E denote a set of vertices and a set of edges, respectively, with pairwise similarity values being assigned as edge weights. Graph bipartitioning involves taking the set V apart into two coherent groups, S 1 and S 2, satisfying V = S 1 S 2, ( V = n), and S 1 S 2 =, by simply cutting edges connecting the two parts }{{} cut(s 1,S 2 ) = } {{ } vol(s 1 ) + } {{ } vol(s 2 ) Adjacency matrix (similarity, proximity, affinity matrix): W = [W ij ] R n n. Degree of nodes: d i = j W ij. Volume: vol(s 1 ) = d S1 = i S 1 d i. } {{ } S 1 only } {{ } S 2 only 19 20

6 Graph Partitioning The task is to find k disjoint sets, S 1,...,S k, given G = (V, E), where S 1 S k = φ and S 1 S k = V such that a certain cut criterion is minimized. 1. Bipartitioning: cut(s 1, S 2 ) = i S 1 j S 2 W ij. 2. Multiway partitioning: cut(s 1,...,S k ) = k i=1 cut(s i, S i ). 3. Ratio cut: Rcut(S 1,...,S k ) = k cut(s i,s i ) i=1 S i. 4. Normalized cut: Ncut(S 1,...,S k ) = k i=1 cut(s i,s i ). vol(s i ) Cut: Bipartitioning The degree of dissimilarity between S 1 and S 2 can be computed by the total weights of edges that have been removed. Cut(S 1, S 2 ) = X X W ij i S1 j S 2 8 = 1 < X X X X X X d i + d j W ij 2 : i S 1 j S 2 i S 1 j S 1 i S 2 = 1 n o (q 1 q 2 ) L (q 1 q 2 ), 4 j S 2 W ij where q j = [q 1j q nj ] R n is the indicator vector which represents partitions, j 1, if i Sj q ij =, for i = 1,..., n and j = 1, 2. 0, if i / S j 9 = ; Note that q 1 and q 2 are orthogonal, i.e., q 1 q 2 = Introducing bipolar indicator vector, x = q 1 q 2 {+1, 1} n, the cut criterion is simplified as Cut(S 1, S 2 ) = 1 4 x Lx. The balanced cut involves the following combinatorial optimization problem arg min x Lx x subject to 1 x = 0, x {1, 1}. Dropping the integer constrains (spectral relaxation), leads to the symmetric eigenvalue problem. The second smallest eigenvector of L corresponds to the solution, since the smallest eigenvalue of L is 0 and its associated eigenvector is 1. The second smallest eigenvector is known as Fiedler vector. 23 Rcut and Unnormalized Spectral Clustering: k = 2 Define the indicator vector x = [x 1 x n ] with entries Then one can easily see that S / S if v i S x i = S / S if v i S. x Lx = 2 V Rcut(S, S), x 1 = 0, x = n. 24

7 arg min Rcut(S, S) S V arg min x Lx, S V subject to x 1 = 0, x i is defined in previous slide, x = n. Rcut and Unnormalized Spectral Clustering: k > 2 Define the indicator matrix X = [x 1 x k ] R n k and x i = [x 1,i x n,i ] R n with entries x i,j = { 1/ Sj if v i S j 0 if v i S j. The relaxation by discarding the condition on the discrete values on x i and instead allowing x i R, leads to arg min x R n x Lx, subject to x 1 = 0 and x = n. Then we have x i Lx i = 2 cut(s i, S i ), X X = I. S i The eigenvector associated with the second smallest eigenvalue of L. Rounding the eigenvector gives an approximate solution to the ratio cut problem. Rcut(S 1,...,S k ) = 1 2 k x i Lx i = 1 2 tr { X LX }. i= arg min Rcut(S 1,...,S k ) S 1,...,S k Then the relaxed problem becomes arg min tr { X LX }, S 1,...,S k subject to X X = I, Xis defined in previous slide. arg min X R n k tr { X LX } subject to X X = I. We determine the k smallest eigenvectors of L to form U and run k-means algorithm on the rows of U. Ncut and Normalized Spectral Clustering: k = 2 Define the indicator vector x = [x 1 x n ] with entries vol(s)/vol(s) if v i S x i = vol(s)/vol(s) if v i S. Then one can easily see that x Lx = 2 V Ncut(S, S), (Dx) 1 = 0, x Dx = vol(v)

8 arg min Ncut(S, S) S V Relaxing the problem gives arg min x Lx, S V subject to (Dx) 1 = 0, x i is defined in previous slide, x Dx = vol(v). arg min x Lx, subject to (Dx) 1 = 0 and x Dx = vol(v). x R n Define y = D 1/2 x, then the problem is arg min y } D 1/2 {{ LD 1/2 } y, subject to y D 1/2 1 = 0 and y 2 = vol(v). y R n L s The solution y is given by the 2nd smallest eigenvector of L s, implying that x is the 2nd smallest eigenvector of L rw or equivalently the 2nd smallest generalized eigenvector of Lu = λdu. 29 Ncut and Normalized Spectral Clustering: k > 2 Define the indicator matrix X = [x 1 x k ] R n k and x i = [x 1,i x n,i ] R n with entries Then we have x i,j = { 1/ vol(sj ) if v i S j 0 if v i S j. x i Lx i = 2 cut(s i, S i ), X X = I, x i Dx i = 1, vol(s i ) Ncut(S 1,...,S k ) = 1 2 k x i Lx i = 1 2 tr{ X LX }. i=1 30 arg min Ncut(S 1,...,S k ) S 1,...,S k Then the relaxed problem becomes { } arg min tr Y D 1/2 LD 1/2 Y Y R n k arg min tr { X LX }, S 1,...,S k subject to X DX = I, Xis defined in previous slide. subject to Y Y = I. Markov Random Walk View of Normalized Cut Melia and Shi 2001 Probabilistic interpretation of normalized cut Data points are clustered on the basis of the eigenvectors of the resulting transition probability matrix (constructed by the weights) We determine the k smallest eigenvectors of L s to form U and run k-means algorithm on the rows of U

9 Transition Probability Matrix We define a Markov random walk over the graph by constructing a transition probability matrix from the edge weights where j P ij = 1 for all i. P ij = W ij j W, ij The random walk proceeds by successively selecting points according to j P ij where i specifies the current location. Ncut via Transition Probabilities Define P(S S) = P(X t+1 S X t S) where X t is a state of Markov random walk model at time t. The Ncut and Markov random walk model has the following relation: Ncut(S, S) = P(S S) + P(S S). The minimization of Ncut actually seeks a cut through the graph such that a random walk seldom transitions from S to S or vice versa. If the graph is connected and non-bipartite (ergodic Markov chain), then the random walk always possesses a unique stationary distribution π = [π 1 π n ] such that π P = π, which is given by π i = d i /vol(g) Random Walk: Properties If we start from i 0, the distribution of points i t that we end up with after t steps, is given by i 1 P i0 i 1, i 2 X i 1 P i0 i 1 P i1 i 2 = i 3 X i 1 X i t. h P ti i 0 i t, h P 2i i 0 i 2, i 2 P i0 i 1 P i1 i 2 P i2 i 3 = h P 3i i 0 i 3, where P t = PP P and [ ] ij denotes the i, j component of the matrix. The distribution of points that we end up with in after t random steps converges as t increases. 35 Stochastic Matrix The stochastic matrix P is defined by P = D 1 W. To find out how P t behaves for large t, it is useful to examine the eigen-decomposition of the following symmetric matrix D 1 2WD 1 2 = λ 1 u 1 u 1 + λ 2u 2 u λ nu n u n. The symmetric matrix is related to P t since D 1 2WD 1 2 D 1 2WD 1 2 = D2 1 (P P) D 1 2. This allows us to write the t step transition probability matrix in terms of the eigenvalues/vectors of the symmetric matrix t P t = D 1 2 D 1 2WD D2 = D 1 2 where λ 1 = 1 and P = D 1 2 `u 1 u 1 D1 2. λ t 1 u 1u 1 + λt 2 u 2u λt n u nu n D2, 1 36

10 Spectral Clustering: Stochastic Matrix We are interested in the largest correction to the asymptotic limit P t P ( + D 1 2 λ t 2 u 2 u ) 1 2 D2. Note that [ ] u 2 u 2 = u ij i2u j2 and thus the largest correction term increases the probability of transitions between points that share the same sign of u i2 and decreases transitions across points with different signs. Equivalence Proposition 1. If λ, x are solutions of Px = λx and P = D 1 W, then (1 λ), x are solutions of Lx = λdx. This proposition shows the equivalence between the spectral clustering formulated by the normalized cut and the eigenvalue/vectors of the stochastic matrix P. The largest eigenvector of P is 1 containing no information. The second smallest eigenvector in the normalized cut, corresponds to the second largest eigenvector of the stochastic matrix. Binary spectral clustering: Divide the points into clusters based on the sign of the elements of u 2 u j2 > 0 cluster 1, otherwise cluster Suggested Further Readings 1. P. K. Chan, M. D. F. Schlag, and J. Y. Zien, Spectral k-way ratio-cut partitioning and clustering, IEEE Trans. CAD of Integrated Circuits and Systems, J. Shi and J. Malik, Normalized cuts and image segmentation, CVPR Y. Weiss, Segmentation using eigenvectors: A unifying view, ICCV M. Meila and J. Shi, A random walks view of spectral segmentation, AISTATS C. Ding et al., A min-max cut algorithm for graph partitioning and data clustering, ICDM A. Ng, M. Jordan, and Y. Weiss, On spectral clustering: Analysis and an algorithm, NIPS U. von Luxburg, A tutorial on spectral clustering, MPI for Biological Cybernetics, TR-149,

Spectral Clustering. Zitao Liu

Spectral Clustering. Zitao Liu Spectral Clustering Zitao Liu Agenda Brief Clustering Review Similarity Graph Graph Laplacian Spectral Clustering Algorithm Graph Cut Point of View Random Walk Point of View Perturbation Theory Point of

More information

MATH 829: Introduction to Data Mining and Analysis Clustering II

MATH 829: Introduction to Data Mining and Analysis Clustering II his lecture is based on U. von Luxburg, A Tutorial on Spectral Clustering, Statistics and Computing, 17 (4), 2007. MATH 829: Introduction to Data Mining and Analysis Clustering II Dominique Guillot Departments

More information

MATH 567: Mathematical Techniques in Data Science Clustering II

MATH 567: Mathematical Techniques in Data Science Clustering II This lecture is based on U. von Luxburg, A Tutorial on Spectral Clustering, Statistics and Computing, 17 (4), 2007. MATH 567: Mathematical Techniques in Data Science Clustering II Dominique Guillot Departments

More information

8.1 Concentration inequality for Gaussian random matrix (cont d)

8.1 Concentration inequality for Gaussian random matrix (cont d) MGMT 69: Topics in High-dimensional Data Analysis Falll 26 Lecture 8: Spectral clustering and Laplacian matrices Lecturer: Jiaming Xu Scribe: Hyun-Ju Oh and Taotao He, October 4, 26 Outline Concentration

More information

Certifying the Global Optimality of Graph Cuts via Semidefinite Programming: A Theoretic Guarantee for Spectral Clustering

Certifying the Global Optimality of Graph Cuts via Semidefinite Programming: A Theoretic Guarantee for Spectral Clustering Certifying the Global Optimality of Graph Cuts via Semidefinite Programming: A Theoretic Guarantee for Spectral Clustering Shuyang Ling Courant Institute of Mathematical Sciences, NYU Aug 13, 2018 Joint

More information

Spectral clustering. Two ideal clusters, with two points each. Spectral clustering algorithms

Spectral clustering. Two ideal clusters, with two points each. Spectral clustering algorithms A simple example Two ideal clusters, with two points each Spectral clustering Lecture 2 Spectral clustering algorithms 4 2 3 A = Ideally permuted Ideal affinities 2 Indicator vectors Each cluster has an

More information

Machine Learning for Data Science (CS4786) Lecture 11

Machine Learning for Data Science (CS4786) Lecture 11 Machine Learning for Data Science (CS4786) Lecture 11 Spectral clustering Course Webpage : http://www.cs.cornell.edu/courses/cs4786/2016sp/ ANNOUNCEMENT 1 Assignment P1 the Diagnostic assignment 1 will

More information

Summary: A Random Walks View of Spectral Segmentation, by Marina Meila (University of Washington) and Jianbo Shi (Carnegie Mellon University)

Summary: A Random Walks View of Spectral Segmentation, by Marina Meila (University of Washington) and Jianbo Shi (Carnegie Mellon University) Summary: A Random Walks View of Spectral Segmentation, by Marina Meila (University of Washington) and Jianbo Shi (Carnegie Mellon University) The authors explain how the NCut algorithm for graph bisection

More information

Data Analysis and Manifold Learning Lecture 7: Spectral Clustering

Data Analysis and Manifold Learning Lecture 7: Spectral Clustering Data Analysis and Manifold Learning Lecture 7: Spectral Clustering Radu Horaud INRIA Grenoble Rhone-Alpes, France Radu.Horaud@inrialpes.fr http://perception.inrialpes.fr/ Outline of Lecture 7 What is spectral

More information

Markov Chains and Spectral Clustering

Markov Chains and Spectral Clustering Markov Chains and Spectral Clustering Ning Liu 1,2 and William J. Stewart 1,3 1 Department of Computer Science North Carolina State University, Raleigh, NC 27695-8206, USA. 2 nliu@ncsu.edu, 3 billy@ncsu.edu

More information

A spectral clustering algorithm based on Gram operators

A spectral clustering algorithm based on Gram operators A spectral clustering algorithm based on Gram operators Ilaria Giulini De partement de Mathe matiques et Applications ENS, Paris Joint work with Olivier Catoni 1 july 2015 Clustering task of grouping

More information

MATH 567: Mathematical Techniques in Data Science Clustering II

MATH 567: Mathematical Techniques in Data Science Clustering II Spectral clustering: overview MATH 567: Mathematical Techniques in Data Science Clustering II Dominique uillot Departments of Mathematical Sciences University of Delaware Overview of spectral clustering:

More information

Learning Spectral Graph Segmentation

Learning Spectral Graph Segmentation Learning Spectral Graph Segmentation AISTATS 2005 Timothée Cour Jianbo Shi Nicolas Gogin Computer and Information Science Department University of Pennsylvania Computer Science Ecole Polytechnique Graph-based

More information

Introduction to Spectral Graph Theory and Graph Clustering

Introduction to Spectral Graph Theory and Graph Clustering Introduction to Spectral Graph Theory and Graph Clustering Chengming Jiang ECS 231 Spring 2016 University of California, Davis 1 / 40 Motivation Image partitioning in computer vision 2 / 40 Motivation

More information

1 Matrix notation and preliminaries from spectral graph theory

1 Matrix notation and preliminaries from spectral graph theory Graph clustering (or community detection or graph partitioning) is one of the most studied problems in network analysis. One reason for this is that there are a variety of ways to define a cluster or community.

More information

Spectral Theory of Unsigned and Signed Graphs Applications to Graph Clustering. Some Slides

Spectral Theory of Unsigned and Signed Graphs Applications to Graph Clustering. Some Slides Spectral Theory of Unsigned and Signed Graphs Applications to Graph Clustering Some Slides Jean Gallier Department of Computer and Information Science University of Pennsylvania Philadelphia, PA 19104,

More information

1 Matrix notation and preliminaries from spectral graph theory

1 Matrix notation and preliminaries from spectral graph theory Graph clustering (or community detection or graph partitioning) is one of the most studied problems in network analysis. One reason for this is that there are a variety of ways to define a cluster or community.

More information

17.1 Directed Graphs, Undirected Graphs, Incidence Matrices, Adjacency Matrices, Weighted Graphs

17.1 Directed Graphs, Undirected Graphs, Incidence Matrices, Adjacency Matrices, Weighted Graphs Chapter 17 Graphs and Graph Laplacians 17.1 Directed Graphs, Undirected Graphs, Incidence Matrices, Adjacency Matrices, Weighted Graphs Definition 17.1. A directed graph isa pairg =(V,E), where V = {v

More information

Spectral Clustering. Guokun Lai 2016/10

Spectral Clustering. Guokun Lai 2016/10 Spectral Clustering Guokun Lai 2016/10 1 / 37 Organization Graph Cut Fundamental Limitations of Spectral Clustering Ng 2002 paper (if we have time) 2 / 37 Notation We define a undirected weighted graph

More information

Spectral Clustering on Handwritten Digits Database Mid-Year Pr

Spectral Clustering on Handwritten Digits Database Mid-Year Pr Spectral Clustering on Handwritten Digits Database Mid-Year Presentation Danielle dmiddle1@math.umd.edu Advisor: Kasso Okoudjou kasso@umd.edu Department of Mathematics University of Maryland- College Park

More information

Spectral Techniques for Clustering

Spectral Techniques for Clustering Nicola Rebagliati 1/54 Spectral Techniques for Clustering Nicola Rebagliati 29 April, 2010 Nicola Rebagliati 2/54 Thesis Outline 1 2 Data Representation for Clustering Setting Data Representation and Methods

More information

Data Mining and Analysis: Fundamental Concepts and Algorithms

Data Mining and Analysis: Fundamental Concepts and Algorithms : Fundamental Concepts and Algorithms dataminingbook.info Mohammed J. Zaki 1 Wagner Meira Jr. 2 1 Department of Computer Science Rensselaer Polytechnic Institute, Troy, NY, USA 2 Department of Computer

More information

MLCC Clustering. Lorenzo Rosasco UNIGE-MIT-IIT

MLCC Clustering. Lorenzo Rosasco UNIGE-MIT-IIT MLCC 2018 - Clustering Lorenzo Rosasco UNIGE-MIT-IIT About this class We will consider an unsupervised setting, and in particular the problem of clustering unlabeled data into coherent groups. MLCC 2018

More information

Linear Spectral Hashing

Linear Spectral Hashing Linear Spectral Hashing Zalán Bodó and Lehel Csató Babeş Bolyai University - Faculty of Mathematics and Computer Science Kogălniceanu 1., 484 Cluj-Napoca - Romania Abstract. assigns binary hash keys to

More information

Statistical Learning Notes III-Section 2.4.3

Statistical Learning Notes III-Section 2.4.3 Statistical Learning Notes III-Section 2.4.3 "Graphical" Spectral Features Stephen Vardeman Analytics Iowa LLC January 2019 Stephen Vardeman (Analytics Iowa LLC) Statistical Learning Notes III-Section

More information

Notes on Elementary Spectral Graph Theory Applications to Graph Clustering

Notes on Elementary Spectral Graph Theory Applications to Graph Clustering Notes on Elementary Spectral Graph Theory Applications to Graph Clustering Jean Gallier Department of Computer and Information Science University of Pennsylvania Philadelphia, PA 19104, USA e-mail: jean@cis.upenn.edu

More information

Spectral Theory of Unsigned and Signed Graphs Applications to Graph Clustering: a Survey

Spectral Theory of Unsigned and Signed Graphs Applications to Graph Clustering: a Survey Spectral Theory of Unsigned and Signed Graphs Applications to Graph Clustering: a Survey Jean Gallier Department of Computer and Information Science University of Pennsylvania Philadelphia, PA 19104, USA

More information

Notes on Elementary Spectral Graph Theory Applications to Graph Clustering Using Normalized Cuts

Notes on Elementary Spectral Graph Theory Applications to Graph Clustering Using Normalized Cuts University of Pennsylvania ScholarlyCommons Technical Reports (CIS) Department of Computer & Information Science 1-1-2013 Notes on Elementary Spectral Graph Theory Applications to Graph Clustering Using

More information

Laplacian Eigenmaps for Dimensionality Reduction and Data Representation

Laplacian Eigenmaps for Dimensionality Reduction and Data Representation Introduction and Data Representation Mikhail Belkin & Partha Niyogi Department of Electrical Engieering University of Minnesota Mar 21, 2017 1/22 Outline Introduction 1 Introduction 2 3 4 Connections to

More information

STA141C: Big Data & High Performance Statistical Computing

STA141C: Big Data & High Performance Statistical Computing STA141C: Big Data & High Performance Statistical Computing Lecture 12: Graph Clustering Cho-Jui Hsieh UC Davis May 29, 2018 Graph Clustering Given a graph G = (V, E, W ) V : nodes {v 1,, v n } E: edges

More information

Networks and Their Spectra

Networks and Their Spectra Networks and Their Spectra Victor Amelkin University of California, Santa Barbara Department of Computer Science victor@cs.ucsb.edu December 4, 2017 1 / 18 Introduction Networks (= graphs) are everywhere.

More information

Learning Spectral Clustering

Learning Spectral Clustering Learning Spectral Clustering Francis R. Bach Computer Science University of California Berkeley, CA 94720 fbach@cs.berkeley.edu Michael I. Jordan Computer Science and Statistics University of California

More information

Doubly Stochastic Normalization for Spectral Clustering

Doubly Stochastic Normalization for Spectral Clustering Doubly Stochastic Normalization for Spectral Clustering Ron Zass and Amnon Shashua Abstract In this paper we focus on the issue of normalization of the affinity matrix in spectral clustering. We show that

More information

Spectral Clustering on Handwritten Digits Database

Spectral Clustering on Handwritten Digits Database University of Maryland-College Park Advance Scientific Computing I,II Spectral Clustering on Handwritten Digits Database Author: Danielle Middlebrooks Dmiddle1@math.umd.edu Second year AMSC Student Advisor:

More information

CS224W: Social and Information Network Analysis Jure Leskovec, Stanford University

CS224W: Social and Information Network Analysis Jure Leskovec, Stanford University CS224W: Social and Information Network Analysis Jure Leskovec Stanford University Jure Leskovec, Stanford University http://cs224w.stanford.edu Task: Find coalitions in signed networks Incentives: European

More information

Diffuse interface methods on graphs: Data clustering and Gamma-limits

Diffuse interface methods on graphs: Data clustering and Gamma-limits Diffuse interface methods on graphs: Data clustering and Gamma-limits Yves van Gennip joint work with Andrea Bertozzi, Jeff Brantingham, Blake Hunter Department of Mathematics, UCLA Research made possible

More information

CS 664 Segmentation (2) Daniel Huttenlocher

CS 664 Segmentation (2) Daniel Huttenlocher CS 664 Segmentation (2) Daniel Huttenlocher Recap Last time covered perceptual organization more broadly, focused in on pixel-wise segmentation Covered local graph-based methods such as MST and Felzenszwalb-Huttenlocher

More information

Data Analysis and Manifold Learning Lecture 3: Graphs, Graph Matrices, and Graph Embeddings

Data Analysis and Manifold Learning Lecture 3: Graphs, Graph Matrices, and Graph Embeddings Data Analysis and Manifold Learning Lecture 3: Graphs, Graph Matrices, and Graph Embeddings Radu Horaud INRIA Grenoble Rhone-Alpes, France Radu.Horaud@inrialpes.fr http://perception.inrialpes.fr/ Outline

More information

CS 556: Computer Vision. Lecture 13

CS 556: Computer Vision. Lecture 13 CS 556: Computer Vision Lecture 1 Prof. Sinisa Todorovic sinisa@eecs.oregonstate.edu 1 Outline Perceptual grouping Low-level segmentation Ncuts Perceptual Grouping What do you see? 4 What do you see? Rorschach

More information

Robust Motion Segmentation by Spectral Clustering

Robust Motion Segmentation by Spectral Clustering Robust Motion Segmentation by Spectral Clustering Hongbin Wang and Phil F. Culverhouse Centre for Robotics Intelligent Systems University of Plymouth Plymouth, PL4 8AA, UK {hongbin.wang, P.Culverhouse}@plymouth.ac.uk

More information

Graphs in Machine Learning

Graphs in Machine Learning Graphs in Machine Learning Michal Valko INRIA Lille - Nord Europe, France Partially based on material by: Ulrike von Luxburg, Gary Miller, Doyle & Schnell, Daniel Spielman January 27, 2015 MVA 2014/2015

More information

On the Equivalence of Nonnegative Matrix Factorization and Spectral Clustering

On the Equivalence of Nonnegative Matrix Factorization and Spectral Clustering On the Equivalence of Nonnegative Matrix Factorization and Spectral Clustering Chris Ding, Xiaofeng He, Horst D. Simon Published on SDM 05 Hongchang Gao Outline NMF NMF Kmeans NMF Spectral Clustering NMF

More information

Finding normalized and modularity cuts by spectral clustering. Ljubjana 2010, October

Finding normalized and modularity cuts by spectral clustering. Ljubjana 2010, October Finding normalized and modularity cuts by spectral clustering Marianna Bolla Institute of Mathematics Budapest University of Technology and Economics marib@math.bme.hu Ljubjana 2010, October Outline Find

More information

Analysis of Spectral Kernel Design based Semi-supervised Learning

Analysis of Spectral Kernel Design based Semi-supervised Learning Analysis of Spectral Kernel Design based Semi-supervised Learning Tong Zhang IBM T. J. Watson Research Center Yorktown Heights, NY 10598 Rie Kubota Ando IBM T. J. Watson Research Center Yorktown Heights,

More information

Markov Chains, Random Walks on Graphs, and the Laplacian

Markov Chains, Random Walks on Graphs, and the Laplacian Markov Chains, Random Walks on Graphs, and the Laplacian CMPSCI 791BB: Advanced ML Sridhar Mahadevan Random Walks! There is significant interest in the problem of random walks! Markov chain analysis! Computer

More information

Statistical Machine Learning

Statistical Machine Learning Statistical Machine Learning Christoph Lampert Spring Semester 2015/2016 // Lecture 12 1 / 36 Unsupervised Learning Dimensionality Reduction 2 / 36 Dimensionality Reduction Given: data X = {x 1,..., x

More information

Multiclass Spectral Clustering

Multiclass Spectral Clustering Multiclass Spectral Clustering Stella X. Yu Jianbo Shi Robotics Institute and CNBC Dept. of Computer and Information Science Carnegie Mellon University University of Pennsylvania Pittsburgh, PA 15213-3890

More information

Spectral Generative Models for Graphs

Spectral Generative Models for Graphs Spectral Generative Models for Graphs David White and Richard C. Wilson Department of Computer Science University of York Heslington, York, UK wilson@cs.york.ac.uk Abstract Generative models are well known

More information

COMPSCI 514: Algorithms for Data Science

COMPSCI 514: Algorithms for Data Science COMPSCI 514: Algorithms for Data Science Arya Mazumdar University of Massachusetts at Amherst Fall 2018 Lecture 8 Spectral Clustering Spectral clustering Curse of dimensionality Dimensionality Reduction

More information

Lecture 13: Spectral Graph Theory

Lecture 13: Spectral Graph Theory CSE 521: Design and Analysis of Algorithms I Winter 2017 Lecture 13: Spectral Graph Theory Lecturer: Shayan Oveis Gharan 11/14/18 Disclaimer: These notes have not been subjected to the usual scrutiny reserved

More information

An indicator for the number of clusters using a linear map to simplex structure

An indicator for the number of clusters using a linear map to simplex structure An indicator for the number of clusters using a linear map to simplex structure Marcus Weber, Wasinee Rungsarityotin, and Alexander Schliep Zuse Institute Berlin ZIB Takustraße 7, D-495 Berlin, Germany

More information

Diffusion and random walks on graphs

Diffusion and random walks on graphs Diffusion and random walks on graphs Leonid E. Zhukov School of Data Analysis and Artificial Intelligence Department of Computer Science National Research University Higher School of Economics Structural

More information

Higher Order Learning with Graphs

Higher Order Learning with Graphs Sameer Agarwal sagarwal@cs.ucsd.edu Kristin Branson kbranson@cs.ucsd.edu Serge Belongie sjb@cs.ucsd.edu Department of Computer Science and Engineering, University of California San Diego, La Jolla, CA

More information

SVD, Power method, and Planted Graph problems (+ eigenvalues of random matrices)

SVD, Power method, and Planted Graph problems (+ eigenvalues of random matrices) Chapter 14 SVD, Power method, and Planted Graph problems (+ eigenvalues of random matrices) Today we continue the topic of low-dimensional approximation to datasets and matrices. Last time we saw the singular

More information

Some notes on SVD, dimensionality reduction, and clustering. Guangliang Chen

Some notes on SVD, dimensionality reduction, and clustering. Guangliang Chen Some notes on SVD, dimensionality reduction, and clustering Guangliang Chen Contents 1. Introduction 4 2. Review of necessary linear algebra 4 3. Matrix SVD and its applications 8 Practice problems set

More information

Grouping with Bias. Stella X. Yu 1,2 Jianbo Shi 1. Robotics Institute 1 Carnegie Mellon University Center for the Neural Basis of Cognition 2

Grouping with Bias. Stella X. Yu 1,2 Jianbo Shi 1. Robotics Institute 1 Carnegie Mellon University Center for the Neural Basis of Cognition 2 Grouping with Bias Stella X. Yu, Jianbo Shi Robotics Institute Carnegie Mellon University Center for the Neural Basis of Cognition What Is It About? Incorporating prior knowledge into grouping Unitary

More information

SPECTRAL CLUSTERING AND KERNEL PRINCIPAL COMPONENT ANALYSIS ARE PURSUING GOOD PROJECTIONS

SPECTRAL CLUSTERING AND KERNEL PRINCIPAL COMPONENT ANALYSIS ARE PURSUING GOOD PROJECTIONS SPECTRAL CLUSTERING AND KERNEL PRINCIPAL COMPONENT ANALYSIS ARE PURSUING GOOD PROJECTIONS VIKAS CHANDRAKANT RAYKAR DECEMBER 5, 24 Abstract. We interpret spectral clustering algorithms in the light of unsupervised

More information

EECS 275 Matrix Computation

EECS 275 Matrix Computation EECS 275 Matrix Computation Ming-Hsuan Yang Electrical Engineering and Computer Science University of California at Merced Merced, CA 95344 http://faculty.ucmerced.edu/mhyang Lecture 23 1 / 27 Overview

More information

Graph Partitioning Using Random Walks

Graph Partitioning Using Random Walks Graph Partitioning Using Random Walks A Convex Optimization Perspective Lorenzo Orecchia Computer Science Why Spectral Algorithms for Graph Problems in practice? Simple to implement Can exploit very efficient

More information

Fisher s Linear Discriminant Analysis

Fisher s Linear Discriminant Analysis Fisher s Linear Discriminant Analysis Seungjin Choi Department of Computer Science and Engineering Pohang University of Science and Technology 77 Cheongam-ro, Nam-gu, Pohang 37673, Korea seungjin@postech.ac.kr

More information

Lecture 12: Introduction to Spectral Graph Theory, Cheeger s inequality

Lecture 12: Introduction to Spectral Graph Theory, Cheeger s inequality CSE 521: Design and Analysis of Algorithms I Spring 2016 Lecture 12: Introduction to Spectral Graph Theory, Cheeger s inequality Lecturer: Shayan Oveis Gharan May 4th Scribe: Gabriel Cadamuro Disclaimer:

More information

ECS231: Spectral Partitioning. Based on Berkeley s CS267 lecture on graph partition

ECS231: Spectral Partitioning. Based on Berkeley s CS267 lecture on graph partition ECS231: Spectral Partitioning Based on Berkeley s CS267 lecture on graph partition 1 Definition of graph partitioning Given a graph G = (N, E, W N, W E ) N = nodes (or vertices), E = edges W N = node weights

More information

Learning Spectral Clustering

Learning Spectral Clustering Learning Spectral Clustering Francis R. Bach fbach@cs.berkeley.edu Computer Science Division University of California Berkeley, CA 94720, USA Michael I. Jordan jordan@cs.berkeley.edu Computer Science Division

More information

Space-Variant Computer Vision: A Graph Theoretic Approach

Space-Variant Computer Vision: A Graph Theoretic Approach p.1/65 Space-Variant Computer Vision: A Graph Theoretic Approach Leo Grady Cognitive and Neural Systems Boston University p.2/65 Outline of talk Space-variant vision - Why and how of graph theory Anisotropic

More information

Clustering by weighted cuts in directed graphs

Clustering by weighted cuts in directed graphs Clustering by weighted cuts in directed graphs Marina Meilă Department of Statistics University of Washington mmp@stat.washington.edu William Pentney Dept. of Computer Science & Engineering University

More information

Manifold Learning: Theory and Applications to HRI

Manifold Learning: Theory and Applications to HRI Manifold Learning: Theory and Applications to HRI Seungjin Choi Department of Computer Science Pohang University of Science and Technology, Korea seungjin@postech.ac.kr August 19, 2008 1 / 46 Greek Philosopher

More information

Mining of Massive Datasets Jure Leskovec, AnandRajaraman, Jeff Ullman Stanford University

Mining of Massive Datasets Jure Leskovec, AnandRajaraman, Jeff Ullman Stanford University Note to other teachers and users of these slides: We would be delighted if you found this our material useful in giving your own lectures. Feel free to use these slides verbatim, or to modify them to fit

More information

Graph-Laplacian PCA: Closed-form Solution and Robustness

Graph-Laplacian PCA: Closed-form Solution and Robustness 2013 IEEE Conference on Computer Vision and Pattern Recognition Graph-Laplacian PCA: Closed-form Solution and Robustness Bo Jiang a, Chris Ding b,a, Bin Luo a, Jin Tang a a School of Computer Science and

More information

Spectral Graph Theory

Spectral Graph Theory Spectral Graph Theory Aaron Mishtal April 27, 2016 1 / 36 Outline Overview Linear Algebra Primer History Theory Applications Open Problems Homework Problems References 2 / 36 Outline Overview Linear Algebra

More information

Chapter 4. Signed Graphs. Intuitively, in a weighted graph, an edge with a positive weight denotes similarity or proximity of its endpoints.

Chapter 4. Signed Graphs. Intuitively, in a weighted graph, an edge with a positive weight denotes similarity or proximity of its endpoints. Chapter 4 Signed Graphs 4.1 Signed Graphs and Signed Laplacians Intuitively, in a weighted graph, an edge with a positive weight denotes similarity or proximity of its endpoints. For many reasons, it is

More information

March 13, Paper: R.R. Coifman, S. Lafon, Diffusion maps ([Coifman06]) Seminar: Learning with Graphs, Prof. Hein, Saarland University

March 13, Paper: R.R. Coifman, S. Lafon, Diffusion maps ([Coifman06]) Seminar: Learning with Graphs, Prof. Hein, Saarland University Kernels March 13, 2008 Paper: R.R. Coifman, S. Lafon, maps ([Coifman06]) Seminar: Learning with Graphs, Prof. Hein, Saarland University Kernels Figure: Example Application from [LafonWWW] meaningful geometric

More information

NORMALIZED CUTS ARE APPROXIMATELY INVERSE EXIT TIMES

NORMALIZED CUTS ARE APPROXIMATELY INVERSE EXIT TIMES NORMALIZED CUTS ARE APPROXIMATELY INVERSE EXIT TIMES MATAN GAVISH AND BOAZ NADLER Abstract The Normalized Cut is a widely used measure of separation between clusters in a graph In this paper we provide

More information

Unsupervised dimensionality reduction

Unsupervised dimensionality reduction Unsupervised dimensionality reduction Guillaume Obozinski Ecole des Ponts - ParisTech SOCN course 2014 Guillaume Obozinski Unsupervised dimensionality reduction 1/30 Outline 1 PCA 2 Kernel PCA 3 Multidimensional

More information

Lecture 10: Dimension Reduction Techniques

Lecture 10: Dimension Reduction Techniques Lecture 10: Dimension Reduction Techniques Radu Balan Department of Mathematics, AMSC, CSCAMM and NWC University of Maryland, College Park, MD April 17, 2018 Input Data It is assumed that there is a set

More information

Consistency of Spectral Clustering

Consistency of Spectral Clustering Max Planck Institut für biologische Kybernetik Max Planck Institute for Biological Cybernetics Technical Report No. TR-34 Consistency of Spectral Clustering Ulrike von Luxburg, Mikhail Belkin 2, Olivier

More information

A lower bound for the Laplacian eigenvalues of a graph proof of a conjecture by Guo

A lower bound for the Laplacian eigenvalues of a graph proof of a conjecture by Guo A lower bound for the Laplacian eigenvalues of a graph proof of a conjecture by Guo A. E. Brouwer & W. H. Haemers 2008-02-28 Abstract We show that if µ j is the j-th largest Laplacian eigenvalue, and d

More information

Limits of Spectral Clustering

Limits of Spectral Clustering Limits of Spectral Clustering Ulrike von Luxburg and Olivier Bousquet Max Planck Institute for Biological Cybernetics Spemannstr. 38, 72076 Tübingen, Germany {ulrike.luxburg,olivier.bousquet}@tuebingen.mpg.de

More information

Clustering and Embedding using Commute Times

Clustering and Embedding using Commute Times Clustering and Embedding using Commute Times Huaijun Qiu and Edwin R. Hancock Abstract This paper exploits the properties of the commute time between nodes of a graph for the purposes of clustering and

More information

Lecture 12 : Graph Laplacians and Cheeger s Inequality

Lecture 12 : Graph Laplacians and Cheeger s Inequality CPS290: Algorithmic Foundations of Data Science March 7, 2017 Lecture 12 : Graph Laplacians and Cheeger s Inequality Lecturer: Kamesh Munagala Scribe: Kamesh Munagala Graph Laplacian Maybe the most beautiful

More information

Part I: Preliminary Results. Pak K. Chan, Martine Schlag and Jason Zien. Computer Engineering Board of Studies. University of California, Santa Cruz

Part I: Preliminary Results. Pak K. Chan, Martine Schlag and Jason Zien. Computer Engineering Board of Studies. University of California, Santa Cruz Spectral K-Way Ratio-Cut Partitioning Part I: Preliminary Results Pak K. Chan, Martine Schlag and Jason Zien Computer Engineering Board of Studies University of California, Santa Cruz May, 99 Abstract

More information

CSE 291. Assignment Spectral clustering versus k-means. Out: Wed May 23 Due: Wed Jun 13

CSE 291. Assignment Spectral clustering versus k-means. Out: Wed May 23 Due: Wed Jun 13 CSE 291. Assignment 3 Out: Wed May 23 Due: Wed Jun 13 3.1 Spectral clustering versus k-means Download the rings data set for this problem from the course web site. The data is stored in MATLAB format as

More information

Markov Random Fields

Markov Random Fields Markov Random Fields Umamahesh Srinivas ipal Group Meeting February 25, 2011 Outline 1 Basic graph-theoretic concepts 2 Markov chain 3 Markov random field (MRF) 4 Gauss-Markov random field (GMRF), and

More information

The f-adjusted Graph Laplacian: a Diagonal Modification with a Geometric Interpretation

The f-adjusted Graph Laplacian: a Diagonal Modification with a Geometric Interpretation The f-adjusted Graph Laplacian: a Diagonal Modification with a Geometric Interpretation Sven Kurras Ulrike von Luxburg Department of Computer Science, University of Hamburg, Germany Gilles Blanchard Department

More information

The Second Eigenvalue of the Google Matrix

The Second Eigenvalue of the Google Matrix The Second Eigenvalue of the Google Matrix Taher H. Haveliwala and Sepandar D. Kamvar Stanford University {taherh,sdkamvar}@cs.stanford.edu Abstract. We determine analytically the modulus of the second

More information

GAUSSIAN PROCESS TRANSFORMS

GAUSSIAN PROCESS TRANSFORMS GAUSSIAN PROCESS TRANSFORMS Philip A. Chou Ricardo L. de Queiroz Microsoft Research, Redmond, WA, USA pachou@microsoft.com) Computer Science Department, Universidade de Brasilia, Brasilia, Brazil queiroz@ieee.org)

More information

Spectral Clustering. by HU Pili. June 16, 2013

Spectral Clustering. by HU Pili. June 16, 2013 Spectral Clustering by HU Pili June 16, 2013 Outline Clustering Problem Spectral Clustering Demo Preliminaries Clustering: K-means Algorithm Dimensionality Reduction: PCA, KPCA. Spectral Clustering Framework

More information

Communities, Spectral Clustering, and Random Walks

Communities, Spectral Clustering, and Random Walks Communities, Spectral Clustering, and Random Walks David Bindel Department of Computer Science Cornell University 26 Sep 2011 20 21 19 16 22 28 17 18 29 26 27 30 23 1 25 5 8 24 2 4 14 3 9 13 15 11 10 12

More information

Stochastic processes. MAS275 Probability Modelling. Introduction and Markov chains. Continuous time. Markov property

Stochastic processes. MAS275 Probability Modelling. Introduction and Markov chains. Continuous time. Markov property Chapter 1: and Markov chains Stochastic processes We study stochastic processes, which are families of random variables describing the evolution of a quantity with time. In some situations, we can treat

More information

Diffusion Maps, Spectral Clustering and Eigenfunctions of Fokker-Planck Operators

Diffusion Maps, Spectral Clustering and Eigenfunctions of Fokker-Planck Operators Diffusion Maps, Spectral Clustering and Eigenfunctions of Fokker-Planck Operators Boaz Nadler Stéphane Lafon Ronald R. Coifman Department of Mathematics, Yale University, New Haven, CT 652. {boaz.nadler,stephane.lafon,ronald.coifman}@yale.edu

More information

Supplemental Material for Discrete Graph Hashing

Supplemental Material for Discrete Graph Hashing Supplemental Material for Discrete Graph Hashing Wei Liu Cun Mu Sanjiv Kumar Shih-Fu Chang IM T. J. Watson Research Center Columbia University Google Research weiliu@us.ibm.com cm52@columbia.edu sfchang@ee.columbia.edu

More information

MATH 5720: Unconstrained Optimization Hung Phan, UMass Lowell September 13, 2018

MATH 5720: Unconstrained Optimization Hung Phan, UMass Lowell September 13, 2018 MATH 57: Unconstrained Optimization Hung Phan, UMass Lowell September 13, 18 1 Global and Local Optima Let a function f : S R be defined on a set S R n Definition 1 (minimizers and maximizers) (i) x S

More information

Probabilistic Latent Semantic Analysis

Probabilistic Latent Semantic Analysis Probabilistic Latent Semantic Analysis Seungjin Choi Department of Computer Science and Engineering Pohang University of Science and Technology 77 Cheongam-ro, Nam-gu, Pohang 37673, Korea seungjin@postech.ac.kr

More information

Statistical and Computational Analysis of Locality Preserving Projection

Statistical and Computational Analysis of Locality Preserving Projection Statistical and Computational Analysis of Locality Preserving Projection Xiaofei He xiaofei@cs.uchicago.edu Department of Computer Science, University of Chicago, 00 East 58th Street, Chicago, IL 60637

More information

Eigenvalue Problems Computation and Applications

Eigenvalue Problems Computation and Applications Eigenvalue ProblemsComputation and Applications p. 1/36 Eigenvalue Problems Computation and Applications Che-Rung Lee cherung@gmail.com National Tsing Hua University Eigenvalue ProblemsComputation and

More information

The Laplacian PDF Distance: A Cost Function for Clustering in a Kernel Feature Space

The Laplacian PDF Distance: A Cost Function for Clustering in a Kernel Feature Space The Laplacian PDF Distance: A Cost Function for Clustering in a Kernel Feature Space Robert Jenssen, Deniz Erdogmus 2, Jose Principe 2, Torbjørn Eltoft Department of Physics, University of Tromsø, Norway

More information

Cholesky Decomposition Rectification for Non-negative Matrix Factorization

Cholesky Decomposition Rectification for Non-negative Matrix Factorization Cholesky Decomposition Rectification for Non-negative Matrix Factorization Tetsuya Yoshida Graduate School of Information Science and Technology, Hokkaido University N-14 W-9, Sapporo 060-0814, Japan yoshida@meme.hokudai.ac.jp

More information

Lecture 14: SVD, Power method, and Planted Graph problems (+ eigenvalues of random matrices) Lecturer: Sanjeev Arora

Lecture 14: SVD, Power method, and Planted Graph problems (+ eigenvalues of random matrices) Lecturer: Sanjeev Arora princeton univ. F 13 cos 521: Advanced Algorithm Design Lecture 14: SVD, Power method, and Planted Graph problems (+ eigenvalues of random matrices) Lecturer: Sanjeev Arora Scribe: Today we continue the

More information

Power Grid Partitioning: Static and Dynamic Approaches

Power Grid Partitioning: Static and Dynamic Approaches Power Grid Partitioning: Static and Dynamic Approaches Miao Zhang, Zhixin Miao, Lingling Fan Department of Electrical Engineering University of South Florida Tampa FL 3320 miaozhang@mail.usf.edu zmiao,

More information

Self-Tuning Spectral Clustering

Self-Tuning Spectral Clustering Self-Tuning Spectral Clustering Lihi Zelnik-Manor Pietro Perona Department of Electrical Engineering Department of Electrical Engineering California Institute of Technology California Institute of Technology

More information

Diffusion Maps, Spectral Clustering and Eigenfunctions of Fokker-Planck Operators

Diffusion Maps, Spectral Clustering and Eigenfunctions of Fokker-Planck Operators Diffusion Maps, Spectral Clustering and Eigenfunctions of Fokker-Planck Operators Boaz Nadler Stéphane Lafon Ronald R. Coifman Department of Mathematics, Yale University, New Haven, CT 652. {boaz.nadler,stephane.lafon,ronald.coifman}@yale.edu

More information