MATH 567: Mathematical Techniques in Data Science Clustering II
|
|
- Michael Patterson
- 5 years ago
- Views:
Transcription
1 This lecture is based on U. von Luxburg, A Tutorial on Spectral Clustering, Statistics and Computing, 17 (4), MATH 567: Mathematical Techniques in Data Science Clustering II Dominique Guillot Departments of Mathematical Sciences University of Delaware May 8, 2017
2 Spectral clustering: overview 2/18 Overview of spectral clustering: 1 Construct a similarity matrix measuring the similarity of pairs of objects (e.g. s ij = exp( x i x j 2 /(2σ 2 ))).
3 Spectral clustering: overview 2/18 Overview of spectral clustering: 1 Construct a similarity matrix measuring the similarity of pairs of objects (e.g. s ij = exp( x i x j 2 /(2σ 2 ))). 2 Use the similarity matrix to construct a (weighted or unweighted) graph.
4 Spectral clustering: overview 2/18 Overview of spectral clustering: 1 Construct a similarity matrix measuring the similarity of pairs of objects (e.g. s ij = exp( x i x j 2 /(2σ 2 ))). 2 Use the similarity matrix to construct a (weighted or unweighted) graph. 3 Compute eigenvectors of the (normalized or unnormalized) graph Laplacian: L = D W, L sym = D 1/2 LD 1/2.
5 Spectral clustering: overview 2/18 Overview of spectral clustering: 1 Construct a similarity matrix measuring the similarity of pairs of objects (e.g. s ij = exp( x i x j 2 /(2σ 2 ))). 2 Use the similarity matrix to construct a (weighted or unweighted) graph. 3 Compute eigenvectors of the (normalized or unnormalized) graph Laplacian: L = D W, L sym = D 1/2 LD 1/2. 4 Construct a matrix containing the rst K eigenvectors of L or L sym as columns.
6 Spectral clustering: overview 2/18 Overview of spectral clustering: 1 Construct a similarity matrix measuring the similarity of pairs of objects (e.g. s ij = exp( x i x j 2 /(2σ 2 ))). 2 Use the similarity matrix to construct a (weighted or unweighted) graph. 3 Compute eigenvectors of the (normalized or unnormalized) graph Laplacian: L = D W, L sym = D 1/2 LD 1/2. 4 Construct a matrix containing the rst K eigenvectors of L or L sym as columns. 5 Each row identies a vertex of the graph to a point in R K.
7 Spectral clustering: overview Overview of spectral clustering: 1 Construct a similarity matrix measuring the similarity of pairs of objects (e.g. s ij = exp( x i x j 2 /(2σ 2 ))). 2 Use the similarity matrix to construct a (weighted or unweighted) graph. 3 Compute eigenvectors of the (normalized or unnormalized) graph Laplacian: L = D W, L sym = D 1/2 LD 1/2. 4 Construct a matrix containing the rst K eigenvectors of L or L sym as columns. 5 Each row identies a vertex of the graph to a point in R K. 6 Cluster those points using the K-means algorithm. 2/18
8 Example 3/18 Let us try to cluster the following graph: A =, L = We have: v 2 = ( , , , , , , , ) T.
9 The unnormalized Laplacian 4/18 Proposition: The matrix L satises the following properties:
10 The unnormalized Laplacian 4/18 Proposition: The matrix L satises the following properties: 1 For any f R n : f T Lf = 1 n w ij (f i f j ) 2. 2 i,j=1
11 The unnormalized Laplacian 4/18 Proposition: The matrix L satises the following properties: 1 For any f R n : f T Lf = 1 n w ij (f i f j ) 2. 2 i,j=1 2 L is symmetric and positive semidenite.
12 The unnormalized Laplacian 4/18 Proposition: The matrix L satises the following properties: 1 For any f R n : f T Lf = 1 n w ij (f i f j ) 2. 2 i,j=1 2 L is symmetric and positive semidenite. 3 0 is an eigenvalue of L with associated constant eigenvector 1.
13 The unnormalized Laplacian 4/18 Proposition: The matrix L satises the following properties: 1 For any f R n : f T Lf = 1 n w ij (f i f j ) 2. 2 i,j=1 2 L is symmetric and positive semidenite. 3 0 is an eigenvalue of L with associated constant eigenvector 1. Proof:
14 The unnormalized Laplacian (2) follows from (1). (3) is easy. 4/18 Proposition: The matrix L satises the following properties: 1 For any f R n : f T Lf = 1 n w ij (f i f j ) 2. 2 i,j=1 2 L is symmetric and positive semidenite. 3 0 is an eigenvalue of L with associated constant eigenvector 1. Proof: To prove (1), f T Lf = f T Df f T W f = n d i fi 2 i=1 n w ij f i f j i,j=1 = 1 n d i fi = 1 2 i=1 n w ij f i f j + i,j=1 n w ij (f i f j ) 2. i,j=1 n d j fj 2 j=1
15 The unnormalized Laplacian (cont.) 5/18 Proposition: Let G be an undirected graph with non-negative weights. Then:
16 The unnormalized Laplacian (cont.) 5/18 Proposition: Let G be an undirected graph with non-negative weights. Then: 1 The multiplicity k of the eigenvalue 0 of L equals the number of connected components A 1,..., A k in the graph.
17 The unnormalized Laplacian (cont.) 5/18 Proposition: Let G be an undirected graph with non-negative weights. Then: 1 The multiplicity k of the eigenvalue 0 of L equals the number of connected components A 1,..., A k in the graph. 2 The eigenspace of eigenvalue 0 is spanned by the indicator vectors 1 A1,..., 1 Ak of those components.
18 The unnormalized Laplacian (cont.) 5/18 Proposition: Let G be an undirected graph with non-negative weights. Then: 1 The multiplicity k of the eigenvalue 0 of L equals the number of connected components A 1,..., A k in the graph. 2 The eigenspace of eigenvalue 0 is spanned by the indicator vectors 1 A1,..., 1 Ak of those components. Proof:
19 The unnormalized Laplacian (cont.) 5/18 Proposition: Let G be an undirected graph with non-negative weights. Then: 1 The multiplicity k of the eigenvalue 0 of L equals the number of connected components A 1,..., A k in the graph. 2 The eigenspace of eigenvalue 0 is spanned by the indicator vectors 1 A1,..., 1 Ak of those components. Proof: If f is an eigenvector associated to λ = 0, then 0 = f T Lf = n w ij (f i f j ) 2. i,j=1 It follows that f i = f j whenever w ij > 0. Thus f is constant on the connected components of G. We conclude that the eigenspace of 0 is contained in span(1 A1,..., 1 Ak ). Conversely, it is not hard to see that each 1 Ai is an eigenvector associated to 0 (write L in block diagonal form).
20 The normalized Laplacians 6/18 Proposition: The normalized Laplacians satisfy the following properties: 1 For every f R n, we have f T L sym f = 1 2 ( ) 2 n f i w ij f j. di dj i,j=1 2 λ is an eigenvalue of L rw with eigenvector u if and only if λ is an eigenvalue of L sym with eigenvector w = D 1/2 u. 3 λ is an eigenvalue of L rw with eigenvector u if and only if λ and u solve the generalized eigenproblem Lu = λdu. Proof: The proof of (1) is similar to the proof of the analogous result for the unnormalized Laplacian. (2) and (3) follow easily by using appropriate rescalings.
21 The normalized Laplacians (cont.) 7/18 Proposition: Let G be an undirected graph with non-negative weights. Then: 1 The multiplicity k of the eigenvalue 0 of both L sym and L rw equals the number of connected components A 1,..., A k in the graph. 2 For L rw, the eigenspace of eigenvalue 0 is spanned by the indicator vectors 1 Ai, i = 1,..., k. 3 For L sym, the eigenspace of eigenvalue 0 is spanned by the vectors D 1/2 1 Ai, i = 1,..., k. Proof: Similar to the proof of the analogous result for the unnormalized Laplacian.
22 Graph cuts G graph with (weighted) adjacency matrix W = (wij ). We de ne: X W (A, B) := wij. i A,j B A := number of vertices in vol(a) := A. P i A di. 8/18
23 Graph cuts G graph with (weighted) adjacency matrix W = (wij ). We de ne: X W (A, B) := wij. i A,j B A := number of vertices in vol(a) := A. P i A di. Given a partition A1,..., A k of the vertices of G, we let k cut(a1,..., Ak ) := 1X W (Ai, Ai ). 2 i=1 8/18
24 Graph cuts G graph with (weighted) adjacency matrix W = (wij ). We de ne: X W (A, B) := wij. i A,j B A := number of vertices in vol(a) := A. P i A di. Given a partition A1,..., A k of the vertices of G, we let k cut(a1,..., Ak ) := 1X W (Ai, Ai ). 2 i=1 The min-cut problem consists of solving: min V =A1 Ak Ai Aj = i6=j cut(a1,..., Ak ). 8/18
25 Graph cuts (cont.) 9/18 The min-cut problem can be solved eciently when k = 2 (see Stoer and Wagner 1997).
26 Graph cuts (cont.) 9/18 The min-cut problem can be solved eciently when k = 2 (see Stoer and Wagner 1997). In practice it often does not lead to satisfactory partitions.
27 Graph cuts (cont.) 9/18 The min-cut problem can be solved eciently when k = 2 (see Stoer and Wagner 1997). In practice it often does not lead to satisfactory partitions. In many cases, the solution of min-cut simply separates one individual vertex from the rest of the graph.
28 Graph cuts (cont.) 9/18 The min-cut problem can be solved eciently when k = 2 (see Stoer and Wagner 1997). In practice it often does not lead to satisfactory partitions. In many cases, the solution of min-cut simply separates one individual vertex from the rest of the graph. We would like clusters to have a reasonably large number of points.
29 Graph cuts (cont.) The min-cut problem can be solved eciently when k = 2 (see Stoer and Wagner 1997). In practice it often does not lead to satisfactory partitions. In many cases, the solution of min-cut simply separates one individual vertex from the rest of the graph. We would like clusters to have a reasonably large number of points. We therefore modify the min-cut problem to enforce such constraints. 9/18
30 10/18 Balanced cuts The two most common objective functions that are used as a replacement to the min-cut objective are: 1 RatioCut (Hagen and Kahng, 1992): RatioCut(A 1,..., A k ) := 1 2 k i=1 W (A i, A i ) A i = k i=1 cut(a i, A i ). A i
31 10/18 Balanced cuts The two most common objective functions that are used as a replacement to the min-cut objective are: 1 RatioCut (Hagen and Kahng, 1992): RatioCut(A 1,..., A k ) := 1 2 k i=1 W (A i, A i ) A i = k i=1 cut(a i, A i ). A i 2 Normalized cut (Shi and Malik, 2000): Ncut(A 1,..., A k ) := 1 2 k i=1 W (A i, A i ) vol(a i ) = k i=1 cut(a i, A i ). vol(a i )
32 10/18 Balanced cuts The two most common objective functions that are used as a replacement to the min-cut objective are: 1 RatioCut (Hagen and Kahng, 1992): RatioCut(A 1,..., A k ) := 1 2 k i=1 W (A i, A i ) A i = k i=1 cut(a i, A i ). A i 2 Normalized cut (Shi and Malik, 2000): Ncut(A 1,..., A k ) := 1 2 k i=1 W (A i, A i ) vol(a i ) = k i=1 cut(a i, A i ). vol(a i ) Note: both objective functions take larger values when the clusters A i are small.
33 10/18 Balanced cuts The two most common objective functions that are used as a replacement to the min-cut objective are: 1 RatioCut (Hagen and Kahng, 1992): RatioCut(A 1,..., A k ) := 1 2 k i=1 W (A i, A i ) A i = k i=1 cut(a i, A i ). A i 2 Normalized cut (Shi and Malik, 2000): Ncut(A 1,..., A k ) := 1 2 k i=1 W (A i, A i ) vol(a i ) = k i=1 cut(a i, A i ). vol(a i ) Note: both objective functions take larger values when the clusters A i are small. Resulting clusters are more balanced.
34 10/18 Balanced cuts The two most common objective functions that are used as a replacement to the min-cut objective are: 1 RatioCut (Hagen and Kahng, 1992): RatioCut(A 1,..., A k ) := 1 2 k i=1 W (A i, A i ) A i = k i=1 cut(a i, A i ). A i 2 Normalized cut (Shi and Malik, 2000): Ncut(A 1,..., A k ) := 1 2 k i=1 W (A i, A i ) vol(a i ) = k i=1 cut(a i, A i ). vol(a i ) Note: both objective functions take larger values when the clusters A i are small. Resulting clusters are more balanced. However, the resulting problems are NP hard - see Wagner and Wagner (1993).
35 Spectral clustering 11/18 Spectral clustering provides a way to relax the RatioCut and the Normalized cut problems.
36 Spectral clustering 11/18 Spectral clustering provides a way to relax the RatioCut and the Normalized cut problems. Strategy: 1 Express the original problem as a linear algebra problem involving discrete/combinatorial constraints.
37 Spectral clustering 11/18 Spectral clustering provides a way to relax the RatioCut and the Normalized cut problems. Strategy: 1 Express the original problem as a linear algebra problem involving discrete/combinatorial constraints. 2 Relax/remove the constraints.
38 Spectral clustering 11/18 Spectral clustering provides a way to relax the RatioCut and the Normalized cut problems. Strategy: 1 Express the original problem as a linear algebra problem involving discrete/combinatorial constraints. 2 Relax/remove the constraints. RatioCut with k = 2: solve min RatioCut(A, A). A V
39 Spectral clustering 11/18 Spectral clustering provides a way to relax the RatioCut and the Normalized cut problems. Strategy: 1 Express the original problem as a linear algebra problem involving discrete/combinatorial constraints. 2 Relax/remove the constraints. RatioCut with k = 2: solve min RatioCut(A, A). A V Given A V, let f R n be given by A / A if v i A f i := A / A if v i A.
40 Relaxing RatioCut 12/18 Let L = D W be the (unnormalized) Laplacian of G. Then f T Lf = 1 n w ij (f i f j ) 2 2 i,j=1 2 = 1 w ij A 2 A + A + 1 A 2 i A,j A ( = W (A, A) 2 + A A + A ) A ( ) A + A A + A = W (A, A) + A A = V 1 ( ) W (A, A) W (A, A) + 2 A A = V RatioCut(A, A). i A,j A w ij since A + A = V, and W (A, A) = W (A, A). A A A A 2
41 Relaxing RatioCut (cont.) 13/18 We showed: f T Lf = 1 n w ij (f i f j ) 2 = V RatioCut(A, A). 2 i,j=1
42 Relaxing RatioCut (cont.) 13/18 We showed: f T Lf = 1 2 n w ij (f i f j ) 2 = V RatioCut(A, A). i,j=1 Moreover, note that n f i = i A i=1 A A i A A A = A A A A A A = 0.
43 Relaxing RatioCut (cont.) 13/18 We showed: f T Lf = 1 2 n w ij (f i f j ) 2 = V RatioCut(A, A). i,j=1 Moreover, note that n f i = i A i=1 Thus f 1. A A i A A A = A A A A A A = 0.
44 Relaxing RatioCut (cont.) 13/18 We showed: f T Lf = 1 2 n w ij (f i f j ) 2 = V RatioCut(A, A). i,j=1 Moreover, note that n f i = i A i=1 Thus f 1. Finally, f 2 2 = A A i A n i=1 f 2 i A A = A A A A A A = 0. = A A A + A = V = n. A A
45 Relaxing RatioCut (cont.) We showed: f T Lf = 1 2 n w ij (f i f j ) 2 = V RatioCut(A, A). i,j=1 Moreover, note that n f i = i A i=1 Thus f 1. Finally, f 2 2 = A A i A n i=1 f 2 i A A = A A A A A A = 0. = A A A + A = V = n. A A Thus, we have showed that the Ratio-Cut problem is equivalent to min f T Lf A V subject to f 1, f = n, f i dened as above. 13/18
46 Relaxing RatioCut (cont.) 14/18 We have: min f T Lf A V subject to f 1, f = n, f i dened as above.
47 Relaxing RatioCut (cont.) 14/18 We have: min f T Lf A V subject to f 1, f = n, f i dened as above. This is a discrete optimization problem since the entries of f can only take two values: A / A and A / A.
48 Relaxing RatioCut (cont.) 14/18 We have: min f T Lf A V subject to f 1, f = n, f i dened as above. This is a discrete optimization problem since the entries of f can only take two values: A / A and A / A. The problem is NP hard.
49 Relaxing RatioCut (cont.) 14/18 We have: min f T Lf A V subject to f 1, f = n, f i dened as above. This is a discrete optimization problem since the entries of f can only take two values: A / A and A / A. The problem is NP hard. The natural relaxation of the problem is to remove the discreteness condition on f and solve min f T Lf f R n subject to f 1, f = n.
50 Relaxing RatioCut (cont.) 15/18 Using properties of the Rayleigh quotient, it is not hard to show that the solution of min f T Lf f R n subject to f 1, f = n. is an eigenvector of L corresponding to the second eigenvalue.
51 15/18 Relaxing RatioCut (cont.) Using properties of the Rayleigh quotient, it is not hard to show that the solution of min f T Lf f R n subject to f 1, f = n. is an eigenvector of L corresponding to the second eigenvalue. Clearly, if f is the solution of the problem, then f T L f min RatioCut(A, A). A V
52 15/18 Relaxing RatioCut (cont.) Using properties of the Rayleigh quotient, it is not hard to show that the solution of min f T Lf f R n subject to f 1, f = n. is an eigenvector of L corresponding to the second eigenvalue. Clearly, if f is the solution of the problem, then f T L f min RatioCut(A, A). A V How do we get the clusters from f? We could set { v i A if f i 0 v i A if f i < 0.
53 15/18 Relaxing RatioCut (cont.) Using properties of the Rayleigh quotient, it is not hard to show that the solution of min f T Lf f R n subject to f 1, f = n. is an eigenvector of L corresponding to the second eigenvalue. Clearly, if f is the solution of the problem, then f T L f min RatioCut(A, A). A V How do we get the clusters from f? We could set { v i A if f i 0 v i A if f i < 0. More generally, we cluster the coordinates of f using K-means.
54 15/18 Relaxing RatioCut (cont.) Using properties of the Rayleigh quotient, it is not hard to show that the solution of min f T Lf f R n subject to f 1, f = n. is an eigenvector of L corresponding to the second eigenvalue. Clearly, if f is the solution of the problem, then f T L f min RatioCut(A, A). A V How do we get the clusters from f? We could set { v i A if f i 0 v i A if f i < 0. More generally, we cluster the coordinates of f using K-means. This is the unnormalized spectral clustering algorithm for k = 2.
55 Relaxing RatioCut (cont.) Using properties of the Rayleigh quotient, it is not hard to show that the solution of min f T Lf f R n subject to f 1, f = n. is an eigenvector of L corresponding to the second eigenvalue. Clearly, if f is the solution of the problem, then f T L f min RatioCut(A, A). A V How do we get the clusters from f? We could set { v i A if f i 0 v i A if f i < 0. More generally, we cluster the coordinates of f using K-means. This is the unnormalized spectral clustering algorithm for k = 2. The above process can be generalized to k 2 clusters. 15/18
56 Unnormalized spectral clustering: summary 16/18 The unnormalized spectral clustering algorithm: Source: von Luxburg, 2007.
57 Normalized spectral clustering 17/18 Relaxing the RatioCut leads to unnormalized spectral clustering.
58 Normalized spectral clustering 17/18 Relaxing the RatioCut leads to unnormalized spectral clustering. By relaxing the Ncut problem, we obtain the Normalized spectral clustering algorithm of Shi and Malik (2000).
59 Normalized spectral clustering 17/18 Relaxing the RatioCut leads to unnormalized spectral clustering. By relaxing the Ncut problem, we obtain the Normalized spectral clustering algorithm of Shi and Malik (2000). Source: von Luxburg, 2007.
60 Normalized spectral clustering 17/18 Relaxing the RatioCut leads to unnormalized spectral clustering. By relaxing the Ncut problem, we obtain the Normalized spectral clustering algorithm of Shi and Malik (2000). Source: von Luxburg, Note: The solutions of Lu = λdu are the eigenvectors of L rw. See von Luxburg (2007) for details.
61 The normalized clustering algorithm of Ng et al. 18/18 Another popular variant of the spectral clustering algorithm was provided by Ng, Jordan, and Weiss (2002). The algorithm uses L sym instead of L (unnormalized clustering) or L rw (Shi and Malik's normalized clustering).
62 The normalized clustering algorithm of Ng et al. 18/18 Another popular variant of the spectral clustering algorithm was provided by Ng, Jordan, and Weiss (2002). The algorithm uses L sym instead of L (unnormalized clustering) or L rw (Shi and Malik's normalized clustering). Source: von Luxburg, See von Luxburg (2007) for details.
MATH 567: Mathematical Techniques in Data Science Clustering II
Spectral clustering: overview MATH 567: Mathematical Techniques in Data Science Clustering II Dominique uillot Departments of Mathematical Sciences University of Delaware Overview of spectral clustering:
More informationMATH 829: Introduction to Data Mining and Analysis Clustering II
his lecture is based on U. von Luxburg, A Tutorial on Spectral Clustering, Statistics and Computing, 17 (4), 2007. MATH 829: Introduction to Data Mining and Analysis Clustering II Dominique Guillot Departments
More informationSpectral Clustering. Zitao Liu
Spectral Clustering Zitao Liu Agenda Brief Clustering Review Similarity Graph Graph Laplacian Spectral Clustering Algorithm Graph Cut Point of View Random Walk Point of View Perturbation Theory Point of
More informationData Analysis and Manifold Learning Lecture 7: Spectral Clustering
Data Analysis and Manifold Learning Lecture 7: Spectral Clustering Radu Horaud INRIA Grenoble Rhone-Alpes, France Radu.Horaud@inrialpes.fr http://perception.inrialpes.fr/ Outline of Lecture 7 What is spectral
More informationSpectral Clustering on Handwritten Digits Database
University of Maryland-College Park Advance Scientific Computing I,II Spectral Clustering on Handwritten Digits Database Author: Danielle Middlebrooks Dmiddle1@math.umd.edu Second year AMSC Student Advisor:
More informationSpectral Clustering. Guokun Lai 2016/10
Spectral Clustering Guokun Lai 2016/10 1 / 37 Organization Graph Cut Fundamental Limitations of Spectral Clustering Ng 2002 paper (if we have time) 2 / 37 Notation We define a undirected weighted graph
More informationGraphs in Machine Learning
Graphs in Machine Learning Michal Valko INRIA Lille - Nord Europe, France Partially based on material by: Ulrike von Luxburg, Gary Miller, Doyle & Schnell, Daniel Spielman January 27, 2015 MVA 2014/2015
More informationSpectral Clustering. Spectral Clustering? Two Moons Data. Spectral Clustering Algorithm: Bipartioning. Spectral methods
Spectral Clustering Seungjin Choi Department of Computer Science POSTECH, Korea seungjin@postech.ac.kr 1 Spectral methods Spectral Clustering? Methods using eigenvectors of some matrices Involve eigen-decomposition
More informationSpectral clustering. Two ideal clusters, with two points each. Spectral clustering algorithms
A simple example Two ideal clusters, with two points each Spectral clustering Lecture 2 Spectral clustering algorithms 4 2 3 A = Ideally permuted Ideal affinities 2 Indicator vectors Each cluster has an
More informationIntroduction to Spectral Graph Theory and Graph Clustering
Introduction to Spectral Graph Theory and Graph Clustering Chengming Jiang ECS 231 Spring 2016 University of California, Davis 1 / 40 Motivation Image partitioning in computer vision 2 / 40 Motivation
More informationMachine Learning for Data Science (CS4786) Lecture 11
Machine Learning for Data Science (CS4786) Lecture 11 Spectral clustering Course Webpage : http://www.cs.cornell.edu/courses/cs4786/2016sp/ ANNOUNCEMENT 1 Assignment P1 the Diagnostic assignment 1 will
More informationGraphs in Machine Learning
Graphs in Machine Learning Michal Valko Inria Lille - Nord Europe, France TA: Pierre Perrault Partially based on material by: Ulrike von Luxburg, Gary Miller, Mikhail Belkin October 16th, 2017 MVA 2017/2018
More informationCertifying the Global Optimality of Graph Cuts via Semidefinite Programming: A Theoretic Guarantee for Spectral Clustering
Certifying the Global Optimality of Graph Cuts via Semidefinite Programming: A Theoretic Guarantee for Spectral Clustering Shuyang Ling Courant Institute of Mathematical Sciences, NYU Aug 13, 2018 Joint
More informationNotes on Elementary Spectral Graph Theory Applications to Graph Clustering Using Normalized Cuts
University of Pennsylvania ScholarlyCommons Technical Reports (CIS) Department of Computer & Information Science 1-1-2013 Notes on Elementary Spectral Graph Theory Applications to Graph Clustering Using
More informationNotes on Elementary Spectral Graph Theory Applications to Graph Clustering
Notes on Elementary Spectral Graph Theory Applications to Graph Clustering Jean Gallier Department of Computer and Information Science University of Pennsylvania Philadelphia, PA 19104, USA e-mail: jean@cis.upenn.edu
More informationMLCC Clustering. Lorenzo Rosasco UNIGE-MIT-IIT
MLCC 2018 - Clustering Lorenzo Rosasco UNIGE-MIT-IIT About this class We will consider an unsupervised setting, and in particular the problem of clustering unlabeled data into coherent groups. MLCC 2018
More information1 Matrix notation and preliminaries from spectral graph theory
Graph clustering (or community detection or graph partitioning) is one of the most studied problems in network analysis. One reason for this is that there are a variety of ways to define a cluster or community.
More informationData Analysis and Manifold Learning Lecture 3: Graphs, Graph Matrices, and Graph Embeddings
Data Analysis and Manifold Learning Lecture 3: Graphs, Graph Matrices, and Graph Embeddings Radu Horaud INRIA Grenoble Rhone-Alpes, France Radu.Horaud@inrialpes.fr http://perception.inrialpes.fr/ Outline
More information1 Matrix notation and preliminaries from spectral graph theory
Graph clustering (or community detection or graph partitioning) is one of the most studied problems in network analysis. One reason for this is that there are a variety of ways to define a cluster or community.
More informationA spectral clustering algorithm based on Gram operators
A spectral clustering algorithm based on Gram operators Ilaria Giulini De partement de Mathe matiques et Applications ENS, Paris Joint work with Olivier Catoni 1 july 2015 Clustering task of grouping
More informationSpectral Theory of Unsigned and Signed Graphs Applications to Graph Clustering. Some Slides
Spectral Theory of Unsigned and Signed Graphs Applications to Graph Clustering Some Slides Jean Gallier Department of Computer and Information Science University of Pennsylvania Philadelphia, PA 19104,
More information17.1 Directed Graphs, Undirected Graphs, Incidence Matrices, Adjacency Matrices, Weighted Graphs
Chapter 17 Graphs and Graph Laplacians 17.1 Directed Graphs, Undirected Graphs, Incidence Matrices, Adjacency Matrices, Weighted Graphs Definition 17.1. A directed graph isa pairg =(V,E), where V = {v
More informationAn Algorithmist s Toolkit September 10, Lecture 1
18.409 An Algorithmist s Toolkit September 10, 2009 Lecture 1 Lecturer: Jonathan Kelner Scribe: Jesse Geneson (2009) 1 Overview The class s goals, requirements, and policies were introduced, and topics
More informationLecture 12 : Graph Laplacians and Cheeger s Inequality
CPS290: Algorithmic Foundations of Data Science March 7, 2017 Lecture 12 : Graph Laplacians and Cheeger s Inequality Lecturer: Kamesh Munagala Scribe: Kamesh Munagala Graph Laplacian Maybe the most beautiful
More informationSummary: A Random Walks View of Spectral Segmentation, by Marina Meila (University of Washington) and Jianbo Shi (Carnegie Mellon University)
Summary: A Random Walks View of Spectral Segmentation, by Marina Meila (University of Washington) and Jianbo Shi (Carnegie Mellon University) The authors explain how the NCut algorithm for graph bisection
More informationLaplacian Eigenmaps for Dimensionality Reduction and Data Representation
Introduction and Data Representation Mikhail Belkin & Partha Niyogi Department of Electrical Engieering University of Minnesota Mar 21, 2017 1/22 Outline Introduction 1 Introduction 2 3 4 Connections to
More informationSpectral Theory of Unsigned and Signed Graphs Applications to Graph Clustering: a Survey
Spectral Theory of Unsigned and Signed Graphs Applications to Graph Clustering: a Survey Jean Gallier Department of Computer and Information Science University of Pennsylvania Philadelphia, PA 19104, USA
More informationSpectral Clustering on Handwritten Digits Database Mid-Year Pr
Spectral Clustering on Handwritten Digits Database Mid-Year Presentation Danielle dmiddle1@math.umd.edu Advisor: Kasso Okoudjou kasso@umd.edu Department of Mathematics University of Maryland- College Park
More informationSome notes on SVD, dimensionality reduction, and clustering. Guangliang Chen
Some notes on SVD, dimensionality reduction, and clustering Guangliang Chen Contents 1. Introduction 4 2. Review of necessary linear algebra 4 3. Matrix SVD and its applications 8 Practice problems set
More informationDiffuse interface methods on graphs: Data clustering and Gamma-limits
Diffuse interface methods on graphs: Data clustering and Gamma-limits Yves van Gennip joint work with Andrea Bertozzi, Jeff Brantingham, Blake Hunter Department of Mathematics, UCLA Research made possible
More informationMATH 829: Introduction to Data Mining and Analysis Graphical Models I
MATH 829: Introduction to Data Mining and Analysis Graphical Models I Dominique Guillot Departments of Mathematical Sciences University of Delaware May 2, 2016 1/12 Independence and conditional independence:
More informationAn indicator for the number of clusters using a linear map to simplex structure
An indicator for the number of clusters using a linear map to simplex structure Marcus Weber, Wasinee Rungsarityotin, and Alexander Schliep Zuse Institute Berlin ZIB Takustraße 7, D-495 Berlin, Germany
More informationSpectral Techniques for Clustering
Nicola Rebagliati 1/54 Spectral Techniques for Clustering Nicola Rebagliati 29 April, 2010 Nicola Rebagliati 2/54 Thesis Outline 1 2 Data Representation for Clustering Setting Data Representation and Methods
More informationConsistency of Spectral Clustering
Max Planck Institut für biologische Kybernetik Max Planck Institute for Biological Cybernetics Technical Report No. TR-34 Consistency of Spectral Clustering Ulrike von Luxburg, Mikhail Belkin 2, Olivier
More informationSTA141C: Big Data & High Performance Statistical Computing
STA141C: Big Data & High Performance Statistical Computing Lecture 12: Graph Clustering Cho-Jui Hsieh UC Davis May 29, 2018 Graph Clustering Given a graph G = (V, E, W ) V : nodes {v 1,, v n } E: edges
More informationLecture 1: Graphs, Adjacency Matrices, Graph Laplacian
Lecture 1: Graphs, Adjacency Matrices, Graph Laplacian Radu Balan January 31, 2017 G = (V, E) An undirected graph G is given by two pieces of information: a set of vertices V and a set of edges E, G =
More informationCS660: Mining Massive Datasets University at Albany SUNY
CS660: Mining Massive Datasets University at Albany SUNY April 8, 06 Intro (Ch. ). Bonferoni s principle examples If your method of finding significant items returns significantly more items that you would
More informationSemi-Supervised Learning by Multi-Manifold Separation
Semi-Supervised Learning by Multi-Manifold Separation Xiaojin (Jerry) Zhu Department of Computer Sciences University of Wisconsin Madison Joint work with Andrew Goldberg, Zhiting Xu, Aarti Singh, and Rob
More informationMATH 285 HW3. Xiaoyan Chong
MATH 285 HW3 Xiaoyan Chong November 20, 2015 1. (a) Draw the graph Figure 1: Graph (b) Find the degrees of all vertices d 1 = 0 +.15 +.15 +.3 + 0 =.6 d 2 =.15 + 0 +.85 + 0 + 0 = 1 d 3 =.15 +.85 + 0 + 0
More information8.1 Concentration inequality for Gaussian random matrix (cont d)
MGMT 69: Topics in High-dimensional Data Analysis Falll 26 Lecture 8: Spectral clustering and Laplacian matrices Lecturer: Jiaming Xu Scribe: Hyun-Ju Oh and Taotao He, October 4, 26 Outline Concentration
More informationLecture 9: Laplacian Eigenmaps
Lecture 9: Radu Balan Department of Mathematics, AMSC, CSCAMM and NWC University of Maryland, College Park, MD April 18, 2017 Optimization Criteria Assume G = (V, W ) is a undirected weighted graph with
More informationLecture 13: Spectral Graph Theory
CSE 521: Design and Analysis of Algorithms I Winter 2017 Lecture 13: Spectral Graph Theory Lecturer: Shayan Oveis Gharan 11/14/18 Disclaimer: These notes have not been subjected to the usual scrutiny reserved
More informationECS231: Spectral Partitioning. Based on Berkeley s CS267 lecture on graph partition
ECS231: Spectral Partitioning Based on Berkeley s CS267 lecture on graph partition 1 Definition of graph partitioning Given a graph G = (N, E, W N, W E ) N = nodes (or vertices), E = edges W N = node weights
More informationLecture 12: Introduction to Spectral Graph Theory, Cheeger s inequality
CSE 521: Design and Analysis of Algorithms I Spring 2016 Lecture 12: Introduction to Spectral Graph Theory, Cheeger s inequality Lecturer: Shayan Oveis Gharan May 4th Scribe: Gabriel Cadamuro Disclaimer:
More information18.312: Algebraic Combinatorics Lionel Levine. Lecture 19
832: Algebraic Combinatorics Lionel Levine Lecture date: April 2, 20 Lecture 9 Notes by: David Witmer Matrix-Tree Theorem Undirected Graphs Let G = (V, E) be a connected, undirected graph with n vertices,
More informationChapter 4. Signed Graphs. Intuitively, in a weighted graph, an edge with a positive weight denotes similarity or proximity of its endpoints.
Chapter 4 Signed Graphs 4.1 Signed Graphs and Signed Laplacians Intuitively, in a weighted graph, an edge with a positive weight denotes similarity or proximity of its endpoints. For many reasons, it is
More informationMODELS FOR SPECTRAL CLUSTERING AND THEIR APPLICATIONS. Donald, F. McCuan. B.A., Austin College, A thesis submitted to the
MODELS FOR SPECTRAL CLUSTERING AND THEIR APPLICATIONS by Donald, F. McCuan B.A., Austin College, 1972 A thesis submitted to the Faculty of the Graduate School of the University of Colorado in partial fulfillment
More information1 Review: symmetric matrices, their eigenvalues and eigenvectors
Cornell University, Fall 2012 Lecture notes on spectral methods in algorithm design CS 6820: Algorithms Studying the eigenvalues and eigenvectors of matrices has powerful consequences for at least three
More informationCommunities, Spectral Clustering, and Random Walks
Communities, Spectral Clustering, and Random Walks David Bindel Department of Computer Science Cornell University 26 Sep 2011 20 21 19 16 22 28 17 18 29 26 27 30 23 1 25 5 8 24 2 4 14 3 9 13 15 11 10 12
More informationNetworks and Their Spectra
Networks and Their Spectra Victor Amelkin University of California, Santa Barbara Department of Computer Science victor@cs.ucsb.edu December 4, 2017 1 / 18 Introduction Networks (= graphs) are everywhere.
More informationA lower bound for the Laplacian eigenvalues of a graph proof of a conjecture by Guo
A lower bound for the Laplacian eigenvalues of a graph proof of a conjecture by Guo A. E. Brouwer & W. H. Haemers 2008-02-28 Abstract We show that if µ j is the j-th largest Laplacian eigenvalue, and d
More informationSpectra of Adjacency and Laplacian Matrices
Spectra of Adjacency and Laplacian Matrices Definition: University of Alicante (Spain) Matrix Computing (subject 3168 Degree in Maths) 30 hours (theory)) + 15 hours (practical assignment) Contents 1. Spectra
More informationLecture 13 Spectral Graph Algorithms
COMS 995-3: Advanced Algorithms March 6, 7 Lecture 3 Spectral Graph Algorithms Instructor: Alex Andoni Scribe: Srikar Varadaraj Introduction Today s topics: Finish proof from last lecture Example of random
More informationCOMPSCI 514: Algorithms for Data Science
COMPSCI 514: Algorithms for Data Science Arya Mazumdar University of Massachusetts at Amherst Fall 2018 Lecture 8 Spectral Clustering Spectral clustering Curse of dimensionality Dimensionality Reduction
More informationAn Introduction to Spectral Graph Theory
An Introduction to Spectral Graph Theory Mackenzie Wheeler Supervisor: Dr. Gary MacGillivray University of Victoria WheelerM@uvic.ca Outline Outline 1. How many walks are there from vertices v i to v j
More informationMa/CS 6b Class 23: Eigenvalues in Regular Graphs
Ma/CS 6b Class 3: Eigenvalues in Regular Graphs By Adam Sheffer Recall: The Spectrum of a Graph Consider a graph G = V, E and let A be the adjacency matrix of G. The eigenvalues of G are the eigenvalues
More informationLinear Spectral Hashing
Linear Spectral Hashing Zalán Bodó and Lehel Csató Babeş Bolyai University - Faculty of Mathematics and Computer Science Kogălniceanu 1., 484 Cluj-Napoca - Romania Abstract. assigns binary hash keys to
More informationThe f-adjusted Graph Laplacian: a Diagonal Modification with a Geometric Interpretation
The f-adjusted Graph Laplacian: a Diagonal Modification with a Geometric Interpretation Sven Kurras Ulrike von Luxburg Department of Computer Science, University of Hamburg, Germany Gilles Blanchard Department
More informationData Mining and Analysis: Fundamental Concepts and Algorithms
: Fundamental Concepts and Algorithms dataminingbook.info Mohammed J. Zaki 1 Wagner Meira Jr. 2 1 Department of Computer Science Rensselaer Polytechnic Institute, Troy, NY, USA 2 Department of Computer
More informationGraph Partitioning Algorithms and Laplacian Eigenvalues
Graph Partitioning Algorithms and Laplacian Eigenvalues Luca Trevisan Stanford Based on work with Tsz Chiu Kwok, Lap Chi Lau, James Lee, Yin Tat Lee, and Shayan Oveis Gharan spectral graph theory Use linear
More informationRelations Between Adjacency And Modularity Graph Partitioning: Principal Component Analysis vs. Modularity Component Analysis
Relations Between Adjacency And Modularity Graph Partitioning: Principal Component Analysis vs. Modularity Component Analysis Hansi Jiang Carl Meyer North Carolina State University October 27, 2015 1 /
More informationMath 3191 Applied Linear Algebra
Math 9 Applied Linear Algebra Lecture 9: Diagonalization Stephen Billups University of Colorado at Denver Math 9Applied Linear Algebra p./9 Section. Diagonalization The goal here is to develop a useful
More informationMarkov Chains and Spectral Clustering
Markov Chains and Spectral Clustering Ning Liu 1,2 and William J. Stewart 1,3 1 Department of Computer Science North Carolina State University, Raleigh, NC 27695-8206, USA. 2 nliu@ncsu.edu, 3 billy@ncsu.edu
More informationGraphs in Machine Learning
Graphs in Machine Learning Michal Valko Inria Lille - Nord Europe, France TA: Pierre Perrault Partially based on material by: Ulrike von Luxburg, Gary Miller, Doyle & Schnell, Daniel Spielman October 10th,
More informationEigenvalue problems and optimization
Notes for 2016-04-27 Seeking structure For the past three weeks, we have discussed rather general-purpose optimization methods for nonlinear equation solving and optimization. In practice, of course, we
More informationCS224W: Social and Information Network Analysis Jure Leskovec, Stanford University
CS224W: Social and Information Network Analysis Jure Leskovec Stanford University Jure Leskovec, Stanford University http://cs224w.stanford.edu Task: Find coalitions in signed networks Incentives: European
More informationPart I: Preliminary Results. Pak K. Chan, Martine Schlag and Jason Zien. Computer Engineering Board of Studies. University of California, Santa Cruz
Spectral K-Way Ratio-Cut Partitioning Part I: Preliminary Results Pak K. Chan, Martine Schlag and Jason Zien Computer Engineering Board of Studies University of California, Santa Cruz May, 99 Abstract
More informationOn the inverse matrix of the Laplacian and all ones matrix
On the inverse matrix of the Laplacian and all ones matrix Sho Suda (Joint work with Michio Seto and Tetsuji Taniguchi) International Christian University JSPS Research Fellow PD November 21, 2012 Sho
More informationSpectral Graph Theory
Spectral Graph Theory Aaron Mishtal April 27, 2016 1 / 36 Outline Overview Linear Algebra Primer History Theory Applications Open Problems Homework Problems References 2 / 36 Outline Overview Linear Algebra
More informationSpectral and Electrical Graph Theory. Daniel A. Spielman Dept. of Computer Science Program in Applied Mathematics Yale Unviersity
Spectral and Electrical Graph Theory Daniel A. Spielman Dept. of Computer Science Program in Applied Mathematics Yale Unviersity Outline Spectral Graph Theory: Understand graphs through eigenvectors and
More informationCS168: The Modern Algorithmic Toolbox Lectures #11 and #12: Spectral Graph Theory
CS168: The Modern Algorithmic Toolbox Lectures #11 and #12: Spectral Graph Theory Tim Roughgarden & Gregory Valiant May 2, 2016 Spectral graph theory is the powerful and beautiful theory that arises from
More informationComputer Vision Group Prof. Daniel Cremers. 14. Clustering
Group Prof. Daniel Cremers 14. Clustering Motivation Supervised learning is good for interaction with humans, but labels from a supervisor are hard to obtain Clustering is unsupervised learning, i.e. it
More informationSpectral Graph Reduction for Efficient Image and Streaming Video Segmentation
Spectral Graph Reduction for Efficient Image and Streaming Video Segmentation Fabio Galasso 1, Margret Keuper 2, Thomas Brox 2, Bernt Schiele 1 1 Max Planck Institute for Informatics, Germany 2 University
More informationCS 556: Computer Vision. Lecture 13
CS 556: Computer Vision Lecture 1 Prof. Sinisa Todorovic sinisa@eecs.oregonstate.edu 1 Outline Perceptual grouping Low-level segmentation Ncuts Perceptual Grouping What do you see? 4 What do you see? Rorschach
More informationMarch 13, Paper: R.R. Coifman, S. Lafon, Diffusion maps ([Coifman06]) Seminar: Learning with Graphs, Prof. Hein, Saarland University
Kernels March 13, 2008 Paper: R.R. Coifman, S. Lafon, maps ([Coifman06]) Seminar: Learning with Graphs, Prof. Hein, Saarland University Kernels Figure: Example Application from [LafonWWW] meaningful geometric
More information1 Last time: least-squares problems
MATH Linear algebra (Fall 07) Lecture Last time: least-squares problems Definition. If A is an m n matrix and b R m, then a least-squares solution to the linear system Ax = b is a vector x R n such that
More informationORIE 6334 Spectral Graph Theory September 8, Lecture 6. In order to do the first proof, we need to use the following fact.
ORIE 6334 Spectral Graph Theory September 8, 2016 Lecture 6 Lecturer: David P. Williamson Scribe: Faisal Alkaabneh 1 The Matrix-Tree Theorem In this lecture, we continue to see the usefulness of the graph
More informationDamped random walks and the characteristic polynomial of the weighted Laplacian on a graph
Damped random walks and the characteristic polynomial of the weighted Laplacian on a graph arxiv:mathpr/0506460 v1 Jun 005 MADHAV P DESAI and HARIHARAN NARAYANAN September 11, 006 Abstract For λ > 0, we
More informationCS 664 Segmentation (2) Daniel Huttenlocher
CS 664 Segmentation (2) Daniel Huttenlocher Recap Last time covered perceptual organization more broadly, focused in on pixel-wise segmentation Covered local graph-based methods such as MST and Felzenszwalb-Huttenlocher
More informationClustering by weighted cuts in directed graphs
Clustering by weighted cuts in directed graphs Marina Meilă Department of Statistics University of Washington mmp@stat.washington.edu William Pentney Dept. of Computer Science & Engineering University
More informationClustering using Mixture Models
Clustering using Mixture Models The full posterior of the Gaussian Mixture Model is p(x, Z, µ,, ) =p(x Z, µ, )p(z )p( )p(µ, ) data likelihood (Gaussian) correspondence prob. (Multinomial) mixture prior
More informationDEN: Linear algebra numerical view (GEM: Gauss elimination method for reducing a full rank matrix to upper-triangular
form) Given: matrix C = (c i,j ) n,m i,j=1 ODE and num math: Linear algebra (N) [lectures] c phabala 2016 DEN: Linear algebra numerical view (GEM: Gauss elimination method for reducing a full rank matrix
More informationSpectral Graph Theory and its Applications. Daniel A. Spielman Dept. of Computer Science Program in Applied Mathematics Yale Unviersity
Spectral Graph Theory and its Applications Daniel A. Spielman Dept. of Computer Science Program in Applied Mathematics Yale Unviersity Outline Adjacency matrix and Laplacian Intuition, spectral graph drawing
More informationFinite Frames and Graph Theoretical Uncertainty Principles
Finite Frames and Graph Theoretical Uncertainty Principles (pkoprows@math.umd.edu) University of Maryland - College Park April 13, 2015 Outline 1 Motivation 2 Definitions 3 Results Outline 1 Motivation
More informationTwo Laplacians for the distance matrix of a graph
Two Laplacians for the distance matrix of a graph Mustapha Aouchiche and Pierre Hansen GERAD and HEC Montreal, Canada CanaDAM 2013, Memorial University, June 1013 Aouchiche & Hansen CanaDAM 2013, Memorial
More informationLearning Spectral Graph Segmentation
Learning Spectral Graph Segmentation AISTATS 2005 Timothée Cour Jianbo Shi Nicolas Gogin Computer and Information Science Department University of Pennsylvania Computer Science Ecole Polytechnique Graph-based
More informationJordan Canonical Form Homework Solutions
Jordan Canonical Form Homework Solutions For each of the following, put the matrix in Jordan canonical form and find the matrix S such that S AS = J. [ ]. A = A λi = λ λ = ( λ) = λ λ = λ =, Since we have
More informationUnsupervised dimensionality reduction
Unsupervised dimensionality reduction Guillaume Obozinski Ecole des Ponts - ParisTech SOCN course 2014 Guillaume Obozinski Unsupervised dimensionality reduction 1/30 Outline 1 PCA 2 Kernel PCA 3 Multidimensional
More informationPower Grid Partitioning: Static and Dynamic Approaches
Power Grid Partitioning: Static and Dynamic Approaches Miao Zhang, Zhixin Miao, Lingling Fan Department of Electrical Engineering University of South Florida Tampa FL 3320 miaozhang@mail.usf.edu zmiao,
More information1 I A Q E B A I E Q 1 A ; E Q A I A (2) A : (3) A : (4)
Latin Squares Denition and examples Denition. (Latin Square) An n n Latin square, or a latin square of order n, is a square array with n symbols arranged so that each symbol appears just once in each row
More informationOn the Maximal Error of Spectral Approximation of Graph Bisection
To appear in Linear and Multilinear Algebra Vol. 00, No. 00, Month 20XX, 1 9 On the Maximal Error of Spectral Approximation of Graph Bisection John C. Urschel a,b,c, and Ludmil T. Zikatanov c,d, a Baltimore
More informationground state degeneracy ground state energy
Searching Ground States in Ising Spin Glass Systems Steven Homer Computer Science Department Boston University Boston, MA 02215 Marcus Peinado German National Research Center for Information Technology
More informationMATH 304 Linear Algebra Lecture 34: Review for Test 2.
MATH 304 Linear Algebra Lecture 34: Review for Test 2. Topics for Test 2 Linear transformations (Leon 4.1 4.3) Matrix transformations Matrix of a linear mapping Similar matrices Orthogonality (Leon 5.1
More informationHigher Order Learning with Graphs
Sameer Agarwal sagarwal@cs.ucsd.edu Kristin Branson kbranson@cs.ucsd.edu Serge Belongie sjb@cs.ucsd.edu Department of Computer Science and Engineering, University of California San Diego, La Jolla, CA
More informationTHE HIDDEN CONVEXITY OF SPECTRAL CLUSTERING
THE HIDDEN CONVEXITY OF SPECTRAL CLUSTERING Luis Rademacher, Ohio State University, Computer Science and Engineering. Joint work with Mikhail Belkin and James Voss This talk A new approach to multi-way
More informationMath 18, Linear Algebra, Lecture C00, Spring 2017 Review and Practice Problems for Final Exam
Math 8, Linear Algebra, Lecture C, Spring 7 Review and Practice Problems for Final Exam. The augmentedmatrix of a linear system has been transformed by row operations into 5 4 8. Determine if the system
More informationMATH 829: Introduction to Data Mining and Analysis Computing the lasso solution
1/16 MATH 829: Introduction to Data Mining and Analysis Computing the lasso solution Dominique Guillot Departments of Mathematical Sciences University of Delaware February 26, 2016 Computing the lasso
More informationDiffusion and random walks on graphs
Diffusion and random walks on graphs Leonid E. Zhukov School of Data Analysis and Artificial Intelligence Department of Computer Science National Research University Higher School of Economics Structural
More informationSpectral Generative Models for Graphs
Spectral Generative Models for Graphs David White and Richard C. Wilson Department of Computer Science University of York Heslington, York, UK wilson@cs.york.ac.uk Abstract Generative models are well known
More informationCSE 291. Assignment Spectral clustering versus k-means. Out: Wed May 23 Due: Wed Jun 13
CSE 291. Assignment 3 Out: Wed May 23 Due: Wed Jun 13 3.1 Spectral clustering versus k-means Download the rings data set for this problem from the course web site. The data is stored in MATLAB format as
More information