Spectral Clustering. Guokun Lai 2016/10
|
|
- Cori Norris
- 5 years ago
- Views:
Transcription
1 Spectral Clustering Guokun Lai 2016/10 1 / 37
2 Organization Graph Cut Fundamental Limitations of Spectral Clustering Ng 2002 paper (if we have time) 2 / 37
3 Notation We define a undirected weighted graph G(V, E), where V is the G s nodes set, and E is the G s edges set. The adjacency matrix is W ij = E(i, j), W ij 0. The degree Matrix D R n n is a diagonal matrix and D i,i = n j=1 W i,j. The Laplacian Matrix L R n n is L = D W. Indicator vector of a cluster: The indicator vector I c of a cluster C is, I c,i = { 1 vi C 0 otherwise (1) 3 / 37
4 Graph Cut The intuition of clustering is to separate points in different groups according to their similarities. If we try to separate the node set G into two disjoint sets A and B, we define Cut(A, B) = i A,j B w ij If we split the node set into K disjoint set, then Cut(A 1,, A k ) = K Cut(A, A) i=1 Where A is the complement set of A. 4 / 37
5 Defect of Graph Cut The simplest idea to cluster the node set V is to find a partition to minimize the Graph Cut function. But usually it will lead to solutions that the subset with few nodes. 5 / 37
6 Normalization Cut For overcoming the defect of the Graph Cut, the Shi proposed a new cost function to regularize the size of the subset. First, we define Vol(A) = i A,j V w(i, j), and we have Ncut(A, B) = cut(a, B) V (A) + cut(a, B) V (B) 6 / 37
7 Relation between NCut and Spectral Clustering Given a vertex subset A i V, we define the vector 1 f i = I Ai. Then we can write the optimization problem as, Vol(Ai ) min A i NCut = 1 2 s.t. f i = I Ai n i=0 f T i Lf i = 1 2 Tr(F T LF ) 1 Vol(Ai ) (2) F T DF = I 7 / 37
8 Optimization 1 Because the constraint f i = I Ai, the optimization Vol(Ai ) problem is a np-hard problem. So we can relax this constraint to the R n. Then the optimization problem is, min fi Tr(F T LF ) s.t. F T DF = I (3) Then we found the solution is the kth smallest eigenvector of D 1 L. Based on the F, we recover the A i by the k-mean algorithm. 8 / 37
9 Unnormalized Laplacian Matrix Similar to the above approach, we can prove that the eigenvectors of the unnormalized Laplacian matrix is the relaxed solution for RatioCut(A, B) = cut(a,b) A + cut(a,b) 1 B. We can set f i = I Ai Ai and get the relaxed optimization problem, min fi Tr(F T LF ) s.t. F T F = I (4) 9 / 37
10 Approximation The solution from the spectral method is approximately for the Normalized Cut objective function. And there is not bound for the gap between them. We can easily construct a case to make the solution to the relaxed problem very different from the origin problem. 10 / 37
11 Experiment Result of Shi paper 11 / 37
12 Organization Graph Cut Fundamental Limitations of Spectral Clustering Ng 2002 paper (if we have time) 12 / 37
13 Fundamental Limitations of Spectral Clustering As mentioned above, the spectral clustering approximately solve the Normalized Graph Cut objective function. But is that the Normalized Graph Cut a good criterion for the all situations? 13 / 37
14 Limitation of NCut The NCut function is more likely to capture the global structure. But sometimes, we may want to extract some local feature of the graph. The Graph Normalized Cut cannot separate the Gaussian distribution and the band. 14 / 37
15 Limitation of Spectral Clustering Next we analyze the spectral method based on the view of random walk process. We define the Markov transition matrix as M = D 1 W, it has eigenvalue λ i and eigenvector v i. And the random walk process in the graph converges to the unique equilibrium distribution π s. Then we can found the relationship between eigenvector and the diffusion distance between points, λ 2t j (v j (x) v j (y)) 2 = p(z, t x) p(z, t y) 2 L 2 (1/π s) j So we see that the spectral method want to capture the major pattern of the random walk on whole graph. 15 / 37
16 Limitation of Spectral Clustering But this method would fail in the situation, which the scale of clusters are very different. 16 / 37
17 Self-Tuning Spectral Clustering One way to solve above case is that we can accelerate the random walk process in the low density area. Assume we define the distance between node is, A i,j = exp( d(v i, v j ) 2 σ i σ j ) And σ i = d(v i, v k ), where v k is the k-th nearest neighbor of v i. 17 / 37
18 Result of Self-Tuning Spectral Clustering 18 / 37
19 Failure case 19 / 37
20 Another solution The paper proposed a solution is that we split the graph into two subsets recursively. And stop criterion is based on the relaxation time of the graph, which is τ V = 1/(1 λ 2 ). Then if the size of two subsets after splitting is comparable, we expect τ V >> τ 1 + τ 2 Otherwise, we expect max(τ 1, τ 2 ) >> min(τ 1, τ 2 ). If the partition satisfy either condition, we accept separation and continue to split the subset. If not, we stop. But it didn t address how to deal with K clustering problem. 20 / 37
21 Tong Zhang 2007 paper This paper gave a upper bound of expectation error in the semi-supervised learning task on graph. Because of the room of presentation, I will just introduce a interesting conclusion of this paper. 21 / 37
22 S-Normalized Laplacian Matrix We define the S-Normalized Laplacian Matrix as L S = S 1/2 LS 1/2 where S is a diagonal matrix. According to the analyze of the this paper, the best choice of S is S i,i = C j, where C j is the size of the cluster j. So this is an approach want to solve the different scale cluster problem cannot be dealt with by the spectral clustering. We can find this is similar to the self-tuning spectral clustering, it renormalized the adjacency matrix as Ŵij W = ij. Ci C j 22 / 37
23 S-Normalized Laplacian Matrix But we don t know C j, the author proposed a method to approximately computer it. We can define K 1 = αi + L S, α R. In the ideal case, which is that we have q disjoint connected components. Then we can prove that q 1 α 0, αk = C j v jvj T + O(α) i=1 where v j is the indicator vector of the cluster j. So if we have a small α, we can assume K i,i C j. Then we can set S i,i 1 K i,i. 23 / 37
24 Comparation 24 / 37
25 Organization Graph Cut Fundamental Limitations of Spectral Clustering Ng 2002 paper (if we have time) 25 / 37
26 Ng 2002 paper This paper analyzed the spectral clustering problem based on the matrix perturbation theory. It obtains a error bound of the spectral clustering algorithm with several assumptions. 26 / 37
27 Algorithm Define the weighted adjacency Matrix W, and construct the Laplacian Matrix L = D 1/2 WD 1/2. Find x 1,, x k, the K largest eigenvectors of L, and form the matrix X = [x 1 x k ] R n k Normalized the every row of X to have unit length, Y ij = X ij /( j X 2 ij )1/2 Treating each row of Y as a point in R k, cluster them into k clusters via K-means. 27 / 37
28 Ideal Case Assume the graph G contain K clusters, and it dose not contain cross-clusters edge. In this case, the Laplacian matrix contains exactly K eigenvector with eigenvalue / 37
29 Y Matrix of Ideal Case After running the algorithm on this graph, we can get Y matrix as Where R is any rotation matrix, and each row of Y will cluster into 3 groups naturally. 29 / 37
30 The general case In real world data, we have cross-clusters edges. So the author analyzes the cross-clusters edges influence on the Y matrix based on the matrix perturbation theory. 30 / 37
31 The general case Assumption 1 There exists δ > 0 so that, for all second largest eigenvalue of each cluster, i = 1,, k, λ i 2 1 δ. Assumption 2 There is some fixed ɛ 1 > 0, so that for every i 1, i 2 1,, k, i 1 i 2, we have that j S i1 where ˆd i is the degree of i in its cluster. Wjk 2 k S i2 ˆd j ˆdk ɛ 1, The intuition of this inequality is to limit the weight of cross-cluster edges, compared to weight of the intra-cluster edges. 31 / 37
32 The general case Assumption 3 There is some fixed ɛ 2 > 0, so that for every j S i, we have that k S W 2 i jk ɛ ˆd 2 ( Wkl 2 j k,l S i ) ˆd 1/2 k ˆdl The intuition of this inequality is also to limit the weight of cross-cluster edges, compared to weight of the intra-cluster edges. Assumption 4 There is some constant C > 0 so that for every i i = 1,, k, j = 1,, n i, we have ˆd j ( n i ˆd i k=1 k )/(Cni ). The intuition of this inequality is that no points in a cluster be too much less connected than other points in the same cluster. 32 / 37
33 The general case If the all of assumptions holds, set ɛ = k(k 1)ɛ + k ɛ 2 2 If σ > (2 + 2)ɛ. There exists k orthogonal vectors r 1,, r k so that 1 n k n i i=1 j=1 y j j r i 2 2 4C(4 + 2 k) 2 ɛ 2 (σ 2ɛ) 2 33 / 37
34 Liu s 2016 paper Motivation The original semi-supervised learning problem can be formalized as min l(f i, y i ) + f T Lf f i We can richer the label propagation patterns based on the spectrum transformation, which called ST-enhance semi-supervised learning min f l(f i, y i ) + f T σ(l)f i 34 / 37
35 Spectral Transform We can define L = i λ iφ i φ T i, and θ i = σ(λ i ) 1, where σ(x) should be a non-decrease function. We can substitute it into the objective function, min f C(f ; θ) = i τ whereas θ 1 θ 2,, θ m 0. l(f i, y i ) + γ m i=1 θ 1 i φ i, f 2 35 / 37
36 Jointly optimization We can try to jointly optimization eigenvalues set θ and labels set f, so we have min θ (min f C(f ; θ)) + τ θ 1 we can prove that this function is convex via θ. The optimization process can be describe as, First, fixed θ, we can optimize the convex problem on f. After that, optimize the θ in its domain. 36 / 37
37 Proof of convexity We can rewrite the objective function used the dual form of the C(f ; θ), which is C (u; θ). min θ (max u C (u; θ)) + τ θ 1 i θ i < φ i, u > 2, and w( u) is where C (u; θ) = w( u) 1 4γ the conjugate function of the l. So the objection is the point-wise maximum of a set of convex function. Then it still convex on θ. 37 / 37
MATH 567: Mathematical Techniques in Data Science Clustering II
This lecture is based on U. von Luxburg, A Tutorial on Spectral Clustering, Statistics and Computing, 17 (4), 2007. MATH 567: Mathematical Techniques in Data Science Clustering II Dominique Guillot Departments
More informationMATH 567: Mathematical Techniques in Data Science Clustering II
Spectral clustering: overview MATH 567: Mathematical Techniques in Data Science Clustering II Dominique uillot Departments of Mathematical Sciences University of Delaware Overview of spectral clustering:
More informationMachine Learning for Data Science (CS4786) Lecture 11
Machine Learning for Data Science (CS4786) Lecture 11 Spectral clustering Course Webpage : http://www.cs.cornell.edu/courses/cs4786/2016sp/ ANNOUNCEMENT 1 Assignment P1 the Diagnostic assignment 1 will
More informationSpectral Clustering. Zitao Liu
Spectral Clustering Zitao Liu Agenda Brief Clustering Review Similarity Graph Graph Laplacian Spectral Clustering Algorithm Graph Cut Point of View Random Walk Point of View Perturbation Theory Point of
More informationSpectral Clustering on Handwritten Digits Database
University of Maryland-College Park Advance Scientific Computing I,II Spectral Clustering on Handwritten Digits Database Author: Danielle Middlebrooks Dmiddle1@math.umd.edu Second year AMSC Student Advisor:
More informationData Analysis and Manifold Learning Lecture 7: Spectral Clustering
Data Analysis and Manifold Learning Lecture 7: Spectral Clustering Radu Horaud INRIA Grenoble Rhone-Alpes, France Radu.Horaud@inrialpes.fr http://perception.inrialpes.fr/ Outline of Lecture 7 What is spectral
More informationIntroduction to Spectral Graph Theory and Graph Clustering
Introduction to Spectral Graph Theory and Graph Clustering Chengming Jiang ECS 231 Spring 2016 University of California, Davis 1 / 40 Motivation Image partitioning in computer vision 2 / 40 Motivation
More informationMATH 829: Introduction to Data Mining and Analysis Clustering II
his lecture is based on U. von Luxburg, A Tutorial on Spectral Clustering, Statistics and Computing, 17 (4), 2007. MATH 829: Introduction to Data Mining and Analysis Clustering II Dominique Guillot Departments
More informationGraphs in Machine Learning
Graphs in Machine Learning Michal Valko INRIA Lille - Nord Europe, France Partially based on material by: Ulrike von Luxburg, Gary Miller, Doyle & Schnell, Daniel Spielman January 27, 2015 MVA 2014/2015
More informationMLCC Clustering. Lorenzo Rosasco UNIGE-MIT-IIT
MLCC 2018 - Clustering Lorenzo Rosasco UNIGE-MIT-IIT About this class We will consider an unsupervised setting, and in particular the problem of clustering unlabeled data into coherent groups. MLCC 2018
More informationSpectral Clustering. Spectral Clustering? Two Moons Data. Spectral Clustering Algorithm: Bipartioning. Spectral methods
Spectral Clustering Seungjin Choi Department of Computer Science POSTECH, Korea seungjin@postech.ac.kr 1 Spectral methods Spectral Clustering? Methods using eigenvectors of some matrices Involve eigen-decomposition
More informationSummary: A Random Walks View of Spectral Segmentation, by Marina Meila (University of Washington) and Jianbo Shi (Carnegie Mellon University)
Summary: A Random Walks View of Spectral Segmentation, by Marina Meila (University of Washington) and Jianbo Shi (Carnegie Mellon University) The authors explain how the NCut algorithm for graph bisection
More informationCertifying the Global Optimality of Graph Cuts via Semidefinite Programming: A Theoretic Guarantee for Spectral Clustering
Certifying the Global Optimality of Graph Cuts via Semidefinite Programming: A Theoretic Guarantee for Spectral Clustering Shuyang Ling Courant Institute of Mathematical Sciences, NYU Aug 13, 2018 Joint
More informationLecture 12 : Graph Laplacians and Cheeger s Inequality
CPS290: Algorithmic Foundations of Data Science March 7, 2017 Lecture 12 : Graph Laplacians and Cheeger s Inequality Lecturer: Kamesh Munagala Scribe: Kamesh Munagala Graph Laplacian Maybe the most beautiful
More informationData Mining and Analysis: Fundamental Concepts and Algorithms
: Fundamental Concepts and Algorithms dataminingbook.info Mohammed J. Zaki 1 Wagner Meira Jr. 2 1 Department of Computer Science Rensselaer Polytechnic Institute, Troy, NY, USA 2 Department of Computer
More informationCS224W: Social and Information Network Analysis Jure Leskovec, Stanford University
CS224W: Social and Information Network Analysis Jure Leskovec Stanford University Jure Leskovec, Stanford University http://cs224w.stanford.edu Task: Find coalitions in signed networks Incentives: European
More informationSTA141C: Big Data & High Performance Statistical Computing
STA141C: Big Data & High Performance Statistical Computing Lecture 12: Graph Clustering Cho-Jui Hsieh UC Davis May 29, 2018 Graph Clustering Given a graph G = (V, E, W ) V : nodes {v 1,, v n } E: edges
More informationComputer Vision Group Prof. Daniel Cremers. 14. Clustering
Group Prof. Daniel Cremers 14. Clustering Motivation Supervised learning is good for interaction with humans, but labels from a supervisor are hard to obtain Clustering is unsupervised learning, i.e. it
More informationCS 664 Segmentation (2) Daniel Huttenlocher
CS 664 Segmentation (2) Daniel Huttenlocher Recap Last time covered perceptual organization more broadly, focused in on pixel-wise segmentation Covered local graph-based methods such as MST and Felzenszwalb-Huttenlocher
More informationLaplacian Eigenmaps for Dimensionality Reduction and Data Representation
Introduction and Data Representation Mikhail Belkin & Partha Niyogi Department of Electrical Engieering University of Minnesota Mar 21, 2017 1/22 Outline Introduction 1 Introduction 2 3 4 Connections to
More informationCS168: The Modern Algorithmic Toolbox Lectures #11 and #12: Spectral Graph Theory
CS168: The Modern Algorithmic Toolbox Lectures #11 and #12: Spectral Graph Theory Tim Roughgarden & Gregory Valiant May 2, 2016 Spectral graph theory is the powerful and beautiful theory that arises from
More informationSemi-Supervised Learning by Multi-Manifold Separation
Semi-Supervised Learning by Multi-Manifold Separation Xiaojin (Jerry) Zhu Department of Computer Sciences University of Wisconsin Madison Joint work with Andrew Goldberg, Zhiting Xu, Aarti Singh, and Rob
More informationClassification Semi-supervised learning based on network. Speakers: Hanwen Wang, Xinxin Huang, and Zeyu Li CS Winter
Classification Semi-supervised learning based on network Speakers: Hanwen Wang, Xinxin Huang, and Zeyu Li CS 249-2 2017 Winter Semi-Supervised Learning Using Gaussian Fields and Harmonic Functions Xiaojin
More informationAn indicator for the number of clusters using a linear map to simplex structure
An indicator for the number of clusters using a linear map to simplex structure Marcus Weber, Wasinee Rungsarityotin, and Alexander Schliep Zuse Institute Berlin ZIB Takustraße 7, D-495 Berlin, Germany
More informationData Analysis and Manifold Learning Lecture 3: Graphs, Graph Matrices, and Graph Embeddings
Data Analysis and Manifold Learning Lecture 3: Graphs, Graph Matrices, and Graph Embeddings Radu Horaud INRIA Grenoble Rhone-Alpes, France Radu.Horaud@inrialpes.fr http://perception.inrialpes.fr/ Outline
More informationOn the Effectiveness of Laplacian Normalization for Graph Semi-supervised Learning
Journal of Machine Learning Research 8 (2007) 489-57 Submitted 7/06; Revised 3/07; Published 7/07 On the Effectiveness of Laplacian Normalization for Graph Semi-supervised Learning Rie Johnson IBM T.J.
More informationSpectral clustering. Two ideal clusters, with two points each. Spectral clustering algorithms
A simple example Two ideal clusters, with two points each Spectral clustering Lecture 2 Spectral clustering algorithms 4 2 3 A = Ideally permuted Ideal affinities 2 Indicator vectors Each cluster has an
More informationLecture 13: Spectral Graph Theory
CSE 521: Design and Analysis of Algorithms I Winter 2017 Lecture 13: Spectral Graph Theory Lecturer: Shayan Oveis Gharan 11/14/18 Disclaimer: These notes have not been subjected to the usual scrutiny reserved
More informationSpectral Techniques for Clustering
Nicola Rebagliati 1/54 Spectral Techniques for Clustering Nicola Rebagliati 29 April, 2010 Nicola Rebagliati 2/54 Thesis Outline 1 2 Data Representation for Clustering Setting Data Representation and Methods
More informationManifold Regularization
9.520: Statistical Learning Theory and Applications arch 3rd, 200 anifold Regularization Lecturer: Lorenzo Rosasco Scribe: Hooyoung Chung Introduction In this lecture we introduce a class of learning algorithms,
More informationA lower bound for the Laplacian eigenvalues of a graph proof of a conjecture by Guo
A lower bound for the Laplacian eigenvalues of a graph proof of a conjecture by Guo A. E. Brouwer & W. H. Haemers 2008-02-28 Abstract We show that if µ j is the j-th largest Laplacian eigenvalue, and d
More informationAnalysis of Spectral Kernel Design based Semi-supervised Learning
Analysis of Spectral Kernel Design based Semi-supervised Learning Tong Zhang IBM T. J. Watson Research Center Yorktown Heights, NY 10598 Rie Kubota Ando IBM T. J. Watson Research Center Yorktown Heights,
More informationCommunities, Spectral Clustering, and Random Walks
Communities, Spectral Clustering, and Random Walks David Bindel Department of Computer Science Cornell University 26 Sep 2011 20 21 19 16 22 28 17 18 29 26 27 30 23 1 25 5 8 24 2 4 14 3 9 13 15 11 10 12
More informationLecture: Local Spectral Methods (1 of 4)
Stat260/CS294: Spectral Graph Methods Lecture 18-03/31/2015 Lecture: Local Spectral Methods (1 of 4) Lecturer: Michael Mahoney Scribe: Michael Mahoney Warning: these notes are still very rough. They provide
More informationStatistical Learning Notes III-Section 2.4.3
Statistical Learning Notes III-Section 2.4.3 "Graphical" Spectral Features Stephen Vardeman Analytics Iowa LLC January 2019 Stephen Vardeman (Analytics Iowa LLC) Statistical Learning Notes III-Section
More informationManifold Coarse Graining for Online Semi-supervised Learning
for Online Semi-supervised Learning Mehrdad Farajtabar, Amirreza Shaban, Hamid R. Rabiee, Mohammad H. Rohban Digital Media Lab, Department of Computer Engineering, Sharif University of Technology, Tehran,
More information1 Matrix notation and preliminaries from spectral graph theory
Graph clustering (or community detection or graph partitioning) is one of the most studied problems in network analysis. One reason for this is that there are a variety of ways to define a cluster or community.
More informationCS 556: Computer Vision. Lecture 13
CS 556: Computer Vision Lecture 1 Prof. Sinisa Todorovic sinisa@eecs.oregonstate.edu 1 Outline Perceptual grouping Low-level segmentation Ncuts Perceptual Grouping What do you see? 4 What do you see? Rorschach
More information17.1 Directed Graphs, Undirected Graphs, Incidence Matrices, Adjacency Matrices, Weighted Graphs
Chapter 17 Graphs and Graph Laplacians 17.1 Directed Graphs, Undirected Graphs, Incidence Matrices, Adjacency Matrices, Weighted Graphs Definition 17.1. A directed graph isa pairg =(V,E), where V = {v
More informationClustering using Mixture Models
Clustering using Mixture Models The full posterior of the Gaussian Mixture Model is p(x, Z, µ,, ) =p(x Z, µ, )p(z )p( )p(µ, ) data likelihood (Gaussian) correspondence prob. (Multinomial) mixture prior
More informationSpectral Clustering on Handwritten Digits Database Mid-Year Pr
Spectral Clustering on Handwritten Digits Database Mid-Year Presentation Danielle dmiddle1@math.umd.edu Advisor: Kasso Okoudjou kasso@umd.edu Department of Mathematics University of Maryland- College Park
More informationFinding normalized and modularity cuts by spectral clustering. Ljubjana 2010, October
Finding normalized and modularity cuts by spectral clustering Marianna Bolla Institute of Mathematics Budapest University of Technology and Economics marib@math.bme.hu Ljubjana 2010, October Outline Find
More information8.1 Concentration inequality for Gaussian random matrix (cont d)
MGMT 69: Topics in High-dimensional Data Analysis Falll 26 Lecture 8: Spectral clustering and Laplacian matrices Lecturer: Jiaming Xu Scribe: Hyun-Ju Oh and Taotao He, October 4, 26 Outline Concentration
More informationMining of Massive Datasets Jure Leskovec, AnandRajaraman, Jeff Ullman Stanford University
Note to other teachers and users of these slides: We would be delighted if you found this our material useful in giving your own lectures. Feel free to use these slides verbatim, or to modify them to fit
More informationSemidefinite and Second Order Cone Programming Seminar Fall 2001 Lecture 5
Semidefinite and Second Order Cone Programming Seminar Fall 2001 Lecture 5 Instructor: Farid Alizadeh Scribe: Anton Riabov 10/08/2001 1 Overview We continue studying the maximum eigenvalue SDP, and generalize
More information1 Matrix notation and preliminaries from spectral graph theory
Graph clustering (or community detection or graph partitioning) is one of the most studied problems in network analysis. One reason for this is that there are a variety of ways to define a cluster or community.
More informationGraph Partitioning Using Random Walks
Graph Partitioning Using Random Walks A Convex Optimization Perspective Lorenzo Orecchia Computer Science Why Spectral Algorithms for Graph Problems in practice? Simple to implement Can exploit very efficient
More informationLearning from Sensor Data: Set II. Behnaam Aazhang J.S. Abercombie Professor Electrical and Computer Engineering Rice University
Learning from Sensor Data: Set II Behnaam Aazhang J.S. Abercombie Professor Electrical and Computer Engineering Rice University 1 6. Data Representation The approach for learning from data Probabilistic
More informationSpectral Theory of Unsigned and Signed Graphs Applications to Graph Clustering. Some Slides
Spectral Theory of Unsigned and Signed Graphs Applications to Graph Clustering Some Slides Jean Gallier Department of Computer and Information Science University of Pennsylvania Philadelphia, PA 19104,
More informationMarkov Chains and Spectral Clustering
Markov Chains and Spectral Clustering Ning Liu 1,2 and William J. Stewart 1,3 1 Department of Computer Science North Carolina State University, Raleigh, NC 27695-8206, USA. 2 nliu@ncsu.edu, 3 billy@ncsu.edu
More informationA spectral clustering algorithm based on Gram operators
A spectral clustering algorithm based on Gram operators Ilaria Giulini De partement de Mathe matiques et Applications ENS, Paris Joint work with Olivier Catoni 1 july 2015 Clustering task of grouping
More informationLaplacian Eigenmaps for Dimensionality Reduction and Data Representation
Laplacian Eigenmaps for Dimensionality Reduction and Data Representation Neural Computation, June 2003; 15 (6):1373-1396 Presentation for CSE291 sp07 M. Belkin 1 P. Niyogi 2 1 University of Chicago, Department
More informationChapter 4. Signed Graphs. Intuitively, in a weighted graph, an edge with a positive weight denotes similarity or proximity of its endpoints.
Chapter 4 Signed Graphs 4.1 Signed Graphs and Signed Laplacians Intuitively, in a weighted graph, an edge with a positive weight denotes similarity or proximity of its endpoints. For many reasons, it is
More informationLecture Introduction. 2 Brief Recap of Lecture 10. CS-621 Theory Gems October 24, 2012
CS-62 Theory Gems October 24, 202 Lecture Lecturer: Aleksander Mądry Scribes: Carsten Moldenhauer and Robin Scheibler Introduction In Lecture 0, we introduced a fundamental object of spectral graph theory:
More informationNetworks and Their Spectra
Networks and Their Spectra Victor Amelkin University of California, Santa Barbara Department of Computer Science victor@cs.ucsb.edu December 4, 2017 1 / 18 Introduction Networks (= graphs) are everywhere.
More informationLearning from Labeled and Unlabeled Data: Semi-supervised Learning and Ranking p. 1/31
Learning from Labeled and Unlabeled Data: Semi-supervised Learning and Ranking Dengyong Zhou zhou@tuebingen.mpg.de Dept. Schölkopf, Max Planck Institute for Biological Cybernetics, Germany Learning from
More informationA Statistical Look at Spectral Graph Analysis. Deep Mukhopadhyay
A Statistical Look at Spectral Graph Analysis Deep Mukhopadhyay Department of Statistics, Temple University Office: Speakman 335 deep@temple.edu http://sites.temple.edu/deepstat/ Graph Signal Processing
More informationDiffuse interface methods on graphs: Data clustering and Gamma-limits
Diffuse interface methods on graphs: Data clustering and Gamma-limits Yves van Gennip joint work with Andrea Bertozzi, Jeff Brantingham, Blake Hunter Department of Mathematics, UCLA Research made possible
More informationDiffusion and random walks on graphs
Diffusion and random walks on graphs Leonid E. Zhukov School of Data Analysis and Artificial Intelligence Department of Computer Science National Research University Higher School of Economics Structural
More informationLearning Spectral Graph Segmentation
Learning Spectral Graph Segmentation AISTATS 2005 Timothée Cour Jianbo Shi Nicolas Gogin Computer and Information Science Department University of Pennsylvania Computer Science Ecole Polytechnique Graph-based
More informationGraphs in Machine Learning
Graphs in Machine Learning Michal Valko Inria Lille - Nord Europe, France TA: Pierre Perrault Partially based on material by: Ulrike von Luxburg, Gary Miller, Mikhail Belkin October 16th, 2017 MVA 2017/2018
More informationData-dependent representations: Laplacian Eigenmaps
Data-dependent representations: Laplacian Eigenmaps November 4, 2015 Data Organization and Manifold Learning There are many techniques for Data Organization and Manifold Learning, e.g., Principal Component
More informationLearning Eigenfunctions: Links with Spectral Clustering and Kernel PCA
Learning Eigenfunctions: Links with Spectral Clustering and Kernel PCA Yoshua Bengio Pascal Vincent Jean-François Paiement University of Montreal April 2, Snowbird Learning 2003 Learning Modal Structures
More informationTHE HIDDEN CONVEXITY OF SPECTRAL CLUSTERING
THE HIDDEN CONVEXITY OF SPECTRAL CLUSTERING Luis Rademacher, Ohio State University, Computer Science and Engineering. Joint work with Mikhail Belkin and James Voss This talk A new approach to multi-way
More informationComputer Vision Group Prof. Daniel Cremers. 10a. Markov Chain Monte Carlo
Group Prof. Daniel Cremers 10a. Markov Chain Monte Carlo Markov Chain Monte Carlo In high-dimensional spaces, rejection sampling and importance sampling are very inefficient An alternative is Markov Chain
More informationThe f-adjusted Graph Laplacian: a Diagonal Modification with a Geometric Interpretation
The f-adjusted Graph Laplacian: a Diagonal Modification with a Geometric Interpretation Sven Kurras Ulrike von Luxburg Department of Computer Science, University of Hamburg, Germany Gilles Blanchard Department
More informationSpectral Graph Theory and its Applications. Daniel A. Spielman Dept. of Computer Science Program in Applied Mathematics Yale Unviersity
Spectral Graph Theory and its Applications Daniel A. Spielman Dept. of Computer Science Program in Applied Mathematics Yale Unviersity Outline Adjacency matrix and Laplacian Intuition, spectral graph drawing
More informationGraph Partitioning Algorithms and Laplacian Eigenvalues
Graph Partitioning Algorithms and Laplacian Eigenvalues Luca Trevisan Stanford Based on work with Tsz Chiu Kwok, Lap Chi Lau, James Lee, Yin Tat Lee, and Shayan Oveis Gharan spectral graph theory Use linear
More informationSpectral Theory of Unsigned and Signed Graphs Applications to Graph Clustering: a Survey
Spectral Theory of Unsigned and Signed Graphs Applications to Graph Clustering: a Survey Jean Gallier Department of Computer and Information Science University of Pennsylvania Philadelphia, PA 19104, USA
More informationDissertation Defense
Clustering Algorithms for Random and Pseudo-random Structures Dissertation Defense Pradipta Mitra 1 1 Department of Computer Science Yale University April 23, 2008 Mitra (Yale University) Dissertation
More informationORIE 4741: Learning with Big Messy Data. Spectral Graph Theory
ORIE 4741: Learning with Big Messy Data Spectral Graph Theory Mika Sumida Operations Research and Information Engineering Cornell September 15, 2017 1 / 32 Outline Graph Theory Spectral Graph Theory Laplacian
More informationCS660: Mining Massive Datasets University at Albany SUNY
CS660: Mining Massive Datasets University at Albany SUNY April 8, 06 Intro (Ch. ). Bonferoni s principle examples If your method of finding significant items returns significantly more items that you would
More informationSpectral Graph Theory Lecture 2. The Laplacian. Daniel A. Spielman September 4, x T M x. ψ i = arg min
Spectral Graph Theory Lecture 2 The Laplacian Daniel A. Spielman September 4, 2015 Disclaimer These notes are not necessarily an accurate representation of what happened in class. The notes written before
More informationA New Spectral Technique Using Normalized Adjacency Matrices for Graph Matching 1
CHAPTER-3 A New Spectral Technique Using Normalized Adjacency Matrices for Graph Matching Graph matching problem has found many applications in areas as diverse as chemical structure analysis, pattern
More informationLecture 14: Random Walks, Local Graph Clustering, Linear Programming
CSE 521: Design and Analysis of Algorithms I Winter 2017 Lecture 14: Random Walks, Local Graph Clustering, Linear Programming Lecturer: Shayan Oveis Gharan 3/01/17 Scribe: Laura Vonessen Disclaimer: These
More information14 : Theory of Variational Inference: Inner and Outer Approximation
10-708: Probabilistic Graphical Models 10-708, Spring 2014 14 : Theory of Variational Inference: Inner and Outer Approximation Lecturer: Eric P. Xing Scribes: Yu-Hsin Kuo, Amos Ng 1 Introduction Last lecture
More informationSpectra of Adjacency and Laplacian Matrices
Spectra of Adjacency and Laplacian Matrices Definition: University of Alicante (Spain) Matrix Computing (subject 3168 Degree in Maths) 30 hours (theory)) + 15 hours (practical assignment) Contents 1. Spectra
More informationBayesian Learning in Undirected Graphical Models
Bayesian Learning in Undirected Graphical Models Zoubin Ghahramani Gatsby Computational Neuroscience Unit University College London, UK http://www.gatsby.ucl.ac.uk/ Work with: Iain Murray and Hyun-Chul
More informationData dependent operators for the spatial-spectral fusion problem
Data dependent operators for the spatial-spectral fusion problem Wien, December 3, 2012 Joint work with: University of Maryland: J. J. Benedetto, J. A. Dobrosotskaya, T. Doster, K. W. Duke, M. Ehler, A.
More informationPower Grid Partitioning: Static and Dynamic Approaches
Power Grid Partitioning: Static and Dynamic Approaches Miao Zhang, Zhixin Miao, Lingling Fan Department of Electrical Engineering University of South Florida Tampa FL 3320 miaozhang@mail.usf.edu zmiao,
More informationMachine Learning - MT Clustering
Machine Learning - MT 2016 15. Clustering Varun Kanade University of Oxford November 28, 2016 Announcements No new practical this week All practicals must be signed off in sessions this week Firm Deadline:
More informationAlgebraic Representation of Networks
Algebraic Representation of Networks 0 1 2 1 1 0 0 1 2 0 0 1 1 1 1 1 Hiroki Sayama sayama@binghamton.edu Describing networks with matrices (1) Adjacency matrix A matrix with rows and columns labeled by
More informationUnsupervised Learning Techniques Class 07, 1 March 2006 Andrea Caponnetto
Unsupervised Learning Techniques 9.520 Class 07, 1 March 2006 Andrea Caponnetto About this class Goal To introduce some methods for unsupervised learning: Gaussian Mixtures, K-Means, ISOMAP, HLLE, Laplacian
More informationLocally-biased analytics
Locally-biased analytics You have BIG data and want to analyze a small part of it: Solution 1: Cut out small part and use traditional methods Challenge: cutting out may be difficult a priori Solution 2:
More informationChapter 11. Matrix Algorithms and Graph Partitioning. M. E. J. Newman. June 10, M. E. J. Newman Chapter 11 June 10, / 43
Chapter 11 Matrix Algorithms and Graph Partitioning M. E. J. Newman June 10, 2016 M. E. J. Newman Chapter 11 June 10, 2016 1 / 43 Table of Contents 1 Eigenvalue and Eigenvector Eigenvector Centrality The
More informationNonlinear Dimensionality Reduction
Outline Hong Chang Institute of Computing Technology, Chinese Academy of Sciences Machine Learning Methods (Fall 2012) Outline Outline I 1 Kernel PCA 2 Isomap 3 Locally Linear Embedding 4 Laplacian Eigenmap
More informationAsymmetric Cheeger cut and application to multi-class unsupervised clustering
Asymmetric Cheeger cut and application to multi-class unsupervised clustering Xavier Bresson Thomas Laurent April 8, 0 Abstract Cheeger cut has recently been shown to provide excellent classification results
More informationLab 8: Measuring Graph Centrality - PageRank. Monday, November 5 CompSci 531, Fall 2018
Lab 8: Measuring Graph Centrality - PageRank Monday, November 5 CompSci 531, Fall 2018 Outline Measuring Graph Centrality: Motivation Random Walks, Markov Chains, and Stationarity Distributions Google
More informationCommunities, Spectral Clustering, and Random Walks
Communities, Spectral Clustering, and Random Walks David Bindel Department of Computer Science Cornell University 3 Jul 202 Spectral clustering recipe Ingredients:. A subspace basis with useful information
More informationContainment restrictions
Containment restrictions Tibor Szabó Extremal Combinatorics, FU Berlin, WiSe 207 8 In this chapter we switch from studying constraints on the set operation intersection, to constraints on the set relation
More informationHYPERGRAPH BASED SEMI-SUPERVISED LEARNING ALGORITHMS APPLIED TO SPEECH RECOGNITION PROBLEM: A NOVEL APPROACH
HYPERGRAPH BASED SEMI-SUPERVISED LEARNING ALGORITHMS APPLIED TO SPEECH RECOGNITION PROBLEM: A NOVEL APPROACH Hoang Trang 1, Tran Hoang Loc 1 1 Ho Chi Minh City University of Technology-VNU HCM, Ho Chi
More informationTopics in Approximation Algorithms Solution for Homework 3
Topics in Approximation Algorithms Solution for Homework 3 Problem 1 We show that any solution {U t } can be modified to satisfy U τ L τ as follows. Suppose U τ L τ, so there is a vertex v U τ but v L
More informationChris Bishop s PRML Ch. 8: Graphical Models
Chris Bishop s PRML Ch. 8: Graphical Models January 24, 2008 Introduction Visualize the structure of a probabilistic model Design and motivate new models Insights into the model s properties, in particular
More informationSemi-Supervised Learning
Semi-Supervised Learning getting more for less in natural language processing and beyond Xiaojin (Jerry) Zhu School of Computer Science Carnegie Mellon University 1 Semi-supervised Learning many human
More informationLecture 7: Spectral Graph Theory II
A Theorist s Toolkit (CMU 18-859T, Fall 2013) Lecture 7: Spectral Graph Theory II September 30, 2013 Lecturer: Ryan O Donnell Scribe: Christian Tjandraatmadja 1 Review We spend a page to review the previous
More informationEugene Wigner [4]: in the natural sciences, Communications in Pure and Applied Mathematics, XIII, (1960), 1 14.
Introduction Eugene Wigner [4]: The miracle of the appropriateness of the language of mathematics for the formulation of the laws of physics is a wonderful gift which we neither understand nor deserve.
More informationSVD, Power method, and Planted Graph problems (+ eigenvalues of random matrices)
Chapter 14 SVD, Power method, and Planted Graph problems (+ eigenvalues of random matrices) Today we continue the topic of low-dimensional approximation to datasets and matrices. Last time we saw the singular
More informationLecture 12: Introduction to Spectral Graph Theory, Cheeger s inequality
CSE 521: Design and Analysis of Algorithms I Spring 2016 Lecture 12: Introduction to Spectral Graph Theory, Cheeger s inequality Lecturer: Shayan Oveis Gharan May 4th Scribe: Gabriel Cadamuro Disclaimer:
More informationSome notes on SVD, dimensionality reduction, and clustering. Guangliang Chen
Some notes on SVD, dimensionality reduction, and clustering Guangliang Chen Contents 1. Introduction 4 2. Review of necessary linear algebra 4 3. Matrix SVD and its applications 8 Practice problems set
More informationStatistical Machine Learning
Statistical Machine Learning Christoph Lampert Spring Semester 2015/2016 // Lecture 12 1 / 36 Unsupervised Learning Dimensionality Reduction 2 / 36 Dimensionality Reduction Given: data X = {x 1,..., x
More information