Graph Partitioning Algorithms and Laplacian Eigenvalues
|
|
- Teresa Jenkins
- 5 years ago
- Views:
Transcription
1 Graph Partitioning Algorithms and Laplacian Eigenvalues Luca Trevisan Stanford Based on work with Tsz Chiu Kwok, Lap Chi Lau, James Lee, Yin Tat Lee, and Shayan Oveis Gharan
2 spectral graph theory Use linear algebra to understand/characterize graph properties design efficient graph algorithms
3 linear algebra and graphs Take an undirected graph G = (V, E), consider matrix A[i, j] := weight of edge (i, j)
4 linear algebra and graphs Take an undirected graph G = (V, E), consider matrix A[i, j] := weight of edge (i, j)
5 linear algebra and graphs Take an undirected graph G = (V, E), consider matrix A[i, j] := weight of edge (i, j) eigenvalues combinatorial properties eigenvenctors algorithms for combinatorial problems
6 fundamental thm of spectral graph theory G = (V, E) undirected, A adjacency matrix, d v degree of v, D diagonal matrix of degrees L := I D 1 2 AD 1 2
7 fundamental thm of spectral graph theory G = (V, E) undirected, A adjacency matrix, d v degree of v, D diagonal matrix of degrees L := I D 1 2 AD 1 2 L is symmetric, all its eigenvalues are real and
8 fundamental thm of spectral graph theory G = (V, E) undirected, A adjacency matrix, d v degree of v, D diagonal matrix of degrees L := I D 1 2 AD 1 2 L is symmetric, all its eigenvalues are real and 0 = λ 1 λ 2 λ n 2
9 fundamental thm of spectral graph theory G = (V, E) undirected, A adjacency matrix, d v degree of v, D diagonal matrix of degrees L := I D 1 2 AD 1 2 L is symmetric, all its eigenvalues are real and 0 = λ 1 λ 2 λ n 2 λ k = 0 iff G has k connected components λ n = 2 iff G has a bipartite connected component
10 0, 2 3, 2 3, 2 3, 2 3, 2 3, 5 3, 5 3, 5 3, 5 3
11 0, 2 3, 2 3, 2 3, 2 3, 2 3, 5 3, 5 3, 5 3, 5 3
12 0, 0,.69,.69, 1.5, 1.5, 1.8, 1.8
13 0, 0,.69,.69, 1.5, 1.5, 1.8, 1.8
14 ,,,,,,,
15 ,,,,,,,
16 eigenvalues as solutions to optimization problems If M R n n is symmetric, and λ 1 λ 2 λ n are eigenvalues counted with multiplicities, then λ k = min A k dim subspace of R V max x A x T Mx x T x and eigenvectors of λ 1,..., λ k are basis of opt A
17 why eigenvalues relate to connected components If G = (V, E) is undirected and L is Laplacian, then x T Lx x T x = (u,v) E x u x v 2 v d v x 2 v
18 why eigenvalues relate to connected components If G = (V, E) is undirected and L is Laplacian, then x T Lx (u,v) E x T x = x u x v 2 v d v xv 2 If λ 1 λ 2 λ n are eigenvalues of L with multiplicities, λ k = min A k dim subspace of R V max x A (u,v) E x u x v 2 v d v xv 2
19 why eigenvalues relate to connected components If G = (V, E) is undirected and L is Laplacian, then x T Lx (u,v) E x T x = x u x v 2 v d v xv 2 If λ 1 λ 2 λ n are eigenvalues of L with multiplicities, λ k = min A k dim subspace of R V max x A (u,v) E x u x v 2 v d v xv 2 Note: (u,v) E x u x v 2 = 0 iff x constant on connected components
20 if G has k connected components λ k = min A k dim subspace of R V max x A (u,v) E x u x v 2 v d v xv 2
21 if G has k connected components λ k = min A k dim subspace of R V max x A (u,v) E x u x v 2 v d v xv 2 Consider the space of all vectors that are constant in each connected component; it has dimension at least k and so it witnesses λ k = 0.
22 if λ k = 0 λ k = min A k dim subspace of R V max x A (u,v) E x u x v 2 v d v xv 2
23 if λ k = 0 λ k = min A k dim subspace of R V max x A (u,v) E x u x v 2 v d v xv 2 The eigenvectors x (1),..., x (k) of λ 1,..., λ k are all constant on connected components.
24 if λ k = 0 λ k = min A k dim subspace of R V max x A (u,v) E x u x v 2 v d v xv 2 The eigenvectors x (1),..., x (k) of λ 1,..., λ k are all constant on connected components. The mapping F (v) := (x v (1), x v (2),..., x v (k) ) is constant on connected components
25 if λ k = 0 λ k = min A k dim subspace of R V max x A (u,v) E x u x v 2 v d v xv 2 The eigenvectors x (1),..., x (k) of λ 1,..., λ k are all constant on connected components. The mapping F (v) := (x v (1), x v (2),..., x v (k) ) is constant on connected components It maps to at least k distinct points
26 if λ k = 0 λ k = min A k dim subspace of R V max x A (u,v) E x u x v 2 v d v xv 2 The eigenvectors x (1),..., x (k) of λ 1,..., λ k are all constant on connected components. The mapping F (v) := (x v (1), x v (2),..., x v (k) ) is constant on connected components It maps to at least k distinct points G has at least k connected components
27 conductance vol(s) := v S d v φ(s) := E(S, S) vol(s) φ(g) := min S V :0<vol(S) 1 2 vol(v ) φ(s) In regular graphs, same as edge expansion and, up to factor of 2 non-uniform sparsest cut
28 conductance 0, 0,.69,.69, 1.5, 1.5, 1.8, 1.8 φ(g) = φ({0, 1, 2}) = 0 6 = 0
29 conductance 0,.055,.055,.211,... φ(g) = 2 18
30 conductance 0, 2 3, 2 3, 2 3, 2 3, 2 3, 5 3, 5 3, 5 3, 5 3 φ(g) = 5 15
31 G is a social network finding S of small conductance
32 finding S of small conductance G is a social network S is a set of users more likely to be friends with each other than anyone else
33 finding S of small conductance G is a social network S is a set of users more likely to be friends with each other than anyone else G is a similarity graph on items of a data sets
34 finding S of small conductance G is a social network S is a set of users more likely to be friends with each other than anyone else G is a similarity graph on items of a data sets S are items more similar to each other than to other items [Shi, Malik]: image segmentation
35 finding S of small conductance G is a social network S is a set of users more likely to be friends with each other than anyone else G is a similarity graph on items of a data sets S are items more similar to each other than to other items [Shi, Malik]: image segmentation G is an input of a NP-hard optimization problem Recurse on S, V S, use dynamic programming to combine in time 2 O( E(S, S) )
36 conductance versus λ 2 Recall: λ 2 = 0 G disconnected
37 conductance versus λ 2 Recall: λ 2 = 0 G disconnected φ(g) = 0
38 conductance versus λ 2 Recall: λ 2 = 0 G disconnected φ(g) = 0 Cheeger inequality (Alon, Milman, Dodzuik, mid-1980s): λ 2 2 φ(g) 2λ 2
39 λ 2 2 φ(g) 2λ 2 Cheeger inequality
40 Cheeger inequality λ 2 2 φ(g) 2λ 2 Constructive proof of φ(g) 2λ 2 : given eigenvector x of λ 2. sort vertices v according to x v a set of vertices S occurring as initial segment or final segment of sorted order has φ(s) 2λ 2
41 Cheeger inequality λ 2 2 φ(g) 2λ 2 Constructive proof of φ(g) 2λ 2 : given eigenvector x of λ 2. sort vertices v according to x v a set of vertices S occurring as initial segment or final segment of sorted order has φ(s) 2λ 2 2 φ(g) sweep algorithm can be implemented in Õ( V + E ) time
42 λ 2 2 φ(g) 2λ 2 applications
43 applications λ 2 2 φ(g) 2λ 2 rigorous analysis of sweep algorithm (practical performance usually better than worst-case analysis)
44 applications λ 2 2 φ(g) 2λ 2 rigorous analysis of sweep algorithm (practical performance usually better than worst-case analysis) to certify φ Ω(1), enough to certify λ 2 Ω(1)
45 applications λ 2 2 φ(g) 2λ 2 rigorous analysis of sweep algorithm (practical performance usually better than worst-case analysis) to certify φ Ω(1), enough to certify λ 2 Ω(1) ( ) mixing time of random walk is O 1 log V and so also λ2 O ( ) 1 φ 2 log V (G)
46 Lee, Oveis-Gharan, T, 2012 From fundamental theorem of spectral graph theory: λ k = 0 G has k connected components
47 Lee, Oveis-Gharan, T, 2012 From fundamental theorem of spectral graph theory: λ k = 0 G has k connected components We prove: λ k small G has k disjoint sets of small conductance
48 order-k conductance Define order-k conductance φ k (G) := min S 1,...,S k disjoint max φ(s i) i=1,...,k Note: φ(g) = φ 2 (G) φ k (G) = 0 G has k connected components
49 0, 0,.69,.69, 1.5, 1.5, 1.8, 1.8
50 0, 0,.69,.69, 1.5, 1.5, 1.8, 1.8 { φ 3 (G) = max 0, 2 6, 2 } = 1 4 2
51 0,.055,.055,.211,... { } 2 φ 3 (G) = max 12, 2 12, 2 =
52 λ k = 0 φ k = 0 λ k
53 λ k λ k = 0 φ k = 0 (Lee,Oveis Gharan, T, 2012) λ k 2 φ k(g) O(k 2 ) λ k Upper bound is constructive: in nearly-linear time we can find k disjoint sets each of expansion at most O(k 2 ) λ k.
54 λ k 2 φ k(g) O(k 2 ) λ k λ k
55 λ k λ k 2 φ k(g) O(k 2 ) λ k φ k (G) O( (log k) λ 1.1k ) (Lee,Oveis Gharan, T, 2012) φ k (G) O( (log k) λ O(k) ) (Louis, Raghavendra, Tetali, Vempala, 2012) Factor O( log k) is tight
56 spectral clustering The following algorithm works well in practice. (But to solve what problem?) Compute k eigenvectors x (1),..., x (k) of k smallest eigenvalues of L Define mapping F : V R k F (v) := (x v (1), x v (2),..., x v (k) ) Apply k-means to the points that vertices are mapped to
57 k = 3 spectral embedding of a random graph
58 spectral clustering LOT algorithms and LRTV algorithm find k low-conductance sets if they exist First step is spectral embedding Then geometric partitioning but not k-means
59 λ 2 2 φ(g) 2λ 2 Cheeger inequality
60 Cheeger inequality λ 2 2 φ(g) 2λ 2 φ(g) O(k) λ 2 λk (Kwok, Lau, Lee, Oveis Gharan, T, 2013) and the upper bound holds for the sweep algorithm
61 Cheeger inequality λ 2 2 φ(g) 2λ 2 φ(g) O(k) λ 2 λk (Kwok, Lau, Lee, Oveis Gharan, T, 2013) and the upper bound holds for the sweep algorithm Nearly-linear time sweep algorithm finds S such that, for every k, ( ) k φ(s) φ(g) O λk
62 1 We find a 2k-valued vector y such that ( ) x y 2 λ2 O where x is eigenvector of λ 2 proof structure λ k
63 1 We find a 2k-valued vector y such that ( ) x y 2 λ2 O where x is eigenvector of λ 2 proof structure 2 For every k-valued vector y, applying the sweep algorithm to x finds a set S such that ( φ(s) O(k) λ 2 + ) λ 2 x y λ k
64 1 We find a 2k-valued vector y such that ( ) x y 2 λ2 O where x is eigenvector of λ 2 proof structure 2 For every k-valued vector y, applying the sweep algorithm to x finds a set S such that ( φ(s) O(k) λ 2 + ) λ 2 x y λ k So the sweep algorithm finds S φ(s) O(k) λ 2 1 λk
65 intuition for first result Given x, we find a 2k-valued vector y such that ( ) R(x) x y 2 O λ k
66 intuition for first result Given x, we find a 2k-valued vector y such that ( ) R(x) x y 2 O λ k Suppose R(x) = 0 and λ k > 0.
67 intuition for first result Given x, we find a 2k-valued vector y such that ( ) R(x) x y 2 O λ k Suppose R(x) = 0 and λ k > 0. Then x is constant on connected components (because R(x) = 0) G has < k connected components (because λ k > 0) So x is at distance zero from a vector (itself) that is (k 1)-valued
68 idea of proof of first result Consider x as an embedding of vertices to the line
69 idea of proof of first result Consider x as an embedding of vertices to the line Partition the points vertices are mapped to into 2k intervals, minimizing 2k-means with squared-distance
70 idea of proof of first result Consider x as an embedding of vertices to the line Partition the points vertices are mapped to into 2k intervals, minimizing 2k-means with squared-distance Round to closest interval boundary
71 idea of proof of first result Consider x as an embedding of vertices to the line Partition the points vertices are mapped to into 2k intervals, minimizing 2k-means with squared-distance Round to closest interval boundary If rounded vector is far from original vector, a lot of points are far from boundary; then we construct witness that λ k is small by finding k disjointly supported vectors of small Rayleigh quotient
72 intuition for second result For every k-valued vector y, applying the sweep algorithm to x finds a set S such that ( φ(s) O(k) R(x) + ) R(x) x y It was known: φ(s) 2R(x) x y = 0 φ(s) kr(x) Our proof blends the two arguments
73 eigenvalue gap if λ k = 0 and λ k+1 > 0 then G has exactly k connected components;
74 eigenvalue gap if λ k = 0 and λ k+1 > 0 then G has exactly k connected components; [Tanaka 2012] If λ k+1 > 5 k λ k, then vertices of G can be partitioned into k sets of small conductance, each inducing a subgraph of large conductance. [Non-algorithmic proof]
75 eigenvalue gap if λ k = 0 and λ k+1 > 0 then G has exactly k connected components; [Tanaka 2012] If λ k+1 > 5 k λ k, then vertices of G can be partitioned into k sets of small conductance, each inducing a subgraph of large conductance. [Non-algorithmic proof] [Oveis-Gharan, T 2013] Sufficient φ k+1 (G) > (1 + ɛ)φ k (G); if λ k+1 > O(k 2 ) λ k, then partition can be found algorithmically
76 bounded threshold rank [Arora, Barak, Steurer], [Barak, Raghavendra, Steurer], [Guruswami, Sinop],... If λ k is large, then a cut of approximately minimal conductance can be found in time exponential in k using spectral methods [ABS] or convex relaxations [BRS, GS]
77 bounded threshold rank [Arora, Barak, Steurer], [Barak, Raghavendra, Steurer], [Guruswami, Sinop],... If λ k is large, then a cut of approximately minimal conductance can be found in time exponential in k using spectral methods [ABS] or convex relaxations [BRS, GS] [Kwok, Lau, Lee, Oveis-Gharan, T]: the nearly-linear time sweep algorithm finds good approximation if λ k is large [Oveis-Gharan, T]: the Linial-Goemans relaxation (solvable in polynomial time independent of k) can be rounded better than in the Arora-Rao-Vazirani analysis if λ k is large
78 challenge: explain practical performance of sweep algorithm λ 2 2 φ(g) φ( output of algorithm ) 2λ 2
79 challenge: explain practical performance of sweep algorithm λ 2 2 φ(g) φ( output of algorithm ) 2λ 2 (Kwok, Lau, Lee, Oveis Gharan, T): φ( output of algorithm ) O(k) λ 2 λk
80 challenge: explain practical performance of sweep algorithm λ 2 2 φ(g) φ( output of algorithm ) 2λ 2
81 challenge: explain practical performance of sweep algorithm λ 2 2 φ(g) φ( output of algorithm ) 2λ 2 (Spielman, Teng) in bounded-degree planar graphs: ( ) 1 λ 2 O n so φ( output of algorithm ) O matching the planar separator theorem. ( 1 n )
82 explain practical performance of sweep algorithm λ 2 2 φ(g) φ( output of algorithm ) 2λ 2
83 explain practical performance of sweep algorithm λ 2 2 φ(g) φ( output of algorithm ) 2λ 2 Can we find interesting classes of graphs for which λ 2 << φ(g)?
84 lucatrevisan.wordpress.com/tag/expanders/
Lecture 17: Cheeger s Inequality and the Sparsest Cut Problem
Recent Advances in Approximation Algorithms Spring 2015 Lecture 17: Cheeger s Inequality and the Sparsest Cut Problem Lecturer: Shayan Oveis Gharan June 1st Disclaimer: These notes have not been subjected
More informationLecture 12: Introduction to Spectral Graph Theory, Cheeger s inequality
CSE 521: Design and Analysis of Algorithms I Spring 2016 Lecture 12: Introduction to Spectral Graph Theory, Cheeger s inequality Lecturer: Shayan Oveis Gharan May 4th Scribe: Gabriel Cadamuro Disclaimer:
More informationLecture 13: Spectral Graph Theory
CSE 521: Design and Analysis of Algorithms I Winter 2017 Lecture 13: Spectral Graph Theory Lecturer: Shayan Oveis Gharan 11/14/18 Disclaimer: These notes have not been subjected to the usual scrutiny reserved
More informationEIGENVECTOR NORMS MATTER IN SPECTRAL GRAPH THEORY
EIGENVECTOR NORMS MATTER IN SPECTRAL GRAPH THEORY Franklin H. J. Kenter Rice University ( United States Naval Academy 206) orange@rice.edu Connections in Discrete Mathematics, A Celebration of Ron Graham
More informationRandom Walks and Evolving Sets: Faster Convergences and Limitations
Random Walks and Evolving Sets: Faster Convergences and Limitations Siu On Chan 1 Tsz Chiu Kwok 2 Lap Chi Lau 2 1 The Chinese University of Hong Kong 2 University of Waterloo Jan 18, 2017 1/23 Finding
More informationGraph Clustering Algorithms
PhD Course on Graph Mining Algorithms, Università di Pisa February, 2018 Clustering: Intuition to Formalization Task Partition a graph into natural groups so that the nodes in the same cluster are more
More informationMarkov chain methods for small-set expansion
Markov chain methods for small-set expansion Ryan O Donnell David Witmer November 4, 03 Abstract Consider a finite irreducible Markov chain with invariant distribution π. We use the inner product induced
More informationLecture 14: Random Walks, Local Graph Clustering, Linear Programming
CSE 521: Design and Analysis of Algorithms I Winter 2017 Lecture 14: Random Walks, Local Graph Clustering, Linear Programming Lecturer: Shayan Oveis Gharan 3/01/17 Scribe: Laura Vonessen Disclaimer: These
More informationU.C. Berkeley CS294: Spectral Methods and Expanders Handout 11 Luca Trevisan February 29, 2016
U.C. Berkeley CS294: Spectral Methods and Expanders Handout Luca Trevisan February 29, 206 Lecture : ARV In which we introduce semi-definite programming and a semi-definite programming relaxation of sparsest
More information1 Matrix notation and preliminaries from spectral graph theory
Graph clustering (or community detection or graph partitioning) is one of the most studied problems in network analysis. One reason for this is that there are a variety of ways to define a cluster or community.
More informationSpectral Algorithms for Graph Mining and Analysis. Yiannis Kou:s. University of Puerto Rico - Rio Piedras NSF CAREER CCF
Spectral Algorithms for Graph Mining and Analysis Yiannis Kou:s University of Puerto Rico - Rio Piedras NSF CAREER CCF-1149048 The Spectral Lens A graph drawing using Laplacian eigenvectors Outline Basic
More informationGraph Partitioning Using Random Walks
Graph Partitioning Using Random Walks A Convex Optimization Perspective Lorenzo Orecchia Computer Science Why Spectral Algorithms for Graph Problems in practice? Simple to implement Can exploit very efficient
More informationSpectral Graph Theory and its Applications. Daniel A. Spielman Dept. of Computer Science Program in Applied Mathematics Yale Unviersity
Spectral Graph Theory and its Applications Daniel A. Spielman Dept. of Computer Science Program in Applied Mathematics Yale Unviersity Outline Adjacency matrix and Laplacian Intuition, spectral graph drawing
More information18.S096: Spectral Clustering and Cheeger s Inequality
18.S096: Spectral Clustering and Cheeger s Inequality Topics in Mathematics of Data Science (Fall 015) Afonso S. Bandeira bandeira@mit.edu http://math.mit.edu/~bandeira October 6, 015 These are lecture
More informationORIE 6334 Spectral Graph Theory November 22, Lecture 25
ORIE 64 Spectral Graph Theory November 22, 206 Lecture 25 Lecturer: David P. Williamson Scribe: Pu Yang In the remaining three lectures, we will cover a prominent result by Arora, Rao, and Vazirani for
More informationLecture 7. 1 Normalized Adjacency and Laplacian Matrices
ORIE 6334 Spectral Graph Theory September 3, 206 Lecturer: David P. Williamson Lecture 7 Scribe: Sam Gutekunst In this lecture, we introduce normalized adjacency and Laplacian matrices. We state and begin
More informationLecture 12 : Graph Laplacians and Cheeger s Inequality
CPS290: Algorithmic Foundations of Data Science March 7, 2017 Lecture 12 : Graph Laplacians and Cheeger s Inequality Lecturer: Kamesh Munagala Scribe: Kamesh Munagala Graph Laplacian Maybe the most beautiful
More informationUnique Games and Small Set Expansion
Proof, beliefs, and algorithms through the lens of sum-of-squares 1 Unique Games and Small Set Expansion The Unique Games Conjecture (UGC) (Khot [2002]) states that for every ɛ > 0 there is some finite
More informationStanford University CS366: Graph Partitioning and Expanders Handout 13 Luca Trevisan March 4, 2013
Stanford University CS366: Graph Partitioning and Expanders Handout 13 Luca Trevisan March 4, 2013 Lecture 13 In which we construct a family of expander graphs. The problem of constructing expander graphs
More informationUnique Games Conjecture & Polynomial Optimization. David Steurer
Unique Games Conjecture & Polynomial Optimization David Steurer Newton Institute, Cambridge, July 2013 overview Proof Complexity Approximability Polynomial Optimization Quantum Information computational
More informationTesting Cluster Structure of Graphs. Artur Czumaj
Testing Cluster Structure of Graphs Artur Czumaj DIMAP and Department of Computer Science University of Warwick Joint work with Pan Peng and Christian Sohler (TU Dortmund) Dealing with BigData in Graphs
More informationApproximation & Complexity
Summer school on semidefinite optimization Approximation & Complexity David Steurer Cornell University Part 1 September 6, 2012 Overview Part 1 Unique Games Conjecture & Basic SDP Part 2 SDP Hierarchies:
More informationImproved Cheeger s Inequality: Analysis of Spectral Partitioning Algorithms through Higher Order Spectral Gap
Improved Cheeger s Inequality: Analysis of Spectral Partitioning Algorithms through Higher Order Spectral Gap Tsz Chiu Kwok CUHK tckwok@csecuhkeduhk Shayan Oveis Gharan Stanford University shayan@stanfordedu
More informationSDP Relaxations for MAXCUT
SDP Relaxations for MAXCUT from Random Hyperplanes to Sum-of-Squares Certificates CATS @ UMD March 3, 2017 Ahmed Abdelkader MAXCUT SDP SOS March 3, 2017 1 / 27 Overview 1 MAXCUT, Hardness and UGC 2 LP
More informationCS224W: Social and Information Network Analysis Jure Leskovec, Stanford University
CS224W: Social and Information Network Analysis Jure Leskovec Stanford University Jure Leskovec, Stanford University http://cs224w.stanford.edu Task: Find coalitions in signed networks Incentives: European
More informationMining of Massive Datasets Jure Leskovec, AnandRajaraman, Jeff Ullman Stanford University
Note to other teachers and users of these slides: We would be delighted if you found this our material useful in giving your own lectures. Feel free to use these slides verbatim, or to modify them to fit
More information1 Matrix notation and preliminaries from spectral graph theory
Graph clustering (or community detection or graph partitioning) is one of the most studied problems in network analysis. One reason for this is that there are a variety of ways to define a cluster or community.
More informationORIE 6334 Spectral Graph Theory September 22, Lecture 11
ORIE 6334 Spectral Graph Theory September, 06 Lecturer: David P. Williamson Lecture Scribe: Pu Yang In today s lecture we will focus on discrete time random walks on undirected graphs. Specifically, we
More informationSpectral and Electrical Graph Theory. Daniel A. Spielman Dept. of Computer Science Program in Applied Mathematics Yale Unviersity
Spectral and Electrical Graph Theory Daniel A. Spielman Dept. of Computer Science Program in Applied Mathematics Yale Unviersity Outline Spectral Graph Theory: Understand graphs through eigenvectors and
More informationA Schur Complement Cheeger Inequality
A Schur Complement Cheeger Inequality arxiv:8.834v [cs.dm] 27 Nov 28 Aaron Schild University of California, Berkeley EECS aschild@berkeley.edu November 28, 28 Abstract Cheeger s inequality shows that any
More informationarxiv: v4 [cs.ds] 23 Jun 2015
Spectral Concentration and Greedy k-clustering Tamal K. Dey Pan Peng Alfred Rossi Anastasios Sidiropoulos June 25, 2015 arxiv:1404.1008v4 [cs.ds] 23 Jun 2015 Abstract A popular graph clustering method
More informationLOCAL CHEEGER INEQUALITIES AND SPARSE CUTS
LOCAL CHEEGER INEQUALITIES AND SPARSE CUTS CAELAN GARRETT 18.434 FINAL PROJECT CAELAN@MIT.EDU Abstract. When computing on massive graphs, it is not tractable to iterate over the full graph. Local graph
More informationLecture 9: Low Rank Approximation
CSE 521: Design and Analysis of Algorithms I Fall 2018 Lecture 9: Low Rank Approximation Lecturer: Shayan Oveis Gharan February 8th Scribe: Jun Qi Disclaimer: These notes have not been subjected to the
More informationSTA141C: Big Data & High Performance Statistical Computing
STA141C: Big Data & High Performance Statistical Computing Lecture 12: Graph Clustering Cho-Jui Hsieh UC Davis May 29, 2018 Graph Clustering Given a graph G = (V, E, W ) V : nodes {v 1,, v n } E: edges
More informationNetworks and Their Spectra
Networks and Their Spectra Victor Amelkin University of California, Santa Barbara Department of Computer Science victor@cs.ucsb.edu December 4, 2017 1 / 18 Introduction Networks (= graphs) are everywhere.
More informationLower bounds on the size of semidefinite relaxations. David Steurer Cornell
Lower bounds on the size of semidefinite relaxations David Steurer Cornell James R. Lee Washington Prasad Raghavendra Berkeley Institute for Advanced Study, November 2015 overview of results unconditional
More informationData Analysis and Manifold Learning Lecture 7: Spectral Clustering
Data Analysis and Manifold Learning Lecture 7: Spectral Clustering Radu Horaud INRIA Grenoble Rhone-Alpes, France Radu.Horaud@inrialpes.fr http://perception.inrialpes.fr/ Outline of Lecture 7 What is spectral
More informationMa/CS 6b Class 23: Eigenvalues in Regular Graphs
Ma/CS 6b Class 3: Eigenvalues in Regular Graphs By Adam Sheffer Recall: The Spectrum of a Graph Consider a graph G = V, E and let A be the adjacency matrix of G. The eigenvalues of G are the eigenvalues
More informationANNALES. SHAYAN OVEIS GHARAN Spectral Graph Theory via Higher Order Eigenvalues and Applications to the Analysis of Random Walks
ANNALES DE LA FACULTÉ DES SCIENCES Mathématiques SHAYAN OVEIS GHARAN Spectral Graph Theory via Higher Order Eigenvalues and Applications to the Analysis of Random Walks Tome XXIV, n o 4 (2015), p. 801-815.
More informationUniversity of Bristol - Explore Bristol Research. Publisher's PDF, also known as Version of record
Zanetti, L., Sun, H., & Peng, R. 05. Partitioning Well-Clustered Graphs: Spectral Clustering Works!. Paper presented at Conference on Learning Theory, Paris, France. Publisher's PDF, also known as Version
More informationSpectral Clustering on Handwritten Digits Database
University of Maryland-College Park Advance Scientific Computing I,II Spectral Clustering on Handwritten Digits Database Author: Danielle Middlebrooks Dmiddle1@math.umd.edu Second year AMSC Student Advisor:
More informationeigenvalue bounds and metric uniformization Punya Biswal & James R. Lee University of Washington Satish Rao U. C. Berkeley
eigenvalue bounds and metric uniformization Punya Biswal & James R. Lee University of Washington Satish Rao U. C. Berkeley separators in planar graphs Lipton and Tarjan (1980) showed that every n-vertex
More informationSpectra of Adjacency and Laplacian Matrices
Spectra of Adjacency and Laplacian Matrices Definition: University of Alicante (Spain) Matrix Computing (subject 3168 Degree in Maths) 30 hours (theory)) + 15 hours (practical assignment) Contents 1. Spectra
More informationTopic: Balanced Cut, Sparsest Cut, and Metric Embeddings Date: 3/21/2007
CS880: Approximations Algorithms Scribe: Tom Watson Lecturer: Shuchi Chawla Topic: Balanced Cut, Sparsest Cut, and Metric Embeddings Date: 3/21/2007 In the last lecture, we described an O(log k log D)-approximation
More informationLaplacian Matrices of Graphs: Spectral and Electrical Theory
Laplacian Matrices of Graphs: Spectral and Electrical Theory Daniel A. Spielman Dept. of Computer Science Program in Applied Mathematics Yale University Toronto, Sep. 28, 2 Outline Introduction to graphs
More informationMaximum cut and related problems
Proof, beliefs, and algorithms through the lens of sum-of-squares 1 Maximum cut and related problems Figure 1: The graph of Renato Paes Leme s friends on the social network Orkut, the partition to top
More informationLinear algebra and applications to graphs Part 1
Linear algebra and applications to graphs Part 1 Written up by Mikhail Belkin and Moon Duchin Instructor: Laszlo Babai June 17, 2001 1 Basic Linear Algebra Exercise 1.1 Let V and W be linear subspaces
More informationConductance, the Normalized Laplacian, and Cheeger s Inequality
Spectral Graph Theory Lecture 6 Conductance, the Normalized Laplacian, and Cheeger s Inequality Daniel A. Spielman September 17, 2012 6.1 About these notes These notes are not necessarily an accurate representation
More informationLecture Introduction. 2 Brief Recap of Lecture 10. CS-621 Theory Gems October 24, 2012
CS-62 Theory Gems October 24, 202 Lecture Lecturer: Aleksander Mądry Scribes: Carsten Moldenhauer and Robin Scheibler Introduction In Lecture 0, we introduced a fundamental object of spectral graph theory:
More informationLecture 8: Spectral Graph Theory III
A Theorist s Toolkit (CMU 18-859T, Fall 13) Lecture 8: Spectral Graph Theory III October, 13 Lecturer: Ryan O Donnell Scribe: Jeremy Karp 1 Recap Last time we showed that every function f : V R is uniquely
More informationLecture 1. 1 if (i, j) E a i,j = 0 otherwise, l i,j = d i if i = j, and
Specral Graph Theory and its Applications September 2, 24 Lecturer: Daniel A. Spielman Lecture. A quick introduction First of all, please call me Dan. If such informality makes you uncomfortable, you can
More informationMarkov Chains and Spectral Clustering
Markov Chains and Spectral Clustering Ning Liu 1,2 and William J. Stewart 1,3 1 Department of Computer Science North Carolina State University, Raleigh, NC 27695-8206, USA. 2 nliu@ncsu.edu, 3 billy@ncsu.edu
More informationCS168: The Modern Algorithmic Toolbox Lectures #11 and #12: Spectral Graph Theory
CS168: The Modern Algorithmic Toolbox Lectures #11 and #12: Spectral Graph Theory Tim Roughgarden & Gregory Valiant May 2, 2016 Spectral graph theory is the powerful and beautiful theory that arises from
More informationLecture: Local Spectral Methods (3 of 4) 20 An optimization perspective on local spectral methods
Stat260/CS294: Spectral Graph Methods Lecture 20-04/07/205 Lecture: Local Spectral Methods (3 of 4) Lecturer: Michael Mahoney Scribe: Michael Mahoney Warning: these notes are still very rough. They provide
More informationLecture 5. Max-cut, Expansion and Grothendieck s Inequality
CS369H: Hierarchies of Integer Programming Relaxations Spring 2016-2017 Lecture 5. Max-cut, Expansion and Grothendieck s Inequality Professor Moses Charikar Scribes: Kiran Shiragur Overview Here we derive
More informationGeometry of Similarity Search
Geometry of Similarity Search Alex Andoni (Columbia University) Find pairs of similar images how should we measure similarity? Naïvely: about n 2 comparisons Can we do better? 2 Measuring similarity 000000
More informationLectures 15: Cayley Graphs of Abelian Groups
U.C. Berkeley CS294: Spectral Methods and Expanders Handout 15 Luca Trevisan March 14, 2016 Lectures 15: Cayley Graphs of Abelian Groups In which we show how to find the eigenvalues and eigenvectors of
More informationLecture 1 and 2: Random Spanning Trees
Recent Advances in Approximation Algorithms Spring 2015 Lecture 1 and 2: Random Spanning Trees Lecturer: Shayan Oveis Gharan March 31st Disclaimer: These notes have not been subjected to the usual scrutiny
More informationPartitioning Algorithms that Combine Spectral and Flow Methods
CS369M: Algorithms for Modern Massive Data Set Analysis Lecture 15-11/11/2009 Partitioning Algorithms that Combine Spectral and Flow Methods Lecturer: Michael Mahoney Scribes: Kshipra Bhawalkar and Deyan
More informationSpectral Clustering. Guokun Lai 2016/10
Spectral Clustering Guokun Lai 2016/10 1 / 37 Organization Graph Cut Fundamental Limitations of Spectral Clustering Ng 2002 paper (if we have time) 2 / 37 Notation We define a undirected weighted graph
More informationRandom Lifts of Graphs
27th Brazilian Math Colloquium, July 09 Plan of this talk A brief introduction to the probabilistic method. A quick review of expander graphs and their spectrum. Lifts, random lifts and their properties.
More informationMATH 567: Mathematical Techniques in Data Science Clustering II
This lecture is based on U. von Luxburg, A Tutorial on Spectral Clustering, Statistics and Computing, 17 (4), 2007. MATH 567: Mathematical Techniques in Data Science Clustering II Dominique Guillot Departments
More informationReductions Between Expansion Problems
Reductions Between Expansion Problems Prasad Raghavendra David Steurer Madhur Tulsiani November 11, 2010 Abstract The Small-Set Expansion Hypothesis (Raghavendra, Steurer, STOC 2010) is a natural hardness
More informationLecture 14: SVD, Power method, and Planted Graph problems (+ eigenvalues of random matrices) Lecturer: Sanjeev Arora
princeton univ. F 13 cos 521: Advanced Algorithm Design Lecture 14: SVD, Power method, and Planted Graph problems (+ eigenvalues of random matrices) Lecturer: Sanjeev Arora Scribe: Today we continue the
More informationDiffusion and random walks on graphs
Diffusion and random walks on graphs Leonid E. Zhukov School of Data Analysis and Artificial Intelligence Department of Computer Science National Research University Higher School of Economics Structural
More informationNonlinear Eigenproblems in Data Analysis
Nonlinear Eigenproblems in Data Analysis Matthias Hein Joint work with: Thomas Bühler, Leonardo Jost, Shyam Rangapuram, Simon Setzer Department of Mathematics and Computer Science Saarland University,
More informationSum-of-Squares Method, Tensor Decomposition, Dictionary Learning
Sum-of-Squares Method, Tensor Decomposition, Dictionary Learning David Steurer Cornell Approximation Algorithms and Hardness, Banff, August 2014 for many problems (e.g., all UG-hard ones): better guarantees
More informationON THE QUALITY OF SPECTRAL SEPARATORS
ON THE QUALITY OF SPECTRAL SEPARATORS STEPHEN GUATTERY AND GARY L. MILLER Abstract. Computing graph separators is an important step in many graph algorithms. A popular technique for finding separators
More informationLecture: Local Spectral Methods (1 of 4)
Stat260/CS294: Spectral Graph Methods Lecture 18-03/31/2015 Lecture: Local Spectral Methods (1 of 4) Lecturer: Michael Mahoney Scribe: Michael Mahoney Warning: these notes are still very rough. They provide
More informationSpectral Clustering. Zitao Liu
Spectral Clustering Zitao Liu Agenda Brief Clustering Review Similarity Graph Graph Laplacian Spectral Clustering Algorithm Graph Cut Point of View Random Walk Point of View Perturbation Theory Point of
More informationApproximating the isoperimetric number of strongly regular graphs
Electronic Journal of Linear Algebra Volume 13 Volume 13 (005) Article 7 005 Approximating the isoperimetric number of strongly regular graphs Sivaramakrishnan Sivasubramanian skrishna@tcs.tifr.res.in
More informationConvex and Semidefinite Programming for Approximation
Convex and Semidefinite Programming for Approximation We have seen linear programming based methods to solve NP-hard problems. One perspective on this is that linear programming is a meta-method since
More informationU.C. Berkeley Better-than-Worst-Case Analysis Handout 3 Luca Trevisan May 24, 2018
U.C. Berkeley Better-than-Worst-Case Analysis Handout 3 Luca Trevisan May 24, 2018 Lecture 3 In which we show how to find a planted clique in a random graph. 1 Finding a Planted Clique We will analyze
More informationSpectral Clustering. Spectral Clustering? Two Moons Data. Spectral Clustering Algorithm: Bipartioning. Spectral methods
Spectral Clustering Seungjin Choi Department of Computer Science POSTECH, Korea seungjin@postech.ac.kr 1 Spectral methods Spectral Clustering? Methods using eigenvectors of some matrices Involve eigen-decomposition
More informationCSC2420 Spring 2017: Algorithm Design, Analysis and Theory Lecture 11
CSC2420 Spring 2017: Algorithm Design, Analysis and Theory Lecture 11 Allan Borodin March 27, 2017 1 / 1 Announcements and todays agenda Announcements 1 Assignment 2 was due today. I just saw some messages
More information1 Review: symmetric matrices, their eigenvalues and eigenvectors
Cornell University, Fall 2012 Lecture notes on spectral methods in algorithm design CS 6820: Algorithms Studying the eigenvalues and eigenvectors of matrices has powerful consequences for at least three
More informationCombinatorial Optimization
Combinatorial Optimization 2017-2018 1 Maximum matching on bipartite graphs Given a graph G = (V, E), find a maximum cardinal matching. 1.1 Direct algorithms Theorem 1.1 (Petersen, 1891) A matching M is
More information1 T 1 = where 1 is the all-ones vector. For the upper bound, let v 1 be the eigenvector corresponding. u:(u,v) E v 1(u)
CME 305: Discrete Mathematics and Algorithms Instructor: Reza Zadeh (rezab@stanford.edu) Final Review Session 03/20/17 1. Let G = (V, E) be an unweighted, undirected graph. Let λ 1 be the maximum eigenvalue
More informationInverse Power Method for Non-linear Eigenproblems
Inverse Power Method for Non-linear Eigenproblems Matthias Hein and Thomas Bühler Anubhav Dwivedi Department of Aerospace Engineering & Mechanics 7th March, 2017 1 / 30 OUTLINE Motivation Non-Linear Eigenproblems
More informationTHE HIDDEN CONVEXITY OF SPECTRAL CLUSTERING
THE HIDDEN CONVEXITY OF SPECTRAL CLUSTERING Luis Rademacher, Ohio State University, Computer Science and Engineering. Joint work with Mikhail Belkin and James Voss This talk A new approach to multi-way
More information1 Adjacency matrix and eigenvalues
CSC 5170: Theory of Computational Complexity Lecture 7 The Chinese University of Hong Kong 1 March 2010 Our objective of study today is the random walk algorithm for deciding if two vertices in an undirected
More informationDissertation Defense
Clustering Algorithms for Random and Pseudo-random Structures Dissertation Defense Pradipta Mitra 1 1 Department of Computer Science Yale University April 23, 2008 Mitra (Yale University) Dissertation
More informationFour graph partitioning algorithms. Fan Chung University of California, San Diego
Four graph partitioning algorithms Fan Chung University of California, San Diego History of graph partitioning NP-hard approximation algorithms Spectral method, Fiedler 73, Folklore Multicommunity flow,
More informationGraph Partitioning by Spectral Rounding: Applications in Image Segmentation and Clustering
Graph Partitioning by Spectral Rounding: Applications in Image Segmentation and Clustering David A. Tolliver Robotics Institute, Carnegie Mellon University, Pittsburgh, PA 15213. tolliver@ri.cmu.edu Gary
More informationGraph Sparsification I : Effective Resistance Sampling
Graph Sparsification I : Effective Resistance Sampling Nikhil Srivastava Microsoft Research India Simons Institute, August 26 2014 Graphs G G=(V,E,w) undirected V = n w: E R + Sparsification Approximate
More informationOn a Cut-Matching Game for the Sparsest Cut Problem
On a Cut-Matching Game for the Sparsest Cut Problem Rohit Khandekar Subhash A. Khot Lorenzo Orecchia Nisheeth K. Vishnoi Electrical Engineering and Computer Sciences University of California at Berkeley
More informationGraphs in Machine Learning
Graphs in Machine Learning Michal Valko INRIA Lille - Nord Europe, France Partially based on material by: Ulrike von Luxburg, Gary Miller, Doyle & Schnell, Daniel Spielman January 27, 2015 MVA 2014/2015
More informationTight Size-Degree Lower Bounds for Sums-of-Squares Proofs
Tight Size-Degree Lower Bounds for Sums-of-Squares Proofs Massimo Lauria KTH Royal Institute of Technology (Stockholm) 1 st Computational Complexity Conference, 015 Portland! Joint work with Jakob Nordström
More informationA local algorithm for finding dense bipartite-like subgraphs
A local algorithm for finding dense bipartite-like subgraphs Pan Peng 1,2 1 State Key Laboratory of Computer Science, Institute of Software, Chinese Academy of Sciences 2 School of Information Science
More informationSpace-Variant Computer Vision: A Graph Theoretic Approach
p.1/65 Space-Variant Computer Vision: A Graph Theoretic Approach Leo Grady Cognitive and Neural Systems Boston University p.2/65 Outline of talk Space-variant vision - Why and how of graph theory Anisotropic
More informationAditya Bhaskara CS 5968/6968, Lecture 1: Introduction and Review 12 January 2016
Lecture 1: Introduction and Review We begin with a short introduction to the course, and logistics. We then survey some basics about approximation algorithms and probability. We also introduce some of
More informationLecture: Flow-based Methods for Partitioning Graphs (1 of 2)
Stat260/CS294: Spectral Graph Methods Lecture 10-02/24/2015 Lecture: Flow-based Methods for Partitioning Graphs (1 of 2) Lecturer: Michael Mahoney Scribe: Michael Mahoney Warning: these notes are still
More informationIntroduction to Spectral Graph Theory and Graph Clustering
Introduction to Spectral Graph Theory and Graph Clustering Chengming Jiang ECS 231 Spring 2016 University of California, Davis 1 / 40 Motivation Image partitioning in computer vision 2 / 40 Motivation
More informationConvergence of Random Walks
Graphs and Networks Lecture 9 Convergence of Random Walks Daniel A. Spielman September 30, 010 9.1 Overview We begin by reviewing the basics of spectral theory. We then apply this theory to show that lazy
More informationUnsupervised Learning
The Complexity of Unsupervised Learning Santosh Vempala, Georgia Tech Unsupervised learning } Data (imagine cool images here) } with no labels (or teachers) } How to } Understand it/find patterns in it?
More informationMachine Learning for Data Science (CS4786) Lecture 11
Machine Learning for Data Science (CS4786) Lecture 11 Spectral clustering Course Webpage : http://www.cs.cornell.edu/courses/cs4786/2016sp/ ANNOUNCEMENT 1 Assignment P1 the Diagnostic assignment 1 will
More informationapproximation algorithms I
SUM-OF-SQUARES method and approximation algorithms I David Steurer Cornell Cargese Workshop, 201 meta-task encoded as low-degree polynomial in R x example: f(x) = i,j n w ij x i x j 2 given: functions
More informationInvertibility and Largest Eigenvalue of Symmetric Matrix Signings
Invertibility and Largest Eigenvalue of Symmetric Matrix Signings Charles Carlson Karthekeyan Chandrasekaran Hsien-Chih Chang Alexandra Kolla Abstract The spectra of signed matrices have played a fundamental
More informationPositive Semi-definite programing and applications for approximation
Combinatorial Optimization 1 Positive Semi-definite programing and applications for approximation Guy Kortsarz Combinatorial Optimization 2 Positive Sem-Definite (PSD) matrices, a definition Note that
More informationPetar Maymounkov Problem Set 1 MIT with Prof. Jonathan Kelner, Fall 07
Petar Maymounkov Problem Set 1 MIT 18.409 with Prof. Jonathan Kelner, Fall 07 Problem 1.2. Part a): Let u 1,..., u n and v 1,..., v n be the eigenvalues of A and B respectively. Since A B is positive semi-definite,
More information