Lecture 13 Spectral Graph Algorithms

Similar documents
Ma/CS 6b Class 23: Eigenvalues in Regular Graphs

An Algorithmist s Toolkit September 10, Lecture 1

Ma/CS 6b Class 20: Spectral Graph Theory

Lecture 14: SVD, Power method, and Planted Graph problems (+ eigenvalues of random matrices) Lecturer: Sanjeev Arora

Graph fundamentals. Matrices associated with a graph

Ma/CS 6b Class 20: Spectral Graph Theory

AMS526: Numerical Analysis I (Numerical Linear Algebra)

Spectra of Adjacency and Laplacian Matrices

6.842 Randomness and Computation March 3, Lecture 8

Lecture 1: Graphs, Adjacency Matrices, Graph Laplacian

Lecture 12: Introduction to Spectral Graph Theory, Cheeger s inequality

CMPSCI 611: Advanced Algorithms

Kernels of Directed Graph Laplacians. J. S. Caughman and J.J.P. Veerman

Part V. Matchings. Matching. 19 Augmenting Paths for Matchings. 18 Bipartite Matching via Flows

6.046 Recitation 11 Handout

Lecture 8 : Eigenvalues and Eigenvectors

1 Matchings in Non-Bipartite Graphs

Lecture Introduction. 2 Brief Recap of Lecture 10. CS-621 Theory Gems October 24, 2012

Lecture #21. c T x Ax b. maximize subject to

22m:033 Notes: 7.1 Diagonalization of Symmetric Matrices

AMS526: Numerical Analysis I (Numerical Linear Algebra)

Eigenvalue and Eigenvector Homework

K-center Hardness and Max-Coverage (Greedy)

An Algorithmist s Toolkit September 24, Lecture 5

Spectral radius, symmetric and positive matrices

ORIE 6334 Spectral Graph Theory September 22, Lecture 11

Lecture: Modeling graphs with electrical networks

Lecture 8: Linear Algebra Background

MATH 423 Linear Algebra II Lecture 20: Geometry of linear transformations. Eigenvalues and eigenvectors. Characteristic polynomial.

Lecture 7: Positive Semidefinite Matrices

Lecture 9: SVD, Low Rank Approximation

Lecture 2: Eigenvalues and their Uses

Lecture 3. 1 Polynomial-time algorithms for the maximum flow problem

Quick Tour of Linear Algebra and Graph Theory

CS 138b Computer Algorithm Homework #14. Ling Li, February 23, Construct the level graph

SVD, Power method, and Planted Graph problems (+ eigenvalues of random matrices)

Lecture 13: Spectral Graph Theory

CSC 373: Algorithm Design and Analysis Lecture 12

Matrix Vector Products

Linear algebra and applications to graphs Part 1

1 Adjacency matrix and eigenvalues

THE EIGENVALUE PROBLEM

An Algorithmist s Toolkit September 15, Lecture 2

Lecture 9: Low Rank Approximation

Very few Moore Graphs

Diffusion and random walks on graphs

ECEN 689 Special Topics in Data Science for Communications Networks

1 Review for Lecture 2 MaxFlow

Spectral Graph Theory and You: Matrix Tree Theorem and Centrality Metrics

Lecture 15: Expanders

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

Spectral Analysis of Random Walks

Duality of LPs and Applications

Math 240 Calculus III

Eigenvalues and Eigenvectors

MATH 304 Linear Algebra Lecture 34: Review for Test 2.

Numerical Methods - Numerical Linear Algebra

Linear Algebra in Actuarial Science: Slides to the lecture

Markov Chains and Spectral Clustering

Algorithms and Theory of Computation. Lecture 11: Network Flow

Calculating determinants for larger matrices

Spectral Theorem for Self-adjoint Linear Operators

Lecture 21 November 11, 2014

Graph G = (V, E). V ={vertices}, E={edges}. V={a,b,c,d,e,f,g,h,k} E={(a,b),(a,g),( a,h),(a,k),(b,c),(b,k),...,(h,k)}

Lecture 7 Spectral methods

Lecture 4: An FPTAS for Knapsack, and K-Center

Conditioning of the Entries in the Stationary Vector of a Google-Type Matrix. Steve Kirkland University of Regina

The Matrix-Tree Theorem

Fundamentals of Matrices

Eigenvalue problems and optimization

1 Matrix notation and preliminaries from spectral graph theory

Eight theorems in extremal spectral graph theory

Eigenvalues, Eigenvectors, and Diagonalization

Lecture: Local Spectral Methods (2 of 4) 19 Computing spectral ranking with the push procedure

The Local Spectra of Regular Line Graphs

MATH 304 Linear Algebra Lecture 23: Diagonalization. Review for Test 2.

ACO Comprehensive Exam March 20 and 21, Computability, Complexity and Algorithms

NORMS ON SPACE OF MATRICES

1.10 Matrix Representation of Graphs

25.1 Markov Chain Monte Carlo (MCMC)

Lecture Note 12: The Eigenvalue Problem

Lecture 12 : Graph Laplacians and Cheeger s Inequality

MATH 829: Introduction to Data Mining and Analysis Clustering II

Algorithms: Lecture 12. Chalmers University of Technology

Lecture 2: Linear operators

0.1 Naive formulation of PageRank

AMS526: Numerical Analysis I (Numerical Linear Algebra)

Lecture Notes CS:5360 Randomized Algorithms Lecture 20 and 21: Nov 6th and 8th, 2018 Scribe: Qianhang Sun

Mathematics for Decision Making: An Introduction. Lecture 13

Introducing the Laplacian

CS168: The Modern Algorithmic Toolbox Lectures #11 and #12: Spectral Graph Theory

EIGENVALUE PROBLEMS. Background on eigenvalues/ eigenvectors / decompositions. Perturbation analysis, condition numbers..

Machine Learning for Data Science (CS4786) Lecture 11

Lecture 14: Random Walks, Local Graph Clustering, Linear Programming

7.5 Bipartite Matching

Lecture 1. 1 Overview. 2 Maximum Flow. COMPSCI 532: Design and Analysis of Algorithms August 26, 2015

Space and Nondeterminism

Lecture 10. Lecturer: Aleksander Mądry Scribes: Mani Bastani Parizi and Christos Kalaitzis

8.1 Concentration inequality for Gaussian random matrix (cont d)

directed weighted graphs as flow networks the Ford-Fulkerson algorithm termination and running time

Transcription:

COMS 995-3: Advanced Algorithms March 6, 7 Lecture 3 Spectral Graph Algorithms Instructor: Alex Andoni Scribe: Srikar Varadaraj Introduction Today s topics: Finish proof from last lecture Example of random walk, stationary distributions Linear Algebra overview - Spectral theorem Define Rayleigh quotient, prove a property Shortest Augmenting Path Algorithm - Max Flow To Recap, the shortest augmenting path algorithm always chose the shortest remaining augmenting path in the residual graph G f. We want to bound its running time. Every iteration takes time O(m) by BFS. We need to answer: How many iterations do we need? Definition. Fix the current flow f. The residual graph is denoted as G f. We define d f (s, v) = distance from s to v in G f. Claim. Let P be the shortest s t augmenting path in G f. Let f be f after augmenting with the path P. Let d (s, v) = d f (s, v). Then, we claim that d (s, v) d(s, v) v. Proof. If d f (s, t) n then we are done trivially. We would like to prove that d f (s, t) increases by every time we augment by the shortest path. However, we will not be able to show this. Instead, we assert and prove the following lemma: Lemma 3. We use nm augmenting paths until we are done, and every edge is saturated.

We want to find out how many times an edge (v w) in the graph can be saturated. Let d(s, v) = distance to v from s before saturation. Before augmentation, d(s, w) = d(s, v) +. Note that before any edge v w is saturated again, the following situation must occur: Now consider distances d (s, w) and d (s, v). Then, d (s, v) d (s, w) + = d(s, v) + because P is saturated. So, distance increases by through any augmenting path. Therefore, every edge v w can be saturated n times. Since there are m total edges, the total number of augmenting paths is upper bounded by mn. After these many augmenting paths, all the edges must be completely saturated.

The above procedure must be repeated for every edge, so the running time is O(m mn ) = O(m n), as required. If we do more work per iteration of the algorithm, we can decrease the number of iterations. The best known current algorithm is: Õ(m n), where Õ(n) = n(logn)o(). For capacity U = O(), the best known algorithm is Õ(m/7 ). 3 Spectral Graph Theory Observation. The adjacency matrix of an undirected graph G, denoted by A G, has A ij = iff edge i j. The adjacency matrix of an undirected graph is symmetric. We only consider undirected graphs in this lecture since the theory we develop works well for them. Note that for directed matrices, the adjacency matrix is not symmetric. 3. Motivation We begin the discussion on adjacency matrices and spectral graph theory with a few definitions and a motivating example. Definition 5. The Diffusion operation D is defined on a graph to have the following properties: Fix a mass vector x R n, which assigns a weight or mass, to every vertex in a graph G at time t =. Time discrete step. i [n], x i is distributed equally amongst its neighbors. So, if vertices with indices a,..., a k are adjacent to the vertex with index i, then x aj x aj + x i k, where x a j denotes the mass at vertex with index a j. Note that we assume no vertex has an edge to itself. Definition 6. Matrix D G is defined for a graph G as: Here d i is the degree of node x i. d... d............. d n 3

Example 7. Consider the graph G : 3 5. A G is: D G is: Let x = (,,,, ) T at time t =. We apply the diffusion operator D repeatedly. At time t =, x = (,,,, ). Now, note that A GD G x = x. Also, time t =, x = (, +,,, ) = (, 3,,, ). D G x = = And so, A G D G x: 3 which is simply x. Hence, we see that A G D G x t = x t+ Also note that the distribution x = ( 8,,,, 8 ) is a stationary distribution in the sense that x = A G D G x. The example above motivates our discussion on spectral graph theory, since knowing properties of the matrices A G and D G easily gives us the value of x t, as well as properties of stationary distributions (which we did not discuss in depth). We begin with a review of the basic linear algebra required. 3. Spectral Theorem and Spectral Decomposition Definition 8. v is an eigenvector of the matrix M n n if Mv = λv for some λ R. λ is referred to an eigenvalue of the matrix M. Observation 9. Mv = λv (M λi)v = det(m λi) =. Hence, all the eigenvalues of M satisfy the equation M λi =.

M λi = is a polynomial of degree n. This implies that there are n eigenvalues counted with multiplicity. The following theorem gives us a special property of symmetric matrices, which is especially relevant to adjacency matrices of undirected graphs. Theorem. Spectral Theorem For symmetric matrix M, n orthonormal eigenvectors v,..., v n such that v i = and v T i v j = i j. The eigenvector v i has corresponding eigenvalue λ i. Some observations: If λ i = λ i+, then v i+v i+ is also an eigenvector with eigenvalue λ i. The Spectral Theorem implies that there exists a Spectral decomposition: M = n λ i v i vi T () i= 3.3 Rayleigh quotient and properties Definition. The Rayleigh quotient is defined as R(x) = xt Mx x Observation. Notice that eigenvectors v i, R(v i ) = vt i Mv i = v T i λ i v i = λ i v T i v i = λ i v i = λ i () Let x = argmax R(x), or a vector that maximizes the Rayleigh quotient. Theorem 3. Let the eigenvalues be ordered as: λ n λ n λ n... λ. Then, R(x ) = λ n. Proof. Every vector v = n i= a iv i, since {v i } forms a basis. Then, R(v) = (vt )M(v) v i = ( n i= a iv i ) T M( n i= a iv i ) a +... + = ( n i= a iv i ) T ( n i= a iλ i v i ) a n a +... + = a n Hence, R(v) λ n. Similarly, we can also conclude that R(v) λ. n i= λ ia i a +... + a n n i= λ na i a +... + a n (3) = λ n Hence, the eigenvalues of the matrix M give us deep properties regarding the Rayleigh quotient. We will see in later lectures how the Rayleigh Quotient is useful and relate it to properties of graphs. 5