Part I: Preliminary Results. Pak K. Chan, Martine Schlag and Jason Zien. Computer Engineering Board of Studies. University of California, Santa Cruz

Size: px
Start display at page:

Download "Part I: Preliminary Results. Pak K. Chan, Martine Schlag and Jason Zien. Computer Engineering Board of Studies. University of California, Santa Cruz"

Transcription

1 Spectral K-Way Ratio-Cut Partitioning Part I: Preliminary Results Pak K. Chan, Martine Schlag and Jason Zien Computer Engineering Board of Studies University of California, Santa Cruz May, 99 Abstract Recent research on partitioning has focussed on the ratio-cut cost metric which maintains a balance between the sizes of the edges cut and the sizes of the partitions without xing the size of the partitions a priori. Iterative approaches and spectral approaches to two-way ratio-cut partitioning have yielded higher quality partitioning results. In this paper we develop a spectral approach to multi-way ratio-cut partitioning which provides a generalization of the ratio-cut cost metric to k-way partitioning and a lower bound on this cost metric. Our approach uses Lanczos algorithm to nd the k smallest eigenvalue/eigenvector pairs of the Laplacian of the graph. The eigenvectors are used to construct an orthogonal projection to map a vertex (of the graph) in an n-dimensional space into a k-dimensional subspace. We exploit the (near) orthogonality of the projected points to eect high quality clustering of points in a k-dimensional subspace. An ecient algorithm is presented for coercing the points in the k-dimensional subspace into k-partitions. Advancement over the current work is evidenced by the results of experiments on the standard MCNC benchmarks.

2 Chan, Schlag, Zien/UCSC May, 99 Introduction We present a method for k-way partitioning based on spectral techniques by extending the techniques of Hagen and Kahng []. The k-way partition problem can be formulated as the search for a projection of n-dimensional space onto a k-dimensional subspace, mapping the n unit-vector basis (the n nodes of a graph) into k distinct points (the partitions) to minimize the weighted quadratic displacement. This is essentially the formulation given by Barnes [, 3]. However, unlike Barnes' formulation we do not assume any pre-determined partition sizes. By Hall's result [4], using the k eigenvectors of the graph's Laplacian, corresponding to the smallest k eigenvalues provides a projection which minimizes the weighted quadratic displacement under the orthonormality constraint. In the case of a partition, this amounts to the number of edges cut. We show that a k projection provided by a partition can be reformulated as an orthonormal projection. This reformulation no longer minimizes the number of edges cut, but a new cost metric which incorporates the size of the partitions of a k-way partition P = fp; P; : : : P k g: kx E h cost(p) = h= jp h j : Here, E h denotes the number of edges between nodes in partition P h and nodes outside the partition. Interestingly, in the case of k =, this is the ratio-cut metric dened by Cheng and Wei scaled by the number of nodes in the graph [5]. The sum of the smallest k eigenvalues provides a lower bound on this cost metric. A geometric interpretation of the eigenvectors provides a method for transforming the eigenvector solution into a partition. More signicantly, this formulation provides a heuristic for identifying the natural number of partitions, k, a priori. Scaling the cost function above by k(k?) to oset the inuence of k on this cost metric (fewer nodes per partition and higher expected degrees) provides a means of comparing partitions across dierent k's. Denitions Given an undirected graph G with n nodes, v; v; : : : ; v n, the adjacency matrix of G is the n n matrix A(G) = [a ij ] dened by, a ij = the number of edges between v i and v j : If G is simple (no loops or parallel edges), then all of the entries in A(G) are 's or 0's and there are 0's along the diagonal. The degree matrix of G is the n n matrix D(G) = [d ij ]

3 Chan, Schlag, Zien/UCSC May, 99 3 dened by, d ij = ( degree(vi ) if i = j 0 if i 6= j The Laplacian of G is the n n matrix Q(G) = D(G)? A(G). A k-partition of the nodes of G, P = fp; P; : : : P k g is represented by an n k assignment matrix Y (P) = [y ih ] where y ih = ( if vi P h 0 if v i 6 P h The rows of Y sum to and column h sums to jp h j. A k-partition of G, P = fp; P; : : : P k g can also be represented by the n n partition matrix P = [p ij ] where ( if vi and v p ij = j are in the same partition 0 otherwise Given a simple graph G and a partition of its nodes, P, the partition graph G P is the graph with k vertices u; u; : : : ; u k where the number of edges between u g and u h for g 6= h is X v i P g X a ij = a ij y ig y jh ; v j P h i= j= where Y is the assignment matrix of P. This is the graph whose nodes are the partitions and whose interconnections are inherited from the edges of G. 3 Graph partitioning and some basic properties of the Laplacian Observation From Mohar \The Laplacian Spectrum of Graphs" [6] X i = je(g)j = degree(v i ) i= i= This follows from the Handshaking Lemma (the sum of the degrees of all nodes in an undirected graph is twice the number of edges) and the trace of a symmetric matrix (the sum of its diagonal entries) is the sum of its eigenvalues. aka Kircho matrix aka admittance matrix This was dened in Barnes' SIAM paper [].

4 Chan, Schlag, Zien/UCSC May, 99 4 Observation If Y is an n k matrix then Y T Q(G)Y is a k k matrix whose gh th component is a ij (y ig? y jg )(y ih? y jh ): i= j= Proof: Since Y T Q(G)Y = Y T (D(G)? A(G))Y = Y T D(G)Y? Y T A(G))Y the gh th component of Y T Q(G)Y is y ig d ii y ih? y ig ( a ij y jh ) = i= i= j= = y ig y ih ( a ij )? y ig ( a ij y jh ) i= j= i= j= a ij (y ig y ih )? a ij (y ig y jh ) i= j= i= j= = a ij (y ig y ih )? a ij (y ig y jh ) + a ij (y ig y ih ) i= j= i= j= i= j= X = a ij (y ig y ih )? na ij (y ig y jh ) + a ji (y jg y jh ) i= j= i= j= j= i= = a ij (y ig y ih )? a ij (y ig y jh ) + a ij (y jg y jh ) i= j= i= j= j= i= = a ij (y ig y ih? y ig y jh + y jg y jh ) i= j= = a ij (y ig? y jg )(y ih? y jh ) i= j= Observation 3 If Y is the assignment matrix for P then Y T Q(G)Y = Q(G P ), the Laplacian of G P. Proof: First consider the g th diagonal entry of Y T Q(G)Y : a ij (y ig? y jg ) : i= j= Since Y is an assignment matrix, a ij (y ig? y jg ) will be only when exactly one of v i and v j is in P g and a ij =. Summed over all i and j this gives twice the total number of edges from nodes in P g to nodes not in P g. Hence the g th diagonal entry is the degree of u g in G P. Now consider an o-diagonal entry (h 6= g). The gh th component of Y T Q(G)Y is: a ij (y ig? y jg )(y ih? y jh ): i= j=

5 Chan, Schlag, Zien/UCSC May, 99 5 Note that y ig? y jg will be non-zero if and only if exactly one of v i or v j is in P g. The same holds for y ih? y jh with respect to P h. Since a node can not be simultaneously in two partitions, (y ig? y jg )(y ih? y jh ) is non-zero exactly when one of v i or v j is in P g and the other is in P h. If this is the case and a ij must be non-zero, then the ij th term of the summation is?. Hence summing over all i and j gives minus twice the number of edges between nodes in partition P g and nodes in partition P h. Observation 4 If P is a partition matrix, then its eigenvalues (P ) = fm; m; :::; m k g, where m i is the number of nodes in the i th partition (E. Barnes). Observation 5 If Y is the assignment matrix for P then Y Y T is the partition matrix. Proof: The ij th element of Y Y T is kx y ih y jh : h= The term y ih y jh will be if and only if both v i and v j are in P h, and hence the sum is exactly when v i and v j are in the same partition and 0 otherwise. Observation 6 If Y is the assignment matrix for P then Y T Y is a diagonal matrix with jp g j in the g th entry. Proof: The gh th element of Y T Y is y ig y ih : i= The term y ig y ih can only be 0 when h 6= g since node v i can be in at most one of the two partitions. Hence the o-diagonal entries will be 0's. On the diagonal we have y ig i= which sums to the number of nodes in P g. The partition problem considered by Barnes and Donath,Homan was to solve for Y given k and jpj; jpj; : : : ; jp k j, the sizes of the partitions [7]. 4 Ratio-cut graph partitioning Now consider modifying the denition of the partition matrix to take into account the size of the partitions. Specically, a k-partition of the nodes of G, P = fp; P; : : : P k g is represented

6 Chan, Schlag, Zien/UCSC May, 99 6 by an n k ratioed assignment matrix R(P) = [r ih ] where r ih = 8 < : p if v i P h jp h j 0 if v i 6 P h The rows no longer necessarily sum to and column h now sums to q jp h j. Observation 7 If R is the ratioed assignment matrix for P then the g th diagonal entry of R T Q(G)R is the degree of u g in G P divided by jp g j, and the gh th o-diagonal entry is minus the number of edges between P g and P h divided by q jp g j jp h j. Proof: By Observation the g th diagonal entry of R T Q(G)R is: a ij (r ig? r jg ) : i= j= Since R is a ratioed assignment matrix, a ij (r ig? r jg ) will be non-zero only when exactly one of v i and v j is in P g and a ij =. Summed over all i and j this gives twice the total number of edges from nodes in P g to nodes not in P g. Each non-zero (r ig? r jg ) is q jp h j A = jp h j : Hence the g th diagonal entry is the degree of u g in G P divided by jp g j. Now consider an o-diagonal entry (h 6= g). The gh th component of R T Q(G)R is: a ij (r ig? r jg )(r ih? r jh ): i= j= Note that r ig? r jg will be non-zero if and only if exactly one of v i or v j is in P g. The same holds for r ih? r jh with respect to P h. Since a node can not be simultaneously in two partitions, (r ig? r jg )(r ih? r jh ) is non-zero exactly when one of v i or v j is in P g and the other is in P h. If this is the case and a ij is non-zero, then the ij th term of the summation is? q q : jp g j jp h j Hence summing over all i and j gives minus twice the number of edges between nodes in partition P g and nodes in partition P h divided by q jp g j jp h j. Given a k-partition of G, the n n ratioed partition matrix P R = [rp ij ] where ( jp rp ij = g if v j i and v j both belong to P g 0 otherwise

7 Chan, Schlag, Zien/UCSC May, 99 7 Observation 8 If R is a ratioed assignment matrix for P then RR T partition matrix. = P R, the ratioed Proof: The ij th element of RR T is kx r ih r jh : h= The term r ih r jh will be non-zero if and only if both v i and v j are in P h, hence the sum is jp h j when v i and v j are in the same partition and 0 otherwise. Observation 9 If R is the ratioed assignment matrix for P then R T R is I k, an identity matrix. Proof: The gh th element of R T R is r ig r ih : i= The term r ig r ih can only be 0 when h 6= g since node v i can be in at most one of the two partitions. Hence the o-diagonal entries will be 0's. On the diagonal we have r ig i= which sums to since there are jp g j non-zero terms which are all Hence the ratioed partition matrix meets the constraint R T R = I k. jp g j. 5 Relation between the eigenvalues and ratio cuts Dene the k-way ratio cut cost metric of a k-partition P of G to be kx degree(u h ) cost(p) = : h= jp h j (Recall that u h is the node in the partition graph corresponding to P h so degree(u h ) is the number of edges in G with exactly one endpoint in P h.) Observation 0 If k =, cost(p) is the ratio-cut cost metric dened by Wei and Cheng scaled by n, the number of nodes in the graph.

8 Chan, Schlag, Zien/UCSC May, 99 8 Proof: If k =, and E c is the number of edges cut in P = fp; Pg, then we have cost(p) = degree(u ) jpj = E c + E c jpj jpj = E c jpj +! jpj = E c jpj + jpj jpj jpj = E c n jpj jpj E c = n jpj jpj + degree(u ) jpj Observation If R is the ratioed assignment matrix associated with P, then cost(p) = trace(r T Q(G)R). This follows from Observation 7. Minimizing cost(p), amounts to nding a ratioed assignment matrix which minimizes the sum of the diagonal entries of R T QR. Observation The n k matrix which minimizes the trace(y T Q(G)Y ) subject to the constraint Y T Y = I k, is the n k matrix whose k columns consist of the k eigenvectors of Q corresponding to the k smallest 3 eigenvalues of Q. Proof: If we take the partials with respect to the variables in Y of the g th diagonal entry the matrix Lagrangian of Y T Q(G)Y?(Y T Y?I k ), and equate them to zero, this constrains column g of Y to be an eigenvector; no other columns are constrained. Thus the minimal solution will consist of a k eigenvectors of Q. These k eigenvectors must be distinct to satisfy Y T Y = I k. If V is formed with any k eigenvectors of Q then QV = V k where k is the diagonal matrix formed with the eigenvalues corresponding to the k eigenvectors in V. This means that V T QV = V T V k = I k k = k and the sum of the diagonal entries of V T QV is the sum of the eigenvalues corresponding to the k eigenvectors in V. It follows that this sum is minimized by selecting the eigenvectors corresponding to the k smallest eigenvalues of Q. 3 Eigenvalues may be repeated but eigenvectors may not.

9 Chan, Schlag, Zien/UCSC May, 99 9 Observation 3 The sum of the smallest k eigenvalues is a lower bound on the minimum cost(p) of any k-partition of G. In the case k =, since + = 0 + = we obtain E c cost(p) = n jpj jpj which amounts to the result of Hagen and Kahng [] n E c jpj jpj : obviously not a tight upper bound for a graph with large number of nodes. Observation 4 By Fiedler's inequality [8], C(? cos(=n))) where C is a cut set of the graph G, an upper bound on the optimal ratio-cut partitioning is (n? )(? cos(=n)) 6 A new k-way spectral ratio cut method The problem of nding an assignment is solved as follows:. Calculate all or enough(?) eigenvalues of Q(G).. Select k. See Section 7 below. 3. Find the eigenvectors corresponding to these k smallest eigenvalues. 4. Construct V, an n k matrix whose columns are the k eigenvectors. 5. Compute Z = V V T. 6. Construct a matrix P = [p ij ] from Z where p ij = ( if zij >= n 0 otherwise This is a heuristic of nding a partition using Z as \close" to P as possible. This method of coercing Z into a partition matrix doesn't always work. See Section 8 for alternatives.

10 Chan, Schlag, Zien/UCSC May, Selecting the number of partitions k In order to compare costs of partitions with dierent sizes, we scale our cost metric by scost(p) = n cost(p) k(k? ) : n k(k?), The quantity n is the average number of nodes in a balanced k-way partition and k? is k the degree of a node in the complete graph with k nodes. Observation 5 For a random graph with edge probability f, and any partition P of size k, scost(p) = fn n k which is a generalization of results of Cheng and Wei for bipartition [5]. (We are not sure yet what this means, may be this is the average degree of the partition graph per partition. But it will result in k = n being selected, if the heuristics by eigenvalues estimation below is accurate). We would like to select the k that produces the smallest scost() kx n degree(u h ) k(k? ) h= jp h j but since we do not know the optimal ratio-cut partition for any k we use the lower bound as an indicator. That is, we choose the k which minimizes n kx k(k? ) h= where : : : n is the non-decreasing order of the eigenvalues of Q. Note that this is a HEURISTIC. h 8 Geometrical interpretation The n n matrix in R k, RR T P R = P R. = P R, the ratioed partition matrix is a projector, since The n n matrix in R k Z = V V T (which approximates the ratioed assignment matrix P R ) also forms an orthogonal projection of R n spanned by the column vectors of Q onto a subspace of R k spanned by the k smallest eigenvectors.

11 Chan, Schlag, Zien/UCSC May, 99 Observation 6 Z has exactly k nonzero eigenvalues which are all, i.e., (V V T ) = Proof: Let the columns of the matrix X be the eigenvectors of Z, and be their corresponding matrix of eigenvalues V V T X = X V T V V T X = V T X I k = Observation 7 The nonzero eigenvalues of the ratioed partition matrix RR T are all s. Proof: Same as above. Observation 8 The \distance" between the subspaces formed by ratioed assignment matrix R and V is given by Homan-Wielandt inequality and is bounded from below: jj(rr T )? (V V T )jj 0 Observation 9 The distance between two equidimensional subspaces spanned by R and V is q? min(r T V ) where min (R T V ) is the smallest singular value of matrix R T V (By CS Decomposition, Golub and van Loan, pg. 77). This is the generalized sine angle between the two subspaces. The P R matrix projects R n onto the R k subspace dened by the orthonormal R. If the n nodes of the graph G are associated with the n unit-vectors of R n, then nodes belonging to the same partition are mapped to the same point by R. Since P R is \close" to the subspace dened by V T V, the nodes belonging to the same partition should have \close" projections under V. This suggests a heuristic for constructing the R which is \closest" to V : use proximity of the images of the unit-vectors under V. In cases of ambiguity the distance can be veried. 9 Summary We have presented some preliminary but new and basic results on spectral k-way ratio-cut partitioning in this report. The next report will demonstrate the eectiveness of this method and some experimental results.

12 Chan, Schlag, Zien/UCSC May, 99 References [] L. Hagen and A. Kahng, \New spectral methods for ratio cut partitioning and clustering," To appear in IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems, 99. [] E. R. Barnes, \An algorithm for partitioning the nodes of a graph," SIAM Journal on Algorithm and Discrete Method, vol. 3, pp. 54{550, Dec. 98. [3] E. R. Barnes, \Partitioning the nodes of a graph," in Proceedings of Graph Theory with Applications to Algorithms and Computer Science, (Wiley), pp. 57{7, 985. [4] K. M. Hall, \An r-dimensional quadratic placement algorithm," Management Science, vol. 7, pp. 9{9, Nov [5] C.-K. Cheng and Y.-C. A. Wei, \An improved two-way partitioning algorithm with stable performance," IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems, vol. 0, pp. 50{5, Dec. 99. [6] B. Mohar, \The Laplacian spectrum of graphs," in Proceedings of the 6th Quadrennial International Conference on the Theory and Applications of Graphs: Graph Theory, Combinatorics and Applications, Vol., (Wiley), pp. 87{898, 988. [7] W. Donath and A. Homan, \Lower bounds for the partitioning of graphs," IBM Journal of Research and Development, pp. 40{45, 973. [8] M. Fiedler, Special matrices and their applications in numerical mathematics. Martinus Nijho Publishers, 986.

1 Matrix notation and preliminaries from spectral graph theory

1 Matrix notation and preliminaries from spectral graph theory Graph clustering (or community detection or graph partitioning) is one of the most studied problems in network analysis. One reason for this is that there are a variety of ways to define a cluster or community.

More information

Lecture 12 : Graph Laplacians and Cheeger s Inequality

Lecture 12 : Graph Laplacians and Cheeger s Inequality CPS290: Algorithmic Foundations of Data Science March 7, 2017 Lecture 12 : Graph Laplacians and Cheeger s Inequality Lecturer: Kamesh Munagala Scribe: Kamesh Munagala Graph Laplacian Maybe the most beautiful

More information

Finding normalized and modularity cuts by spectral clustering. Ljubjana 2010, October

Finding normalized and modularity cuts by spectral clustering. Ljubjana 2010, October Finding normalized and modularity cuts by spectral clustering Marianna Bolla Institute of Mathematics Budapest University of Technology and Economics marib@math.bme.hu Ljubjana 2010, October Outline Find

More information

Spectral Graph Theory and You: Matrix Tree Theorem and Centrality Metrics

Spectral Graph Theory and You: Matrix Tree Theorem and Centrality Metrics Spectral Graph Theory and You: and Centrality Metrics Jonathan Gootenberg March 11, 2013 1 / 19 Outline of Topics 1 Motivation Basics of Spectral Graph Theory Understanding the characteristic polynomial

More information

A lower bound for the Laplacian eigenvalues of a graph proof of a conjecture by Guo

A lower bound for the Laplacian eigenvalues of a graph proof of a conjecture by Guo A lower bound for the Laplacian eigenvalues of a graph proof of a conjecture by Guo A. E. Brouwer & W. H. Haemers 2008-02-28 Abstract We show that if µ j is the j-th largest Laplacian eigenvalue, and d

More information

ON THE QUALITY OF SPECTRAL SEPARATORS

ON THE QUALITY OF SPECTRAL SEPARATORS ON THE QUALITY OF SPECTRAL SEPARATORS STEPHEN GUATTERY AND GARY L. MILLER Abstract. Computing graph separators is an important step in many graph algorithms. A popular technique for finding separators

More information

Chapter 3 Transformations

Chapter 3 Transformations Chapter 3 Transformations An Introduction to Optimization Spring, 2014 Wei-Ta Chu 1 Linear Transformations A function is called a linear transformation if 1. for every and 2. for every If we fix the bases

More information

1 Matrix notation and preliminaries from spectral graph theory

1 Matrix notation and preliminaries from spectral graph theory Graph clustering (or community detection or graph partitioning) is one of the most studied problems in network analysis. One reason for this is that there are a variety of ways to define a cluster or community.

More information

Lecture 14: Random Walks, Local Graph Clustering, Linear Programming

Lecture 14: Random Walks, Local Graph Clustering, Linear Programming CSE 521: Design and Analysis of Algorithms I Winter 2017 Lecture 14: Random Walks, Local Graph Clustering, Linear Programming Lecturer: Shayan Oveis Gharan 3/01/17 Scribe: Laura Vonessen Disclaimer: These

More information

(a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? Solution: dim N(A) 1, since rank(a) 3. Ax =

(a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? Solution: dim N(A) 1, since rank(a) 3. Ax = . (5 points) (a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? dim N(A), since rank(a) 3. (b) If we also know that Ax = has no solution, what do we know about the rank of A? C(A)

More information

Data Analysis and Manifold Learning Lecture 3: Graphs, Graph Matrices, and Graph Embeddings

Data Analysis and Manifold Learning Lecture 3: Graphs, Graph Matrices, and Graph Embeddings Data Analysis and Manifold Learning Lecture 3: Graphs, Graph Matrices, and Graph Embeddings Radu Horaud INRIA Grenoble Rhone-Alpes, France Radu.Horaud@inrialpes.fr http://perception.inrialpes.fr/ Outline

More information

7. Symmetric Matrices and Quadratic Forms

7. Symmetric Matrices and Quadratic Forms Linear Algebra 7. Symmetric Matrices and Quadratic Forms CSIE NCU 1 7. Symmetric Matrices and Quadratic Forms 7.1 Diagonalization of symmetric matrices 2 7.2 Quadratic forms.. 9 7.4 The singular value

More information

Markov Chains and Spectral Clustering

Markov Chains and Spectral Clustering Markov Chains and Spectral Clustering Ning Liu 1,2 and William J. Stewart 1,3 1 Department of Computer Science North Carolina State University, Raleigh, NC 27695-8206, USA. 2 nliu@ncsu.edu, 3 billy@ncsu.edu

More information

Machine Learning for Data Science (CS4786) Lecture 11

Machine Learning for Data Science (CS4786) Lecture 11 Machine Learning for Data Science (CS4786) Lecture 11 Spectral clustering Course Webpage : http://www.cs.cornell.edu/courses/cs4786/2016sp/ ANNOUNCEMENT 1 Assignment P1 the Diagnostic assignment 1 will

More information

COMPSCI 514: Algorithms for Data Science

COMPSCI 514: Algorithms for Data Science COMPSCI 514: Algorithms for Data Science Arya Mazumdar University of Massachusetts at Amherst Fall 2018 Lecture 8 Spectral Clustering Spectral clustering Curse of dimensionality Dimensionality Reduction

More information

STA141C: Big Data & High Performance Statistical Computing

STA141C: Big Data & High Performance Statistical Computing STA141C: Big Data & High Performance Statistical Computing Lecture 12: Graph Clustering Cho-Jui Hsieh UC Davis May 29, 2018 Graph Clustering Given a graph G = (V, E, W ) V : nodes {v 1,, v n } E: edges

More information

Laplacian Integral Graphs with Maximum Degree 3

Laplacian Integral Graphs with Maximum Degree 3 Laplacian Integral Graphs with Maximum Degree Steve Kirkland Department of Mathematics and Statistics University of Regina Regina, Saskatchewan, Canada S4S 0A kirkland@math.uregina.ca Submitted: Nov 5,

More information

Ma/CS 6b Class 23: Eigenvalues in Regular Graphs

Ma/CS 6b Class 23: Eigenvalues in Regular Graphs Ma/CS 6b Class 3: Eigenvalues in Regular Graphs By Adam Sheffer Recall: The Spectrum of a Graph Consider a graph G = V, E and let A be the adjacency matrix of G. The eigenvalues of G are the eigenvalues

More information

On the Maximal Error of Spectral Approximation of Graph Bisection

On the Maximal Error of Spectral Approximation of Graph Bisection To appear in Linear and Multilinear Algebra Vol. 00, No. 00, Month 20XX, 1 9 On the Maximal Error of Spectral Approximation of Graph Bisection John C. Urschel a,b,c, and Ludmil T. Zikatanov c,d, a Baltimore

More information

17.1 Directed Graphs, Undirected Graphs, Incidence Matrices, Adjacency Matrices, Weighted Graphs

17.1 Directed Graphs, Undirected Graphs, Incidence Matrices, Adjacency Matrices, Weighted Graphs Chapter 17 Graphs and Graph Laplacians 17.1 Directed Graphs, Undirected Graphs, Incidence Matrices, Adjacency Matrices, Weighted Graphs Definition 17.1. A directed graph isa pairg =(V,E), where V = {v

More information

Spectral Clustering. Spectral Clustering? Two Moons Data. Spectral Clustering Algorithm: Bipartioning. Spectral methods

Spectral Clustering. Spectral Clustering? Two Moons Data. Spectral Clustering Algorithm: Bipartioning. Spectral methods Spectral Clustering Seungjin Choi Department of Computer Science POSTECH, Korea seungjin@postech.ac.kr 1 Spectral methods Spectral Clustering? Methods using eigenvectors of some matrices Involve eigen-decomposition

More information

Inverse Perron values and connectivity of a uniform hypergraph

Inverse Perron values and connectivity of a uniform hypergraph Inverse Perron values and connectivity of a uniform hypergraph Changjiang Bu College of Automation College of Science Harbin Engineering University Harbin, PR China buchangjiang@hrbeu.edu.cn Jiang Zhou

More information

Lecture 13: Spectral Graph Theory

Lecture 13: Spectral Graph Theory CSE 521: Design and Analysis of Algorithms I Winter 2017 Lecture 13: Spectral Graph Theory Lecturer: Shayan Oveis Gharan 11/14/18 Disclaimer: These notes have not been subjected to the usual scrutiny reserved

More information

Ma/CS 6b Class 20: Spectral Graph Theory

Ma/CS 6b Class 20: Spectral Graph Theory Ma/CS 6b Class 20: Spectral Graph Theory By Adam Sheffer Eigenvalues and Eigenvectors A an n n matrix of real numbers. The eigenvalues of A are the numbers λ such that Ax = λx for some nonzero vector x

More information

Spectral Theory of Unsigned and Signed Graphs Applications to Graph Clustering. Some Slides

Spectral Theory of Unsigned and Signed Graphs Applications to Graph Clustering. Some Slides Spectral Theory of Unsigned and Signed Graphs Applications to Graph Clustering Some Slides Jean Gallier Department of Computer and Information Science University of Pennsylvania Philadelphia, PA 19104,

More information

CS168: The Modern Algorithmic Toolbox Lectures #11 and #12: Spectral Graph Theory

CS168: The Modern Algorithmic Toolbox Lectures #11 and #12: Spectral Graph Theory CS168: The Modern Algorithmic Toolbox Lectures #11 and #12: Spectral Graph Theory Tim Roughgarden & Gregory Valiant May 2, 2016 Spectral graph theory is the powerful and beautiful theory that arises from

More information

Spectra of Adjacency and Laplacian Matrices

Spectra of Adjacency and Laplacian Matrices Spectra of Adjacency and Laplacian Matrices Definition: University of Alicante (Spain) Matrix Computing (subject 3168 Degree in Maths) 30 hours (theory)) + 15 hours (practical assignment) Contents 1. Spectra

More information

216 S. Chandrasearan and I.C.F. Isen Our results dier from those of Sun [14] in two asects: we assume that comuted eigenvalues or singular values are

216 S. Chandrasearan and I.C.F. Isen Our results dier from those of Sun [14] in two asects: we assume that comuted eigenvalues or singular values are Numer. Math. 68: 215{223 (1994) Numerische Mathemati c Sringer-Verlag 1994 Electronic Edition Bacward errors for eigenvalue and singular value decomositions? S. Chandrasearan??, I.C.F. Isen??? Deartment

More information

Eigenvalue problems and optimization

Eigenvalue problems and optimization Notes for 2016-04-27 Seeking structure For the past three weeks, we have discussed rather general-purpose optimization methods for nonlinear equation solving and optimization. In practice, of course, we

More information

Conceptual Questions for Review

Conceptual Questions for Review Conceptual Questions for Review Chapter 1 1.1 Which vectors are linear combinations of v = (3, 1) and w = (4, 3)? 1.2 Compare the dot product of v = (3, 1) and w = (4, 3) to the product of their lengths.

More information

Networks and Their Spectra

Networks and Their Spectra Networks and Their Spectra Victor Amelkin University of California, Santa Barbara Department of Computer Science victor@cs.ucsb.edu December 4, 2017 1 / 18 Introduction Networks (= graphs) are everywhere.

More information

Data Analysis and Manifold Learning Lecture 7: Spectral Clustering

Data Analysis and Manifold Learning Lecture 7: Spectral Clustering Data Analysis and Manifold Learning Lecture 7: Spectral Clustering Radu Horaud INRIA Grenoble Rhone-Alpes, France Radu.Horaud@inrialpes.fr http://perception.inrialpes.fr/ Outline of Lecture 7 What is spectral

More information

MATH 567: Mathematical Techniques in Data Science Clustering II

MATH 567: Mathematical Techniques in Data Science Clustering II This lecture is based on U. von Luxburg, A Tutorial on Spectral Clustering, Statistics and Computing, 17 (4), 2007. MATH 567: Mathematical Techniques in Data Science Clustering II Dominique Guillot Departments

More information

Math 443/543 Graph Theory Notes 5: Graphs as matrices, spectral graph theory, and PageRank

Math 443/543 Graph Theory Notes 5: Graphs as matrices, spectral graph theory, and PageRank Math 443/543 Graph Theory Notes 5: Graphs as matrices, spectral graph theory, and PageRank David Glickenstein November 3, 4 Representing graphs as matrices It will sometimes be useful to represent graphs

More information

Linear Algebra Review. Vectors

Linear Algebra Review. Vectors Linear Algebra Review 9/4/7 Linear Algebra Review By Tim K. Marks UCSD Borrows heavily from: Jana Kosecka http://cs.gmu.edu/~kosecka/cs682.html Virginia de Sa (UCSD) Cogsci 8F Linear Algebra review Vectors

More information

Approximation Algorithms for Maximum. Coverage and Max Cut with Given Sizes of. Parts? A. A. Ageev and M. I. Sviridenko

Approximation Algorithms for Maximum. Coverage and Max Cut with Given Sizes of. Parts? A. A. Ageev and M. I. Sviridenko Approximation Algorithms for Maximum Coverage and Max Cut with Given Sizes of Parts? A. A. Ageev and M. I. Sviridenko Sobolev Institute of Mathematics pr. Koptyuga 4, 630090, Novosibirsk, Russia fageev,svirg@math.nsc.ru

More information

Real symmetric matrices/1. 1 Eigenvalues and eigenvectors

Real symmetric matrices/1. 1 Eigenvalues and eigenvectors Real symmetric matrices 1 Eigenvalues and eigenvectors We use the convention that vectors are row vectors and matrices act on the right. Let A be a square matrix with entries in a field F; suppose that

More information

1. The Polar Decomposition

1. The Polar Decomposition A PERSONAL INTERVIEW WITH THE SINGULAR VALUE DECOMPOSITION MATAN GAVISH Part. Theory. The Polar Decomposition In what follows, F denotes either R or C. The vector space F n is an inner product space with

More information

Spectral Graph Theory Lecture 2. The Laplacian. Daniel A. Spielman September 4, x T M x. ψ i = arg min

Spectral Graph Theory Lecture 2. The Laplacian. Daniel A. Spielman September 4, x T M x. ψ i = arg min Spectral Graph Theory Lecture 2 The Laplacian Daniel A. Spielman September 4, 2015 Disclaimer These notes are not necessarily an accurate representation of what happened in class. The notes written before

More information

Last Time. Social Network Graphs Betweenness. Graph Laplacian. Girvan-Newman Algorithm. Spectral Bisection

Last Time. Social Network Graphs Betweenness. Graph Laplacian. Girvan-Newman Algorithm. Spectral Bisection Eigenvalue Problems Last Time Social Network Graphs Betweenness Girvan-Newman Algorithm Graph Laplacian Spectral Bisection λ 2, w 2 Today Small deviation into eigenvalue problems Formulation Standard eigenvalue

More information

Ma/CS 6b Class 20: Spectral Graph Theory

Ma/CS 6b Class 20: Spectral Graph Theory Ma/CS 6b Class 20: Spectral Graph Theory By Adam Sheffer Recall: Parity of a Permutation S n the set of permutations of 1,2,, n. A permutation σ S n is even if it can be written as a composition of an

More information

On Hadamard Diagonalizable Graphs

On Hadamard Diagonalizable Graphs On Hadamard Diagonalizable Graphs S. Barik, S. Fallat and S. Kirkland Department of Mathematics and Statistics University of Regina Regina, Saskatchewan, Canada S4S 0A2 Abstract Of interest here is a characterization

More information

Lecture Semidefinite Programming and Graph Partitioning

Lecture Semidefinite Programming and Graph Partitioning Approximation Algorithms and Hardness of Approximation April 16, 013 Lecture 14 Lecturer: Alantha Newman Scribes: Marwa El Halabi 1 Semidefinite Programming and Graph Partitioning In previous lectures,

More information

What is the meaning of the graph energy after all?

What is the meaning of the graph energy after all? What is the meaning of the graph energy after all? Ernesto Estrada and Michele Benzi Department of Mathematics & Statistics, University of Strathclyde, 6 Richmond Street, Glasgow GXQ, UK Department of

More information

Partitioning Networks by Eigenvectors

Partitioning Networks by Eigenvectors 1 Partitioning Networks by Eigenvectors Andrew J. Seary and William D. Richards School of Communication, Simon Fraser University Burnaby, British Columbia, Canada INSNA Sunbelt XVI London, July, 1995.

More information

E2 212: Matrix Theory (Fall 2010) Solutions to Test - 1

E2 212: Matrix Theory (Fall 2010) Solutions to Test - 1 E2 212: Matrix Theory (Fall 2010) s to Test - 1 1. Let X = [x 1, x 2,..., x n ] R m n be a tall matrix. Let S R(X), and let P be an orthogonal projector onto S. (a) If X is full rank, show that P can be

More information

Problem # Max points possible Actual score Total 120

Problem # Max points possible Actual score Total 120 FINAL EXAMINATION - MATH 2121, FALL 2017. Name: ID#: Email: Lecture & Tutorial: Problem # Max points possible Actual score 1 15 2 15 3 10 4 15 5 15 6 15 7 10 8 10 9 15 Total 120 You have 180 minutes to

More information

Lecture 9: Low Rank Approximation

Lecture 9: Low Rank Approximation CSE 521: Design and Analysis of Algorithms I Fall 2018 Lecture 9: Low Rank Approximation Lecturer: Shayan Oveis Gharan February 8th Scribe: Jun Qi Disclaimer: These notes have not been subjected to the

More information

MATH 829: Introduction to Data Mining and Analysis Clustering II

MATH 829: Introduction to Data Mining and Analysis Clustering II his lecture is based on U. von Luxburg, A Tutorial on Spectral Clustering, Statistics and Computing, 17 (4), 2007. MATH 829: Introduction to Data Mining and Analysis Clustering II Dominique Guillot Departments

More information

Graph fundamentals. Matrices associated with a graph

Graph fundamentals. Matrices associated with a graph Graph fundamentals Matrices associated with a graph Drawing a picture of a graph is one way to represent it. Another type of representation is via a matrix. Let G be a graph with V (G) ={v 1,v,...,v n

More information

Lecture 12 Eigenvalue Problem. Review of Eigenvalues Some properties Power method Shift method Inverse power method Deflation QR Method

Lecture 12 Eigenvalue Problem. Review of Eigenvalues Some properties Power method Shift method Inverse power method Deflation QR Method Lecture Eigenvalue Problem Review of Eigenvalues Some properties Power method Shift method Inverse power method Deflation QR Method Eigenvalue Eigenvalue ( A I) If det( A I) (trivial solution) To obtain

More information

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination Math 0, Winter 07 Final Exam Review Chapter. Matrices and Gaussian Elimination { x + x =,. Different forms of a system of linear equations. Example: The x + 4x = 4. [ ] [ ] [ ] vector form (or the column

More information

A New Spectral Technique Using Normalized Adjacency Matrices for Graph Matching 1

A New Spectral Technique Using Normalized Adjacency Matrices for Graph Matching 1 CHAPTER-3 A New Spectral Technique Using Normalized Adjacency Matrices for Graph Matching Graph matching problem has found many applications in areas as diverse as chemical structure analysis, pattern

More information

linearly indepedent eigenvectors as the multiplicity of the root, but in general there may be no more than one. For further discussion, assume matrice

linearly indepedent eigenvectors as the multiplicity of the root, but in general there may be no more than one. For further discussion, assume matrice 3. Eigenvalues and Eigenvectors, Spectral Representation 3.. Eigenvalues and Eigenvectors A vector ' is eigenvector of a matrix K, if K' is parallel to ' and ' 6, i.e., K' k' k is the eigenvalue. If is

More information

Lecture 1: Graphs, Adjacency Matrices, Graph Laplacian

Lecture 1: Graphs, Adjacency Matrices, Graph Laplacian Lecture 1: Graphs, Adjacency Matrices, Graph Laplacian Radu Balan January 31, 2017 G = (V, E) An undirected graph G is given by two pieces of information: a set of vertices V and a set of edges E, G =

More information

Strongly Regular Decompositions of the Complete Graph

Strongly Regular Decompositions of the Complete Graph Journal of Algebraic Combinatorics, 17, 181 201, 2003 c 2003 Kluwer Academic Publishers. Manufactured in The Netherlands. Strongly Regular Decompositions of the Complete Graph EDWIN R. VAN DAM Edwin.vanDam@uvt.nl

More information

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL MATH 3 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL MAIN TOPICS FOR THE FINAL EXAM:. Vectors. Dot product. Cross product. Geometric applications. 2. Row reduction. Null space, column space, row space, left

More information

Krylov Subspace Methods to Calculate PageRank

Krylov Subspace Methods to Calculate PageRank Krylov Subspace Methods to Calculate PageRank B. Vadala-Roth REU Final Presentation August 1st, 2013 How does Google Rank Web Pages? The Web The Graph (A) Ranks of Web pages v = v 1... Dominant Eigenvector

More information

Linear Algebra and its Applications

Linear Algebra and its Applications Linear Algebra and its Applications 430 (2009) 532 543 Contents lists available at ScienceDirect Linear Algebra and its Applications journal homepage: wwwelseviercom/locate/laa Computing tight upper bounds

More information

Spectral Clustering. Zitao Liu

Spectral Clustering. Zitao Liu Spectral Clustering Zitao Liu Agenda Brief Clustering Review Similarity Graph Graph Laplacian Spectral Clustering Algorithm Graph Cut Point of View Random Walk Point of View Perturbation Theory Point of

More information

The Singular Value Decomposition

The Singular Value Decomposition The Singular Value Decomposition Philippe B. Laval KSU Fall 2015 Philippe B. Laval (KSU) SVD Fall 2015 1 / 13 Review of Key Concepts We review some key definitions and results about matrices that will

More information

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra. DS-GA 1002 Lecture notes 0 Fall 2016 Linear Algebra These notes provide a review of basic concepts in linear algebra. 1 Vector spaces You are no doubt familiar with vectors in R 2 or R 3, i.e. [ ] 1.1

More information

Space-Variant Computer Vision: A Graph Theoretic Approach

Space-Variant Computer Vision: A Graph Theoretic Approach p.1/65 Space-Variant Computer Vision: A Graph Theoretic Approach Leo Grady Cognitive and Neural Systems Boston University p.2/65 Outline of talk Space-variant vision - Why and how of graph theory Anisotropic

More information

The Matrix-Tree Theorem

The Matrix-Tree Theorem The Matrix-Tree Theorem Christopher Eur March 22, 2015 Abstract: We give a brief introduction to graph theory in light of linear algebra. Our results culminates in the proof of Matrix-Tree Theorem. 1 Preliminaries

More information

Relative Irradiance. Wavelength (nm)

Relative Irradiance. Wavelength (nm) Characterization of Scanner Sensitivity Gaurav Sharma H. J. Trussell Electrical & Computer Engineering Dept. North Carolina State University, Raleigh, NC 7695-79 Abstract Color scanners are becoming quite

More information

Linear Algebra: Characteristic Value Problem

Linear Algebra: Characteristic Value Problem Linear Algebra: Characteristic Value Problem . The Characteristic Value Problem Let < be the set of real numbers and { be the set of complex numbers. Given an n n real matrix A; does there exist a number

More information

The QR Factorization

The QR Factorization The QR Factorization How to Make Matrices Nicer Radu Trîmbiţaş Babeş-Bolyai University March 11, 2009 Radu Trîmbiţaş ( Babeş-Bolyai University) The QR Factorization March 11, 2009 1 / 25 Projectors A projector

More information

Computation of eigenvalues and singular values Recall that your solutions to these questions will not be collected or evaluated.

Computation of eigenvalues and singular values Recall that your solutions to these questions will not be collected or evaluated. Math 504, Homework 5 Computation of eigenvalues and singular values Recall that your solutions to these questions will not be collected or evaluated 1 Find the eigenvalues and the associated eigenspaces

More information

Solutions to Review Problems for Chapter 6 ( ), 7.1

Solutions to Review Problems for Chapter 6 ( ), 7.1 Solutions to Review Problems for Chapter (-, 7 The Final Exam is on Thursday, June,, : AM : AM at NESBITT Final Exam Breakdown Sections % -,7-9,- - % -9,-,7,-,-7 - % -, 7 - % Let u u and v Let x x x x,

More information

Introduction to Spectral Graph Theory and Graph Clustering

Introduction to Spectral Graph Theory and Graph Clustering Introduction to Spectral Graph Theory and Graph Clustering Chengming Jiang ECS 231 Spring 2016 University of California, Davis 1 / 40 Motivation Image partitioning in computer vision 2 / 40 Motivation

More information

Preliminary/Qualifying Exam in Numerical Analysis (Math 502a) Spring 2012

Preliminary/Qualifying Exam in Numerical Analysis (Math 502a) Spring 2012 Instructions Preliminary/Qualifying Exam in Numerical Analysis (Math 502a) Spring 2012 The exam consists of four problems, each having multiple parts. You should attempt to solve all four problems. 1.

More information

Spectral Graph Theory and its Applications. Daniel A. Spielman Dept. of Computer Science Program in Applied Mathematics Yale Unviersity

Spectral Graph Theory and its Applications. Daniel A. Spielman Dept. of Computer Science Program in Applied Mathematics Yale Unviersity Spectral Graph Theory and its Applications Daniel A. Spielman Dept. of Computer Science Program in Applied Mathematics Yale Unviersity Outline Adjacency matrix and Laplacian Intuition, spectral graph drawing

More information

Matrices, Moments and Quadrature, cont d

Matrices, Moments and Quadrature, cont d Jim Lambers CME 335 Spring Quarter 2010-11 Lecture 4 Notes Matrices, Moments and Quadrature, cont d Estimation of the Regularization Parameter Consider the least squares problem of finding x such that

More information

Linear Algebra (Review) Volker Tresp 2017

Linear Algebra (Review) Volker Tresp 2017 Linear Algebra (Review) Volker Tresp 2017 1 Vectors k is a scalar (a number) c is a column vector. Thus in two dimensions, c = ( c1 c 2 ) (Advanced: More precisely, a vector is defined in a vector space.

More information

The spectrum of the edge corona of two graphs

The spectrum of the edge corona of two graphs Electronic Journal of Linear Algebra Volume Volume (1) Article 4 1 The spectrum of the edge corona of two graphs Yaoping Hou yphou@hunnu.edu.cn Wai-Chee Shiu Follow this and additional works at: http://repository.uwyo.edu/ela

More information

Main matrix factorizations

Main matrix factorizations Main matrix factorizations A P L U P permutation matrix, L lower triangular, U upper triangular Key use: Solve square linear system Ax b. A Q R Q unitary, R upper triangular Key use: Solve square or overdetrmined

More information

Lecture 12: Introduction to Spectral Graph Theory, Cheeger s inequality

Lecture 12: Introduction to Spectral Graph Theory, Cheeger s inequality CSE 521: Design and Analysis of Algorithms I Spring 2016 Lecture 12: Introduction to Spectral Graph Theory, Cheeger s inequality Lecturer: Shayan Oveis Gharan May 4th Scribe: Gabriel Cadamuro Disclaimer:

More information

Abstract We survey some of what can be deduced about automorphisms of a graph from information on its eigenvalues and eigenvectors. Two of the main to

Abstract We survey some of what can be deduced about automorphisms of a graph from information on its eigenvalues and eigenvectors. Two of the main to Symmetry and Eigenvectors Ada Chan and Chris D. Godsil Combinatorics and Optimization University of Waterloo Waterloo, Ontario Canada N2L 3G http://bilby.uwaterloo.ca Support from a National Sciences and

More information

18.06 Professor Johnson Quiz 1 October 3, 2007

18.06 Professor Johnson Quiz 1 October 3, 2007 18.6 Professor Johnson Quiz 1 October 3, 7 SOLUTIONS 1 3 pts.) A given circuit network directed graph) which has an m n incidence matrix A rows = edges, columns = nodes) and a conductance matrix C [diagonal

More information

Recall the convention that, for us, all vectors are column vectors.

Recall the convention that, for us, all vectors are column vectors. Some linear algebra Recall the convention that, for us, all vectors are column vectors. 1. Symmetric matrices Let A be a real matrix. Recall that a complex number λ is an eigenvalue of A if there exists

More information

Linear algebra and applications to graphs Part 1

Linear algebra and applications to graphs Part 1 Linear algebra and applications to graphs Part 1 Written up by Mikhail Belkin and Moon Duchin Instructor: Laszlo Babai June 17, 2001 1 Basic Linear Algebra Exercise 1.1 Let V and W be linear subspaces

More information

Graph Embedding Techniques for Bounding Condition Numbers of Incomplete Factor Preconditioners

Graph Embedding Techniques for Bounding Condition Numbers of Incomplete Factor Preconditioners NASA/CR-01741 ICASE Report No. 97-47 Graph Embedding Techniques for Bounding Condition Numbers of Incomplete Factor Preconditioners Stephen Guattery ICASE Institute for Computer Applications in Science

More information

Lecture 13. Principal Component Analysis. Brett Bernstein. April 25, CDS at NYU. Brett Bernstein (CDS at NYU) Lecture 13 April 25, / 26

Lecture 13. Principal Component Analysis. Brett Bernstein. April 25, CDS at NYU. Brett Bernstein (CDS at NYU) Lecture 13 April 25, / 26 Principal Component Analysis Brett Bernstein CDS at NYU April 25, 2017 Brett Bernstein (CDS at NYU) Lecture 13 April 25, 2017 1 / 26 Initial Question Intro Question Question Let S R n n be symmetric. 1

More information

Maximizing the numerical radii of matrices by permuting their entries

Maximizing the numerical radii of matrices by permuting their entries Maximizing the numerical radii of matrices by permuting their entries Wai-Shun Cheung and Chi-Kwong Li Dedicated to Professor Pei Yuan Wu. Abstract Let A be an n n complex matrix such that every row and

More information

Large-scale eigenvalue problems

Large-scale eigenvalue problems ELE 538B: Mathematics of High-Dimensional Data Large-scale eigenvalue problems Yuxin Chen Princeton University, Fall 208 Outline Power method Lanczos algorithm Eigenvalue problems 4-2 Eigendecomposition

More information

Exercise Sheet 1.

Exercise Sheet 1. Exercise Sheet 1 You can download my lecture and exercise sheets at the address http://sami.hust.edu.vn/giang-vien/?name=huynt 1) Let A, B be sets. What does the statement "A is not a subset of B " mean?

More information

SVD, Power method, and Planted Graph problems (+ eigenvalues of random matrices)

SVD, Power method, and Planted Graph problems (+ eigenvalues of random matrices) Chapter 14 SVD, Power method, and Planted Graph problems (+ eigenvalues of random matrices) Today we continue the topic of low-dimensional approximation to datasets and matrices. Last time we saw the singular

More information

Summary: A Random Walks View of Spectral Segmentation, by Marina Meila (University of Washington) and Jianbo Shi (Carnegie Mellon University)

Summary: A Random Walks View of Spectral Segmentation, by Marina Meila (University of Washington) and Jianbo Shi (Carnegie Mellon University) Summary: A Random Walks View of Spectral Segmentation, by Marina Meila (University of Washington) and Jianbo Shi (Carnegie Mellon University) The authors explain how the NCut algorithm for graph bisection

More information

Announce Statistical Motivation Properties Spectral Theorem Other ODE Theory Spectral Embedding. Eigenproblems I

Announce Statistical Motivation Properties Spectral Theorem Other ODE Theory Spectral Embedding. Eigenproblems I Eigenproblems I CS 205A: Mathematical Methods for Robotics, Vision, and Graphics Doug James (and Justin Solomon) CS 205A: Mathematical Methods Eigenproblems I 1 / 33 Announcements Homework 1: Due tonight

More information

Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008

Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008 Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008 Exam 2 will be held on Tuesday, April 8, 7-8pm in 117 MacMillan What will be covered The exam will cover material from the lectures

More information

Applied Mathematics 205. Unit II: Numerical Linear Algebra. Lecturer: Dr. David Knezevic

Applied Mathematics 205. Unit II: Numerical Linear Algebra. Lecturer: Dr. David Knezevic Applied Mathematics 205 Unit II: Numerical Linear Algebra Lecturer: Dr. David Knezevic Unit II: Numerical Linear Algebra Chapter II.3: QR Factorization, SVD 2 / 66 QR Factorization 3 / 66 QR Factorization

More information

Data Mining and Analysis: Fundamental Concepts and Algorithms

Data Mining and Analysis: Fundamental Concepts and Algorithms : Fundamental Concepts and Algorithms dataminingbook.info Mohammed J. Zaki 1 Wagner Meira Jr. 2 1 Department of Computer Science Rensselaer Polytechnic Institute, Troy, NY, USA 2 Department of Computer

More information

On Euclidean distance matrices

On Euclidean distance matrices On Euclidean distance matrices R. Balaji and R. B. Bapat Indian Statistical Institute, New Delhi, 110016 November 19, 2006 Abstract If A is a real symmetric matrix and P is an orthogonal projection onto

More information

MATH 567: Mathematical Techniques in Data Science Clustering II

MATH 567: Mathematical Techniques in Data Science Clustering II Spectral clustering: overview MATH 567: Mathematical Techniques in Data Science Clustering II Dominique uillot Departments of Mathematical Sciences University of Delaware Overview of spectral clustering:

More information

ECS231: Spectral Partitioning. Based on Berkeley s CS267 lecture on graph partition

ECS231: Spectral Partitioning. Based on Berkeley s CS267 lecture on graph partition ECS231: Spectral Partitioning Based on Berkeley s CS267 lecture on graph partition 1 Definition of graph partitioning Given a graph G = (N, E, W N, W E ) N = nodes (or vertices), E = edges W N = node weights

More information

Eigenvalue comparisons in graph theory

Eigenvalue comparisons in graph theory Eigenvalue comparisons in graph theory Gregory T. Quenell July 1994 1 Introduction A standard technique for estimating the eigenvalues of the Laplacian on a compact Riemannian manifold M with bounded curvature

More information

A Graph-Theoretic Characterization of Controllability for Multi-agent Systems

A Graph-Theoretic Characterization of Controllability for Multi-agent Systems A Graph-Theoretic Characterization of Controllability for Multi-agent Systems Meng Ji and Magnus Egerstedt Abstract In this paper we continue our pursuit of conditions that render a multi-agent networked

More information

Worksheet for Lecture 25 Section 6.4 Gram-Schmidt Process

Worksheet for Lecture 25 Section 6.4 Gram-Schmidt Process Worksheet for Lecture Name: Section.4 Gram-Schmidt Process Goal For a subspace W = Span{v,..., v n }, we want to find an orthonormal basis of W. Example Let W = Span{x, x } with x = and x =. Give an orthogonal

More information

An Algorithmist s Toolkit September 10, Lecture 1

An Algorithmist s Toolkit September 10, Lecture 1 18.409 An Algorithmist s Toolkit September 10, 2009 Lecture 1 Lecturer: Jonathan Kelner Scribe: Jesse Geneson (2009) 1 Overview The class s goals, requirements, and policies were introduced, and topics

More information

Numerical Methods. Elena loli Piccolomini. Civil Engeneering. piccolom. Metodi Numerici M p. 1/??

Numerical Methods. Elena loli Piccolomini. Civil Engeneering.  piccolom. Metodi Numerici M p. 1/?? Metodi Numerici M p. 1/?? Numerical Methods Elena loli Piccolomini Civil Engeneering http://www.dm.unibo.it/ piccolom elena.loli@unibo.it Metodi Numerici M p. 2/?? Least Squares Data Fitting Measurement

More information