Ensuring Strong Dominance of the Leading Eigenvalues for Cluster Ensembles

Size: px
Start display at page:

Download "Ensuring Strong Dominance of the Leading Eigenvalues for Cluster Ensembles"

Transcription

1 Ensuring Strong Dominance of the Leading Eigenvalues for Cluster Ensembles H.. Kung School of Engineering Applied Sciences Harvard University Cambridge, MA 0478 Bruce W. Suter Air Force Research Laboratory/RIB 55 Brooks Road Rome, NY Abstract Spectral analysis is a popular mathematical tool in analyzing a variety of network distributed systems. For a special class of networks, called cluster ensembles, which are made of interconnected clusters, we can characterize those which exhibit strong dominance of the leading eigenvalues in terms of the cluster structure. For such systems, only these leading eigenvalues their corresponding eigenvectors will need to be examined in studying important properties of the underlying system. his paper establishes several bounds on eigenvalue separation ratios in terms of the number of clusters, their sizes cluster interconnection topologies. Keywords-eigenvalue;eigenvector; spectral analysis; network; cluster I. INRODUCION Spectral analysis is a widely used mathematical tool in analyzing network distributed systems (see, e.g., []). Given such a system, e.g., a sensor-target measurement system (see, e.g., [5]), it's often sufficient to examine only those dominating eigenvectors for which the corresponding eigenvalues are much larger than the rest of eigenvalues. hat is, these dominating eigenvectors alone can reveal important properties of the system, e.g., its clusters (see, e.g., [5]). From the resulting clustering structure one can detect, for example, malfunctioning or malicious sensor nodes in sensor network applications [5]. More precisely, for a given system, let A denote the adjacency matrix of the graph reflecting the underlying network topology. We are interested in the eigen structure of A A, in particular, the dominance of the leading eigenvalues of A A. In this paper we consider a class of network-based systems, called cluster ensembles, which are collections of clusters connected via some cluster interconnection network as depicted in Figure. We show that for some cluster ensembles with regular structures, such as alpha-beta cluster ensembles defined in Section III, we can characterize those ensembles which exhibit strong dominance of the leading eigenvalues in terms of their underlying network topologies. hese characterizations can provide guidelines in designing cluster ensembles where such strong dominance is desired. Figure. Cluster ensemble II. AN EXAMPLE OF CLUSER ENSEMBLE o provide a concrete example, we consider a cluster ensemble composed of three clusters interconnected via a clique network, as shown in Figure. We call those cluster nodes which are part of the cluster interconnection network hub nodes. hus, in this example, nodes 0, are hub nodes. Figure. A three-cluster ensemble, where the cluster interconnection network is a clique A more readable drawing of the same cluster ensemble is depicted Figure 3, where the cluster interconnection network is placed in the center. III. ALPHA-BEA CLUSER ENSEMBLE We focus on a special class of cluster ensembles, called alpha-beta cluster ensembles, which satisfy the following properties: P. he cluster ensemble is composed of α identical clusters.

2 P. Each cluster contains one hub node β non-hub nodes. his means that the size of each cluster is β +. P3. In the adjacency matrix C of each cluster, columns of non-hub nodes are orthogonal to each other. It is possible to relax this orthogonality condition so C C goes from a diagonal matrix to a diagonally dominant one as described in [4]. he alpha-beta clusters represent a natural model of connecting multiple clusters. Figure 3 depicts an instance of an alpha-beta cluster ensemble with α = 3 β = 3. eigenvalues the rest of non-zero eigenvalues will increase by a factor of at least β/u. his means that when β is large, there are α dominating leading eigenvalues the number of these leading eigenvalues is exactly equal to the number of clusters in the alpha-beta cluster ensemble. hus we have succeeded in relating dominance of the leading eigenvalues to cluster size. o establish the heorem other similar results of this paper, we will carry out the following steps: S. Consider a general m-node, k-cluster ensemble. S. Partition the adjacency matrix A of the cluster ensemble introduce the cluster interconnection submatrix B. S3. Derive bounds for the eigenvalues of A A in terms of those of B. S4. Consider the special case of alpha-beta cluster ensembles, derive bounds for the eigenvalues of B in terms of α β. Figure 3. Another drawing of the cluster ensemble in Figure. It is an alpha-beta cluster ensemble with α = 3 β = 3. In what follows, the eigenvalues λ of a symmetric matrix i nxn P R are always ordered such that λ λ... λn. When it is useful to indicate that they are the eigenvalues on a specific matrix P, we write λ ( P) λ ( P)... λ n ( P) instead. IV. PREVIEW OF AN EXEMPLAR RESUL o have a quick reading on the work of this paper, we preview an exemplar result here. Note that for an alpha-beta cluster ensemble, the total number of nodes is α(β+). We use m to denote α (β + ). Moreover, we use μ to denote the largest in-degree of a nonhub node. For the network topology in Figure 3, we can check that nodes, 5 8 are those non-hub nodes which have the largest in-degree value of. hus in this case μ =. Let A be the m m adjacency matrix of the entire ensemble. hen we can show the following (Corollary 4 of Section VI): Exemplar heorem. ( λ ) m α + A A β m α We describe some system implication of the heorem. Suppose that we increase β, while keeping α μ fixed. hen, by the heorem, the separation ratio of the α leading S5. Derive bounds on the separation ratio for the α leading eigenvalues of an alpha-beta cluster ensemble, including the result of the Exemplar heorem described above. In the rest of the paper, we present our work in the order of these steps. V. PARIIONING A A AND CLUSER INERCONNECION SUBMARIX For a given m-node, k-cluster ensemble, we consider its adjacency matrix A. Since the eigenvalues of A A are invariant under similarity transformations, we can permute the columns of A so that its last k columns correspond to the k hub nodes. We express A A in a partitioned form: X A A= where, X B are (m-k) (m-k), (m-k) k k k, respectively. he B submatrix is of particular interest, as it reflects the interconnections among the hub nodes those between the hub nodes other nodes in their respective clusters. We call B the cluster interconnection submatrix. In the rest of this section, we will show that the eigenvalues of B can contribute to upper lower bounds for those of A A (see Corollary ). heorem. (Cauchy s Interlacing heorem). Let U be an m m symmetric matrix V a (m-k) (m-k) principal submatrix of U, for k < m. hen, for j =,, m-k, λj( U) λj( V) λ k + j( U ) Proof : See, for example, page 89 of [3].

3 heorem. Let G be an m m positive semi-definite symmetric matrix partitioned as H X G = F where H is (m-k) (m-k) for some 0 < k < m. hen, for j = 0,, k, λ ( F) λ ( G) λ ( F) + λ ( H ) k j m j k j Proof: he stated lower bound on λm j( G) follows from heorem 0.. of [6]. he stated upper bound on λm j( G) can be derived from Aronszajn's inequality on page 64 of []. Corollary. Suppose that A A is an m m symmetric positive semi-definite matrix that B are (m-k) (m-k) k k sub-matrices, respectively, defined by hen for j =,, m k, for j = 0,, k, X A A= λj( A A) λj λ k + j( A A ) λ ( B) λ ( B) + λ ( ) k j m j k j Proof: he first inequality follows from heorem. he second inequality follows from heorem. Corollary. Suppose that X A A= where A A, B are m m, (m - k) (m - k) k k respectively. hen λm k+ ( A A) λ( B) λ Proof: By letting j = m k in the first inequalities expression of Corollary, we have λ ( A A) λ ( ) By letting j = k in the second inequalities expression of Corollary, we have λ( B) λm k + ( A A ) Corollary follows from the above two inequalities. Corollary 3. Suppose that D X A A= where A A, D B are m m, (m - k) (m - k) k k matrices, respectively, D is a diagonal matrix () (,..., m k D = diag d d () ( m k) ) with d... d. hen λm k+ ( A A) λ( B) d Proof: he result follows from Corollary, by noting that in this case. λ m k d = VI. ALPHA-BEA CLUSER ENSEMBLES AND HEIR CLUSER INERCONNECION NEWORKS o establish further bounds on eigenvalues of A A, we consider alpha-beta cluster ensembles as defined in Section III. Note that an alpha-beta cluster ensemble is a m-node, k-cluster ensemble with m = α(β+) k = α. Let A be the adjacency A is α(β+) α(β+), can be written as: matrix. hen A D X A A= where D, X B are αβ αβ, αβ α, α α matrices, respectively, D is a diagonal matrix () D = diag( d,..., d αβ ) with d ()... d ( αβ ). Since each diagonal element of D is the in-degree of a non-hub node, we have d αβ = where μ is the largest in-degree of a non-hub node. he cluster interconnection submatrix B reflects the topology of the interconnection network connecting hub nodes of the clusters. We consider here two specific structures, which represent two design extremes. In the first case, hub nodes are densely connected as a clique, i.e., every hub node has a directed edge to every other hub node, including itself. In the second case, hub nodes are sparsely connected as a ring, i.e., if the clusters are labeled as cluster through α then, for each i =, α, the hub node in cluster i has a directed edge to the hub node in cluster (i+) mod α, also a directed edge to itself. hese two cases are depicted in Figure 4. 3

4 Figure 4. wo cluster interconnection networks: (a) clique (b) ring. Note that the alpha-beta cluster ensemble of Figure 3 uses the clique interconnection network heorem 3. Suppose that the hub nodes are connected as a clique. hen, for j =,,α-, λ ( B ) = β j λ B = α + β α Proof: When the hub nodes are connected as a clique, we have: B = αj + I α β, α x where each entry of J α α α R is equal to one α x I R α is the identity matrix. α Note the vector u = (,,, ) with all of its components equal to is an eigenvector of the cluster interconnection submatrix B with the corresponding eigenvalue being α + β. Let v be any other eigenvector of B. Since v must be orthogonal to u, we have C ν = 0 wherec is a α α matrix with all its components being α. Since B-C is a diagonal matrix with all its diagonals being β, we have Bν = ( B C) ν + Cν = ( B C ) ν = βν. hus, the eigenvalues of B are λj ( B) = β for j =,..., α λ α B = α + β. heorem 4. Suppose that the hub nodes are connected as a ring. hen, for j =,,α, β λ ( B) β + 4 j Proof: When the hub nodes are connected as a ring, we have + β β + B = + β + β By the Gershgorin Disc heorem, eigenvalues of B are bounded below by ( + β ) above by( + β ) +. From heorems 3 4, we note that when the clusters increase their size β, λ ( B) will increase thus, by Corollary, also the eigenvalue separation. More precisely, we can establish the corollaries below.. Corollary 4. Suppose that the hub nodes in the system are connected as a clique or ring. hen ( λ ) m k+ A A β Proof: he result follows from Corollary 3 heorems 3 4, while noting that d αβ =. Corollary 5. Suppose that the hub nodes in the system are connected as a clique. hen λ β λm A A α + β m ( A A) + Proof: From heorem, we note that, for j = 0,..., k, λ ( B) λ ( B) + λ ( D ) k j m j k j For j = 0, we have λk( B) λm( A A ) For j =, we have λm ( A A) λk ( B) + λ( D ) By heorem 3, we have λk ( B) = α + β λ k B = β It follows that λm( A A) λk( B) = α + β λm ( A A) λk ( B) + λ( D ) = β + by noting that λ m k( D ) =. he Corollary follows from the above two inequalities. 4

5 Corollary 6. Suppose that the hub nodes in the system are connected as a ring. hen λm( A A) β + 4+ β m k+ Proof: From Corollary, we note that, for j = 0,, k, λ ( B) λ ( B) + λ ( D ) k j m j k j For j = 0, we have ( A A) ( B) + ( D ) λ λ λ m k m k For j = k, we have ( B) ( A A ) By heorem 4, we have It follows that λ λ + m k λ ( B) β + 4 λ ( B ) β k λ ( B) + λ ( D) β + 4+ m k λm k+ ( A A) λ( B ) β he Corollary follows from the above two inequalities. VII. SYSEM IMPLICAION We can design a cluster ensemble system which will lead to a desired eigenvalue separation ratio of the leading eigenvalues the rest of the non-zero eigenvalues. For example, we can form an alpha-beta cluster ensemble with a relatively large β, so that β/μ will be large, where μ is the largest in-degree of a non-hub node in a cluster. hen, when the hub nodes are connected as either a clique or ring, Corollary 4 implies that the resulting system will exhibit strong dominance by the α leading eigenvalues. his means that the overall system behavior will essentially be governed by the corresponding α eigenvectors. For example, the values or signs of the components in these eigenvectors will reveal clustering structures of the system [7]. he resulting clusters, for example, can help detect malfunctioning or malicious sensors in a sensor network application [5]. Furthermore, suppose that values for α μ are fixed the hub nodes are connected as a ring. Corollary 6 implies that separation ratios among the largest α eigenvalues will be bounded above by a quantity which will approach when β increases. hus, for large β, there will be large separations between the α leading eigenvalues the rest of eigenvalues (Corollary 4), but there are no significant separations among the α leading eigenvalues themselves (Corollary 6). his means that when the hub nodes are connected as ring all the corresponding α eigenvectors will be of equal significance in characterizing the system. As a contrast, consider the case when hub nodes are connected as a clique. Assume that the values for β μ are fixed. hen, by Corollary 5, when (α + β)/(β +μ) is large (e.g., when α is large relatively to β μ), the principal eigenvalue will exhibit strong dominance. hus, for example, component values of the principal eigenvector alone will basically characterize the clustering structure of the system [7]. From the above discussion, we see that by choosing proper values for α, β μ in the design of a cluster ensemble, we can tune the dominance properties of its leading eigenvalues. VIII. SUMMARY AND CONCLUDING REMARKS We have motivated the problem of designing distributed systems which exhibit strong dominance of the leading eigenvalues, in terms of underlying network topologies. We have introduced a special class of networks called cluster ensembles, established some fundamental mathematical results of ensuring such dominance for this class of networks. As future work, we plan to generalize these characterizations by allowing more general topology than the alpha-beta ensemble develop applications in system design based on spectral properties established in this paper. ACKOWLEDGMENS his material is based on research sponsored by Air Force Research Laboratory under agreement number FA he U.S. Government is authorized to reproduce distribute reprints for Governmental purposes notwithsting any copyright notation thereon. he views conclusions contained herein are those of the authors should not be interpreted as necessarily representing the official policies or endorsements, either expressed or implied, of Air Force Research Laboratory or the U.S. Government. REFERENCES [] R. Bhatia, Matrix Analysis, Springer-Verlag, New York, 997. [] F. R. K. Chung, Spectral Graph heory, American Mathematical Society, 997. [3] R. A. Horn C. R. Johnson, Matrix Analysis, Cambridge University Press, Cambridge, UK, 985. [4] H.. Kung B. W. Suter, Hub Matrix heory Applications to Wireless Communications, EURASIP Journal on Advances in Signal Processing, Vol. 007, Issue, Article ID 3659, 007. [5] H.. Kung D. Vlah, Validating Sensors in the Field via Spectral Clustering Based on heir Measurement Data, Military Communications Conference (MILCOM 009), October 009. [6] B. N. Parlett, he Symmetric Eigenvalue Problem. Prentice-Hall, Englewood Cliffs, NJ, 980. [7] H.. Kung D. Vlah, Sign-based spectral clustering, 5th Biennial Symposium on Communications, May 00. 5

RITZ VALUE BOUNDS THAT EXPLOIT QUASI-SPARSITY

RITZ VALUE BOUNDS THAT EXPLOIT QUASI-SPARSITY RITZ VALUE BOUNDS THAT EXPLOIT QUASI-SPARSITY ILSE C.F. IPSEN Abstract. Absolute and relative perturbation bounds for Ritz values of complex square matrices are presented. The bounds exploit quasi-sparsity

More information

SIGN-BASED SPECTRAL CLUSTERING

SIGN-BASED SPECTRAL CLUSTERING SIGN-BASED SPECTRAL CLUSTERING H. T. Kung Dario Vlah {htk, dario}@eecs.harvard.edu Harvard School of Engineering and Applied Sciences Cambridge, MA 02138 Abstract Sign-based spectral clustering performs

More information

GAUSSIAN PROCESS TRANSFORMS

GAUSSIAN PROCESS TRANSFORMS GAUSSIAN PROCESS TRANSFORMS Philip A. Chou Ricardo L. de Queiroz Microsoft Research, Redmond, WA, USA pachou@microsoft.com) Computer Science Department, Universidade de Brasilia, Brasilia, Brazil queiroz@ieee.org)

More information

Relations Between Adjacency And Modularity Graph Partitioning: Principal Component Analysis vs. Modularity Component Analysis

Relations Between Adjacency And Modularity Graph Partitioning: Principal Component Analysis vs. Modularity Component Analysis Relations Between Adjacency And Modularity Graph Partitioning: Principal Component Analysis vs. Modularity Component Analysis Hansi Jiang Carl Meyer North Carolina State University October 27, 2015 1 /

More information

Matrix Theory, Math6304 Lecture Notes from October 25, 2012

Matrix Theory, Math6304 Lecture Notes from October 25, 2012 Matrix Theory, Math6304 Lecture Notes from October 25, 2012 taken by John Haas Last Time (10/23/12) Example of Low Rank Perturbation Relationship Between Eigenvalues and Principal Submatrices: We started

More information

CLASSIFICATION OF TREES EACH OF WHOSE ASSOCIATED ACYCLIC MATRICES WITH DISTINCT DIAGONAL ENTRIES HAS DISTINCT EIGENVALUES

CLASSIFICATION OF TREES EACH OF WHOSE ASSOCIATED ACYCLIC MATRICES WITH DISTINCT DIAGONAL ENTRIES HAS DISTINCT EIGENVALUES Bull Korean Math Soc 45 (2008), No 1, pp 95 99 CLASSIFICATION OF TREES EACH OF WHOSE ASSOCIATED ACYCLIC MATRICES WITH DISTINCT DIAGONAL ENTRIES HAS DISTINCT EIGENVALUES In-Jae Kim and Bryan L Shader Reprinted

More information

1 A factor can be considered to be an underlying latent variable: (a) on which people differ. (b) that is explained by unknown variables

1 A factor can be considered to be an underlying latent variable: (a) on which people differ. (b) that is explained by unknown variables 1 A factor can be considered to be an underlying latent variable: (a) on which people differ (b) that is explained by unknown variables (c) that cannot be defined (d) that is influenced by observed variables

More information

Spectral Graph Theory and its Applications. Daniel A. Spielman Dept. of Computer Science Program in Applied Mathematics Yale Unviersity

Spectral Graph Theory and its Applications. Daniel A. Spielman Dept. of Computer Science Program in Applied Mathematics Yale Unviersity Spectral Graph Theory and its Applications Daniel A. Spielman Dept. of Computer Science Program in Applied Mathematics Yale Unviersity Outline Adjacency matrix and Laplacian Intuition, spectral graph drawing

More information

1 Matrix notation and preliminaries from spectral graph theory

1 Matrix notation and preliminaries from spectral graph theory Graph clustering (or community detection or graph partitioning) is one of the most studied problems in network analysis. One reason for this is that there are a variety of ways to define a cluster or community.

More information

Physical Metaphors for Graphs

Physical Metaphors for Graphs Graphs and Networks Lecture 3 Physical Metaphors for Graphs Daniel A. Spielman October 9, 203 3. Overview We will examine physical metaphors for graphs. We begin with a spring model and then discuss resistor

More information

Eigenvectors Via Graph Theory

Eigenvectors Via Graph Theory Eigenvectors Via Graph Theory Jennifer Harris Advisor: Dr. David Garth October 3, 2009 Introduction There is no problem in all mathematics that cannot be solved by direct counting. -Ernst Mach The goal

More information

Computational Lower Bounds for Community Detection on Random Graphs

Computational Lower Bounds for Community Detection on Random Graphs Computational Lower Bounds for Community Detection on Random Graphs Bruce Hajek, Yihong Wu, Jiaming Xu Department of Electrical and Computer Engineering Coordinated Science Laboratory University of Illinois

More information

8.1 Concentration inequality for Gaussian random matrix (cont d)

8.1 Concentration inequality for Gaussian random matrix (cont d) MGMT 69: Topics in High-dimensional Data Analysis Falll 26 Lecture 8: Spectral clustering and Laplacian matrices Lecturer: Jiaming Xu Scribe: Hyun-Ju Oh and Taotao He, October 4, 26 Outline Concentration

More information

On marker set distance Laplacian eigenvalues in graphs.

On marker set distance Laplacian eigenvalues in graphs. Malaya Journal of Matematik, Vol 6, No 2, 369-374, 2018 https://doiorg/1026637/mjm0602/0011 On marker set distance Laplacian eigenvalues in graphs Medha Itagi Huilgol1 * and S Anuradha2 Abstract In our

More information

PROOF OF TWO MATRIX THEOREMS VIA TRIANGULAR FACTORIZATIONS ROY MATHIAS

PROOF OF TWO MATRIX THEOREMS VIA TRIANGULAR FACTORIZATIONS ROY MATHIAS PROOF OF TWO MATRIX THEOREMS VIA TRIANGULAR FACTORIZATIONS ROY MATHIAS Abstract. We present elementary proofs of the Cauchy-Binet Theorem on determinants and of the fact that the eigenvalues of a matrix

More information

A Note on Eigenvalues of Perturbed Hermitian Matrices

A Note on Eigenvalues of Perturbed Hermitian Matrices A Note on Eigenvalues of Perturbed Hermitian Matrices Chi-Kwong Li Ren-Cang Li July 2004 Let ( H1 E A = E H 2 Abstract and à = ( H1 H 2 be Hermitian matrices with eigenvalues λ 1 λ k and λ 1 λ k, respectively.

More information

Interlacing Inequalities for Totally Nonnegative Matrices

Interlacing Inequalities for Totally Nonnegative Matrices Interlacing Inequalities for Totally Nonnegative Matrices Chi-Kwong Li and Roy Mathias October 26, 2004 Dedicated to Professor T. Ando on the occasion of his 70th birthday. Abstract Suppose λ 1 λ n 0 are

More information

SECTIONS 5.2/5.4 BASIC PROPERTIES OF EIGENVALUES AND EIGENVECTORS / SIMILARITY TRANSFORMATIONS

SECTIONS 5.2/5.4 BASIC PROPERTIES OF EIGENVALUES AND EIGENVECTORS / SIMILARITY TRANSFORMATIONS SECINS 5/54 BSIC PRPERIES F EIGENVUES ND EIGENVECRS / SIMIRIY RNSFRMINS Eigenvalues of an n : there exists a vector x for which x = x Such a vector x is called an eigenvector, and (, x) is called an eigenpair

More information

Semidefinite and Second Order Cone Programming Seminar Fall 2001 Lecture 5

Semidefinite and Second Order Cone Programming Seminar Fall 2001 Lecture 5 Semidefinite and Second Order Cone Programming Seminar Fall 2001 Lecture 5 Instructor: Farid Alizadeh Scribe: Anton Riabov 10/08/2001 1 Overview We continue studying the maximum eigenvalue SDP, and generalize

More information

Network Reconstruction from Intrinsic Noise: Non-Minimum-Phase Systems

Network Reconstruction from Intrinsic Noise: Non-Minimum-Phase Systems Preprints of the 19th World Congress he International Federation of Automatic Control Network Reconstruction from Intrinsic Noise: Non-Minimum-Phase Systems David Hayden, Ye Yuan Jorge Goncalves Department

More information

General Properties for Determining Power Loss and Efficiency of Passive Multi-Port Microwave Networks

General Properties for Determining Power Loss and Efficiency of Passive Multi-Port Microwave Networks University of Massachusetts Amherst From the SelectedWorks of Ramakrishna Janaswamy 015 General Properties for Determining Power Loss and Efficiency of Passive Multi-Port Microwave Networks Ramakrishna

More information

Spectral inequalities and equalities involving products of matrices

Spectral inequalities and equalities involving products of matrices Spectral inequalities and equalities involving products of matrices Chi-Kwong Li 1 Department of Mathematics, College of William & Mary, Williamsburg, Virginia 23187 (ckli@math.wm.edu) Yiu-Tung Poon Department

More information

Note on deleting a vertex and weak interlacing of the Laplacian spectrum

Note on deleting a vertex and weak interlacing of the Laplacian spectrum Electronic Journal of Linear Algebra Volume 16 Article 6 2007 Note on deleting a vertex and weak interlacing of the Laplacian spectrum Zvi Lotker zvilo@cse.bgu.ac.il Follow this and additional works at:

More information

Majorizations for the Eigenvectors of Graph-Adjacency Matrices: A Tool for Complex Network Design

Majorizations for the Eigenvectors of Graph-Adjacency Matrices: A Tool for Complex Network Design Majorizations for the Eigenvectors of Graph-Adjacency Matrices: A Tool for Complex Network Design Rahul Dhal Electrical Engineering and Computer Science Washington State University Pullman, WA rdhal@eecs.wsu.edu

More information

Graphs and matrices with maximal energy

Graphs and matrices with maximal energy Graphs and matrices with maximal energy arxiv:math/060375v1 [math.co] 30 Mar 006 Vladimir Nikiforov Department of Mathematical Sciences, University of Memphis, Memphis TN 3815, USA, e-mail: vnikifrv@memphis.edu

More information

CANONICAL LOSSLESS STATE-SPACE SYSTEMS: STAIRCASE FORMS AND THE SCHUR ALGORITHM

CANONICAL LOSSLESS STATE-SPACE SYSTEMS: STAIRCASE FORMS AND THE SCHUR ALGORITHM CANONICAL LOSSLESS STATE-SPACE SYSTEMS: STAIRCASE FORMS AND THE SCHUR ALGORITHM Ralf L.M. Peeters Bernard Hanzon Martine Olivi Dept. Mathematics, Universiteit Maastricht, P.O. Box 616, 6200 MD Maastricht,

More information

Lecture 9: SVD, Low Rank Approximation

Lecture 9: SVD, Low Rank Approximation CSE 521: Design and Analysis of Algorithms I Spring 2016 Lecture 9: SVD, Low Rank Approimation Lecturer: Shayan Oveis Gharan April 25th Scribe: Koosha Khalvati Disclaimer: hese notes have not been subjected

More information

Spectral Generative Models for Graphs

Spectral Generative Models for Graphs Spectral Generative Models for Graphs David White and Richard C. Wilson Department of Computer Science University of York Heslington, York, UK wilson@cs.york.ac.uk Abstract Generative models are well known

More information

ORIE 6334 Spectral Graph Theory October 13, Lecture 15

ORIE 6334 Spectral Graph Theory October 13, Lecture 15 ORIE 6334 Spectral Graph heory October 3, 206 Lecture 5 Lecturer: David P. Williamson Scribe: Shijin Rajakrishnan Iterative Methods We have seen in the previous lectures that given an electrical network,

More information

AN ASYMPTOTIC BEHAVIOR OF QR DECOMPOSITION

AN ASYMPTOTIC BEHAVIOR OF QR DECOMPOSITION Unspecified Journal Volume 00, Number 0, Pages 000 000 S????-????(XX)0000-0 AN ASYMPTOTIC BEHAVIOR OF QR DECOMPOSITION HUAJUN HUANG AND TIN-YAU TAM Abstract. The m-th root of the diagonal of the upper

More information

Inverse Eigenvalue Problems for Two Special Acyclic Matrices

Inverse Eigenvalue Problems for Two Special Acyclic Matrices mathematics Communication Inverse Eigenvalue Problems for Two Special Acyclic Matrices Debashish Sharma, *, and Mausumi Sen, Department of Mathematics, Gurucharan College, College Road, Silchar 788004,

More information

Stat 159/259: Linear Algebra Notes

Stat 159/259: Linear Algebra Notes Stat 159/259: Linear Algebra Notes Jarrod Millman November 16, 2015 Abstract These notes assume you ve taken a semester of undergraduate linear algebra. In particular, I assume you are familiar with the

More information

13-2 Text: 28-30; AB: 1.3.3, 3.2.3, 3.4.2, 3.5, 3.6.2; GvL Eigen2

13-2 Text: 28-30; AB: 1.3.3, 3.2.3, 3.4.2, 3.5, 3.6.2; GvL Eigen2 The QR algorithm The most common method for solving small (dense) eigenvalue problems. The basic algorithm: QR without shifts 1. Until Convergence Do: 2. Compute the QR factorization A = QR 3. Set A :=

More information

Laplacian Integral Graphs with Maximum Degree 3

Laplacian Integral Graphs with Maximum Degree 3 Laplacian Integral Graphs with Maximum Degree Steve Kirkland Department of Mathematics and Statistics University of Regina Regina, Saskatchewan, Canada S4S 0A kirkland@math.uregina.ca Submitted: Nov 5,

More information

THE SPECTRUM OF THE LAPLACIAN MATRIX OF A BALANCED 2 p -ARY TREE

THE SPECTRUM OF THE LAPLACIAN MATRIX OF A BALANCED 2 p -ARY TREE Proyecciones Vol 3, N o, pp 131-149, August 004 Universidad Católica del Norte Antofagasta - Chile THE SPECTRUM OF THE LAPLACIAN MATRIX OF A BALANCED p -ARY TREE OSCAR ROJO Universidad Católica del Norte,

More information

STA141C: Big Data & High Performance Statistical Computing

STA141C: Big Data & High Performance Statistical Computing STA141C: Big Data & High Performance Statistical Computing Lecture 12: Graph Clustering Cho-Jui Hsieh UC Davis May 29, 2018 Graph Clustering Given a graph G = (V, E, W ) V : nodes {v 1,, v n } E: edges

More information

Fundamentals of Matrices

Fundamentals of Matrices Maschinelles Lernen II Fundamentals of Matrices Christoph Sawade/Niels Landwehr/Blaine Nelson Tobias Scheffer Matrix Examples Recap: Data Linear Model: f i x = w i T x Let X = x x n be the data matrix

More information

Introduction to Search Engine Technology Introduction to Link Structure Analysis. Ronny Lempel Yahoo Labs, Haifa

Introduction to Search Engine Technology Introduction to Link Structure Analysis. Ronny Lempel Yahoo Labs, Haifa Introduction to Search Engine Technology Introduction to Link Structure Analysis Ronny Lempel Yahoo Labs, Haifa Outline Anchor-text indexing Mathematical Background Motivation for link structure analysis

More information

Spectral Graph Theory

Spectral Graph Theory Spectral Graph Theory Aaron Mishtal April 27, 2016 1 / 36 Outline Overview Linear Algebra Primer History Theory Applications Open Problems Homework Problems References 2 / 36 Outline Overview Linear Algebra

More information

Spectral Graph Theory Lecture 2. The Laplacian. Daniel A. Spielman September 4, x T M x. ψ i = arg min

Spectral Graph Theory Lecture 2. The Laplacian. Daniel A. Spielman September 4, x T M x. ψ i = arg min Spectral Graph Theory Lecture 2 The Laplacian Daniel A. Spielman September 4, 2015 Disclaimer These notes are not necessarily an accurate representation of what happened in class. The notes written before

More information

Spectral Methods for Subgraph Detection

Spectral Methods for Subgraph Detection Spectral Methods for Subgraph Detection Nadya T. Bliss & Benjamin A. Miller Embedded and High Performance Computing Patrick J. Wolfe Statistics and Information Laboratory Harvard University 12 July 2010

More information

Markov Chains and Spectral Clustering

Markov Chains and Spectral Clustering Markov Chains and Spectral Clustering Ning Liu 1,2 and William J. Stewart 1,3 1 Department of Computer Science North Carolina State University, Raleigh, NC 27695-8206, USA. 2 nliu@ncsu.edu, 3 billy@ncsu.edu

More information

A Random Dot Product Model for Weighted Networks arxiv: v1 [stat.ap] 8 Nov 2016

A Random Dot Product Model for Weighted Networks arxiv: v1 [stat.ap] 8 Nov 2016 A Random Dot Product Model for Weighted Networks arxiv:1611.02530v1 [stat.ap] 8 Nov 2016 Daryl R. DeFord 1 Daniel N. Rockmore 1,2,3 1 Department of Mathematics, Dartmouth College, Hanover, NH, USA 03755

More information

Spectra of Adjacency and Laplacian Matrices

Spectra of Adjacency and Laplacian Matrices Spectra of Adjacency and Laplacian Matrices Definition: University of Alicante (Spain) Matrix Computing (subject 3168 Degree in Maths) 30 hours (theory)) + 15 hours (practical assignment) Contents 1. Spectra

More information

Matrix Inequalities by Means of Block Matrices 1

Matrix Inequalities by Means of Block Matrices 1 Mathematical Inequalities & Applications, Vol. 4, No. 4, 200, pp. 48-490. Matrix Inequalities by Means of Block Matrices Fuzhen Zhang 2 Department of Math, Science and Technology Nova Southeastern University,

More information

A New Spectral Technique Using Normalized Adjacency Matrices for Graph Matching 1

A New Spectral Technique Using Normalized Adjacency Matrices for Graph Matching 1 CHAPTER-3 A New Spectral Technique Using Normalized Adjacency Matrices for Graph Matching Graph matching problem has found many applications in areas as diverse as chemical structure analysis, pattern

More information

U.C. Berkeley Better-than-Worst-Case Analysis Handout 3 Luca Trevisan May 24, 2018

U.C. Berkeley Better-than-Worst-Case Analysis Handout 3 Luca Trevisan May 24, 2018 U.C. Berkeley Better-than-Worst-Case Analysis Handout 3 Luca Trevisan May 24, 2018 Lecture 3 In which we show how to find a planted clique in a random graph. 1 Finding a Planted Clique We will analyze

More information

Secondary κ-kernel Symmetric Fuzzy Matrices

Secondary κ-kernel Symmetric Fuzzy Matrices Intern. J. Fuzzy Mathematical rchive Vol. 5, No. 2, 24, 89-94 ISSN: 232 3242 (P), 232 325 (online) Published on 2 December 24 www.researchmathsci.org International Journal of Secondary κ-ernel Symmetric

More information

CS168: The Modern Algorithmic Toolbox Lectures #11 and #12: Spectral Graph Theory

CS168: The Modern Algorithmic Toolbox Lectures #11 and #12: Spectral Graph Theory CS168: The Modern Algorithmic Toolbox Lectures #11 and #12: Spectral Graph Theory Tim Roughgarden & Gregory Valiant May 2, 2016 Spectral graph theory is the powerful and beautiful theory that arises from

More information

Zero controllability in discrete-time structured systems

Zero controllability in discrete-time structured systems 1 Zero controllability in discrete-time structured systems Jacob van der Woude arxiv:173.8394v1 [math.oc] 24 Mar 217 Abstract In this paper we consider complex dynamical networks modeled by means of state

More information

a Λ q 1. Introduction

a Λ q 1. Introduction International Journal of Pure and Applied Mathematics Volume 9 No 26, 959-97 ISSN: -88 (printed version); ISSN: -95 (on-line version) url: http://wwwijpameu doi: 272/ijpamv9i7 PAijpameu EXPLICI MOORE-PENROSE

More information

Solving Homogeneous Systems with Sub-matrices

Solving Homogeneous Systems with Sub-matrices Pure Mathematical Sciences, Vol 7, 218, no 1, 11-18 HIKARI Ltd, wwwm-hikaricom https://doiorg/112988/pms218843 Solving Homogeneous Systems with Sub-matrices Massoud Malek Mathematics, California State

More information

On the adjacency matrix of a block graph

On the adjacency matrix of a block graph On the adjacency matrix of a block graph R. B. Bapat Stat-Math Unit Indian Statistical Institute, Delhi 7-SJSS Marg, New Delhi 110 016, India. email: rbb@isid.ac.in Souvik Roy Economics and Planning Unit

More information

Finding normalized and modularity cuts by spectral clustering. Ljubjana 2010, October

Finding normalized and modularity cuts by spectral clustering. Ljubjana 2010, October Finding normalized and modularity cuts by spectral clustering Marianna Bolla Institute of Mathematics Budapest University of Technology and Economics marib@math.bme.hu Ljubjana 2010, October Outline Find

More information

Data Mining and Analysis: Fundamental Concepts and Algorithms

Data Mining and Analysis: Fundamental Concepts and Algorithms : Fundamental Concepts and Algorithms dataminingbook.info Mohammed J. Zaki 1 Wagner Meira Jr. 2 1 Department of Computer Science Rensselaer Polytechnic Institute, Troy, NY, USA 2 Department of Computer

More information

Copositive matrices and periodic dynamical systems

Copositive matrices and periodic dynamical systems Extreme copositive matrices and periodic dynamical systems Weierstrass Institute (WIAS), Berlin Optimization without borders Dedicated to Yuri Nesterovs 60th birthday February 11, 2016 and periodic dynamical

More information

Four new upper bounds for the stability number of a graph

Four new upper bounds for the stability number of a graph Four new upper bounds for the stability number of a graph Miklós Ujvári Abstract. In 1979, L. Lovász defined the theta number, a spectral/semidefinite upper bound on the stability number of a graph, which

More information

Background on Linear Algebra - Lecture 2

Background on Linear Algebra - Lecture 2 Background on Linear Algebra - Lecture September 6, 01 1 Introduction Recall from your math classes the notion of vector spaces and fields of scalars. We shall be interested in finite dimensional vector

More information

Iterative Methods for Eigenvalues of Symmetric Matrices as Fixed Point Theorems

Iterative Methods for Eigenvalues of Symmetric Matrices as Fixed Point Theorems Iterative Methods for Eigenvalues of Symmetric Matrices as Fixed Point Theorems Student: Amanda Schaeffer Sponsor: Wilfred M. Greenlee December 6, 007. The Power Method and the Contraction Mapping Theorem

More information

BIPARTITE SUBGRAPHS AND THE SIGNLESS LAPLACIAN MATRIX

BIPARTITE SUBGRAPHS AND THE SIGNLESS LAPLACIAN MATRIX BIPARTITE SUBGRAPHS AND THE SIGNLESS LAPLACIAN MATRIX Steve Kirkland and Debdas Paul For a connected graph G, we derive tight inequalities relating the smallest signless Laplacian eigenvalue to the largest

More information

Machine Learning for Data Science (CS4786) Lecture 11

Machine Learning for Data Science (CS4786) Lecture 11 Machine Learning for Data Science (CS4786) Lecture 11 Spectral clustering Course Webpage : http://www.cs.cornell.edu/courses/cs4786/2016sp/ ANNOUNCEMENT 1 Assignment P1 the Diagnostic assignment 1 will

More information

Research Article A Note on Kantorovich Inequality for Hermite Matrices

Research Article A Note on Kantorovich Inequality for Hermite Matrices Hindawi Publishing Corporation Journal of Inequalities and Applications Volume 0, Article ID 5767, 6 pages doi:0.55/0/5767 Research Article A Note on Kantorovich Inequality for Hermite Matrices Zhibing

More information

Root systems and optimal block designs

Root systems and optimal block designs Root systems and optimal block designs Peter J. Cameron School of Mathematical Sciences Queen Mary, University of London Mile End Road London E1 4NS, UK p.j.cameron@qmul.ac.uk Abstract Motivated by a question

More information

Lecture Introduction. 2 Brief Recap of Lecture 10. CS-621 Theory Gems October 24, 2012

Lecture Introduction. 2 Brief Recap of Lecture 10. CS-621 Theory Gems October 24, 2012 CS-62 Theory Gems October 24, 202 Lecture Lecturer: Aleksander Mądry Scribes: Carsten Moldenhauer and Robin Scheibler Introduction In Lecture 0, we introduced a fundamental object of spectral graph theory:

More information

Spectral Graph Theory Lecture 3. Fundamental Graphs. Daniel A. Spielman September 5, 2018

Spectral Graph Theory Lecture 3. Fundamental Graphs. Daniel A. Spielman September 5, 2018 Spectral Graph Theory Lecture 3 Fundamental Graphs Daniel A. Spielman September 5, 2018 3.1 Overview We will bound and derive the eigenvalues of the Laplacian matrices of some fundamental graphs, including

More information

Combining Association Schemes

Combining Association Schemes Chapter 4 Combining Association Schemes 4.1 Tensor products In this chapter we take an association scheme on Ω 1 and an association scheme on Ω, and combine them to obtain an association scheme on Ω 1

More information

Lecture 10 - Eigenvalues problem

Lecture 10 - Eigenvalues problem Lecture 10 - Eigenvalues problem Department of Computer Science University of Houston February 28, 2008 1 Lecture 10 - Eigenvalues problem Introduction Eigenvalue problems form an important class of problems

More information

Compressed Statistical Testing and Application to Radar

Compressed Statistical Testing and Application to Radar Compressed Statistical Testing and Application to Radar Hsieh-Chung Chen, H. T. Kung, and Michael C. Wicks Harvard University, Cambridge, MA, USA University of Dayton, Dayton, OH, USA Abstract We present

More information

DRAGAN STEVANOVIĆ. Key words. Modularity matrix; Community structure; Largest eigenvalue; Complete multipartite

DRAGAN STEVANOVIĆ. Key words. Modularity matrix; Community structure; Largest eigenvalue; Complete multipartite A NOTE ON GRAPHS WHOSE LARGEST EIGENVALUE OF THE MODULARITY MATRIX EQUALS ZERO SNJEŽANA MAJSTOROVIĆ AND DRAGAN STEVANOVIĆ Abstract. Informally, a community within a graph is a subgraph whose vertices are

More information

Obtaining Consensus of Multi-agent Linear Dynamic Systems

Obtaining Consensus of Multi-agent Linear Dynamic Systems Obtaining Consensus of Multi-agent Linear Dynamic Systems M I GRCÍ-PLNS Universitat Politècnica de Catalunya Departament de Matemàtica plicada Mineria 1, 08038 arcelona SPIN mariaisabelgarcia@upcedu bstract:

More information

Decentralized Stabilization of Heterogeneous Linear Multi-Agent Systems

Decentralized Stabilization of Heterogeneous Linear Multi-Agent Systems 1 Decentralized Stabilization of Heterogeneous Linear Multi-Agent Systems Mauro Franceschelli, Andrea Gasparri, Alessandro Giua, and Giovanni Ulivi Abstract In this paper the formation stabilization problem

More information

Frame Diagonalization of Matrices

Frame Diagonalization of Matrices Frame Diagonalization of Matrices Fumiko Futamura Mathematics and Computer Science Department Southwestern University 00 E University Ave Georgetown, Texas 78626 U.S.A. Phone: + (52) 863-98 Fax: + (52)

More information

USING SINGULAR VALUE DECOMPOSITION (SVD) AS A SOLUTION FOR SEARCH RESULT CLUSTERING

USING SINGULAR VALUE DECOMPOSITION (SVD) AS A SOLUTION FOR SEARCH RESULT CLUSTERING POZNAN UNIVE RSIY OF E CHNOLOGY ACADE MIC JOURNALS No. 80 Electrical Engineering 2014 Hussam D. ABDULLA* Abdella S. ABDELRAHMAN* Vaclav SNASEL* USING SINGULAR VALUE DECOMPOSIION (SVD) AS A SOLUION FOR

More information

Clustering-based State Aggregation of Dynamical Networks

Clustering-based State Aggregation of Dynamical Networks Clustering-based State Aggregation of Dynamical Networks Takayuki Ishizaki Ph.D. from Tokyo Institute of Technology (March 2012) Research Fellow of the Japan Society for the Promotion of Science More than

More information

Complex Networks CSYS/MATH 303, Spring, Prof. Peter Dodds

Complex Networks CSYS/MATH 303, Spring, Prof. Peter Dodds Complex Networks CSYS/MATH 303, Spring, 2011 Prof. Peter Dodds Department of Mathematics & Statistics Center for Complex Systems Vermont Advanced Computing Center University of Vermont Licensed under the

More information

Peter J. Dukes. 22 August, 2012

Peter J. Dukes. 22 August, 2012 22 August, 22 Graph decomposition Let G and H be graphs on m n vertices. A decompostion of G into copies of H is a collection {H i } of subgraphs of G such that each H i = H, and every edge of G belongs

More information

arxiv:quant-ph/ v1 22 Aug 2005

arxiv:quant-ph/ v1 22 Aug 2005 Conditions for separability in generalized Laplacian matrices and nonnegative matrices as density matrices arxiv:quant-ph/58163v1 22 Aug 25 Abstract Chai Wah Wu IBM Research Division, Thomas J. Watson

More information

PRODUCT DISTANCE MATRIX OF A GRAPH AND SQUARED DISTANCE MATRIX OF A TREE. R. B. Bapat and S. Sivasubramanian

PRODUCT DISTANCE MATRIX OF A GRAPH AND SQUARED DISTANCE MATRIX OF A TREE. R. B. Bapat and S. Sivasubramanian PRODUCT DISTANCE MATRIX OF A GRAPH AND SQUARED DISTANCE MATRIX OF A TREE R B Bapat and S Sivasubramanian Let G be a strongly connected, weighted directed graph We define a product distance η(i, j) for

More information

Multivariate Statistical Analysis

Multivariate Statistical Analysis Multivariate Statistical Analysis Fall 2011 C. L. Williams, Ph.D. Lecture 4 for Applied Multivariate Analysis Outline 1 Eigen values and eigen vectors Characteristic equation Some properties of eigendecompositions

More information

ALGEBRAIC COMBINATORICS. c1993 C. D. Godsil

ALGEBRAIC COMBINATORICS. c1993 C. D. Godsil ALGEBRAIC COMBINATORICS c1993 C. D. Godsil To Gillian Preface There are people who feel that a combinatorial result should be given a \purely combinatorial" proof, but I am not one of them. For me the

More information

642:550, Summer 2004, Supplement 6 The Perron-Frobenius Theorem. Summer 2004

642:550, Summer 2004, Supplement 6 The Perron-Frobenius Theorem. Summer 2004 642:550, Summer 2004, Supplement 6 The Perron-Frobenius Theorem. Summer 2004 Introduction Square matrices whose entries are all nonnegative have special properties. This was mentioned briefly in Section

More information

Lecture 1 and 2: Introduction and Graph theory basics. Spring EE 194, Networked estimation and control (Prof. Khan) January 23, 2012

Lecture 1 and 2: Introduction and Graph theory basics. Spring EE 194, Networked estimation and control (Prof. Khan) January 23, 2012 Lecture 1 and 2: Introduction and Graph theory basics Spring 2012 - EE 194, Networked estimation and control (Prof. Khan) January 23, 2012 Spring 2012: EE-194-02 Networked estimation and control Schedule

More information

The Eigenvalue Problem: Perturbation Theory

The Eigenvalue Problem: Perturbation Theory Jim Lambers MAT 610 Summer Session 2009-10 Lecture 13 Notes These notes correspond to Sections 7.2 and 8.1 in the text. The Eigenvalue Problem: Perturbation Theory The Unsymmetric Eigenvalue Problem Just

More information

Markov Chains, Random Walks on Graphs, and the Laplacian

Markov Chains, Random Walks on Graphs, and the Laplacian Markov Chains, Random Walks on Graphs, and the Laplacian CMPSCI 791BB: Advanced ML Sridhar Mahadevan Random Walks! There is significant interest in the problem of random walks! Markov chain analysis! Computer

More information

NP-hardness of the stable matrix in unit interval family problem in discrete time

NP-hardness of the stable matrix in unit interval family problem in discrete time Systems & Control Letters 42 21 261 265 www.elsevier.com/locate/sysconle NP-hardness of the stable matrix in unit interval family problem in discrete time Alejandra Mercado, K.J. Ray Liu Electrical and

More information

Proofs for Large Sample Properties of Generalized Method of Moments Estimators

Proofs for Large Sample Properties of Generalized Method of Moments Estimators Proofs for Large Sample Properties of Generalized Method of Moments Estimators Lars Peter Hansen University of Chicago March 8, 2012 1 Introduction Econometrica did not publish many of the proofs in my

More information

Central Groupoids, Central Digraphs, and Zero-One Matrices A Satisfying A 2 = J

Central Groupoids, Central Digraphs, and Zero-One Matrices A Satisfying A 2 = J Central Groupoids, Central Digraphs, and Zero-One Matrices A Satisfying A 2 = J Frank Curtis, John Drew, Chi-Kwong Li, and Daniel Pragel September 25, 2003 Abstract We study central groupoids, central

More information

ON THE HÖLDER CONTINUITY OF MATRIX FUNCTIONS FOR NORMAL MATRICES

ON THE HÖLDER CONTINUITY OF MATRIX FUNCTIONS FOR NORMAL MATRICES Volume 10 (2009), Issue 4, Article 91, 5 pp. ON THE HÖLDER CONTINUITY O MATRIX UNCTIONS OR NORMAL MATRICES THOMAS P. WIHLER MATHEMATICS INSTITUTE UNIVERSITY O BERN SIDLERSTRASSE 5, CH-3012 BERN SWITZERLAND.

More information

Networks and Their Spectra

Networks and Their Spectra Networks and Their Spectra Victor Amelkin University of California, Santa Barbara Department of Computer Science victor@cs.ucsb.edu December 4, 2017 1 / 18 Introduction Networks (= graphs) are everywhere.

More information

Analysis of Spectral Kernel Design based Semi-supervised Learning

Analysis of Spectral Kernel Design based Semi-supervised Learning Analysis of Spectral Kernel Design based Semi-supervised Learning Tong Zhang IBM T. J. Watson Research Center Yorktown Heights, NY 10598 Rie Kubota Ando IBM T. J. Watson Research Center Yorktown Heights,

More information

An Even Order Symmetric B Tensor is Positive Definite

An Even Order Symmetric B Tensor is Positive Definite An Even Order Symmetric B Tensor is Positive Definite Liqun Qi, Yisheng Song arxiv:1404.0452v4 [math.sp] 14 May 2014 October 17, 2018 Abstract It is easily checkable if a given tensor is a B tensor, or

More information

Discrete Simulation of Power Law Noise

Discrete Simulation of Power Law Noise Discrete Simulation of Power Law Noise Neil Ashby 1,2 1 University of Colorado, Boulder, CO 80309-0390 USA 2 National Institute of Standards and Technology, Boulder, CO 80305 USA ashby@boulder.nist.gov

More information

ELA A NOTE ON GRAPHS WHOSE LARGEST EIGENVALUE OF THE MODULARITY MATRIX EQUALS ZERO

ELA A NOTE ON GRAPHS WHOSE LARGEST EIGENVALUE OF THE MODULARITY MATRIX EQUALS ZERO A NOTE ON GRAPHS WHOSE LARGEST EIGENVALUE OF THE MODULARITY MATRIX EQUALS ZERO SNJEŽANA MAJSTOROVIĆ AND DRAGAN STEVANOVIĆ Abstract. Informally, a community within a graph is a subgraph whose vertices are

More information

Latent Semantic Models. Reference: Introduction to Information Retrieval by C. Manning, P. Raghavan, H. Schutze

Latent Semantic Models. Reference: Introduction to Information Retrieval by C. Manning, P. Raghavan, H. Schutze Latent Semantic Models Reference: Introduction to Information Retrieval by C. Manning, P. Raghavan, H. Schutze 1 Vector Space Model: Pros Automatic selection of index terms Partial matching of queries

More information

An Algorithmist s Toolkit September 10, Lecture 1

An Algorithmist s Toolkit September 10, Lecture 1 18.409 An Algorithmist s Toolkit September 10, 2009 Lecture 1 Lecturer: Jonathan Kelner Scribe: Jesse Geneson (2009) 1 Overview The class s goals, requirements, and policies were introduced, and topics

More information

Characterization of Diagonalizable Matrices: An Algorithm

Characterization of Diagonalizable Matrices: An Algorithm Applied Mathematical Sciences, Vol. 3, 2009, no. 12, 575-580 Characterization of Diagonalizable Matrices: An Algorithm Arman Shokrollahi Ramanujan Institute for Advanced Study in Mathematics University

More information

(Kac Moody) Chevalley groups and Lie algebras with built in structure constants Lecture 1. Lisa Carbone Rutgers University

(Kac Moody) Chevalley groups and Lie algebras with built in structure constants Lecture 1. Lisa Carbone Rutgers University (Kac Moody) Chevalley groups and Lie algebras with built in structure constants Lecture 1 Lisa Carbone Rutgers University Slides will be posted at: http://sites.math.rutgers.edu/ carbonel/ Video will be

More information

A New Space for Comparing Graphs

A New Space for Comparing Graphs A New Space for Comparing Graphs Anshumali Shrivastava and Ping Li Cornell University and Rutgers University August 18th 2014 Anshumali Shrivastava and Ping Li ASONAM 2014 August 18th 2014 1 / 38 Main

More information

Reconstruction in the Generalized Stochastic Block Model

Reconstruction in the Generalized Stochastic Block Model Reconstruction in the Generalized Stochastic Block Model Marc Lelarge 1 Laurent Massoulié 2 Jiaming Xu 3 1 INRIA-ENS 2 INRIA-Microsoft Research Joint Centre 3 University of Illinois, Urbana-Champaign GDR

More information

1 Matrix notation and preliminaries from spectral graph theory

1 Matrix notation and preliminaries from spectral graph theory Graph clustering (or community detection or graph partitioning) is one of the most studied problems in network analysis. One reason for this is that there are a variety of ways to define a cluster or community.

More information