Majorizations for the Eigenvectors of Graph-Adjacency Matrices: A Tool for Complex Network Design

Similar documents
Lab 8: Measuring Graph Centrality - PageRank. Monday, November 5 CompSci 531, Fall 2018

6.207/14.15: Networks Lectures 4, 5 & 6: Linear Dynamics, Markov Chains, Centralities

Nonnegative and spectral matrix theory Lecture notes

Applications to network analysis: Eigenvector centrality indices Lecture notes

Markov Chains and Spectral Clustering

Markov Chains and Stochastic Sampling

8. Statistical Equilibrium and Classification of States: Discrete Time Markov Chains

Detailed Proof of The PerronFrobenius Theorem

Introduction to Search Engine Technology Introduction to Link Structure Analysis. Ronny Lempel Yahoo Labs, Haifa

A Control-Theoretic Perspective on the Design of Distributed Agreement Protocols, Part

Kernels of Directed Graph Laplacians. J. S. Caughman and J.J.P. Veerman

Eigenvectors Via Graph Theory

Definition A finite Markov chain is a memoryless homogeneous discrete stochastic process with a finite number of states.

Linear Algebra: Matrix Eigenvalue Problems

The local equivalence of two distances between clusterings: the Misclassification Error metric and the χ 2 distance

On the eigenvalues of specially low-rank perturbed matrices

Krylov Subspace Methods to Calculate PageRank

Markov Chains, Random Walks on Graphs, and the Laplacian

Lecture 10. Lecturer: Aleksander Mądry Scribes: Mani Bastani Parizi and Christos Kalaitzis

Spectra of Adjacency and Laplacian Matrices

Agreement algorithms for synchronization of clocks in nodes of stochastic networks

CSI 445/660 Part 6 (Centrality Measures for Networks) 6 1 / 68

No class on Thursday, October 1. No office hours on Tuesday, September 29 and Thursday, October 1.

Spectral Graph Theory Tools. Analysis of Complex Networks

On the simultaneous diagonal stability of a pair of positive linear systems

Lecture: Local Spectral Methods (1 of 4)

Z-Pencils. November 20, Abstract

The spectra of super line multigraphs

Random Walks on Graphs. One Concrete Example of a random walk Motivation applications

Ma/CS 6b Class 23: Eigenvalues in Regular Graphs

NOTES ON THE PERRON-FROBENIUS THEORY OF NONNEGATIVE MATRICES

On the mathematical background of Google PageRank algorithm

Notes on Linear Algebra and Matrix Theory

642:550, Summer 2004, Supplement 6 The Perron-Frobenius Theorem. Summer 2004

Calculating Web Page Authority Using the PageRank Algorithm

Geometric Mapping Properties of Semipositive Matrices

Review of Some Concepts from Linear Algebra: Part 2

The Kemeny Constant For Finite Homogeneous Ergodic Markov Chains

A Control-Theoretic Approach to Distributed Discrete-Valued Decision-Making in Networks of Sensing Agents

Nonnegative Matrices I

Markov Chains. Andreas Klappenecker by Andreas Klappenecker. All rights reserved. Texas A&M University

Spectral Properties of Matrix Polynomials in the Max Algebra

MATH36001 Perron Frobenius Theory 2015

Distributed Randomized Algorithms for the PageRank Computation Hideaki Ishii, Member, IEEE, and Roberto Tempo, Fellow, IEEE

Spectral Graph Theory and You: Matrix Tree Theorem and Centrality Metrics

In particular, if A is a square matrix and λ is one of its eigenvalues, then we can find a non-zero column vector X with

Lecture Note 12: The Eigenvalue Problem

Lecture 1: Review of linear algebra

Eigenvalues and Eigenvectors: An Introduction

Econ Slides from Lecture 7

Link Analysis. Leonid E. Zhukov

MATH 5524 MATRIX THEORY Pledged Problem Set 2

Central Groupoids, Central Digraphs, and Zero-One Matrices A Satisfying A 2 = J

Markov Decision Processes

AN ELEMENTARY PROOF OF THE SPECTRAL RADIUS FORMULA FOR MATRICES

Invertibility and stability. Irreducibly diagonally dominant. Invertibility and stability, stronger result. Reducible matrices

Spectral radius, symmetric and positive matrices

MAA704, Perron-Frobenius theory and Markov chains.

Vector Spaces. Vector space, ν, over the field of complex numbers, C, is a set of elements a, b,..., satisfying the following axioms.

Markov Chains, Stochastic Processes, and Matrix Decompositions

Uncertainty and Randomization

series. Utilize the methods of calculus to solve applied problems that require computational or algebraic techniques..

Discrete quantum random walks

Algebraic Multigrid Preconditioners for Computing Stationary Distributions of Markov Processes

Data Analysis and Manifold Learning Lecture 3: Graphs, Graph Matrices, and Graph Embeddings

Spectral Theory of Unsigned and Signed Graphs Applications to Graph Clustering. Some Slides

Strongly Regular Decompositions of the Complete Graph

ECEN 689 Special Topics in Data Science for Communications Networks

Math 1553, Introduction to Linear Algebra

Perturbation results for nearly uncoupled Markov. chains with applications to iterative methods. Jesse L. Barlow. December 9, 1992.

Hilbert Spaces. Hilbert space is a vector space with some extra structure. We start with formal (axiomatic) definition of a vector space.

ON COST MATRICES WITH TWO AND THREE DISTINCT VALUES OF HAMILTONIAN PATHS AND CYCLES

Google Page Rank Project Linear Algebra Summer 2012

CS168: The Modern Algorithmic Toolbox Lectures #11 and #12: Spectral Graph Theory

Spectral Analysis of k-balanced Signed Graphs

Ma/CS 6b Class 20: Spectral Graph Theory

Perron Frobenius Theory

Lecture 3: graph theory

Archive of past papers, solutions and homeworks for. MATH 224, Linear Algebra 2, Spring 2013, Laurence Barker

Consensus Problems on Small World Graphs: A Structural Study

Ma/CS 6b Class 20: Spectral Graph Theory

22.4. Numerical Determination of Eigenvalues and Eigenvectors. Introduction. Prerequisites. Learning Outcomes

NCS Lecture 8 A Primer on Graph Theory. Cooperative Control Applications

Lecture 15, 16: Diagonalization

8. Diagonalization.

Fiedler s Theorems on Nodal Domains

Complex Networks CSYS/MATH 303, Spring, Prof. Peter Dodds

A hybrid reordered Arnoldi method to accelerate PageRank computations

The Second Eigenvalue of the Google Matrix

Pseudocode for calculating Eigenfactor TM Score and Article Influence TM Score using data from Thomson-Reuters Journal Citations Reports

SVD, Power method, and Planted Graph problems (+ eigenvalues of random matrices)

Basis Construction from Power Series Expansions of Value Functions

Chapter 1. Vectors, Matrices, and Linear Spaces

Foundations of Matrix Analysis

Boolean Inner-Product Spaces and Boolean Matrices

chapter 12 MORE MATRIX ALGEBRA 12.1 Systems of Linear Equations GOALS

ON THE QUALITY OF SPECTRAL SEPARATORS

Online Social Networks and Media. Link Analysis and Web Search

Linear Algebra Methods for Data Mining

CONSENSUS BUILDING IN SENSOR NETWORKS AND LONG TERM PLANNING FOR THE NATIONAL AIRSPACE SYSTEM. Naga Venkata Swathik Akula

Transcription:

Majorizations for the Eigenvectors of Graph-Adjacency Matrices: A Tool for Complex Network Design Rahul Dhal Electrical Engineering and Computer Science Washington State University Pullman, WA rdhal@eecs.wsu.edu Sandip Roy Electrical Engineering and Computer Science Washington State University Pullman, WA sroy@eecs.wsu.edu Yan Wan Department of Electrical Engineering University of North Texas Denton, TX yan.wan@unt.edu ABSTRACT We develop majorization results that characterize changes in eigenvector components of a graph s adjacency matrix when its topology is changed. Specifically, for general weighted, directed) graphs, we characterize changes in dominant eigenvector components for single-row and multi-row incrementations. We also show that topology changes can be tailored to set ratios between the components of the dominant eigenvector. For more limited graph classes specifically, undirected and reversibly-structured ones), majorizations for components of the subdominant and other eigenvectors upon graph modifications are also obtained. Keywords Algebraic Graph Theory, Network Design, Dynamical Networks, Adjacency Matrix, Eigenvectors, Majorization 1. INTRODUCTION Designers and operators of complex networks are often concerned with modifying the network s structure to shape its dynamics in certain ways. For instance, in sensor networking applications, designers may adjust the sensor - to - sensor communication topology to permit fast decision-making and therefore reduce energy consumption [1]. Similarly in epidemic control, changing the population structure eg. through restrictions on travel and quarantine policies) can allow faster elimination of virus spread [2]. In general, these modifications effect complex changes in the dynamics of the network beyond achieving the design goal of interest), that are not well understood and require new methods for characterization. In this paper, we investigate the impact of topology changes on the spectral properties of the adjacency matrix This work was supported by the National Science Foundation Grant Number: 1058110) Corresponding Author of a network s graph, which is in many cases a crucial indicator of the network dynamics. Specifically, we focus on developing majorizations for the eigenvector components of the adjacency matrices of both directed and undirected graphs, upon structured modification of the graph topology. We consider several types of topology changes for symmetric, reversible, and arbitrary directed graph topologies, and develop majorizations for the dominant and/or sub-dominant eigenvalues in these cases. Graph spectra have been used extensively in various fields of computer science [3]. Among these fields, eigenvectors of adjacency matrices or matrices related to adjacency matrices) are particularly important in the areas of web information retrieval and social-network analysis. In these domains, eigenvectors of adjacency matrices are used to define the notion of eigenvector centrality, which is a measure of a node s importance in the network s dynamics [4]. Google s PageRank algorithm and its precursor Hyperlink Induced Topic Search HITS) are both based on variants of this sprectral notion of nodal importance. For a brief review of these and other similar algorithms based on eigenvectors of graph matrices, see [5]. For these computing applications, our results show how structural changes to the network topology will affect a node s centrality measure; as such, they provide insight into algorithm- and topology- modification to change centrality scores. The eigenvector majorization results that we develop rely critically on 1) classical results on non-negative matrices eigenvalues [6], and 2) matrix perturbation concepts. From one viewpoint, our result extend existing eigenvalue majorizations analyses of non-negative matrices [7], toward comparative characterizations of eigenvectors see also [8]). From another viewpoint, our results can be viewed as providing tighter bounds on eigenvector components than would be obtained from perturbation arguments see [7]), for the special class of non-negative matrices. The remainder of the article is organized as follows. In section 1.1, we motivate and formulate the study of eigenvector properties for adjacency matrices of graphs of three types: symmetric or undirected), reversible in the sense of Markov-chain reversibility), and arbitrary undirected graphs. We also define graph modifications of interest for these graph classes. In section 2, we provide eigenvector majorization results for the case of symmetric weight increments in an undi-

rected graph, wherein edge-weight between a pair of vertices is incremented. In section 3, we consider balanced weight increments in a reversible graph, wherein weights of directed edges between a pair of distinct vertices are increased in a specific way. In section 4 we consider a more general case where edge weights in an arbitrary directed graph are increased in complex ways, but focus on majorizations for the dominant eigenvector only. 1.1 Problem Formulation We consider a weighted and directed graph with n vertices, V 1, V 2,..., V n. For a given pair of distinct vertices, V j and V k, there may or may not be a directed edge from V j to V k. If there is a directed edge from V j to V k, we denote the weight of the edge as e jk > 0. In addition, we permit a directed edge from a vertex V j to itself, with an edge-weight e jj > 0. The topology of the graph can be represented using a n n adjacency matrix P defined as follows: P j, k) = e jk, if there is a directed edge from V j to V k = 0, Otherwise. We denote the eigenvalues of P as λ 1, λ 2,..., λ n, where Without Loss Of Generality WLOG) we assume that λ 1 λ 2... λ n. From classical results on non-negative matrices, we immediately see that P will have a real eigenvalue whose magnitude is at least as large in magnitude as any other eigenvalue; let us label this eigenvalue as λ 1. In the case that the graph is strongly connected, this real eigenvalue in fact has algebraic multiplicity 1 and is strictly dominant [9]. It is worth pointing out that the remaining eigenvalues of P may be complex, and may have algebraic or geometric multiplicities greater than one, and may have associated Jordan blocks of size greater than 1. We use the notation v i for any right eigenvector associated with the eigenvalue λ i. In this paper, we study the effect of graph edge-weight increases on the adjacency matrix s eigenvectors, for three different classes of graphs. For each graph class, we consider particular structured topology changes edge-weight increments). The graph classes and topology modifications that we focus on are representative of engineered modifications of networks, and also permit substantial analysis of eigenvectors as detailed below). Let us enumerate the particular cases considered here, in order of increasingly general graph classes. We here point out that in these descriptions and subsequent development of results, we denote the p th component of a vector x as x p. 1) Symmetric Weight Increments to Undirected Symmetric) Graphs As a first case, we focus on the class of graphs such that e jk = e kj for any pair of the distinct vertices V j and V k. We refer to these graphs as undirected or symmetric graphs, and note that the adjacency matrix is symmetric. For undirected graphs, the eigenvalues of P are real and all are in Jordan blocks of size one. Let the eigenvalues of P in decreasing order of magnitude be λ 1 λ 2... λ n with associated eigenvectors v 1, v 2,..., v n. The topology change that we examine for the undirected graph case is as follows: for a given 1) pair of distinct vertices V j and V k, the edge weights e jk and e kj are identically incremented by a certain nonnegative quantity, say α, where α mine jj, e kk ). The self-loops on the vertices V j and V k, or e jj and e kk, are reduced by the same amount α. That is, we consider modifications that increase the interaction strengths between two vertices at the cost of selfinfluences or weights. The graph resulting from such a topology change is also an undirected graph. Let us denote the adjacency matrix of the resulting graph by P. It is easy to see that P = P + P, where P is the following symmetric matrix: α for p = j and q = j α for p = j and q = k P p, q) = α for p = k and q = j 2) α for p = k and q = k 0 otherwise Let the eigenvalues of P be λ1 λ 2... λ n, with associated eigenvectors v 1, v 2,..., v n. We refer to this case as a Symmetric Weight Increment to an Undirected Graph. Our aim is to characterize the eigenvalues and eigenvectors of P as compared to those of P. We present the results for this case in section 2. Symmetric increments to undirected graphs of the described form arise in numerous application domains, including Markov chain analysis, mitigation of disease spread, and characterization of linear consensus protocols [2, 1]; these structural modifications often arise when resources are re-distributed to improve the response properties of the network, see e.g. [2]. 2) Balanced Weight Increments to Reversible Graphs We consider the class of graphs for which 1) the weights of the edges leaving each vertex sum to 1 i.e. e jk = k 1, j), and 2) there exists a unique vector π, such that π 1 = 1 and π j e jk = π k e kj for each graph edge. The graphs associated with homogeneous reversible Markov chains have these properties, and so we refer to these graphs as reversible ones. We note that such reversible graphs also arise in e.g. consensus algorithms for sensor networks [1]. The adjacency matrix for a reversible graph is a stochastic matrix, for which π is a left eigenvector associated with the dominant eigenvalue at unity. In addition to these properties, the adjacency matrix P is also symmetrizable through a diagonal similarity transformation, specifically A = DP D 1 is symmetric if D = diagπ 0.5 ). Here the diagx) operator on an q-component vector x returns a q q diagonal matrix which has the components of x along the main diagonal). For this case, we note that the eigenvalues of P are real and nondefective. We denote them as 1 = λ 1 λ 2... λ n, with v i denoting the right eigenvector associated with eigenvalue λ i. Let us now discuss the topology change we consider in our development. Motivated by both Markov chain and sensor network consensus applications, we consider topology changes that maintain the reversibility structure and leave the left eigenvector associated with

the adjacency matrix unchanged. Specifically, the edge weight between two vertices V j and V k, or e jk, is incremented by an amount α > 0, while the self-loop edge weight e jj is reduced by the same amount. Similarly, the edge weight e kj is increased by an amount β > 0, while the edge weight e kk is reduced by the same amount. Further, the quantities α and β are assumed to satisfy: = πk. Let the adjacency matrix α β π j of the resulting graph be P. It is easy to see that P = P + P where P is the following matrix. α for p = j and q = j α for p = j and q = k P p, q) = β for p = k and q = j 3) β for p = k and q = k 0 otherwise The vector π is the left eigenvector of P associated with the dominant eigenvalue at unity, since β = πj. α π Hence, π P = 0. Further, k π j P j, k) = π j P j, k) + π j α = π k P k, j) + π k β 4) = π k P k, j) Therefore, P is also the adjacency matrix of a reversible graph. Let the eigenvalues of the P be 1 = λ 1 λ 2... λ n, with v i denoting the right eigenvector associated with eigenvalue λ i. We refer to this type of topology change as Balanced Weight Increments to Reversible graphs. Our aim to characterize the sub-dominant eigenvalues and eigenvectors of P as compared to those of P. We present the results for this case in section 3. Let us briefly discuss the relevance of the above problem to the analysis of Markov chains. For a Markov chain, the adjacency matrix serves as the transition matrix of the chain, and the entries of the transition matrix represent conditional probabilities of state transition [10], i.e. P p, q) is the probability of the chain transitioning to state q given that the current state is p. The vector π specifies the steady-state stateoccupancy probabilities. Thus the balanced topology changes described above can be seen as changing transition probabilities in a way that maintains the steady state distribution as well as reversibility. Thus, the analysis of this case provides insight into how transition probabilities in a reversible Markov chain can be modified so as to maintain the steady state while shaping dynamical performance. 3) Arbitrary Increments to Weighted graphs. The previous cases are limited to special graph classes undirected, reversible), and also consider only very specially structured changes to the topology changes that can be made to the topologies. In this article, we also consider a class of quite-general modifications to arbitrary directed graphs. For this case, we focus exclusively on characterizing the dominant eigenvector of the modified topology. Specifically, in section 4, we present two results - one concerned with incrementing weights of the edges departing from multiple vertices, and one concerned with weight changes on edges entering into a given vertex. We also present results on designing the graph topology to achieve certain ratios among the dominant eigenvector components. To avoid confusion with cases 1 and 2 above, we defer the mathematical description of arbitrary topology changes to section 4. 2. SYMMETRIC TOPOLOGY CHANGES IN AN UNDIRECTED GRAPH In this section, we develop majorization results for eigenvector components of adjacency matrices, upon application of symmetric topology changes to undirected graphs, as described in section 1.1 1). First we present a preliminary theorem which indicates that, when the eigenvectors of the adjacency matrices have a certain special structure, then the associated eigenvalues and the eigenvectors themselves are invariant to the defined symmetric changes in the topology Theorem 1). This straightforward result serves as a starting point for more general spectral analyses, as given in Theorem 2. Theorem 1. Consider a symmetric weight increment between vertices V j and V k in an undirected graph, as described in Section 1.1-1). If v j i = vk i for some eigenvector v i associated with the eigenvalue λ i of P, then v i is also an eigenvector of P with eigenvalue λi. In other words, the eigenvalue λ i and its eigenvector v i are invariant to the topology change. Proof: The proof follows immediately from the fact that P v i = P + P )v i = λ iv i + P v i. However, we automatically find that P v i = 0, and so the result follows. Theorem 1 provides us with a mechanism for changing an undirected graph s structure, in such a manner that particular eigenvectors of its adjacency matrix remain unchanged. The above invariance result may prove applicable for algorithm design in sensor networks, in cases where changing the underlying network topology is a primary tool for improving performance; the result shows how changes can be made that do not affect critical mode shapes, while the remaining dynamics is changed. However, the result is restrictive, in the sense that invariance results hold only when edges are added between vertices whose eigenvector components are identical. The simple result of Theorem 1 serves as a stepping stone towards a more general characterization. Specifically, for symmetric topology changes between arbitrary vertices in an undirected graph, we find that the components of the subdominant eigenvector associated with the two ends of the added edge come closer to each other; similar results hold for eigenvectors associated with other eigenvalues, in a certain sequential sense. This result is formalized in the following theorem.

Theorem 2. Consider a symmetric weight increment between vertices V j and V k in an undirected graph, as described in Section 1.1- Case 1. Then, λ 1 λ 1, with equality only if v j 1 = vk 1. Moreover, v 1 k v j 1 vk 1 v j 1. Furthermore, if λ q = λ q and v q = v q q p, where p < n is some positive integer, then 1) λ p+1 λ p+1, with equality only if v j p+1 = vk p+1; and 2) v p+1 k v j p+1 vk p+1 v j p+1. Proof: The majorization of eigenvalues specified in the theorem is akin to existing eigenvalue-majorization results [7], but is included here for the sake of completeness and because the eigenvector majorization follows thereof. The majorization of eigenvector components is then proved, by using the Courant - Fischer Theorem to arrive a contradiction. The topology change is structured in such a way that the adjacency matrices, P and P are symmetric. Hence, the eigenvalues as well as the corresponding eigenvectors) of both the matrices can be found using the Courant - Fischer Theorem. In particular, λ 1 and λ 1 can be written as ) λ 1 x T P x 5) x and ) λ 1 y T P y y y y ) y T P + P ) y y T P y α y k y j) ) 2 where the maximums are taken over vectors of unit length, i.e. x 2 = 1 and y 2 = 1. The vectors that achieve the maximums are the eigenvectors associated with the eigenvalues λ 1 and λ 1. Notice that α y k y j) ) 2 0 for any vector y, with equality only if y j = y k. It is therefore straightforward to see that λ 1 λ 1, with equality only if v j 1 = vk 1. To prove the result on the change in eigenvector entries, we again use the Courant - Fischer Theorem Equations 5 and 6) to arrive at a contradiction. Let us denote the vectors that achieve the maximums as x for Equation 5 and y for Equation 6. Also, assume that x ) k x ) j < y ) k y ) j. Then according to Equation 6, we see that 6) x T P x αx k x j ) 2 y T P y αy k y j ) 2 7) Noting that x T P x y T P y, let x T P x = y T P y + C for some C 0. Thus we find that αx k x j ) 2 C + αy k y j ) 2 8) Equations 8 imply that x ) i x ) j y ) k y ) j, which is a contradiction to our assumption. The result on the eigenvectors v 1 and v 1 thus follows directly from this contradiction. The Courant - Fischer Theorem also permits computation of the eigenvalues λ 2,..., λ n. These eigenvalues can be found by optimizing the same objective as in Equations 5 and 6, but with the additional constraints that the vectors) that achieve the maximums) should be orthogonal to the subspace spanned by the eigenvectors associated with eigenvalues larger in magnitude. The result on the v p+1 and λ p+1, when λ q = λ q and v q = v q, q p, follows directly from this fact. We remark that results similar to Theorem 2 can be developed for the smallest most negative) eigenvalue λ n, and the associated eigenvector v n. However, we have focused our attention on characterizing the dominant and sub-dominant eigenvalues and their eigenvectors, since these spectral properties most often inform characterizations of complex-network dynamics and algorithms. 3. BALANCED TOPOLOGY CHANGES IN REVERSIBLE GRAPHS In section 2 we limited ourselves to the case where the original matrix and perturbation are symmetric, as described in Section 1.1 - Case 1). These eigenvector majorization results can be extended for balanced topology changes to reversible graphs, as described in Section 1.1 - Case 2). This extension is made possible by the observation that a diagonal similarity transformation can be used to equivalence a reversible graph s adjacency matrix with a symmetric matrix. The following theorem presents the result for reversible topologies formally. Theorem 3. Consider a reversible graph topology, and assume that a balanced weight increment is performed between vertices V j and V k, as described in Section 1.1-2). Then the eigenvalues/eigenvectors before and after the perturbation will have the following properties: 1) If v j i = v k i for some i = 2,..., n, then λ i = λ i and v i = v i. 2) If v j 2 vk 2, then λ 2 < λ 2 and v j 2 v k 2 ) v j 2 v k 2 ). Moreover, if λ q = λ q and v q = v q, q < p, where p < n is some positive integer then λ p+1 λ p+1 and v j p+1 vk p+1) v j p+1 vk p+1) Proof: From the discussion in section 1.1-2), we know that P represents a reversible graph. Further π is a left eigenvector associated with the unity eigenvalue of P. Therefore, there exists a similarity transformation matrix D = diagπ 0.5 ) such that A = DP D 1 and  = D P D 1 are symmetric matrices. Moreover, A =  A is also symmetric, with the following structure: α p = j and q = j αβ p = j and q = k Ap, q) = αβ p = k and q = j 9) β p = k and q = k 0 otherwise To prove the multiple statements in the theorem, we follow arguments similar to those used in the proofs of Theorem 1

and 2, while using the fact that if v i is an eigenvector associated with an eigenvalue λ of P or P ), then Dv i is an eigenvector associated with an eigenvalue λ of A or Â.) The proofs are organized into different cases for ease of presentation. 1) If v j i = vk i then P v i = 0. Therefore, P v i = P v i = λ iv i. The result follows immediately. 2) The eigenvalues of P respectively, P ) are identical to the eigenvalues of symmetric matrix A respectively, Â), since the matrices are related by a similarity transform. We can thus use Courant - Fischer Theorem to characterize the eigenvalues and eigenvectors of P with respect to those of P, in analogy with the proof of Theorem 2. Using the fact that P and P are stochastic matrices and noting the form of the similarity transform, we see that the sub-dominant eigenvalues λ 2 and λ 2 can be written as and ) λ 2 y T Ây y 1 n y 1 n y 1 n ) λ 2 x T Ax x 1 n ) y T A + A) y y T Ay αy j βy k) 2 ), 10) 11) where we have used the notation x 1 n to indicate that the maximum is taken over all vectors of unit length that are orthogonal to the n-component vector whose entries all 1 equal n. The vectors x and y that achieve the maxima, subject to the constraint, are the eigenvectors associated with the eigenvalues λ 2 and λ 2, respectively. We notice that αy j βy k) 2 0 for any vector y, with equality only if βy k = αy j or equivalently v j 2 = vk 2 ). It therefore follows that λ 2 < λ 2 if v j 2 vk 2. To prove the result on the components of the sub-dominant eigenvectors, we again use Courant Fischer Theorem Equations 10 and 11) to arrive at a contradiction. Let us the denote vectors that achieve the maxima subject to the constraints) as x for Equation 10 and y for Equation 11. Also, let us assume to obtain a contradiction) that αx ) j βx ) k < αy ) j βy ) k. Then using x in Equation 11 we see that x T Ax αx j βx k ) 2 y T Ay αy j βy k ) 2 12) Noting that x T Ax y T Ay, let x T Ax = y T Ay + C for some C 0. Thus we find that, λ 2 αx j βx k ) 2 λ 2 C αy j βy k ) 2 13) Equation 13 implies that αx ) j βx ) k αy ) j βy ) k, which is a contradiction to our assumption. The result on the eigenvectors v 2 and v 2 follows from this contradiction, from the fact that v 2 = D 1 x and v 2 = D 1 y, and from the assumed relationship between P j, k) and P k, j). The results on v p+1 and v p+1, under the condition that v p = v p and λ p = λ p p < n, can be proved using a very similar argument. 4. ARBITRARY WEIGHT INCREMENTS IN A DIRECTED GRAPH The results developed in sections 2 and 3 consider very specific changes to undirected or reversible graph topologies. A characterization of the change in the eigenvalues and/or eigenvectors for general topology modifications in arbitrary graphs has wider applications. In this section, we present some characterizations of the dominant eigenvector, for more-general topology changes in arbitrary directed and weighted graphs. Please note that the all digraphs we consider are strongly connected and hence the adjacency matrices are non-negative and irreducible. To avoid confusion between these results and those developed in sections 2 and 3, we use slightly different and simplified) notations here. For the remainder of the paper, we consider a directed and weighted graph of n vertices labeled V 1, V 2,..., V n, with an adjacency matrix denoted as P. We use P to denote the adjacency matrix of a graph resulting from a change in the topology. We assume that both P and P capture strongly-connected graphs, in which case they both have positive real eigenvalues of algebraic multiplicity 1 that are larger in magnitude than all other eigenvalues, i.e. that are dominant. The dominant eigenvectors of P and P are denoted as v and v respectively, while the dominant eigenvalues are denoted as λ and λ respectively. For a vector x, the notation x p:q specifies a q p + 1)-component subvector, which contains the correspondingly indexed components of the x i.e., the components between the pth and qth components, inclusive). The outline of this section is as follows. In Theorem 4 we present a characterization of the dominant eigenvector when weights of edges emerging from m distinct vertices say V 1,..., V m) are arbitrarily increased. In terms of the adjacency matrix, this change in graph topology corresponds to the case where we arbitrarily increment the first m < n rows of P. In Theorem 5 we present a similar result on the arbitrary weight increments to edges directed into a given vertex. This corresponds to arbitrarily incrementing a single column, say the first column WLOG, of the graph adjacency matrix. We next present a characterization of the dominant eigenvector of adjacency matrix, when weights of the edges emerging from m given vertices are incremented arbitrarily increased. We stress here that eigen-structure of the adjacency matrix is graph invariant to within a permutation, i.e. the order of the m vertices is not important; thus we focus on incrementing the first m rows WLOG. Theorem 4. Consider arbitrarily increasing the weights of the edges emerging from vertices V 1, V 2...., V m.

Then i {1, 2,..., m} such that v i v j > v i v j, for j = m + 1,..., n. We thus recover that, when the eigenvector is normalized to unit length, v i > v i. Proof: Incrementing rows strictly increases the dominant eigenvalue for a non-negative matrix associated with a connected graph [7], thus λ = λ + for some > 0. To prove the majorization on the eigenvector components we consider a particular scaling of the dominant[ eigenvectors. ] v1:m + q Specifically, let the dominant eigenvectors v = 1, v m+1:n + q 2 be scaled so that q 1 is an m component non-positive vector with q 1i = 0 for some i {1,..., m}. Notice that this scaling can be achieved, by scaling down the original eigenvector v after perturbation by the largest among the ratios v i v i for i = 1,..., m. We shall prove that q 2 is element-wise strictly negative for the assumed scaling. To prove that, let us consider the last n m equations of the eigenvector equation P v = λ v. For notational convenience, we partition the matrix P and similarly, P ) as [ ] P11 P P = 12, P 21 P 22 such that P 11 is an m m matrix and the dimensions of the other partitions are consistent. Substituting and v into the eigenvector equation P v = λ v, and also noting the assumed form for v, we obtain λi P 22)q 2 = P 21q 1 v m+1):n, It is easy to see that λi P 22) is an M-matrix. Using the fact that q 1 is entry-wise non-positive and that the inverse of an M-matrix is a non-negative matrix, we see that for the assumed scaling of the v) the vector q 2 is entry-wise strictly negative. The majorization result in the theorem statement follows immediately. The result presented in Theorem 4, gives us insight into how changes to the topology of an arbitrary directed graph would affect the eigen-structure of the adjacency matrix. We now turn our attention to the case where weights of edges to a given vertex rather than out of a given vertex) in a graph are arbitrarily increased, in which case the entries in a given column of the adjacency matrix are increased arbitrarily. We find that, under certain conditions, arbitrarily incrementing the edge weights to the i th vertex also increases the i th component of the dominant eigenvector relative to the other components of the eigenvector. We present this result in the following theorem. Theorem 5. Consider arbitrarily incrementing edge weights incoming to the first vertex WLOG). Let, E = P P and = λ λ. If I E) is non-negative, then v 1 v j > v 1 v j and for j = 2,..., n. We thus recover that, when the eigenvector is normalized to unit length, v 1 > v 1. Proof: It is easy to see that > 0 [7]. Moreover, from the Perron-Frobenius Theorem for non-negative matrices, the dominant eigenvectors are entry-wise non-negative [10]. [ We consider ] the scaling of the eigenvector v for which v v = 1 where q is a n 1 length vector. We wish to v 2:n + q find conditions for which q is element-wise strictly negative, using perturbation results. In order to find such conditions, we study the eigenvector equation P v = λ v. Substituting for λ and v. We arrive at the following equation. [ 0 λi P ) = I E)v q] If I E) is a non-negative matrix, then it can be seen that q has strictly negative entries. In order to find conditions for which I E) is a nonnegative matrix, we can use the various perturbation results for general matrices such as the Ostrowski - Elsner Theorem [7]. We would also like to point out that our results, presented in Theorems 4 and 5, allows for design of new edges that modify the eigenvectors in a specified way. As we pointed out earlier, eigenvectors of graph-adjacency matrices play an important role in characterizing various aspects of complex networked systems and associated algorithms. The results we developed inform on how these eigenvectors change when the underlying topology of the networked systems, and so the behavior of the algorithms, are modified. Using our results, one can develop frameworks for designing and/or changing existing topologies so as to shape various aspects of these systems. 5. REFERENCES [1] Y. Wan, K. Namuduri, S. Akula, and M. Varanasi. Consensus building in distributed sensor networks with bipartite graph structures. In AIAA Guidance, Navigation, and Control Conference, Oregon, 2011. [2] Y. Wan, S. Roy, and A. Saberi. Designing spatially heterogeneous strategies for control of virus spread. Systems Biology, IET, 24):184 201, 2008. [3] D. Cvetković and S. Simić. Graph spectra in computer science. Linear Algebra and its Applications, 4346):1545 1562, 2011. [4] B. Ruhnau. Eigenvector centrality a node-centrality? Social networks, 224):357 365, 2000. [5] A. Langville and C. Meyer. A survey of eigenvector methods for web information retrieval. SIAM review, 471):135 161, 2005. [6] A. Berman and R. Plemmons. Nonnegative matrices in the mathematical sciences. 1994. [7] G. Stewart and J. Sun. Matrix perturbation theory, volume 175. Academic press New York, 1990. [8] S. Roy, A. Saberi, and Y. Wan. Majorizations for the dominant eigenvector of a nonnegative matrix. In American Control Conference, Westin Seattle Hotel, Seattle, Washington, USA, pages 1965 1966, 2008. [9] F. Chung. Spectral graph theory, volume 92. American Mathematical Society, 1997. [10] R. Gallager. Discrete stochastic processes, volume 101. Kluwer Academic Publishers, 1996.