Abstract We survey some of what can be deduced about automorphisms of a graph from information on its eigenvalues and eigenvectors. Two of the main to

Similar documents
An algebraic proof of the Erdős-Ko-Rado theorem for intersecting families of perfect matchings

Some notes on Coxeter groups

GRE Subject test preparation Spring 2016 Topic: Abstract Algebra, Linear Algebra, Number Theory.

ALGEBRAIC COMBINATORICS. c1993 C. D. Godsil

When can perfect state transfer occur?

ELEMENTARY LINEAR ALGEBRA WITH APPLICATIONS. 1. Linear Equations and Matrices

Math 413/513 Chapter 6 (from Friedberg, Insel, & Spence)

Strongly Regular Decompositions of the Complete Graph

Real symmetric matrices/1. 1 Eigenvalues and eigenvectors

Laplacian Integral Graphs with Maximum Degree 3

Algebra Exam Topics. Updated August 2017

2 Garrett: `A Good Spectral Theorem' 1. von Neumann algebras, density theorem The commutant of a subring S of a ring R is S 0 = fr 2 R : rs = sr; 8s 2

Linear algebra and applications to graphs Part 1

Vector Space Basics. 1 Abstract Vector Spaces. 1. (commutativity of vector addition) u + v = v + u. 2. (associativity of vector addition)

Elementary linear algebra

Very few Moore Graphs

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )

INTRODUCTION TO LIE ALGEBRAS. LECTURE 2.

IRREDUCIBLE REPRESENTATIONS OF SEMISIMPLE LIE ALGEBRAS. Contents

JOHNSON SCHEMES AND CERTAIN MATRICES WITH INTEGRAL EIGENVALUES

Kernels of Directed Graph Laplacians. J. S. Caughman and J.J.P. Veerman

ALGEBRAIC MATCHING THEORY. C. D. Godsil 1

Using Laplacian Eigenvalues and Eigenvectors in the Analysis of Frequency Assignment Problems

3. Linear Programming and Polyhedral Combinatorics

Boolean degree 1 functions on some classical association schemes

Lecture 7: Positive Semidefinite Matrices

1 Directional Derivatives and Differentiability

Linear algebra 2. Yoav Zemel. March 1, 2012

Topics in Graph Theory

PROOF OF TWO MATRIX THEOREMS VIA TRIANGULAR FACTORIZATIONS ROY MATHIAS

EXERCISES. = {1, 4}, and. The zero coset is J. Thus, by (***), to say that J 4- a iu not zero, is to

Average Mixing Matrix of Trees

A Questionable Distance-Regular Graph

Math 350 Fall 2011 Notes about inner product spaces. In this notes we state and prove some important properties of inner product spaces.

Linear Algebra (part 1) : Vector Spaces (by Evan Dummit, 2017, v. 1.07) 1.1 The Formal Denition of a Vector Space

1 Fields and vector spaces

ALGEBRA QUALIFYING EXAM PROBLEMS LINEAR ALGEBRA

Detailed Proof of The PerronFrobenius Theorem

Chapter 2 Linear Transformations

5 Flows and cuts in digraphs

Elementary 2-Group Character Codes. Abstract. In this correspondence we describe a class of codes over GF (q),

Foundations of Matrix Analysis

Algebraic Methods in Combinatorics

Classification of root systems

j=1 x j p, if 1 p <, x i ξ : x i < ξ} 0 as p.

ACO Comprehensive Exam March 17 and 18, Computability, Complexity and Algorithms

Boolean Inner-Product Spaces and Boolean Matrices

Classification of semisimple Lie algebras

Spectral Theorem for Self-adjoint Linear Operators

Algebra Exam Syllabus

MATH 326: RINGS AND MODULES STEFAN GILLE

Root systems and optimal block designs

First we introduce the sets that are going to serve as the generalizations of the scalars.

4.1 Eigenvalues, Eigenvectors, and The Characteristic Polynomial

REPRESENTATION THEORY OF S n

MAT 445/ INTRODUCTION TO REPRESENTATION THEORY

Theorem 5.3. Let E/F, E = F (u), be a simple field extension. Then u is algebraic if and only if E/F is finite. In this case, [E : F ] = deg f u.

DISTINGUISHING PARTITIONS AND ASYMMETRIC UNIFORM HYPERGRAPHS

Notes on the Matrix-Tree theorem and Cayley s tree enumerator

Ma/CS 6b Class 23: Eigenvalues in Regular Graphs

Course 311: Michaelmas Term 2005 Part III: Topics in Commutative Algebra

x i e i ) + Q(x n e n ) + ( i<n c ij x i x j

Notes on the matrix exponential

LECTURE VI: SELF-ADJOINT AND UNITARY OPERATORS MAT FALL 2006 PRINCETON UNIVERSITY

ACO Comprehensive Exam 19 March Graph Theory

Applicable Analysis and Discrete Mathematics available online at GRAPHS WITH TWO MAIN AND TWO PLAIN EIGENVALUES

Math 554 Qualifying Exam. You may use any theorems from the textbook. Any other claims must be proved in details.

LINEAR ALGEBRA REVIEW

Lecture 1. Toric Varieties: Basics

MATHEMATICS 217 NOTES

Distance-regular graphs where the distance-d graph has fewer distinct eigenvalues

Spectral results on regular graphs with (k, τ)-regular sets

2.2. Show that U 0 is a vector space. For each α 0 in F, show by example that U α does not satisfy closure.

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.

Chapter 2 Spectra of Finite Graphs

Math 121 Homework 5: Notes on Selected Problems

Lecture Summaries for Linear Algebra M51A

Contents. Preface for the Instructor. Preface for the Student. xvii. Acknowledgments. 1 Vector Spaces 1 1.A R n and C n 2

18.S34 linear algebra problems (2007)

11. Finitely-generated modules

ARCS IN FINITE PROJECTIVE SPACES. Basic objects and definitions

Tree-width and planar minors

3. Linear Programming and Polyhedral Combinatorics

Introduction to Association Schemes

Self-Complementary Arc-Transitive Graphs and Their Imposters

Linear Algebra, 4th day, Thursday 7/1/04 REU Info:

MATH 223A NOTES 2011 LIE ALGEBRAS 35

THE MAXIMAL SUBGROUPS AND THE COMPLEXITY OF THE FLOW SEMIGROUP OF FINITE (DI)GRAPHS

08a. Operators on Hilbert spaces. 1. Boundedness, continuity, operator norms

Math113: Linear Algebra. Beifang Chen

LOVÁSZ-SCHRIJVER SDP-OPERATOR AND A SUPERCLASS OF NEAR-PERFECT GRAPHS

Linear Algebra. Workbook

K 4 -free graphs with no odd holes

On the eigenvalues of Euclidean distance matrices

Graph coloring, perfect graphs

Groups of Prime Power Order with Derived Subgroup of Prime Order

Representations of algebraic groups and their Lie algebras Jens Carsten Jantzen Lecture III

Math 443/543 Graph Theory Notes 5: Graphs as matrices, spectral graph theory, and PageRank

1. General Vector Spaces

Bichain graphs: geometric model and universal graphs

Transcription:

Symmetry and Eigenvectors Ada Chan and Chris D. Godsil Combinatorics and Optimization University of Waterloo Waterloo, Ontario Canada N2L 3G http://bilby.uwaterloo.ca Support from a National Sciences and Engineering Council of Canada operating grant is gratefully acknowledged.

Abstract We survey some of what can be deduced about automorphisms of a graph from information on its eigenvalues and eigenvectors. Two of the main tools are convex polytopes and a number of matrix algebras that can be associated to the adjacency matrix of a graph.

Eigenvectors and Eigenpolytopes We study eigenvectors of the adjacency matrix of a graph, and how these interact with the graph's automorphisms. The treatment in the rst four sections is based loosely on [0,6].. Eigenvalues and Automorphisms Our aim in this section is to introduce the idea that the eigenvectors of the adjacency matrix of a graph are best viewed as a class of functions on the vertices of the graph. So we present some denitions and then use them to derive some classical results concerning simple eigenvalues of vertex-transitive graphs. Let X be a graph with vertex set V. If i and j are vertices in X, we write i j to denote that i is adjacent to j. Let F (V ) denote the space of real functions on V. We denote the characteristic vector of the vertex i of X by e i ; the vectors e i form the standard basis of F (V ). The adjacency operator A is the X linear operator on F (V ) dened by (Af)(i) := f(j): ji The matrix that represents A relative to the standard basis is the usual adjacency matrix A of X. The inner product of two functions f and g from F (V ) is X i2v f(i)g(i); the adjacency operator is self-adjoint relative to this inner product (equivalently, the adjacency matrix is symmetric). We denote the function on V that is constant and equal to by. A function f is an eigenvector for A if it is not zero and there is a constant such that, for each vertex i in V X f(i) = f(j); ji or Af = f. The constant is, of course, the eigenvalue of f. The graph X is regular if and only if is an eigenvector for A, with the corresponding eigenvalue equal to the valency of X. Let f be an eigenvector for A and let i be the vertex of X such that jf(i)j jf(j)j, for all j 2 V. Then jf(j)j; jjjf(i)j = j X f(j)j X ji ji

2 Chan & Godsil and therefore, jj X ji jf(j)j jf(i)j : If X is connected then equality holds if and only if jfj is a constant function. If f itself is constant, then X is regular, otherwise X is regular and bipartite. In both cases, the function jfj is an eigenvector. Moreover, by our choice of i, we can conclude that jj is less than or equal to the valency of i and, thus, the absolute value of all eigenvalues for A is bounded above by the maximum valency of X. If X is a complete graph on the vertex set V of size n, then since it is regular, it has eigenvector with eigenvalue (n? ). Let f 2 F (V ) such that P i2v all i 2 V, X ji f(j) =?f(i); f(i) = 0. Then for and therefore? is an eigenvalue for X with n? linearly independent eigenvectors. If X is a circuit on n vertices and is the primitive n th root of unity, then the function f(i) = i, for i = 0 : : : n? is an eigenvector for X with eigenvalue ( +? ). We denote the automorphism group of X by Aut(X); if i 2 V and a 2 Aut(X) then ia is the image of i under a. If f 2 F (V ) and a 2 Aut(X) then the composition f a is again a real function on X, with (f a)(i) = f(ia): The map f 7! f a is an invertible linear mapping of F (V ) onto itself. If we need to distinguish the above mapping from the automorphism a, we will denote it by ^a. Our conventions imply that babf = ^a^bf: If a 2 Sym(V ) then ^ae i = e ia?; which is probably not what we would expect. Now we show that if a 2 Sym(V ) then a 2 Aut(X) if and only if ^a and A commute. Equivalently, if a 2 Aut(X) then ^a lies in the centralizer C(A) of A. Consider for any f 2 F (V ) and i 2 V, X X (A^a)f(i) = ^af(j) = f(ja) ji ji

Symmetry and Eigenvectors 3 and (^aa)f(i) = Af(ia) = X lia f(l): Then A and ^a commute if and only if the right hand side of the above equations are equal. This occurs exactly when i j if and only if ia ja for all i 2 V ; that is a 2 Aut(X). If f is an eigenvector with eigenvalue and a 2 Aut(X) then f a is again an eigenvector with eigenvalue ; simply note that X ji(f a)(j) = X ji f(ja) = X`ia f(`) = (f a)(i): Although we have not developed much theory yet, we can use what we have to derive some results concerning the eigenvalues of vertex-transitive graphs. Assume X is a vertex-transitive graph with automorphism group G. Let be a simple eigenvalue of A, and let f be a corresponding eigenvector. If a 2 G then f a is again an eigenvector with eigenvalue, as is simple it follows that f a = cf (:) for some constant c. Now there is a least positive integer r such that a r = ; from (.) we then nd that c r =. Hence c is an r-th root of unity. As both f and f a are real functions this implies that c =.. Theorem (Petersdorf and Sachs [26]). Let X be a vertex transitive graph on n vertices with valency k, and let be a simple eigenvalue of A. Then k? is an even integer. If 6= k then jv j is even. Proof. Let f be an eigenvector for. If a 2 Aut(X) and f a = f then f is constant on the orbits of a. If f a =?f and S is an orbit of a then either f is zero on S, or f(ia) =?f(i) for each i in S. If f is constant then Af = kf, i.e, = k. So we may assume that f is not constant. As Aut(X) is vertex-transitive, it follows that there is a constant c such that f(i) = c, for all vertices i in X. We may assume without loss that c =. Let i be a vertex of X and let `i be the number of neighbours j of i such that f(j) =?f(i). Then X f(i) = f(j) = (k? 2`i)f(i): (:2) ji This proves our claim concerning.

4 Chan & Godsil Finally, as X is regular, the all ones vector is an eigenvector for A. As A is selfadjoint, eigenvectors with distinct eigenvalues are orthogonal. Hence 0 = hf; i = X i2v f(i) and thus we deduce that jv j must be even. It is an interesting exercise to show that, if X is vertex transitive and has two simple eigenvalues, neither equal to the valency k, then jv (X)j must be divisible by 4. It can also be shown that X has at least three vertices then it has at most jv (X)j=2 simple eigenvalues. This is another reasonable exercise. (Proofs of both these results are given in [6].) A natural question is how many simple eigenvalues a vertex transitive graph can have. Note that (.2) implies that `i is independent of the vertex i. Let be the partition of V (X), where vertices i and j are in the same cell of if f(i) = f(j). Then has exactly two cells, and the number of neighbours in a given cell of a vertex i of X is determined by the cell in which i lies. This means, in the terminology of Section 5, that is an equitable partition of X. 2. Eigenspaces If real-valued functions on V (X) are interesting then it is not unreasonable to consider functions on V (X) with values in the vector space R m, in particular those functions f from V to R m such that, for some constant f(i) = X ji f(j): (2:) By way of example, consider a regular dodecahedron in 3-space, with centre of mass at the origin. Let X be the -skeleton of this solid and let p be the map that associates each vertex of X with its image in R 3. From the symmetry of the solid, we see that each vertex i there is a constant c i such that c i p(i) = X ji p(j): Since all vertices of the solid are equivalent under its symmetry group, c i is independent of i and so (2.) holds.

Symmetry and Eigenvectors 5 Our aim is to generalize this example. We use A to denote the adjacency matrix of X. Let be an eigenvalue of A with multiplicity m and let U be a jv j m matrix whose columns form an orthonormal basis for the eigenspace belonging to. As each column of U is an eigenvector for A we have U = AU. Hence, if u (i) denotes the i-th row of U, then (2.) holds (with u in place of f). We call the function u from V to R m the representation belonging to. As the columns of U form an orthonormal set of vectors, U T U = I and therefore, if we dene E := U U T then E = E T and E ; if = ; E E = 0; otherwise. In fact E is the matrix representing orthogonal projection onto the column space of U. This implies that E is an invariant of the eigenspace, and does not depend on the orthonormal basis we chose to form U. Next, if x and y are vectors in R m, for some m, then hx; yi denotes their inner product. If i and j are two vertices of X then (E ) i;j = hu (i); u (j)i: This shows that the Euclidean distance between the points u (i) and u (j) is determined by i, j and. A convex polytope is the convex hull of a nite set of points in R m. A vertex of a polytope is a point in it which is an endpoint of any closed line segment that contains it and is contained in the polytope. (Equivalently, a vertex is an extreme point.) If S is a nite subset of R m then the vertices of the convex hull C of S are a subset of S. Any linear mapping of R m that xes S as a set must x the polytope generated by S, and a linear mapping xes all points in C if and only if it xes each vertex of C. The simplest example occurs when m =. If jsj 2 then C has exactly two vertices and certainly any linear mapping that xes these two vertices xes all points in C. The convex hull of the set of points u (i), for all i in V (X), is the eigenpolytope belonging to. Figure illustrates the four eigenpolytopes of the cube. The corresponding matrices U are listed in the following table. If h is an eigenvector for A with eigenvalue and S is the set of vertices on which h takes its maximum value then the convex hull of the points u (i), for i 2 S is a face of the eigenpolytope. Conversely each face can be obtained from an eigenvector, since we can dene the faces of a polytope as the set

6 Chan & Godsil 8-8 0 8 z y x z y x Figure : Eigenpolytopes of the Cube of points on which a linear function takes its maximum value. Note that the eigenvector?h determines a face disjoint from that gotten from h. Eigenpolytopes of distance-regular graphs are studied at some length in [4].

Symmetry and Eigenvectors 7 m U 3 p 8?3 3? 3 0 B @ 0 B @ 0 B @? p 8 p 8 p 8? p 8 p 8? p 8? p 8 p 8 C A p? p? p? 8 8 8 p? p? p 8 8 8 p? p p? 8 8 8? p 8 p 8 p 8 p 8? p 8? p 8 p 8? p 8 p 8 p 8 p 8? p 8 p 8 p 8 p 8 2 0 0 0? 2? 2 0 0 0 2 p? p? 2 24 p p? 3 q 24 3 8 p? p? 2 24 p? p? 2 q 24 3 8 p p? 3 24? p 2? p 24 C A C A

8 Chan & Godsil 3. Walks A walk of length r in a graph X is a sequence x 0 ; : : : ; x r of vertices from X, such that consecutive vertices are adjacent. A walk is closed if its rst and last vertices are equal. In this section we show that the geometry of each eigenpolytope of X is determined by the number of walks of certain types in X. The number of walks W i;j (X; r) of length r from vertex i to vertex j in X is equal to he i ; A r e j i: We will derive an alternative expression for this number. The walk generating function W i;j (X; t) is dened to be the formal sum X r0 W i;j (X; r)t r : The geometry of the eigenpolytopes of X is completely determined by these walk generating functions. Our main goal in this section is to demonstrate this. If is an eigenvalue of A, let E denote orthogonal projection onto the associated eigenspace. For each eigenvalue of A, we have AE = E A = E : (3:) The spectral decomposition of A is equivalent to the identity A = X E ; where we sum over all eigenvalues of A. This is a standard result, of course, which can be found in many texts. (See [2], for example.) From (3.) and the spectral decomposition we obtain, for any non-negative integer r: A r = X r E : Consequently he i ; A r e j i = X r he i ; E e j i: (3:2) As E = U U T, we obtain our next result.

Symmetry and Eigenvectors 9 3. Theorem. For any two vertices i and j in the graph X and any non-negative integer r, we have W i;j (X; r) = X r hu (i); u (j)i; where we sum over all eigenvalues of A. 3.2 Corollary. The inner product hu (i); u (j)i is determined by and the sequence W i;j (X; r) for r = 0; ; : : :. Proof. >From (3.2), we have he i ; p(a)e j i = P p()hu (i); u (j)i, for any polynomial p. Let p (x) := Y 6= (x? ) ; if x = ; (? ) = 0; if x 6=, x an eigenvalue of A, be a product over all eigenvalues of A. Then hu (i); u (j)i = he i ; p (A)e j i and the result follows. The signicance of Corollary 3.2 is that it shows that the geometry of the set of points fu (i) : i 2 V g is determined by and the walk generating functions W i;j (X; t) for all vertices i and j of X. As one useful consequence we note that, if X is a vertex-transitive graph, the walk generating function W i;i (X; t) is independent of the vertex i and therefore the length of the vector u (i) does not depend on i. Thus in this case all points u (i) for i in V lie on a sphere with centre at the origin. The same conclusion will hold more generally for graphs X with the property that the walk generating functions W i;i (X; t) are independent of i. We will call such graphs walk-regular. They need not be vertex-transitive, but nonetheless have a number of properties in common with vertex-transitive graphs. For examples, the Petersdorf-Sachs theorem (Theorem.) holds for walk-regular graphs. For if X is walkregular and is a simple eigenvalue for X then ju (i)j is independent of i, and the rest of the proof follows. (For further information, see [6,].)

0 Chan & Godsil 4. Automorphisms It follows from the spectral decomposition of A that each projection E is a polynomial in A. Therefore each automorphism of X, viewed as a linear mapping, commutes with each projection E. As E = U U T, we thus have ^gu = ^gu U T U = U U T ^gu ; which implies that u (ig) = u (i)u T ^gu : Here U T ^gu is an m m matrix that is easily shown to be orthogonal. Let us denote it by g. The mapping g 7! g is thus a representation of the automorphism group into the group of m m real orthogonal matrices; we obtain one such representation from each eigenspace of X. An automorphism g lies in the kernel of the representation if and only if ^gu = U, i.e., if and only if ^g xes each column of U. It follows that if g is the identity for each eigenvalue then ^g xes every eigenvector of A, which implies that it is the identity mapping on F (V ). Now consider the eigenpolytope of X associated to an eigenvalue of X, with multiplicity m. If m = then this is either a single point or a closed line segment in R. In the rst case the representation of Aut(X) aorded by the eigenspace is trivial (each automorphism is mapped to the identity). In the second case, the image of an automorphism either xes each endpoint of the line segment, and hence induces the identity mapping, or it swaps the two endpoints and its square is the identity. Thus the representation of Aut(X) determines a homomorphism of Aut(X) onto the group of order 2, and therefore Aut(X) contains a subgroup of index 2. If each eigenvalue of X has multiplicity, it follows that Aut(X) must be an elementary abelian 2-group. (It is an interesting problem just how large Aut(X) can be in this case. Note that we saw in Section that the eigenvalues of a vertex-transitive graph cannot all be simple, hence jv j? is an upper bound.) If m = 2 then the eigenpolytope is either a closed line segment or a convex polygon. In the rst case our previous analysis suces, in the second we obtain a representation of Aut(X) as a subgroup of a dihedral group. If each eigenvalue of X has multiplicity at most 2 then Aut(X) is therefore a subgroup of a direct product of dihedral groups. Babai, Grigoryev and Mount [] have proved that if the multiplicity of the eigenvalues are bounded by some constant, then Aut(X) can be computed in polynomial time.

Symmetry and Eigenvectors Equitable Partitions Let C ; : : : ; C m be the orbits of a group of automorphisms of X. Any two vertices in C i must have the same number of neighbours in C j, for all i; j m. Equitable partitions are partitions of V (X) with this regularity property, that are not necessarily orbit partitions. Our treatment in this and the next section follows [5] and [: Chapter 5]. 5. Basics If is a partition of a set, let jj denote the number of cells of. A partition is discrete if its cells are all singletons. Let X be a graph with vertex set V and let be a partition of V with cells C ; : : : ; C m. We say that is equitable if, for each i and j, the number of neighbours in C j of a given vertex u in C i is determined by i and j, and is independent of the choice of u in C i. If G is a subgroup of Aut(X), then the orbits of G form a partition of V with the properties that the subgraph induced by a cell of is regular and the subgraph formed by the edges joining any two cells of is bipartite and semiregular. Hence, the orbits of an automorphism group of X form an equitable partition. However, the converse is not true, Figure 2 shows an example of an equitable partition that is not an orbit partition. 3 Figure 2: An equitable partition that is not an orbit partition The quotient X= of X relative to the equitable partition is the directed graph with the cells C ; : : : ; C m of as its vertices and c i;j arcs from cell i to cell j, where c i;j is the number of vertices in C j adjacent to a given vertex in C i. The adjacency matrix of X= is the m m matrix with ij-entry equal to c i;j. Figure 3 lists four equitable partitions of the Petersen graph with their quotients. If is a partition of V, let F (V; ) denote the space of real functions on V (X) that are constant on the cells of.

2 Chan & Godsil 2 0 3 0 π 0 0 2 2 3 2 2 2 π 2 2 2 π 3 2 2 0 0 2 0 2 2 2 3 π 4 3 Figure 3: Equitable partitions of Petersen graph

Symmetry and Eigenvectors 3 5. Lemma. A subspace U of F (V ) equals to F (V; ) for some partition of V if and only if U is a subalgebra of F (V ) which contains the constant functions. Proof. If a partition of V then F (V; ) is a subspace of F (V ) that contains all the constant functions; it is also closed under multiplication and is thus a subalgebra of F (V ). Conversely, suppose U is a subalgebra of F (V ) which contains the constant functions and with basis fx ; : : : ; x m g. For each vector x i, let i be the partition of V such that vertices u; v belong to the same cell if and only if x i (u) = x i (v). Let be the partition of V whose cells are the non-empty pairwise intersections of the cells of ; : : : m. Let c = bx + b 2 x 2 + : : : + b m x m ; where the real number b is chosen large enough that c takes distinct values on vertices in distinct cells of. Suppose v ; : : : ; v r are vertices in the cells C ; : : : ; C r of respectively, and for k = ; : : : ; r let y k (v) = Y j6=k (c(v)? c(v j )) (c(v k )? c(v j )) = ; if v 2 Ck ; 0; otherwise. The functions y ; : : : ; y r are the characteristic functions of C ; : : : ; C r respectively. They all belong to U because U contains the constant functions c(v j ). Moreover, they are linearly independent in U. Since each cell of is contained in a cell of i, for i = ; : : : ; m, we can express the x i as a linear combination of y ; : : : ; y r. As a result, the y j 's form a basis of U and U is equal to F (V; ). 5.2 Lemma. Let X be a graph and let be a partition of V (X). Then is equitable if and only if F (V; ) is A-invariant. Proof. Suppose that x is the characteristic function of the subset S, a cell of of V (X). If i 2 V then (Ax)(i) equals the number of vertices in S adjacent to i. So Ax is constant on the cells of if and only if the number of neighbours in S of a vertex i is determined by the cell of in which i lies. Now the result follows at once. Equivalently, for undirected graphs, we can dene a partition to be equitable if and only if F (V; ) is A-invariant. For directed graphs, there is a choice of the algebra hai or ha; A T i used to dene equitable partitions. If and are partitions of V and each cell of is contained in a cell of, we say that is a renement of, and write. This shows that the set of all partitions of V forms

4 Chan & Godsil a partially ordered set, but in fact it is even a lattice. The meet ^ of two partitions and is the partition whose cells are all the nonempty pairwise intersections of cells of with cells of. The join _ is most easily dened as the meet of all the partitions such that and. Alternatively, dene a graph with vertex set V, where two vertices are adjacent if and only if they lie in the same cell of or of ; the connected components of this graph are the cells of _. The meet of two equitable partitions need not be equitable. An example is given in Figure 4, which is the meet of 2 and 3 of the Petersen graph in Figure 3. However the join must be. We will derive this from the next result, the proof of which is left as an exercise. Figure 4: Meet of 2 and 3 is not equitable 5.3 Lemma. Let and be partitions of the set V. Then F (V; _ ) = F (V; ) \ F (V; ). 5.4 Corollary (McKay [22, Lemma 5.3]). Let and be equitable partitions of the graph X. Then _ is equitable. Proof. Assume V = V (X). As and are equitable, both F (V; ) and F (V; ) are A- invariant. Now the above lemma implies that F (V; _ ) is A-invariant and so _ is equitable. One important consequence is that any partition has a unique maximal equitable renement it is the join of all the equitable partitions that rene.

Symmetry and Eigenvectors 5 If is a partition of V such that each cell is xed by Aut(X) then the set of equitable renements of is xed by Aut(X), and therefore the coarsest equitable renement of is xed by Aut(X). (For example, we might choose to be the partition of the vertices of X by valency, which is clearly invariant under Aut(X). If 0, the coarsest equitable renement of, is discrete then it follows that Aut(X) must be the identity group.) If is a partition with cells C ; : : : ; C r and v 2 C i, dene the prole of v to be (n ; : : : ; n r ; i), where n j is the number of neighbours v has in C j. Let be the partition such that two vertices are in the same cell if and only if their proles (relative to ) are equal. Then is a renement of which is invariant under any automorphism that xes each cell of. It can be computed from in polynomial time, hence it follows that by repeatedly computing renements in this manner, we can determine 0 in polynomial time. So it is not surprising that equitable partitions play an important role in practical graph isomorphism packages such as nauty. Further details will be found in [7]. 6. Quotients Any subset S of V can be represented by its characteristic function, which takes the value on each element of S and 0 on each element not in S. Because a partition of V is a set whose elements are sets, we can represent a partition by its characteristic matrix, which is the jv j jj matrix with the characteristic functions of the cells of for columns. 6. Lemma. Let A be the adjacency matrix of X and let be a partition of V (X) with characteristic matrix D. Then is equitable if and only if there is a matrix B such that AD = DB. Proof. The column space of D can be identied with F (V; ). Hence is equitable if and only if the column space of D is A-invariant. This holds if and only if, for each column x of D, the vector Ax is a linear combination of columns of D and so can be written as Db for some vector b. The columns of the characteristic matrix of a partition are linearly independent, therefore there is at most one matrix B such that AD = DB. Alternatively, note that D T D is invertible and diagonal with entries being the sizes of the cells, and that B = (D T D)? D T AD, thus is determined by A and D. If is equitable then B must be the adjacency matrix of X=.

6 Chan & Godsil 6.2 Theorem. Let X be a graph with adjacency matrix A and let be an equitable partition of X. Then the characteristic polynomial of A(X=) divides the characteristic polynomial of A. Proof. Suppose that v = jv (X)j and m = jj. Let x ; : : : ; x v be a basis for R v such that x ; : : : ; x m are the distinct columns of the characteristic matrix of and x m+ ; : : : ; x v are each orthogonal to x ; : : : ; x m. Let D = (x ; : : : ; x m ) and D 2 = (x m+ ; : : : ; x v ). Then the matrix representing A relative to this basis is (D T D )? 0 0 (D T 2 D 2 )? D T D T 2 A ( D D 2 ) : Since AD = D B and D T D 2 = 0, the matrix has the block diagonal form B 0 0 Y (6:) where B is the adjacency matrix of X=. Since the characteristic polynomial of the matrix in (6.) is the characteristic polynomial of A, the result follows. Note that it follows from this that, if the characteristic polynomial of A is irreducible over the rationals, then the only equitable partition of X is the discrete partition. Suppose S V (X). The distance partition of X relative to S is the partition with cells C 0 ; : : : ; C r, where C i is the set of vertices in X at distance i from S. The integer r is the covering radius of S. We say that S is a completely regular subset of X if its distance partition is equitable. For example, the leftmost cell of each of the four partitions in Figure 3 are four dierent completely regular subsets of the Petersen graph. The ball of radius r about a vertex u of X consists of all vertices in X at distance at most r from u. The packing radius of S is the maximum integer e such that the balls of radius e about the vertices in S are pairwise disjoint. A subset with packing radius e is also called an e-code. If S has packing radius e and covering radius r then e r. We call S a perfect code if r = e. (This terminology is consistent with the standard usage in coding theory.) The Johnson graph, J(n; k) has as vertices all the k-subsets of a n-set, where two k-subsets are adjacent if and only if they intersect in k? elements. Any single vertex in J(n; k) is a perfect code. Furthermore, when n = 2k and k is odd, any pair of disjoint k-subsets form a perfect (k?) 2 -code. No further perfect codes are known in the Johnson graphs. As a result, Delsarte [8, p55] states that \it is tempting to conjecture that such codes do not exist"; this question is still open.

Symmetry and Eigenvectors 7 6.3 Theorem. Let X be a regular graph. If X has a perfect -code then? is an eigenvalue of A. Proof. Assume that the valency of X is k. Suppose that C is a perfect code in X, and that c = jcj. Let be the partition with C and V n C as its cells. A vertex in C has no neighbours in C and k neighbours in V n C. A vertex in V n C is adjacent to exactly one vertex in C, and to k? vertices in V n C. Therefore is equitable, and the adjacency matrix of the quotient is 0 k k? The characteristic polynomial of this matrix is : x 2? (k? )x? k = (x? k)(x + ); whence Theorem 6.2 implies that? must be an eigenvalue of A.

8 Chan & Godsil Algebras In the next two sections, we derive results concerning some matrix algebras, and apply them to equitable partitions. The following section provides some information about one of these algebras for trees. 7. Algebras As we have seen, nding the equitable partitions of a graph is equivalent to nding the A- invariant subalgebras of F (V ) that contain the constant functions. If Y is such a subalgebra then Y is invariant under all elements of the algebra generated by A. But there is a larger algebra that we can work with. Let J denote the square matrix with all entries equal to (and order determined by the context.) A subspace of F (V ) is J -invariant if and only if it contains all constant functions, and therefore a subspace F (V; ) is invariant under A and J if and only if it is A-invariant. The operator J is a polynomial in A if and only if X is a connected regular graph. (This result was rst observed by A. J. Homan, for a proof see Biggs [3, Prop. 3.2].) Hence, if X is not regular, then ha; J i is strictly larger than hai. In particular, if ha; J i is equal to the algebra of all linear operators on F (V ), then no proper non-zero subspace of F (V ) is ha; J i-invariant, and consequently the only equitable partition of X is the discrete partition. (And X has no non-identity automorphisms.) In fact, under comparatively mild conditions, ha; J i is equal to the algebra of all linear operators on F (V ). We introduce some machinery to let us justify this claim. If S V (X), let J S be the operator on F (V ) dened by (J S f)(i) = P j2s f(j); if i 2 S; 0; otherwise. 7. Theorem. Let A be the adjacency operator of the graph X and let S be a subset of V (X) with characteristic function. Let W be the subspace of F (V ) spanned by the functions A r. Then the algebra generated by A and J S is isomorphic to the direct sum of the algebra of all linear operators on W and the algebra generated by the restriction of A to W?. Proof. The space F (V ) is a direct sum of eigenspaces of A; let ; : : : ; m denote the distinct non-zero projections of on the eigenspaces of A. As the vectors i lie in distinct

Symmetry and Eigenvectors 9 eigenspaces of A, they have distinct eigenvalues and therefore, for each i, there is a polynomial p i such that p i (A) i = i and P p i (A) j = 0 when j 6= i. This also shows that the vectors i all lie in W. Because = i i, we see that W is contained in the span of the vectors i and therefore these vectors form a basis for W. Since T i = i T T i we have p i (A)J S p j (A) r = p i (A) T kj k 2 p j (A) r = i ; if r = j; 0; otherwise. This implies that the m 2 operators p i (A)J S p j (A) span the space of all linear operators on W. As 2 W, each vector in W? is orthogonal to and therefore J S acts on W? as the zero operator. 7.2 Corollary. Let A be the adjacency operator of the graph X, let S be a subset of V (X) with characteristic function and let be the partition of V with S and V n S as its cells. If no eigenvector of A is orthogonal to then the discrete partition is the only equitable renement of. Proof. If W? is non-zero then it contains an eigenvector for A, this eigenvector is necessarily orthogonal to. Thus the hypothesis implies that W = F (V ), and therefore that A and J S generate the algebra of all linear operators on F (V ). Consequently no non-zero proper subspace of F (V ) is invariant under A and J S. This implies that there is no equitable renement of the partition with cells S and V n S. If i 2 V (X) and f is an eigenvector for A such that f(i) = 0 then the restriction of f to V n i is an eigenvector for A(X n i), with the same eigenvalue. Consequently Corollary 7.2 implies that, if A(X) and A(X n i) have no eigenvalue in common, no nonidentity automorphism of X xes i. If A has no eigenvector orthogonal to then Corollary 7.2 implies that the discrete partition is the only equitable partition of X. We leave it as an exercise to show that, if the characteristic polynomial of A is irreducible over the rationals, then A and J have no common eigenvector, and consequently X has no non-identity automorphisms. This is a classical result, rst appearing in [25]. Another proof follows from Theorem 6.2. We present an extension of Theorem 7..

20 Chan & Godsil 7.3 Theorem. Let X be a connected graph and let ; : : : ; m be characteristic vectors of subsets S ; : : : ; S m of V (X). Let W be the space spanned by all vectors of the form A r i. Then the algebra M generated by A and J Si (i = ; : : : ; m) is isomorphic to the direct sum of the algebra of all linear operators on W and the algebra generated by the restriction of A to W?. Proof. Suppose y is an eigenvector for A in W. If T i y = 0 for each i then T i A r y = 0 for each i and so y 2 W? \ W ; hence y = 0. Thus we may assume that T i i. y 6= 0 for some We now show that W is an irreducible M-module. Suppose that y lies in a submodule W of W with i 2 W. As X is connected, for any j there is an integer r such that j T A r i > 0: Consequently j 2 W, for all j, and therefore W = W. Next, let L be a linear endomorphism of W that commutes with each element of M. We aim to show that L is a scalar multiple of the identity map. For any vector, if L T = T L then L and must be parallel; i.e, is a right eigenvector for L. Similarly T and T L are parallel and therefore T is a left eigenvector. If L = c and T L = c T then c T = T L = d T ; whence we see that and T have the same eigenvalue. Suppose now that L i = 0. Because A and L commute we than have LA r i = 0 for all r and therefore, for any j, T j LA r i = 0. Since L commutes with J Sj = j T j, from above, j T L = ct j for some c. It follows that cj T Ar i = 0 for all r, and hence c = 0. Thus we have proved that L i = 0 for all i. Assume next that L i = c i i for i = ; : : : ; m and c i 6= 0 for any i. Then c i T j A r i = T j A r L i = c j T j A r i ; for all values of r. This implies that c i is independent of i; denote its common value by c.

Symmetry and Eigenvectors 2 If y 2 W and Ay = y then Ly = LAy = ALy which shows that each eigenspace of A in W is xed by L. Suppose that y ; : : : ; y r are the non-zero projections of i on the eigenspaces of A in W. Then and so i = y + + y r c i = L i = Ly + + Ly r : >From this it follows that Ly j = cy j for all j. To complete the proof that L is a scalar operator, it suces to show that each eigenspace of A in W is spanned by projections of the vectors i. Assume by way of contradiction that this is false. Then we would have an eigenspace U and a vector u in U orthogonal to the projection of each vector i onto U. This implies that i T u = 0 for all i and hence u = 0. Summing up, we have shown that W is an irreducible M-module, and that the only linear endomorphisms of W that commute with M are the scalar operators. By Schur's theorem [9, Theorem 5.3], it follows that M acts on W as the algebra of all linear endomorphisms. Because W contains each vector i, we see that J Si acts on W? as the zero operator. This completes the proof. The condition that ha; J i is the algebra of all linear operators of F (V ) is clearly very strong. Nonetheless it often holds. Some evidence for this is provided by the following table. We believe that the proportion of graphs on n vertices such that ha; J i is the full matrix algebra tends to as n increases. (This belief is based on the results in the table and on experiments with random graphs on up to 20 vertices.) jv j No. of No. of graphs with No. of asymmetric No. of asymmetric & graphs ha; J i = M nn (R) graphs connected graphs 6 78 4 4 4 7 522 46 76 72 8 6996 370 848 786 9 54354 48457 67502 65726 Figure 5 contains all four graphs of order six with the property that ha; J i generates the algebra of all linear operators of F (V ).

22 Chan & Godsil Figure 5: Graphs of order 6 with ha; J i = M nn (R) The proofs of the main results in this section (Theorem 7. and Theorem 7.3) are based on ideas from Laey [20]. 8. Walks Again Two graphs X and Y are cospectral if A(X) and A(Y ) are similar. The following result implies that cospectral graphs have the same closed walk generating function. 8. Lemma. If A and B are symmetric matrices, then the following are equivalent: i. A and B are similar. ii. A and B have the same characteristic polynomial. iii. tr(a n ) = tr(b n ), for all n 0. We let W (X; t) denote the generating function for walks in the graph X, enumerated by their length. If S V (X) then W S (X; t) is the generating function for the walks in X that start at a vertex in S, enumerated by length. If S and T are subsets of V (X) then W S;T (X; t) is the generating function for the walks in X that start at a vertex in S and nish in T. In particular, W i (X; t) is the generating function for the walks in X that start at i and W i;i (X; t) is the generating function for the walks in X that start and nish at the vertex i. The next two results can be derived from identities in [7]. 8.2 Lemma. Let X and Y be cospectral graphs. Then their complements X and Y are cospectral if and only if W (X; t) = W (Y; t). 8.3 Lemma. Let i and j be vertices in the graph X. Then Xni and Xnj are cospectral if and only if W i;i (X; t) = W j;j (X; t). Their complements are cospectral as well if and only if we also have W i (X; t) = W j (X; t).

Symmetry and Eigenvectors 23 If is the characteristic vector of S and the space spanned by the vectors A r has dimension s +, we say that S has dual degree s. The covering radius of S is the least integer r such that each vertex of X is at distance at most r from some vertex in S. It follows that (I + A) 0 ; : : : ; (I + A) r are linearly independent with increasing number of non-zero entries, thus r s. As the diameter of X is the maximum value for the covering radius of a vertex, this inequality is a generalization of the well known fact that the diameter of X is bounded above by the number of distinct eigenvalues of X, less. The terms covering radius and dual degree come from coding theory, where they refer to the covering radius and dual degree of subsets of the n-cube or,more generally, of the Hamming graph H(n; q). For more information see, e.g., [, Chapter ]. Let S be a subset of V (X) with characteristic function. Dene two vertices i and j of X to be equivalent if (A r )(i) = (A r )(j); r 0: This determines a partition of V, which we call the partition induced by S. The partition induced by V itself is the walk partition, since two vertices i and j are in the same cell if and only the generating function for the walks in X that start at i is equal to the generating function for the walks in X that start at j. The walk partition! is a renement of the partition of V (X) by valency. It is rened in turn by the coarsest equitable partition of X that renes the partition with S and its complement as its cells. This shows that the dual degree of X is a lower bound on the number of cells in this partition. It follows from the proof of Theorem 7. that the A-module generated by is an irreducible module for the algebra ha; J S i. This module has a basis consisting of vectors A r, and so the matrix M representing the action of A on this module is determined by the inner products ha r ; A A s i; r; s 0: As ha r ; A A s i = h; A r+s+ i it follows that M is determined by the numbers T A r, i.e., by the number of walks in X that start in nish in S and have length r, for each non-negative integer r. 8.4 Theorem. Let X be a graph on n vertices and let S be a subset of V (X) with dual degree n?. Then the walk generating functions W i;s (X; t), for i 2 V (X), determine X. Proof. Let be the characteristic function of S. As the dual degree of S is n?, it follows from the proof of Theorem 7. that the matrices A r;s := A r T A s ; r; s n

24 Chan & Godsil form a basis for the algebra of all n n matrices. But A r T A r = A r (A s ) T ; consequently the matrices A r;s are determined by the given walk generating functions. As these matrices form a basis, we can write A as a linear combination of them. The coecients in this linear combination are determined by the inner products ha; A r;s i = ha; A r T A s i = T A r+s+ : Hence these coecients are determined by the walk generating function W S;S (X; t); since this equals P i2s W i;s(x; t), the theorem follows. If W i;s = W j;s then he i? e j ; A r i = 0 for all non-negative integers r. But if the dual degree of S is n? then the vectors A r span R n, consequently in this case e i? e j must zero, and therefore i and j are equal. Note also that W i;s (X; t) is determined by its rst n coecients. We can order the set of all real generating functions by writing F (t) > G(t) if the rst non-zero coecient of F (t)? G(t) is positive. The point of Theorem 8.4 is that, under the given hypothesis, the graph X is determined by the ordered set of walk generating functions. Thus it provides an eective isomorphism test. 9. Trees Let D be the diagonal matrix whose i-th diagonal entry is the valency of the i-th vertex of the graph X. If X is connected then 0 is a simple eigenvalue of A? D with eigenvector ; it follows from the spectral decomposition that the all-ones matrix J is a polynomial in A?D. Hence, when X is connected, the algebra generated by A and J is contained in the algebra generated by A and D. If X is a tree we can say more about the latter algebra. What we do say has not been published before. First we need some more notation. A walk in X is closed if its rst and last vertices are the same; it is irreducible if any two vertices at distance two in the sequence are distinct. If X is a tree then any two vertices are joined by exactly one irreducible walk, and the length of this walk is the distance between the two vertices. Let A be the adjacency matrix of X, and let p r (A) denote the matrix with ij-entry equal to the number of irreducible walks of length r from i to j in X. Dene the generating

Symmetry and Eigenvectors 25 function (X; t) by (X; t) = X r p r (A)t r : Note that p 0 (A) = I and p (A) = A. As a walk of length two is either closed or irreducible, p 2 (A) = A 2? D. Our next result implies that p r (A) is a polynomial in A and D, with degree r in A. 9. Theorem. Let A be the adjacency matrix of X. We have (t 2 (D? I)? ta + I)(X; t) = (? t 2 )I: (9:) Proof. The coecient of t r in (9.) is (D? I)p r?2 (A)? Ap r? (A) + p r (A); with the understanding that r 0 and p r (A) = 0 when r < 0. Suppose that i and j are vertices of X. Assume further that r 3. Then (Ap r? (A)) i;j equals the number of walks of length r from i to j that are formed by a walk of length from i to one of its neighbours, ` say, followed by an irreducible walk of length r? from ` to j. This class of walks includes all irreducible walks of length r from i to j. It also contains the walks that start (i; `; i), followed an irreducible walk of length r? 2 from i to j with second vertex distinct from `, that is, an irreducible walk of length r? 2 from i to j with second vertex not equal to `. The number of these walks is ((D? I)p r?2 (A)) i;j. Accordingly we have proved that the coecient of t r in (t 2 (D? I)? ta + I)(X; t) is zero when r 3. We have (X; t) = I + ta + t 2 (A 2? D) + t 3 M; for some matrix M (with entries formal power series in t). Now (I? ta + t 2 (D? I))(I + ta + t 2 (A 2? D) + t 3 M) = I + t 2 (A 2? D? A 2 + D? I) + t 3 N; where the entries of N are, again, formal power series in t. It follows from this that the left and right sides of (9.) are equal as generating functions.

26 Chan & Godsil 9.2 Corollary. Let A be the adjacency matrix of the graph X, and let D be the diagonal matrix of valencies of X. Then p r (A) is a polynomial in A and D with degree r in A. If X has vertex set V, let X i denote the graph with the same vertex set, where two vertices are adjacent in X i if and only if they are at distance i in X. (Thus X = X.) Let A i be the adjacency matrix of X i, and let A 0 denote the identity matrix of order jv jjv j. If r is greater than the diameter of X then X r has no edges and A r = 0. We call the A i 's the distance matrices of X. 9.3 Corollary. If X is a tree with diameter d then ha; Di contains A 0 ; : : : ; A d. Proof. If X is a tree then two vertices are at distance r if and only if there is a unique irreducible walk of length r joining them. Therefore A r = p r (A) and X r A r t r = (? t 2 )(I? ta + t 2 (D? I))? and therefore the matrices A r are polynomials in A and D. The remaining results in this section have no bearing on the main topic of these notes, but may still be interesting. If X is a graph on n vertices, W (X; t) denotes the nn matrix with ij-entry equal to the generating function for the walks in X from i to j, counted by length. We call W (X; t) the walk generating function of X. It is not hard to show that, if A is the adjacency matrix of X, then W (X; t) = (I? ta)? : Our next result shows that, for regular graphs, (X; t) is determined by W (X; t). 9.4 Corollary. If X is a regular graph with valency k then (X; t) =? t 2 + (k? )t 2 W (X; t( + (k? )t2 )? ): Suppose that X has exactly c connected components. It follows from Theorem 9. that each entry of (X; t) is the product of a formal power series in t with the polynomial? t 2, given this it can be shown that (? t 2 ) c divides det (X; t).

Symmetry and Eigenvectors 27 9.5 Lemma. Let X be a graph. Then det(i? ta + t 2 (D? I)) =? t 2 if and only if X is a tree. Proof. We rst show that the result holds when X is a tree. We proceed by induction on the number of vertices, noting that it can be easily veried when X has two vertices. Suppose X has at least three vertices. Suppose i is a vertex in X with valency, adjacent to a vertex j. with valency at least 2. Let Y be the graph got by deleting i from X. Let D and A be respectively the valency and adjacency matrices of Y. If M := I? ta + t 2 (D? I) then M i;i =, M i;j =?t and M i;r = 0 if r 6= i; j. If we add t times column i of M to column j and delete row and column i, the resulting matrix is I? ta + t 2 (D? I), and its determinant is equal to det(m). Now we prove the converse. By Theorem 9., (X; t) = (? t 2 )(I? ta + t 2 (D? I))? : If det(i?ta+t 2 (D?I)) = (?t 2 ), this implies that the entries of (X; t) are polynomials in t and therefore there is an upper bound on the length of an irreducible walk in X. Consequently X has no cycles. From the rst part of the proof, it follows that det(i? ta + t 2 (D? I)) is (? t 2 ) c, where c is the number of components of X; this forces us to conclude that X is a tree. It is left an exercise to show that det(i? ta + t 2 (D? I)) divides (? t 2 ) n if and only if X is a forest. Finally, we note that dierentiating both sides of (9.) with respect to t yields, after some manipulation, that (I? ta + t 2 (D? I)) 0 (X; t)(i? ta + t 2 (D? I)) = (t 2 + )A? 2tD: Here 0 (X; t) is the derivative of (X; t) with respect to t. If X is a forest then 0 (X; ) is the distance matrix of X.

28 Chan & Godsil Compact Graphs Compact graphs are a class of graphs with the property that, if a non-identity automorphism exists then we can nd one in polynomial time. Equitable partitions provide a useful tool to study them with. Our treatment follows [3], with some adjustments occasioned by interesting work of Evdokimov, M. Karpinski and I. Ponomarenko [9]. 0. A Convex Polytope Let M be a matrix algebra, for example, the adjacency algebra of a graph. We dene S(M) to be the set of all doubly stochastic matrices that commute with each element of M. When M = hai, we will usually write S(A) rather than S(M). The set S(M) is convex and contains the permutation matrices that commute with each element of M. If G and H are two matrices in S(M) and F = tg + (? t)h for some t in the interval [0; ] then F i;j 6= 0 if either G i;j 6= 0 or H i;j 6= 0. From this we see that any permutation matrix in S(M) is an extreme point. Slightly modifying Evdokimov et al, we dene M to be compact if all the extreme points of S(M) are permutation matrices. Note that when M is nitely generated the set S(M) is polyhedral if M is generated by A ; : : : ; A r then S(M) consists of the matrices F such that F 0; A i F = F A i ; i = ; : : : ; r F = ; F T = : As S(M) is clearly bounded, it will therefore be a convex polytope. We say that a graph X is compact if its adjacency algebra is compact. In this case the terminology, and many of the basic results, are due to Tinhofer [30]. Why is this property signicant? One reason is that it is possible to nd extreme points of S(A) in polynomial time. If all extreme points of S(A) belonged to Aut(A) then we would be able to decide, in polynomial time, whether a graph admitted an non-identity automorphism. In general, no such algorithm is known. The complete graphs are compact, a fact which is equivalent to the statement that any doubly stochastic matrix is a convex combination of permutation matrices. Cycles and trees are compact graphs; the disjoint union of two connected non-isomorphic k-regular graphs is not compact. (For proofs of these assertions, see [30].) The complement of a

Symmetry and Eigenvectors 29 compact graph is compact, as is easily shown. The next result provides further evidence that being compact is a strong condition. 0. Lemma. A compact regular graph is vertex-transitive. Proof. Suppose X is compact and regular, with adjacency matrix A. Then jv j J 2 S(A) and, as X is compact, we may assume that J = X i a i P i where a i 0 and P i is an automorphism of X, for all i. Assume 2 V (X). Since all entries in J are equal to it follows that, for each r, there is a permutation matrix in the above sum, P say, such that P ;r =. Hence, for each vertex r of X, there is an automorphism of X that maps to r, and thus Aut(X) acts transitively on V (X) as claimed. Lemma 0. is not the full truth, we have stated it here because the argument used in its proof is important. We now oer a more precise result, from [3], but with a new proof. Note that a permutation group on a set V is generously transitive if, given any two points in X, there is a permutation that swaps them. 0.2 Lemma. If X is regular, connected and compact, then Aut(X) is generously transitive. Proof. Let G = Aut(X). As X is compact, the matrices in G span the same subspace as the matrices in S(A). As X is regular and connected, and matrix which commutes with A commutes with J, and therefore its row and columns sums are all equal. Hence the real span of the matrices in S(A) is equal to C(A), and consequently the centralizer of G in M nn (R) is C(C(A)) = hai, [28, Theorem 39.3], which is commutative. Hence G is multiplicity free, with rank equal to the number of eigenvalues of A [2, Example 2.]. Also, hai is an association scheme since it is generated by A and A = A T, all matrices in hai are symmetric and so all orbitals of G are symmetric. Hence G is generously transitive.

30 Chan & Godsil. Equitable Partitions Let be an equitable partition of X, and let D be the characteristic matrix of. Let P be dened by P = D(D T D)? D T : Then P is symmetric, P 2 = P and P and D have the same column space. Hence P is the matrix representing orthogonal projection onto the column space of D, i.e., onto F (V; ). Note that D T D is a diagonal matrix, with i-th diagonal entry equal to the size of the i-th cell of, and that P is a block-diagonal matrix with each diagonal block a multiple of the `all-ones' matrix. In particular, all entries of P are non-negative. Because 2 F (V; ) it follows that P =, which implies that each row of P sums to. Similarly each column sums to and therefore P is a doubly stochastic matrix.. Lemma. Let X be a graph and let be a partition of V (X). Then is equitable if and only if the matrix P representing orthogonal projection onto F (V; ) lies in S(A). Proof. The entries of P are non-negative, as we saw above. Because 2 F (V; ) it follows that P = and, because P is symmetric, it is therefore a doubly stochastic matrix. The column space of P is F (V; ), which is A-invariant. F (V; ) and therefore P AP = AP. Accordingly Hence each column of AP lies in P A = (AP ) T = (P AP ) T = P AP = AP; which proves that A and P commute and therefore P 2 S(A). Conversely, if P and A commute then AP = AP P = P AP; since P represents the orthogonal projection onto F (V; ), AP 2 F (V; ). As a result, F (V; ) is A-invariant, and by Lemma 5.2, is equitable..2 Lemma. If X is a compact graph, each equitable partition is an orbit partition. Proof. Let be an equitable partition of the compact graph X, and let P be the matrix representing orthogonal projection on F (V (X); ). Then P 2 S(A(X)), and hence P is a convex combination of automorphisms of X. Arguing as in the proof of Lemma 0., it follows that the cells of are the orbits of some group of automorphisms of X.