Real symmetric matrices/1. 1 Eigenvalues and eigenvectors

Similar documents
Quantum Computing Lecture 2. Review of Linear Algebra

Spectral Theorem for Self-adjoint Linear Operators

MATHEMATICS 217 NOTES

Math 4153 Exam 3 Review. The syllabus for Exam 3 is Chapter 6 (pages ), Chapter 7 through page 137, and Chapter 8 through page 182 in Axler.

ALGEBRA QUALIFYING EXAM PROBLEMS LINEAR ALGEBRA

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )

1 Linear Algebra Problems

Linear algebra II Homework #1 due Thursday, Feb. 2 A =. 2 5 A = When writing up solutions, write legibly and coherently.

NOTES ON BILINEAR FORMS

Math 350 Fall 2011 Notes about inner product spaces. In this notes we state and prove some important properties of inner product spaces.

The Spectral Theorem for normal linear maps

Review problems for MA 54, Fall 2004.

1 Last time: least-squares problems

MATH 423 Linear Algebra II Lecture 33: Diagonalization of normal operators.

Linear algebra 2. Yoav Zemel. March 1, 2012

Contents. Preface for the Instructor. Preface for the Student. xvii. Acknowledgments. 1 Vector Spaces 1 1.A R n and C n 2

Diagonalizing Matrices

LINEAR ALGEBRA REVIEW

MATRICES ARE SIMILAR TO TRIANGULAR MATRICES

Linear Algebra: Matrix Eigenvalue Problems

LECTURE VI: SELF-ADJOINT AND UNITARY OPERATORS MAT FALL 2006 PRINCETON UNIVERSITY

Linear algebra II Homework #1 due Thursday, Feb A =

Linear Algebra using Dirac Notation: Pt. 2

Then x 1,..., x n is a basis as desired. Indeed, it suffices to verify that it spans V, since n = dim(v ). We may write any v V as r

Linear Algebra II Lecture 13

LINEAR ALGEBRA BOOT CAMP WEEK 4: THE SPECTRAL THEOREM

Linear Algebra in Actuarial Science: Slides to the lecture

The following definition is fundamental.

Linear algebra and applications to graphs Part 1

Review of Linear Algebra Definitions, Change of Basis, Trace, Spectral Theorem

BASIC ALGORITHMS IN LINEAR ALGEBRA. Matrices and Applications of Gaussian Elimination. A 2 x. A T m x. A 1 x A T 1. A m x

Linear Algebra Lecture Notes-II

Symmetric and anti symmetric matrices

Homework 11 Solutions. Math 110, Fall 2013.

Properties of Matrices and Operations on Matrices

PRACTICE PROBLEMS FOR THE FINAL

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL

Numerical Linear Algebra Homework Assignment - Week 2

Lecture notes: Applied linear algebra Part 1. Version 2

Linear Algebra. Min Yan

Symmetric and self-adjoint matrices

Mathematical Methods wk 2: Linear Operators

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

University of Colorado at Denver Mathematics Department Applied Linear Algebra Preliminary Exam With Solutions 16 January 2009, 10:00 am 2:00 pm

Linear Algebra. Workbook

Chapter 6 Inner product spaces

Linear Algebra II. 7 Inner product spaces. Notes 7 16th December Inner products and orthonormal bases

Chapter 3 Transformations

Very few Moore Graphs

1. General Vector Spaces

Linear algebra II Homework #1 solutions A = This means that every eigenvector with eigenvalue λ = 1 must have the form

SUMMARY OF MATH 1600

Definition (T -invariant subspace) Example. Example

Lecture 2: Linear operators

MATH 583A REVIEW SESSION #1

235 Final exam review questions

08a. Operators on Hilbert spaces. 1. Boundedness, continuity, operator norms

(v, w) = arccos( < v, w >

A linear algebra proof of the fundamental theorem of algebra

A linear algebra proof of the fundamental theorem of algebra

Spectral Theorems in Euclidean and Hermitian Spaces

Lecture notes on Quantum Computing. Chapter 1 Mathematical Background

Elementary linear algebra

Math Linear Algebra II. 1. Inner Products and Norms

ALGEBRA 8: Linear algebra: characteristic polynomial

Ma/CS 6b Class 23: Eigenvalues in Regular Graphs

Throughout these notes we assume V, W are finite dimensional inner product spaces over C.

Final A. Problem Points Score Total 100. Math115A Nadja Hempel 03/23/2017

Then since v is an eigenvector of T, we have (T λi)v = 0. Then

22m:033 Notes: 7.1 Diagonalization of Symmetric Matrices

Linear Algebra 2 Spectral Notes

Linear Algebra Practice Problems

Math 489AB Exercises for Chapter 2 Fall Section 2.3

Dimension. Eigenvalue and eigenvector

Math 25a Practice Final #1 Solutions

FFTs in Graphics and Vision. The Laplace Operator

4.2. ORTHOGONALITY 161

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

Foundations of Matrix Analysis

Linear Algebra 1. M.T.Nair Department of Mathematics, IIT Madras. and in that case x is called an eigenvector of T corresponding to the eigenvalue λ.

Math 108b: Notes on the Spectral Theorem

I. Multiple Choice Questions (Answer any eight)

Archive of past papers, solutions and homeworks for. MATH 224, Linear Algebra 2, Spring 2013, Laurence Barker

WOMP 2001: LINEAR ALGEBRA. 1. Vector spaces

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

Math 113 Final Exam: Solutions

LinGloss. A glossary of linear algebra

Bare-bones outline of eigenvalue theory and the Jordan canonical form

Periodicity & State Transfer Some Results Some Questions. Periodic Graphs. Chris Godsil. St John s, June 7, Chris Godsil Periodic Graphs

Math 554 Qualifying Exam. You may use any theorems from the textbook. Any other claims must be proved in details.

Econ 204 Supplement to Section 3.6 Diagonalization and Quadratic Forms. 1 Diagonalization and Change of Basis

( 9x + 3y. y 3y = (λ 9)x 3x + y = λy 9x + 3y = 3λy 9x + (λ 9)x = λ(λ 9)x. (λ 2 10λ)x = 0

Lecture 7: Positive Semidefinite Matrices

Definitions for Quizzes

Eigenvectors and Hermitian Operators

Conceptual Questions for Review

UNDERSTANDING THE DIAGONALIZATION PROBLEM. Roy Skjelnes. 1.- Linear Maps 1.1. Linear maps. A map T : R n R m is a linear map if

GRE Subject test preparation Spring 2016 Topic: Abstract Algebra, Linear Algebra, Number Theory.

INNER PRODUCT SPACE. Definition 1

Spectral Theorems in Euclidean and Hermitian Spaces

Transcription:

Real symmetric matrices 1 Eigenvalues and eigenvectors We use the convention that vectors are row vectors and matrices act on the right. Let A be a square matrix with entries in a field F; suppose that A is n n. An eigenvector of A is a non-zero vector v F n such that va = λv for some λ F. The scalar λ is called an eigenvalue of A. The characteristic polynomial of A is the polynomial f A defined by f A (x) = det(xi A). The characteristic equation is the equation f A (x) = 0. Theorem 1 (a) The eigenvalues of A are the roots of the equation f A (λ) = 0. (b) The matrix A satisfies its own characteristic equation; that is, f A (A) = O, where O is the all-zero matrix. Part (b) of this theorem is known as the Cayley Hamilton Theorem. The minimal polynomial of A is the monic polynomial m A of least degree such that m A (A) = O. The Cayley Hamilton Theorem is equivalent to the statement that m A (x) divides f A (x). It is also true that f A (x) and m A (x) have the same roots (possibly with different multiplicities). A matrix D is diagonal if all its off-diagonal entries are zero. If D is diagonal, then its eigenvalues are the diagonal entries, and the characteristic polynomial of D is f D (x) = n i=1 (x d ii), where d ii is the (i,i) diagonal entry of D. A matrix A is diagonalisable if there is an invertible matrix Q such that QAQ 1 is diagonal. Note that A and QAQ 1 always have the same eigenvalues and the same characteristic polynomial. Theorem 2 The matrix A is diagonalisable if and only if its minimal polynomial has no repeated roots. 2 Symmetric and orthogonal matrices For the next few sections, the underlying field is always the field R of real numbers. Real symmetric matrices/1

We use A to denote the transpose of the matrix A: that is, it is the matrix whose ( j,i) entry is the (i, j) entry of A. The matrix A is called symmetric if A = A. The matrix Q is called orthogonal if it is invertible and Q 1 = Q. The most important fact about real symmetric matrices is the following theorem. Theorem 3 Any real symmetric matrix is diagonalisable. More precisely, if A is symmetric, then there is an orthogonal matrix Q such that QAQ 1 = QAQ is diagonal. The natural setting of this theorem is real inner product spaces, which we now describe. 3 Real inner product spaces An inner product on the real vector space V is a function from V V to R (we write the inner product of v and w as v w), satisfying the following three conditions: v (a 1 w 1 + a 2 w 2 ) = a 1 (v w 1 ) + a 2 (v w 2 ); w v = v w; v v 0, and v v = 0 if and only if v = 0. The first two conditions imply (a 1 v 1 + a 2 v 2 ) w = a 1 (v 1 w) + a 2 (v 2 w). We summarise the first and fourth conditions by saying that the inner product is bilinear; the second condition says that it is symmetric, and the third that it is positive definite. For any subspace W of V, we write W for the subspace {v V : (v w) = 0 for all w W}. This operation (read perp ) on subspaces has the properties V = W W (this means that V = W +W and W W = 0); dim(w) + dim(w ) = dim(v ); Real symmetric matrices/2

if W 1 W 2, then W 1 W 2 ; (W ) = W. Any real inner product space has an orthonormal basis e 1,...,e n. These vectors satisfy e i e i = 1 and e i e j = 0 for i j. Such a basis is constructed by applying the Gram Schmidt process to an arbitrary basis. A linear transformation A of V is said to be self-adjoint if v (wa) = (va) w for all vectors v, w. A linear transformation Q which preserves the inner product (in the sense that (vq) (wq) = v w) is called orthogonal. Now if we represent linear transformations with respect to an orthonormal basis, a transformation is self-adjoint if and only if its matrix is symmetric, and a transformation is orthogonal if and only if its matrix is orthogonal. Moreover, a transformation is orthogonal if and only if it maps an orthonormal basis to an orthonormal basis. So, in order to prove the spectral theorem, it is enough to prove the following assertion: Theorem 4 Let A be a self-adjoint transformation of a real inner product space V. Then there is a decomposition V = W 1 W 2 W r, and distinct scalars λ 1,...,λ r, with the properties (a) W i W j for i j; (b) va = λ j v for all v W j. To show this, let λ be any eigenvalue of A, and W the corresponding eigenspace {w V : wa = λw}. We claim that W is invariant under A. For take v W ; we must show that va W. But, for any w W, we have (va) w = v (wa) = v (λw) = λ(v w) = 0; so the assertion is proved. But then the restriction of A to W is a self-adjoint transformation. By induction on the dimension, there is a decomposition of W into subspaces with the properties of the theorem. Adding W to this decomposition, we obtain the result. Real symmetric matrices/3

4 The spectral theorem Let W be a subspace of the inner product space V. Then V = W W. Define a map P on V by the rule that (w + x)p = w, where w W and x W. (Any vector has a unique representation of this form.) We call P the projection of V onto W. Note that the image of P is W, and wp = w for all w W; hence P 2 = P. Moreover, it is easy to check that P is self-adjoint. Conversely, if P is self-adjoint and satisfies P 2 = P, then P is the projection of V onto some subspace W (namely, the image of P). So these conditions define projections in an intrinsic way. This means that we can recognise the matrix of a projection; it is a matrix satisfying P = P = P 2. The main theorem about real symmetric matrices can be re-phrased in terms of projections. In this form it is often referred to as the spectral theorem. Theorem 5 Let A be a real symmetric matrix with distinct eigenvalues λ 1,λ 2,...,λ r. Then there are projection matrices P 1,...,P r satisfying (a) P 1 + P 2 + + P r = I; (b) P i P j = O for i j; (c) A = λ 1 P 1 + λ 2 P 2 + + λ r P r. We take P i to be the projection onto the eigenspace V i associated with λ i (the set of all vectors v satisfying va = λ i v. Since these spaces are pairwise orthogonal and satisfy V 1 V 2 V r, conditions (a) and (b) hold. Part (c) is proved by noting that the two sides agree on any vector in V i, for any i, and so agree everywhere. 5 Commuting symmetric matrices There is a remarkable extension of the spectral theorem to sets of symmetric matrices. The essential condition is that the matrices should commute. Theorem 6 Let A 1,...,A s be a set of symmetric matrices satisfying A i A j = A j A i for all i, j. Then there is an orthogonal matrix P such that PAP 1 is diagonal for Real symmetric matrices/4

i = 1,...,s. Equivalently, there are projections P 1,...,P r such that conditions (a) and (b) of Theorem 5 hold, and for j = 1,... we have for some scalars λ i j. A j = i=1 r λ i j P i In terms of the inner product space, the theorem reads as follows. Given a set of s pairwise commuting self-adjoint transformations; then R n is the orthogonal direct sum of subspaces V i, on each of which all of the transformations are scalars. The proof is by induction on s, the spectral theorem itself being the case s = 1. So suppose that the theorem is true for s 1. Now decompose R n into eigenspaces of A s, using the spectral theorem; let V j correspond to the eigenvalue λ j of A s. We claim that V j is invariant under A 1,...,A s 1. For let v V j and i s 1. Then (va i )A s = (va s )A i = (λ j v)a i = λ j (va i ), where the first equality holds because A i and A s commute. Now this equation shows that va i is an eigenvector of A s with eigenvalue λ j ; so va i V j, as claimed. Now V j is an inner product space, and the restrictions of A 1,...,A s 1 to it are pairwise commuting self-adjoint transformations of it. So we can write V j as an orthogonal direct sum of subspaces, each of which consists of eigenvectors for A 1,...,A s 1 (and also of course for A s, since all vectors in V j are eigenvectors for A s ). The proof is complete. This result is crucial in the theory of association schemes. 6 Hermitian, normal and unitary matrices Although our main interest lies in real symmetric matrices, there is a parallel theory over the complex numbers, which is of great importance. The definition of an inner product on a complex vector space is slightly different: the definition is v (a 1 w 1 + a 2 w 2 ) = a 1 (v w 1 ) + a 2 (v w 2 ); w v = v w, where denotes complex conjugation; v v is a non-negative real number, and is zero if and only if v = 0. The first two conditions imply Real symmetric matrices/5

(a 1 v 1 + a 2 v 2 ) w = a 1 (v 1 w) + a 2 (v 2 w); v v is real for any vector v. As before, a linear transformation is self-adjoint if v (wa) = (va) w for any v and w. We say that Q is unitary if it preserves the inner product, that is, (vq) (wq) = v w for all v,w. If we take an orthonormal basis for the space, then the matrix representing a self-adjoint transformation is Hermitian, that is, A = A; and the matrix representing a unitary transformation satisfies Q = Q 1 (we also call such a matrix unitary). Now the complex spectral theorem asserts that, if A is Hermitian, then there is a unitary matrix Q such that QAQ 1 is diagonal. The reformulation in terms of projections and the extension to a set of commuting Hermitian matrices are as in the real case. Over the real numbers, a matrix which can be diagonalised by an orthogonal matrix must necessarily be symmetric. In the complex case, however, we can go a little further. The matrix Z is said to be normal if it commutes with its conjugate transpose (that is, ZZ = Z Z). Theorem 7 Let Z be a square matrix over C. Then there exists a unitary matrix Q such that QZQ 1 is diagonal if and only if Z is normal. For any complex matrix Z can be written uniquely in the form Z = X + iy, where X and Y are self-adjoint. (This is the analogue of the real and imaginary parts of a complex number.) Now Z is normal if and only if XY = Y X. If this holds, then there is a unitary matrix which diagonalises both X and Y, and hence Z. The converse is trivial. 7 An application: Moore graphs Here is an application to show how the theory of real symmetric matrices can be used in combinatorics. This analysis is due to Hoffman and Singleton [1]. Let Γ be a connected regular graph with valency k, and v a vertex of Γ. Each neighbour of v is joined to at most k 1 non-neighbours of v. So, if Γ has diameter 2, then it has at most 1+k +k(k 1) = k 2 +1 vertices altogether, with equality if and only if it has girth 5 (no circuits shorter than 5). Dually, if Γ has girth 5, then it has at least k 2 + 1 vertices, with equality if and only if it has diameter 2. Real symmetric matrices/6

A Moore graph of diameter 2 is a graph attaining these bounds, that is, a connected regular graph with diameter 2 and girth 5. It has n = k 2 + 1 vertices, where k is the valency. Our question is: which Moore graphs exist? The adjacency matrix A of such a graph satisfies A 2 = ki + (J I A), that is, A 2 +A (k 1)I = J, where J is the all-1 matrix. Thus J commutes with A, and so A and J are simultaneously diagonalisable. Now J has the all-1 vector j as an eigenvector with multiplicity n; all its other eigenvalues are zero. Clearly j is an eigenvalue of A with multiplicity 1. Its other eigenvalues thus satisfy λ 2 + λ (k 1) = 0, so λ = 1 2 ( 1 ± 4k 3). If their multiplicities are f and g, we have f + g = n 1 = k 2, 1 2 ( 1 + 4k 3) f + 1 2 ( 1 4k 3)g = k, the second equation coming from the fact that Trace(A) = 0 (since all the diagonal entries are zero). From these equations we find that ( f g) 4k 3 = 1 2k(k 2). If k = 2, then n = 5 and the unique graph is the pentagon. Suppose not. Then f g is an integer, so 4k 3 must be a perfect square, indeed the square of an odd number; say 4k 3 = (2s + 1) 2, so k = s 2 +s+1. In addition, 2s+1 must divide k(k 2) = (s 2 +s+1)(s 2 +s 1). It follows that 2s + 1 divides 3 5 = 15, so (since s > 0) we have 2s + 1 = 3, 5, 15 k = 3, 7, 57 n = 10, 50, 3250 It is known that there is a unique 3-valent Moore graph on 10 vertices (the Petersen graph) and a unique 7-valent Moore graph on 50 vertices (the Hoffman Singleton graph). The existence of a 57-valent Moore graph on 3250 vertices is unknown. Real symmetric matrices/7

References [1] A. J. Hoffman and R. R. Singleton, On finite Moore graphs, IBM J. Research Develop. 4 (1960), 497 504. Peter J. Cameron May 30, 2003 Real symmetric matrices/8