Strict diagonal dominance and a Geršgorin type theorem in Euclidean

Similar documents
Some inequalities involving determinants, eigenvalues, and Schur complements in Euclidean Jordan algebras

M. Seetharama Gowda. J. Tao. G. Ravindran

Complementarity properties of Peirce-diagonalizable linear transformations on Euclidean Jordan algebras

Key words. Complementarity set, Lyapunov rank, Bishop-Phelps cone, Irreducible cone

The eigenvalue map and spectral sets

Positive and doubly stochastic maps, and majorization in Euclidean Jordan algebras

M. Seetharama Gowda a, J. Tao b & Roman Sznajder c a Department of Mathematics and Statistics, University of Maryland

Permutation invariant proper polyhedral cones and their Lyapunov rank

SCHUR IDEALS AND HOMOMORPHISMS OF THE SEMIDEFINITE CONE

We describe the generalization of Hazan s algorithm for symmetric programming

ON POSITIVE SEMIDEFINITE PRESERVING STEIN TRANSFORMATION

Spectral sets and functions on Euclidean Jordan algebras

A PREDICTOR-CORRECTOR PATH-FOLLOWING ALGORITHM FOR SYMMETRIC OPTIMIZATION BASED ON DARVAY'S TECHNIQUE

On the game-theoretic value of a linear transformation relative to a self-dual cone

Some characterizations of cone preserving Z-transformations

L. Vandenberghe EE236C (Spring 2016) 18. Symmetric cones. definition. spectral decomposition. quadratic representation. log-det barrier 18-1

LINEAR ALGEBRA REVIEW

Elementary linear algebra

Linear Algebra. Workbook

ALGEBRA QUALIFYING EXAM PROBLEMS LINEAR ALGEBRA

On the game-theoretic value of a linear transformation on a symmetric cone

1 Last time: least-squares problems

Linear Algebra: Matrix Eigenvalue Problems

Math Camp Lecture 4: Linear Algebra. Xiao Yu Wang. Aug 2010 MIT. Xiao Yu Wang (MIT) Math Camp /10 1 / 88

Hermitian octonion matrices and numerical ranges

Linear Algebra 1. M.T.Nair Department of Mathematics, IIT Madras. and in that case x is called an eigenvector of T corresponding to the eigenvalue λ.

STRUCTURE AND DECOMPOSITIONS OF THE LINEAR SPAN OF GENERALIZED STOCHASTIC MATRICES

The Q Method for Symmetric Cone Programming

The Eigenvalue Problem over H and O. Quaternionic and Octonionic Matrices

(6) For any finite dimensional real inner product space V there is an involutive automorphism α : C (V) C (V) such that α(v) = v for all v V.

Knowledge Discovery and Data Mining 1 (VO) ( )

Recall the convention that, for us, all vectors are column vectors.

Math 443 Differential Geometry Spring Handout 3: Bilinear and Quadratic Forms This handout should be read just before Chapter 4 of the textbook.

The Jordan Algebraic Structure of the Circular Cone

Trace Inequalities for a Block Hadamard Product

2.1. Jordan algebras. In this subsection, we introduce Jordan algebras as well as some of their basic properties.

Chapter 2 Notes, Linear Algebra 5e Lay

Review of Linear Algebra

Boolean Inner-Product Spaces and Boolean Matrices

Math 315: Linear Algebra Solutions to Assignment 7

Math Bootcamp An p-dimensional vector is p numbers put together. Written as. x 1 x =. x p

Lecture 10 - Eigenvalues problem

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

Uniqueness of the Solutions of Some Completion Problems

Lecture Notes 1: Vector spaces

Spectral inequalities and equalities involving products of matrices

Algebra I Fall 2007

MATH 423 Linear Algebra II Lecture 33: Diagonalization of normal operators.

Canonical inner products on C (n) and M(n,K)

Quantum Computing Lecture 2. Review of Linear Algebra

Properties of Matrices and Operations on Matrices

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

Matrix Inequalities by Means of Block Matrices 1

Foundations of Matrix Analysis

Introduction to Matrix Algebra

Determining Unitary Equivalence to a 3 3 Complex Symmetric Matrix from the Upper Triangular Form. Jay Daigle Advised by Stephan Garcia

CS 246 Review of Linear Algebra 01/17/19

9.1 Eigenvectors and Eigenvalues of a Linear Map

MATH 23a, FALL 2002 THEORETICAL LINEAR ALGEBRA AND MULTIVARIABLE CALCULUS Solutions to Final Exam (in-class portion) January 22, 2003

Chapter 5 Eigenvalues and Eigenvectors

EE/ACM Applications of Convex Optimization in Signal Processing and Communications Lecture 2

Optimization Theory. A Concise Introduction. Jiongmin Yong

DAVIS WIELANDT SHELLS OF NORMAL OPERATORS

MATH 431: FIRST MIDTERM. Thursday, October 3, 2013.

NONCOMMUTATIVE POLYNOMIAL EQUATIONS. Edward S. Letzter. Introduction

Lie Algebra of Unit Tangent Bundle in Minkowski 3-Space

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )

POSITIVE MAP AS DIFFERENCE OF TWO COMPLETELY POSITIVE OR SUPER-POSITIVE MAPS

Review of Linear Algebra Definitions, Change of Basis, Trace, Spectral Theorem

Review problems for MA 54, Fall 2004.

Appendix A: Matrices

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

A PRIMER ON SESQUILINEAR FORMS

1 Linear Algebra Problems

CLASSIFICATION OF COMPLETELY POSITIVE MAPS 1. INTRODUCTION

Linear Algebra March 16, 2019

Lecture 1. 1 Conic programming. MA 796S: Convex Optimization and Interior Point Methods October 8, Consider the conic program. min.

Linear algebra 2. Yoav Zemel. March 1, 2012

LINEAR ALGEBRA MICHAEL PENKAVA

Topics in linear algebra

j=1 u 1jv 1j. 1/ 2 Lemma 1. An orthogonal set of vectors must be linearly independent.

The GEOMETRY of the OCTONIONS

Problems in Linear Algebra and Representation Theory

Mathematical foundations - linear algebra

E2 212: Matrix Theory (Fall 2010) Solutions to Test - 1

STAT200C: Review of Linear Algebra

Linear Algebra. Min Yan

Chap 3. Linear Algebra

Lecture 7: Positive Semidefinite Matrices

AN ELEMENTARY PROOF OF THE SPECTRAL RADIUS FORMULA FOR MATRICES

MATHEMATICS 217 NOTES

2. Linear algebra. matrices and vectors. linear equations. range and nullspace of matrices. function of vectors, gradient and Hessian

Jordan Canonical Form of A Partitioned Complex Matrix and Its Application to Real Quaternion Matrices

JORDAN AND RATIONAL CANONICAL FORMS

Lecture notes on Quantum Computing. Chapter 1 Mathematical Background

Jordan-algebraic aspects of optimization:randomization

ELA

Lecture notes: Applied linear algebra Part 1. Version 2

Preliminary/Qualifying Exam in Numerical Analysis (Math 502a) Spring 2012

Transcription:

Strict diagonal dominance and a Geršgorin type theorem in Euclidean Jordan algebras Melania Moldovan Department of Mathematics and Statistics University of Maryland, Baltimore County Baltimore, Maryland 2250 and M. Seetharama Gowda Department of Mathematics and Statistics University of Maryland, Baltimore County Baltimore, Maryland 2250 October 2, 2008 ABSTRACT For complex square matrices, the Levy-Desplanques theorem asserts that a strictly diagonally dominant matrix is invertible. The well-known Geršgorin theorem on the location of eigenvalues is equivalent to this. In this article, we extend the Levy-Desplanques theorem to an object in a Euclidean Jordan algebra when its Peirce decomposition with respect to a Jordan frame is given. As a consequence, we prove a Geršgorin type theorem for the spectral eigenvalues of an object in a Euclidean Jordan algebra. Key Words: quaternions, octonions, Euclidean Jordan algebras, strict diagonal dominance, Geršgorin type theorem E-mail: melania@math.umbc.edu E-mail: gowda@math.umbc.edu

2. Introduction In matrix theory, the well-known Geršgorin theorem [0] asserts that for an n n complex matrix A = [a ij ], the spectrum (consisting of the eigenvalues) of A lies in the union of Geršgorin discs in the complex plane: σ(a) n {z C : z a ii R i (A)}, where n R i (A) := a ij ( i n). j=,j i This is equivalent to the strict diagonal dominance theorem - known as the Levy-Desplanques theorem [0] - which says that if an n n complex matrix A = [a ij ] is strictly diagonally dominant, that is, a ii > R i (A) i =, 2,...,n, () then A is invertible in C n n. In a recent paper [5], Zhang extends the Geršgorin theorem to quaternionic matrices by stating two results, one for left eigenvalues and the other one for right eigenvalues (the difference arising because of non-commutative nature of quaternions). The strict diagonal dominance result extends to quaternionic matrices, since for a quaternionic square matrix A, the following two conditions are equivalent [4]: (a) Ax = 0 x = 0. (b) A is invertible, that is, there is a quaternionic matrix B such that AB = BA = I. It is easily seen (see Section 4) that Zhang s two Geršgorin type results carry over to octonionic matrices. Furthermore, the strict diagonal dominance condition implies condition (a) above and a modified version of (b). Our objective in this paper is to prove analogs of the above results in Euclidean Jordan algebras. More precisely, we show that if (V,,, ) is a Euclidean Jordan algebra of rank r and x = r x i e i + i<j x ij

is the Peirce decomposition of x V with respect to a given Jordan frame {e,...,e r } (see Section 3 for definitions), then the strict diagonal dominance condition i r x i > R i (x) := x ki + x ij i =, 2,..., r 2 ei k= j=i+ implies the invertibility of x in V. Moreover, for any x V, we have σ sp (x) r {λ R : λ x i R i (x)}, where σ sp (x) denotes the set of all spectral eigenvalues (coming from the spectral decomposition) of x in V. As a consequence, we deduce that if each x i is positive and the strict diagonal dominance condition holds, then x is in the interior of the symmetric cone in V. Our analysis is as follows. Since the results for real/complex Hermitian matrices are known, we first prove the strict diagonal dominance result in the matrix algebras of n n quaternion Hermitian matrices, 3 3 octonion Hermitian matrices, and the Jordan spin algebra. Then we use the structure theorem - that any Euclidean Jordan algebra is essentially the product of above mentioned algebras - to cover the general case. From this, we easily deduce the Geršgorin type result mentioned above. As we shall see, the case of 3 3 octonion Hermitian matrices requires special consideration: for such matrices, the spectral eigenvalues can be different from the real left/right eigenvalues and the strict diagonal dominance result requires a a non-standard proof that avoids left/right eigenvalues. Our paper is organized as follows. In Section 2, we describe quaternions, octonions, matrices over these, and some eigenvalues properties. In Section 3, we cover Euclidean Jordan algebra concepts, examples, and all preliminary results. In Section 4, we describe Geršgorin type left/right eigenvalue results for matrices with entries from real numbers/complex numbers/quaternions/octonions. Section 5 covers the strict diagonal dominance results for matrices. In Section 6, we prove the strict diagonal dominance result in Euclidean Jordan algebras. Finally, in Section 7, we prove a Geršgorin type theorem in Euclidean Jordan algebras. 3 2. Square matrices over quaternions and octonions, and their eigenvalues Throughout this paper, we use the standard notations - R for the set of all real numbers and C for the set of all complex numbers. If F denotes the

4 set of all reals/complex numbers/quaternions/octonions, we write F n for the space of all n vectors over F and F n n for the space of all n n matrices over F. 2.. Quaternions The linear space of quaternions - denoted by H - is a 4-dimensional linear space over R with a basis {, i, j, k}. The space H can be made into an algebra by means of the conditions i 2 = j 2 = k 2 = and ijk =. For any x = x 0 + x i + x 2 j + x 3 k H, we define the real part, conjugate, and norm by Re(x) := x 0, x := x 0 x i x 2 j x 3 k, and x := xx = x 2 0 + x2 + x2 2 + x2 3. We have xx = x 2 and xy = x y, for all x, y H. It is known that H is a non-commutative, associative, normed division algebra. Let A H n n. An element λ H is a left (right) eigenvalue of A if there is a nonzero x H n such that Ax = λx (respectively, Ax = xλ). We use the notation σ l (A) (σ r (A)) for the set of all left eigenvalues of A (respectively, the right eigenvalues of A). For a matrix in H n n, we can define the conjugate and transpose in the usual way. We say that a square matrix A with quaternion entries is Hermitian if A coincides with its conjugate transpose, that is, if A = (A) T. We list below some eigenvalue properties of quaternionic matrices. Theorem. Let A H n n. Then (a) The implication [x H n, Ax = 0] x = 0 holds if and only if there is a unique B H n n such that AB = BA = I ([4], Theorem 4.3). (b) The sets σ l (A) and σ r (A) are nonempty ([4], Theorem 5.3 and 5.4). (c) The sets σ l (A) and σ r (A) may be infinite ([5], Example 3.). (d) When A is Hermitian, the right eigenvalues are always real while the left eigenvalues may not be real ([3], Lemma H, [2], Page 98). (e) When A is Hermitian, there exist real eigenvalues λ, λ 2,..., λ n and corresponding eigenvectors v, v 2,..., v n in H n such that

5 v i v j = δ ij ( i, j), n A = λ m v m vm m= n and I = v m vm. m= (Theorem H, [3]). 2.2. Octonions The linear space of octonions over R - denoted by O - is a 8-dimensional linear space with basis {, e, e 2, e 3, e 4, e 5, e 6, e 7 }. The space O becomes an algebra via the following multiplication table on the non-unit basis elements [3]: For an element e e 2 e 3 e 4 e 5 e 6 e 7 e e 3 e 2 e 5 e 4 e 7 e 6 e 2 e 3 e e 6 e 7 e 4 e 5 e 3 e 2 e e 7 e 6 e 5 e 4 e 4 e 5 e 6 e 7 e e 2 e 3 e 5 e 4 e 7 e 6 e e 3 e 2 e 6 e 7 e 4 e 5 e 2 e 3 e e 7 e 6 e 5 e 4 e 3 e 2 e x = x 0 + x e + x 2 e 2 + x 3 e 3 + x 4 e 4 + x 5 e 5 + x 6 e 6 + x 7 e 7 in O, we define the real part, conjugate, and norm by Re(x) := x 0, x = x 0 x e x 2 e 2 x 3 e 3 x 4 e 4 x 5 e 5 x 6 e 6 x 7 e 7, and x := xx. We note that xx = x 2 and xy = x y for all x and y. It is known that O is a non-commutative, non-associative, normed division algebra. In addition, O is an alternative algebra, that is, the subalgebra generated by any two elements in O is associative []. As in the case of quaternionic matrices, one can define left and right eigenvalues for an octonionic matrix. We list below few eigenvalue properties of octonionic matrices. For further details, see [3]. Theorem 2. Let A O n n. Then

6 (a) The implication [x O n, Ax = 0] x = 0 holds if and only if there exist unique B and C in O n n such that AB = CA = I ([3], Lemma 4.4, Theorem 4.3, and Corollary 4.4). (b) The sets σ l (A) and σ r (A) may be infinite. (c) When A is Hermitian (that is, when A = (A) T )), the right eigenvalues may not be real ([3], Page 360). 3. Euclidean Jordan Algebras In this section, we briefly recall concepts, properties/results, and examples from Euclidean Jordan algebra theory. For short introductions, see [8] and []. For complete details, we refer to [6]. A Euclidean Jordan Algebra [6] is a triple (V,,.,. ), where (V,.,. ) is a finite dimensional inner product space over R and (x, y) x y : V V V is a bilinear mapping satisfying the following conditions for all x, y, and z: x y = y x, x (x 2 y) = x 2 (x y), and x y, z = y, x z. In addition, we assume that there is an element e V (called the unit element) such that x e = x, for all x V. An element c V is an idempotent if c 2 = c; it is a primitive idempotent if it is nonzero and cannot be written as a sum of two nonzero idempotents. We say that a finite set {e, e 2,..., e m } of primitive idempotents in V is a Jordan frame if e i e j = 0 if i j and m e i = e. For x V, we define m(x) := min{k > 0 : e, x,..., x k are linearly dependent} and rank of V by r = max{m(x) : x V }. Theorem 3. (Spectral decomposition theorem) Let V be a Euclidean Jordan algebra with rank r. Then for every x V, there exist a Jordan frame {e, e 2,..., e r } and real numbers λ,..., λ r such that x = λ e +... + λ r e r. The numbers λ i are called the spectral eigenvalues of x. (In this paper, we have used the additional word spectral in order to distinguish these eigenvalues from the left/right eigenvalues of matrices.) These numbers are

uniquely defined even though the Jordan frame that corresponds to x may not be unique. Given the spectral eigenvalues of x, we define σ sp (x) := {λ, λ 2,..., λ r }, trace(x) := λ + λ 2 + λ r, and det(x) := λ λ 2 λ r. 7 by Corresponding to an x V, we define the Lyapunov operator L x on V L x (z) := x z. We say that two elements x and y in V operator commute if the corresponding Lyapunov operators L x and L y commute (which can happen if and only if x and y have their spectral decompositions with respect to the same Jordan frame [6]). We say that an element x is invertible in V if all the spectral eigenvalues of x are nonzero. This happens if and only if there is a y in V that operator commutes with x and x y = e. The set of squares K := {x 2 : x V } is (called) a symmetric cone. It is a self-dual closed convex cone. Let {e, e 2,..., e r } be a Jordan frame in a Euclidean Jordan algebra V. For i, j {, 2,..., r}, we define the Peirce eigenspaces V ii := {x V : x e i = x} = Re i and when i j, V ij := { x V : x e i = } 2 x = x e j. Theorem 4. (Theorem IV.2., Faraut and Koranyi [6]) The space V is the orthogonal direct sum of spaces V ij (i j). Thus, given a Jordan frame {e, e 2,...,e r }, we can write any element x V as r x = x ij, x i e i + i<j where x i R and x ij V ij. This expression is the Peirce decomposition of x with respect to {e, e 2,...,e r }.

8 Given the above Peirce decomposition of x, we define the Geršgorin radii of x: i r R i (x) := x ki + x ij, i =, 2..., r. (2) 2 ei k= j=i+ In what follows, we describe some examples of Euclidean Jordan algebras. 3.. Matrix algebras Let F denote any of the spaces R, C, H, and O. A matrix A F n n is said to be Hermitian if A := (A) T = A. Let Herm(F n n ):=set of all n n Hermitian matrices with entries from F. Given X, Y Herm(F n n ), we define X, Y := Re trace(xy ) and X Y := (XY + Y X) 2 where the trace of a matrix is the sum of its diagonal elements. (We note that when X and Y are complex, there is no need to take the real part.) It is known that Herm(R n n ), Herm(C n n ), and Herm(H n n ) are Euclidean Jordan algebras, each of rank n. Moreover, the set {E, E 2,...,E n } is a Jordan frame in each of these algebras where E i is the diagonal matrix with in the (i, i)-slot and zeros elsewhere. It is also known that Herm(O 3 3 ) is Euclidean Jordan algebra of rank 3. Furthermore, the set {E, E 2, E 3 } is a Jordan frame in this algebra. For a matrix X in any one of these algebras, it is easy to write down the Peirce decomposition with respect to {E, E 2,..., E n }. For example, in Herm(O 3 3 ), where X 2 = X = 0 a 0 a 0 0 0 0 0 p a b a q c b c r = p E + q E 2 + r E 3 + X 2 + X 3 + X 23,, X 3 = 0 0 b 0 0 0 b 0 0, and X 23 = Corresponding to this, we have (the Geršgorin radii of X): R (X) = 2 E ( X 2 + X 3 ) = a + b, 0 0 0 0 0 c 0 c 0.

9 R 2 (X) = 2 E2 ( X 2 + X 23 ) = a + c, etc. More generally, for A = [a ij ] Herm(F n n ) (with n = 3 when F = O), it is easily seen that with respect to the Jordan frame {E, E 2,..., E n }, R i (A) := n j=,j i a ij ( i n). (3) 3.2. The Jordan spin algebra L n Consider R n (n > ) where any element x is written as [ ] x0 x = x with x 0 R and x R n. The inner product in R n is the usual inner product. The Jordan product x y in R n is defined by [ ] [ ] [ ] x0 y0 x, y x y = :=. x y x 0 y + y 0 x We shall denote this Euclidean Jordan algebra (R n,,, ) by L n. We note the spectral decomposition of any x with x 0: where Thus, x = λ e + λ 2 e 2 λ := x 0 + x, λ 2 := x 0 x, e := [ ] x, and e 2 := [ 2 x 2 x x det(x) = λ λ 2 = x 2 0 x 2. ]. Now consider any Jordan frame {e, e 2 } in L n. Then there exists a unit vector u R n such that e := [ ] and e 2 u 2 := [ ]. 2 u With respect to this, any x L n has a Peirce decomposition x = x e + x 2 e 2 + x 2

0 where [ 0 x 2 = v ] for some v R n with u, v = 0. (This is easy to verify, see e.g., Lemma 2.3.4, [2].) This leads to [ ] x0 x =, x where x 0 = 2 (x + x 2 ) and x = 2 (x x 2 )u + v. Thus det(x) = x 2 0 x 2 = x x 2 v 2 = x x 2 x 2 2. (4) We finally note that as e = e 2 = 2, the Geršgorin radii of x are given by R (x) = 2 e x 2 = x 2 = R 2 (x). (5) 3.3. Simple algebras and the structure theorem A Euclidean Jordan algebra is said to be simple if it is not the direct product of two (nontrivial) Euclidean Jordan algebras. The classification theorem (Chapter V, Faraut and Koranyi [6]) says that every simple Euclidean Jordan algebra is isomorphic to one of the following: () The Jordan spin algebra L n ; (2) Herm(R n n ); (3) Herm(C n n ); (4) Herm(H n n ); (5) Herm(O 3 3 ). Furthermore, the structure theorem, see (Chapters III and V, Faraut and Koranyi [6]) says that any Euclidean Jordan algebra is a (Cartesian) product of simple Euclidean Jordan algebras.

3.4. Algebra automorphisms Given a Euclidean Jordan algebra V, an invertible linear transformation Λ : V V is said to to be an algebra automorphism if Λ(x y) = Λ(x) Λ(y) x, y V. We need the following results for our later use: () The trace and determinant are invariant under algebra automorphisms. (2) In a simple Euclidean Jordan algebra, every algebra automorphism is orthogonal (that is, it preserves the inner product), see Page 56, [6]. (3) In a simple algebra, any Jordan frame can be mapped onto any other Jordan frame by an algebra automorphism, see Theorem IV.2.5, [6]. 3.5. The algebra Herm(O 3 3 ) The algebra Herm(O 3 3 ) is crucial for our analysis. We collect below some important results that are needed. For A, B Herm(O 3 3 ), the so-called Freudenthal product [3] is defined by A B := A B 2 (Atr(B) + B tr(a)) + (tr(a)tr(b) tr(a B))I, 2 where I is the identity matrix. Recall that for a matrix A Herm(O 3 3 ), det(a) is the product of its spectral eigenvalues. In the result below (which is essentially in [3]), we express this determinant in terms of the entries of A. Lemma 5. Let A Herm(O 3 3 ) be given by A := p a b ā q c, b c r where p, q, r R and a, b, c O. Then det(a) = 3 tr ((A A) A) = pqr + 2Re( b(ac)) r a 2 q b 2 p c 2. (6) Proof. The second equality comes from direct computation, see [3]. In particular, when A is diagonal, the middle expression reduces to the product of the diagonal entries of A.

2 We prove the first equality. By the spectral decomposition theorem (see Section 3), we may write A = λ f + λ 2 f 2 + λ 3 f 3, where λ, λ 2, λ 3 are the spectral eigenvalues of A, and {f, f 2, f 3 } is a Jordan frame in Herm(O 3 3 ). As this algebra is simple, there is an algebra automorphism Λ of Herm(O 3 3 ) that maps {f, f 2, f 3 } to {E, E 2, E 3 }, where E i is a 3 3 matrix with one in the (i, i) slot and zeros elsewhere. Then Λ(A) is a diagonal matrix with λ, λ 2, λ 3 on the diagonal. Since Λ(A B) = Λ(A) Λ(B), Λ(A B) = Λ(A) Λ(B) and trλ(a) = tr(a), we have (from the second equality in (6) applied to Λ(A)), But 3 tr ((Λ(A) Λ(A)) Λ(A)) = λ λ 2 λ 3. 3 tr ((Λ(A) Λ(A)) Λ(A)) = 3 trλ ((A A) A) = tr ((A A) A). 3 Thus, det(a) = λ λ 2 λ 3 = tr ((A A) A) 3 proving the first equality in (6). For objects a, b, c O and for the matrix A given above, we let [a, b] := ab ba, [a, b, c] := (ab)c a(bc), and Φ(a, b, c) := 2 Re ([a, b]c). Also, let s(a) := pq + qr + rp a 2 b 2 c 2. (Recall that tr(a) = p+q+r.) We need the following result from [3] which was verified using Mathematica. Lemma 6 (Lemma O3, [3]) The real eigenvalues of the 3 3 octonion Hermitian matrix A satisfy the modified characteristic equation det(λi A) = λ 3 (tra)λ 2 + s(a)λ det(a) = r where r is either of the two roots of r 2 + 4Φ(a, b, c)r [a, b, c] 2 = 0. Remark. It follows from Lemma 5 that the spectral eigenvalues of A are the roots of det(λi A) = λ 3 (tra)λ 2 + s(a)λ det(a) = 0.

3 4. Geršgorin type theorems for matrices Let F denote any one of the spaces R, C, H, and O. For A = [a ij ] F n n, we let n R i (A) := a ij. j=,j i We define σ l (A) and σ r (A) in the usual way. The following two results are routine generalizations of classical Geršgorin theorem and the Geršgorin type theorems of Zhang [5]. We state them for completeness. Theorem 7. (Geršgorin type theorem for left eigenvalues) For A = [a ij ] F n n, we have σ l (A) n {λ F : λ a ii R i (A)}. Proof. Suppose λ σ l (A) and 0 x F n with Ax = λx. Let x i := max j n x j (which is nonzero). Then (Ax) i = λx i implies (λ a ii )x i = Since ab = a b in F, we have n j i,j= a ij x j. λ a i x i R i (A) x i. Thus λ x i R i (A) proving the result. In what follows, we say that elements µ and λ in F are similar (and write µ λ) if there is a nonzero z F such that µ = zλz. (Note that zλz is well defined even in O because of the alternative property.) We state the following result without proof as it is similar to that of Theorem 7 in [5]. Theorem 8. (Geršgorin theorem for right eigenvalues) Let A = [a ij ] F n n. Then for every right eigenvalue λ of A there exists µ F, µ λ such that n µ {γ F : γ a ii R i (A)}.

4 5. Strict diagonal dominance in F n n Let F be as in the previous section. For a matrix A = [a ij ] F n n, we say that A is strictly diagonally dominant if a ii > R i (A) i =, 2,..., n. Theorem 9. For A = [a ij ] F n n, consider the following statements: () A is strictly diagonally dominant. (2) The implication [x F n, Ax = 0] x = 0 holds. (3) There exist unique matrices B and C in F n n (which are equal when A is defined over F = H) such that AB = I = CA. (4) A is invertible in the Euclidean Jordan algebra Herm(H n n ). (5) A is invertible in the Euclidean Jordan algebra Herm(O 3 3 ). Then we have the following implications: () (2) (3), (3) (4) when A Herm(H n n ), and () (5) when A Herm(O 3 3 ). Proof. The implication () (2) follows immediately from Theorem 7. The equivalence of (2) and (3) is obvious when F is R or C, follows from Theorem for F = H, and from Theorem 2 for F = O. Now assume that A belongs to Herm(H n n ). (3) (4): When (3) holds, there exists a unique matrix B H n n such that AB = BA = I. By uniqueness of B, we see that B = B, which means that B is Hermitian. To prove that B is the inverse of A in the the algebra Herm(H n n ), we need only to show that A and B operator commute, that is, L A L B = L B L A, where L A (X) := AX+XA 2 (X Herm(H n n )), etc. This easily follows due to the associativity in H. (4) (3): As A Herm(H n n ), by Theorem (e), A can be expanded as A = n λ m v m vm, (7) m=

where {v m : m =,...,n} is an orthonormal basis of eigenvectors of A, with real eigenvalues λ m. In view of the properties of v m, the set {v v,...,v nv n } is a Jordan frame in Herm(Hn n ). This means that (7) is the spectral decomposition of A in Herm(H n n ). Now suppose condition (4) holds. Then each λ m is nonzero. Now define B := n m= λ m v m v m. Then, due to properties of v m and associativity in H, we have AB = BA = I; hence (3) holds. We remark that it is possible to prove the implication (4) (3) without using (7). For example, we can show that AB = BA = I when B is the inverse of A in Herm(H n n ), i.e, when B operator commutes with A and A B = I. Finally, assume that A Herm(O 3 3 ). () (5): Let A be strictly diagonally dominant. As O is non-associative, the argument of (3) (4) cannot be used here. So, we offer a different proof. Let A = p a b ā q c, b c r where p, q, r R and a, b, c O. Next, suppose that A is not invertible in Herm(O 3 3 ) which means that one of the spectral eigenvalues of A is zero, that is, det(a) = 0, see Lemma 5. Thus (from (6)), This implies that 0 = deta = pqr + 2Re( b(ac)) r a 2 q b 2 p c 2. pqr = 2Re( b(ac)) + r a 2 + q b 2 + p c 2 2 a b c + r a 2 + q b 2 + p c 2, hence p q r 2 a b c ( r a 2 + q b 2 + p c 2 ) 0. Now, as A is strictly diagonally dominant, the matrix p a b B := a q c b c r is a real symmetric strictly diagonally dominant matrix with a positive diagonal. By a well-known matrix theory result (see [0], Theorem 6..0) B is positive definite and hence det B > 0. 5

6 Therefore, p q r 2 a b c ( r a 2 + q b 2 + p c 2 ) > 0 which is clearly a contradiction. Hence A is invertible in Herm(O 3 3 ). Remark 2. The following example shows that the implication (2) (5) fails for octonion matrices. In Herm(O 3 3 ), let A = 3 e2 e 6 e 2 3 e e 6 e 3 Then, using (6) and the multiplication table for O, det(a) = 0, and so zero is a spectral eigenvalue of A. This means that A is not invertible in the algebra Herm(O 3 3 ). We claim that zero is not a left/right eigenvalue of A. Assumming the contrary, by Lemma 6, λ = 0 must satisfy. det(λi A) = λ 3 (tr(a))λ 2 + s(a)λ deta = r where r is either of the two roots of r 2 + 4Φ(e 2, e 6, e )r [e 2, e 6, e ] 2 = 0, with s(a) and Φ previously defined. Thus, 0 = det(a) = r. Now, [e 2, e 6, e ] 2 = 2e 5 2 = 4 0; hence r 0, leading to a contradiction. Thus, zero is not a real eigenvalue of A, even though, it is a spectral eigenvalue of A. In particular, we have Ax = 0 x = 0. Remark 3. In the context of Herm(R n n ) or Herm(C n n ) matrices, it is well known that if X and Y are positive semidefinite matrices, then X Y = 0 X, Y = 0 XY = 0. (8) In this remark, we will demonstrate that these equivalences continue to hold in Herm(H n n ), but that the second equivalence fails in Herm(O 3 3 ). It is known that in any Euclidean Jordan algebra V with corresponding symmetric cone K, the following two statements are equivalent, see [8], Proposition 6: (i) x K, y K, and x y = 0.

7 (ii) x K, y K, and x, y = 0. Moreover, in each case, the objects x and y operator commute. Thus, to see (8) in Herm(H n n ) (or for that matter, in Herm(R n n ) or Herm(C n n )), it is enough to show that X and Y positive semidefinite in Herm(H n n ), X, Y = 0 XY = 0. In view of the operator commutativity and the spectral decomposition theorem, this reduces to showing: If F and F 2 are two primitive idempotents in Herm(H n n ) with Re tr(f F 2 ) = 0, then F F 2 = 0. Now if F and F 2 are two primitive idempotents in Herm(H n n ), then as in (7) we can expand F and F 2 using their eigenvalues and eigenvectors: F = vv and F 2 = ww, where v and w are unit quaternion vectors. If Re tr(f F 2 ) = 0, then Re tr(vv ww ) = 0. Putting c := v w, expanding tr(vv ww ) as a sum and using the fact that Re(ab ba) = 0 for any two quaternions, we see that tr(vv ww ) = cc. Thus, 0 = Re(cc) and so v w = c = 0. From this, we get F F 2 = vv ww = 0. Thus we have (8) for quaternion Hermitian matrices. Now we claim that the second equivalence in (8) fails for octonions. Consider the matrix A given in the previous example. We write the spectral decomposition for this A: A = 0 F + λ 2 F 2 + λ 3 F 3 where {F, F 2, F 3 } is a Jordan frame in Herm(O 3 3 ) and σ sp (A) = {0, λ 2, λ 3 }. We claim that both F 2 F and F 3 F cannot be zero simultaneously. Assuming the contrary, we have F 2 F = 0 and F 3 F = 0; hence AF = 0. Now if u is any column of F, then Au = 0. By the known property of A (see the end of previous remark), we must have u = 0 proving F = 0. But this is a contradiction as F is a primitive idempotent and hence cannot be zero. Remark 4. The following example shows that the implication (5) (2) in Theorem 9 need not be true. Let A := e 2 e 6 e 2 e e 6 e

8 Then x := + e + e 2 + e 3 + e 4 e 5 e 6 e 7, x 2 := 0, and x 3 := + e e 2 e 3 + e 4 e 5 + e 6 + e 7. A x x 2 x 3 = 0, hence 0 is a left/right eigenvalue of A. By the modified characteristic equation in Lemma 6, we get det(a) = r. Solving for r from r 2 + 4Φ(e 2, e 6, e )r [e 2, e 6, e ] 2 = 0, we get r = ±2 and so det(a) 0. Hence 0 is a not a spectral eigenvalue of A. Examples in Remarks and 4 show that for matrices in Herm(O 3 3 ), the spectral eigenvalues and real left/right eigenvalues of A can be different. 6. Strict diagonal dominance in Euclidean Jordan algebras Theorem 0. Let (V,,, ) be any Euclidean Jordan algebra of rank r and r x = x i e i + i<j be the Peirce decomposition of x V with respect to a given Jordan frame {e,..., e r }. If x is strictly diagonally dominant, that is, if i r x i > R i (x) := x ki + x ij i =, 2,..., r, 2 ei then x is invertible in V. k= x ij j=i+ Proof. We first suppose that V is simple. Case : Let V is one of the matrix algebras. We note that if the Peirce decomposition of x is strictly diagonally dominant with respect to the Jordan frame {e, e 2,..., e r }, then for any algebra automorphism Λ on V, the Perice decomposition of Λ(x) is strictly diagonally dominant with respect to {Λ(e ), Λ(e 2 ),..., Λ(e r )} (as any algebra automorphism on a simple algebra is orthogonal, see Section 3.4). As V is simple, any Jordan frame can be mapped onto another (see Section 3.4). Hence we assume, without loss of generality, that the Jordan frame is the canonical one given by

{E, E 2,...,E r } where E i is the matrix with one in the (i, i) slot and zeros elsewhere. Now if x is strictly diagonally dominant with respect to this Jordan frame, we can apply Theorem 9 above and get the invertibility. Case 2: Now assume that V = L n. Let x = x e + x 2 e 2 + x 2 be the Peirce decomposition of x with respect to a Jordan frame {e, e 2 }. Given x > R (x), x 2 > R 2 (x), we have to show that x is invertible in L n. Now computation in (5) shows that R (x) = x 2 = R 2 (x). Also, from (4), det(x) = x x 2 x 2 2. We see that det(x) 0 proving the invertibility of x. Thus, we have proved the invertibility of x when V is one of the standard simple algebras. Note that the result continues to hold in each of these standard algebras when we change the inner product to a constant multiple of the trace inner product. (The reason being that the Peirce decomposition remains the same except that the norms of objects get multiplied by a constant factor.) Now, using the classification theorem (see Section 3) and the fact that in any simple algebra, the inner product is a multiple of the trace product (see Prop. III.4. in [6]), we can prove our result in any simple Euclidean Jordan algebra. Now let V be any Euclidean Jordan algebra. By the structure theorem, we can write V = V V 2 V k where each V i is simple. For notational simplicity, we let k = 2 and put r = rank(v ), r 2 = rank(v 2 ). We regard any element of V as a column vector with two components, the first component belonging to V and the second component belonging to V 2. If c is any primitive idempotent in V, then exactly one component of c is nonzero and this nonzero component is a primitive idempotent in the corresponding component algebra. By rearranging the elements, we may write {e, e 2,..., e r } = {[ g 0 ] [ g2, 0 ],..., [ gr 0 ] [ 0, h 9 ] [ ]} 0,...,, h r2 where {g, g 2,...,g r } is a Jordan frame in V and {h, h 2,..., h r2 } is a Jordan frame in V 2. Now writing the given element x as a column vector with two components u V and v V 2, we may write the Peirce decomposition of x in the form r [ gi x = u i 0 ] + i<j r [ uij 0 ] + r 2 v i [ 0 h i ] + i<j r 2 [ 0 v ij ],

20 where we have [ used ] [ the ] fact that the Peirce space V ij with respect to gi 0 any pair {, } is zero. The strict diagonal dominance of x 0 hj now implies that u and v are strictly diagonally dominant with respect to {g, g 2,...,g r } in V and {h, h 2,..., h r2 } in V 2. By our previous arguments, u and v are invertible in V and V 2 respectively. It follows that x is invertible in V. This concludes the proof of the theorem. 7. A Geršgorin type theorem in Euclidean Jordan algebras Theorem. Let V be a Euclidean Jordan algebra of rank r and x = r x i e i + i<j be the Peirce decomposition of x V with respect to a given Jordan frame {e,..., e r }. Then where R i (x) := σ sp (x) 2 ei x ij r {λ R : λ x i R i (x)}, i x ki + k= r j=i+ x ij i =, 2,..., r. Moreover, if a union of k Geršgorin intervals forms an interval that is disjoint from the remaining n k Geršgorin intervals, then there are precisely k spectral eigenvalues of x in this interval. Note. It is possible to say precisely which k spectral eigenvalues lie in the union of k Geršgorin intervals, see the proof below. Proof. Suppose that the stated inclusion fails, so that there exists a λ σ sp (x) such that λ x i > R i (x), for all i =,..., r. Then y := x λe has the Peirce decomposition r y = i λ)e i + (x i<j and hence is a strictly diagonally dominant element of V. By Theorem 0, y is invertible. Now let x = λ f +... + λ r f r x ij

be the spectral decomposition of x, where {f,..., f r } is a Jordan frame. Then y = (λ λ)f +... + (λ r λ)f r is the spectral decomposition of y. As λ σ sp (x) = {λ, λ 2,..., λ r }, λ i = λ, for some i. It follows that zero is a spectral eigenvalue of y which means that y is not invertible. This is a contradiction. Hence we have the spectral inclusion. Now for the second part of the theorem. Its proof, as in the classical case of complex matrices (see [0], Page 345), relies on continuity of eigenvalues. First suppose that V is simple. Define x(ε) := r x i e i + ε i<j with ε [0, ]. Note that x() = x and x(0) = r x ie i. Also, R i (x(ε)) R i (x) for each i and so the spectrum of x(ε) is contained in the union of Geršgorin intervals of x. Now we consider the decreasing rearrangement of spectral eigenvalues of x(ε): λ (x(ε)) := λ (x(ε)) λ 2 (x(ε)).. λ r (x(ε)) where λ (x(ε)) λ 2 (x(ε)) λ r (x(ε)). In particular, for ε = 0, x λ x 2 (x(0)) =.. In view of the continuity of λ (x(ε)) in ε (see e.g., Theorem 9 in [9]) each of the spectral eigenvalue curves joining x i and λ i (x) lies in the union of all Geršgorin intervals of x. Now consider the union of k Geršgorin intervals that form an interval (i.e., a connected set) which is disjoint from other Geršgorin intervals of x. Corresponding to the center, say, x i of a Geršgorin interval that is contained in this union, the other end of the spectral eigenvalue curve, namely, λ i (x) must also be in this union. Even the converse x r x ij, 2

22 statement holds. Thus there are exactly k eigenvalues of x that lie in this union. Now let V be a general Euclidean Jordan algebra and let k Geršgorin intervals of x form an interval that is disjoint from other Geršgorin intervals of x. Define x(ε) as in the previous case. Suppose, without loss of generality, x is the center of one of the Geršgorin intervals in this union. Then the associated primitive idempotent e (in the Peirce decomposition of x with respect to {e, e 2,..., e r }) belongs to a unique factor (simple) algebra, say, V of V. Using the continuity of spectral eigenvalues in simple algebras (as observed above), we can conclude that the spectral eigenvalue curve joining x and one of the spectral eigenvalues of x lies in this union. Conversely, each spectral eigenvalue of x that lies in this union connects to one of the centers that lies in the union. Because of this one-to-one correspondence, we see that there are exactly k spectral eigenvalues of x lying in the union. This completes the proof. Since an object x in a Euclidean Jordan algebra belongs to the interior of the symmetric cone if and only if all its spectral eigenvalues are positive, the following result is an immediate consequence of the above theorem. Corollary 2. If in the above theorem x is strictly diagonally dominant with respect to some Jordan frame and the diagonal elements x i are positive, then x is in the interior of the symmetric cone. REFERENCES J.C. Baez, The octonions, Bulletin of American Mathematical Society, 39 (2002) 45-205. 2 T. Dray, J. Janesky, C.A. Manogue, Octonionic Hermitian matrices with non-real eigenvalues, Adv. Appl. Clifford Algebras, 0 (2000) 93-26. 3 T. Dray and C.A. Manogue, The octonionic eigenvalue problem, Adv. Appl. Clifford Algebras, 8 (998) 34-364. 4 T. Dray and C.A. Manogue, The exceptional Jordan eigenvalue problem, Internat. J. Theoret. Phys. 38 (999) 290 296. 5 T. Dray, C.A. Manogue, and S. Okubo, Orthogonal eigenbases over the octonions, Algebras Groups Geom. 9 (2002) 63-80. 6 J. Faraut and A. Korányi, Analysis on Symmetric Cones, Clarendon Press, Oxford, 994.

7 H. Freudenthal, Lie groups in the foundations of geometry, Adv. Math., (964) 45-90. 8 M.S. Gowda, R. Sznajder, and J. Tao, Some P-properties for linear transformations on Euclidean Jordan algebras, Linear Alg. Appl., 393 (2004) 203-232. 9 M.S. Gowda, J. Tao, and M. Moldovan, Some inertia theorems in Euclidean Jordan algebras, Research Report, Department of Mathematics and Statistics, University of Maryland, Baltimore County, Baltimore, MD 2250, February 2008 (Revised Sept. 2008). 0 R.A. Horn and C.R. Johnson, Matrix Analysis, Cambridge University Press, Cambridge, 985. S.H. Schmieta and F. Alizadeh, Extension of primal-dual interior point algorithms to symmetric cones, Math. Prog. Series A 96 (2003) 409-438. 2 J. Tao, Some P-Properties for Linear Transformations on the Lorentz Cone, PhD thesis, UMBC, 2004. 3 Y. Tian, Matrix representations of octonions and their applications, Adv. Appl. Clifford Algebras, 0 (2000) 6-90. 4 F. Zhang, Quaternions and matrices of quaternions, Linear Algebra Appl., 25 (997) 2-57. 5 F. Zhang, Geršgorin type theorem for quaternionic matrices, Linear Algebra Appl., 424 (2007) 39-53. 23