# Strict diagonal dominance and a Geršgorin type theorem in Euclidean

Save this PDF as:

Size: px
Start display at page:

## Transcription

1 Strict diagonal dominance and a Geršgorin type theorem in Euclidean Jordan algebras Melania Moldovan Department of Mathematics and Statistics University of Maryland, Baltimore County Baltimore, Maryland 2250 and M. Seetharama Gowda Department of Mathematics and Statistics University of Maryland, Baltimore County Baltimore, Maryland 2250 October 2, 2008 ABSTRACT For complex square matrices, the Levy-Desplanques theorem asserts that a strictly diagonally dominant matrix is invertible. The well-known Geršgorin theorem on the location of eigenvalues is equivalent to this. In this article, we extend the Levy-Desplanques theorem to an object in a Euclidean Jordan algebra when its Peirce decomposition with respect to a Jordan frame is given. As a consequence, we prove a Geršgorin type theorem for the spectral eigenvalues of an object in a Euclidean Jordan algebra. Key Words: quaternions, octonions, Euclidean Jordan algebras, strict diagonal dominance, Geršgorin type theorem

2 2. Introduction In matrix theory, the well-known Geršgorin theorem [0] asserts that for an n n complex matrix A = [a ij ], the spectrum (consisting of the eigenvalues) of A lies in the union of Geršgorin discs in the complex plane: σ(a) n {z C : z a ii R i (A)}, where n R i (A) := a ij ( i n). j=,j i This is equivalent to the strict diagonal dominance theorem - known as the Levy-Desplanques theorem [0] - which says that if an n n complex matrix A = [a ij ] is strictly diagonally dominant, that is, a ii > R i (A) i =, 2,...,n, () then A is invertible in C n n. In a recent paper [5], Zhang extends the Geršgorin theorem to quaternionic matrices by stating two results, one for left eigenvalues and the other one for right eigenvalues (the difference arising because of non-commutative nature of quaternions). The strict diagonal dominance result extends to quaternionic matrices, since for a quaternionic square matrix A, the following two conditions are equivalent [4]: (a) Ax = 0 x = 0. (b) A is invertible, that is, there is a quaternionic matrix B such that AB = BA = I. It is easily seen (see Section 4) that Zhang s two Geršgorin type results carry over to octonionic matrices. Furthermore, the strict diagonal dominance condition implies condition (a) above and a modified version of (b). Our objective in this paper is to prove analogs of the above results in Euclidean Jordan algebras. More precisely, we show that if (V,,, ) is a Euclidean Jordan algebra of rank r and x = r x i e i + i<j x ij

3 is the Peirce decomposition of x V with respect to a given Jordan frame {e,...,e r } (see Section 3 for definitions), then the strict diagonal dominance condition i r x i > R i (x) := x ki + x ij i =, 2,..., r 2 ei k= j=i+ implies the invertibility of x in V. Moreover, for any x V, we have σ sp (x) r {λ R : λ x i R i (x)}, where σ sp (x) denotes the set of all spectral eigenvalues (coming from the spectral decomposition) of x in V. As a consequence, we deduce that if each x i is positive and the strict diagonal dominance condition holds, then x is in the interior of the symmetric cone in V. Our analysis is as follows. Since the results for real/complex Hermitian matrices are known, we first prove the strict diagonal dominance result in the matrix algebras of n n quaternion Hermitian matrices, 3 3 octonion Hermitian matrices, and the Jordan spin algebra. Then we use the structure theorem - that any Euclidean Jordan algebra is essentially the product of above mentioned algebras - to cover the general case. From this, we easily deduce the Geršgorin type result mentioned above. As we shall see, the case of 3 3 octonion Hermitian matrices requires special consideration: for such matrices, the spectral eigenvalues can be different from the real left/right eigenvalues and the strict diagonal dominance result requires a a non-standard proof that avoids left/right eigenvalues. Our paper is organized as follows. In Section 2, we describe quaternions, octonions, matrices over these, and some eigenvalues properties. In Section 3, we cover Euclidean Jordan algebra concepts, examples, and all preliminary results. In Section 4, we describe Geršgorin type left/right eigenvalue results for matrices with entries from real numbers/complex numbers/quaternions/octonions. Section 5 covers the strict diagonal dominance results for matrices. In Section 6, we prove the strict diagonal dominance result in Euclidean Jordan algebras. Finally, in Section 7, we prove a Geršgorin type theorem in Euclidean Jordan algebras Square matrices over quaternions and octonions, and their eigenvalues Throughout this paper, we use the standard notations - R for the set of all real numbers and C for the set of all complex numbers. If F denotes the

4 4 set of all reals/complex numbers/quaternions/octonions, we write F n for the space of all n vectors over F and F n n for the space of all n n matrices over F. 2.. Quaternions The linear space of quaternions - denoted by H - is a 4-dimensional linear space over R with a basis {, i, j, k}. The space H can be made into an algebra by means of the conditions i 2 = j 2 = k 2 = and ijk =. For any x = x 0 + x i + x 2 j + x 3 k H, we define the real part, conjugate, and norm by Re(x) := x 0, x := x 0 x i x 2 j x 3 k, and x := xx = x x2 + x2 2 + x2 3. We have xx = x 2 and xy = x y, for all x, y H. It is known that H is a non-commutative, associative, normed division algebra. Let A H n n. An element λ H is a left (right) eigenvalue of A if there is a nonzero x H n such that Ax = λx (respectively, Ax = xλ). We use the notation σ l (A) (σ r (A)) for the set of all left eigenvalues of A (respectively, the right eigenvalues of A). For a matrix in H n n, we can define the conjugate and transpose in the usual way. We say that a square matrix A with quaternion entries is Hermitian if A coincides with its conjugate transpose, that is, if A = (A) T. We list below some eigenvalue properties of quaternionic matrices. Theorem. Let A H n n. Then (a) The implication [x H n, Ax = 0] x = 0 holds if and only if there is a unique B H n n such that AB = BA = I ([4], Theorem 4.3). (b) The sets σ l (A) and σ r (A) are nonempty ([4], Theorem 5.3 and 5.4). (c) The sets σ l (A) and σ r (A) may be infinite ([5], Example 3.). (d) When A is Hermitian, the right eigenvalues are always real while the left eigenvalues may not be real ([3], Lemma H, [2], Page 98). (e) When A is Hermitian, there exist real eigenvalues λ, λ 2,..., λ n and corresponding eigenvectors v, v 2,..., v n in H n such that

5 5 v i v j = δ ij ( i, j), n A = λ m v m vm m= n and I = v m vm. m= (Theorem H, [3]) Octonions The linear space of octonions over R - denoted by O - is a 8-dimensional linear space with basis {, e, e 2, e 3, e 4, e 5, e 6, e 7 }. The space O becomes an algebra via the following multiplication table on the non-unit basis elements [3]: For an element e e 2 e 3 e 4 e 5 e 6 e 7 e e 3 e 2 e 5 e 4 e 7 e 6 e 2 e 3 e e 6 e 7 e 4 e 5 e 3 e 2 e e 7 e 6 e 5 e 4 e 4 e 5 e 6 e 7 e e 2 e 3 e 5 e 4 e 7 e 6 e e 3 e 2 e 6 e 7 e 4 e 5 e 2 e 3 e e 7 e 6 e 5 e 4 e 3 e 2 e x = x 0 + x e + x 2 e 2 + x 3 e 3 + x 4 e 4 + x 5 e 5 + x 6 e 6 + x 7 e 7 in O, we define the real part, conjugate, and norm by Re(x) := x 0, x = x 0 x e x 2 e 2 x 3 e 3 x 4 e 4 x 5 e 5 x 6 e 6 x 7 e 7, and x := xx. We note that xx = x 2 and xy = x y for all x and y. It is known that O is a non-commutative, non-associative, normed division algebra. In addition, O is an alternative algebra, that is, the subalgebra generated by any two elements in O is associative []. As in the case of quaternionic matrices, one can define left and right eigenvalues for an octonionic matrix. We list below few eigenvalue properties of octonionic matrices. For further details, see [3]. Theorem 2. Let A O n n. Then

6 6 (a) The implication [x O n, Ax = 0] x = 0 holds if and only if there exist unique B and C in O n n such that AB = CA = I ([3], Lemma 4.4, Theorem 4.3, and Corollary 4.4). (b) The sets σ l (A) and σ r (A) may be infinite. (c) When A is Hermitian (that is, when A = (A) T )), the right eigenvalues may not be real ([3], Page 360). 3. Euclidean Jordan Algebras In this section, we briefly recall concepts, properties/results, and examples from Euclidean Jordan algebra theory. For short introductions, see [8] and []. For complete details, we refer to [6]. A Euclidean Jordan Algebra [6] is a triple (V,,.,. ), where (V,.,. ) is a finite dimensional inner product space over R and (x, y) x y : V V V is a bilinear mapping satisfying the following conditions for all x, y, and z: x y = y x, x (x 2 y) = x 2 (x y), and x y, z = y, x z. In addition, we assume that there is an element e V (called the unit element) such that x e = x, for all x V. An element c V is an idempotent if c 2 = c; it is a primitive idempotent if it is nonzero and cannot be written as a sum of two nonzero idempotents. We say that a finite set {e, e 2,..., e m } of primitive idempotents in V is a Jordan frame if e i e j = 0 if i j and m e i = e. For x V, we define m(x) := min{k > 0 : e, x,..., x k are linearly dependent} and rank of V by r = max{m(x) : x V }. Theorem 3. (Spectral decomposition theorem) Let V be a Euclidean Jordan algebra with rank r. Then for every x V, there exist a Jordan frame {e, e 2,..., e r } and real numbers λ,..., λ r such that x = λ e λ r e r. The numbers λ i are called the spectral eigenvalues of x. (In this paper, we have used the additional word spectral in order to distinguish these eigenvalues from the left/right eigenvalues of matrices.) These numbers are

7 uniquely defined even though the Jordan frame that corresponds to x may not be unique. Given the spectral eigenvalues of x, we define σ sp (x) := {λ, λ 2,..., λ r }, trace(x) := λ + λ 2 + λ r, and det(x) := λ λ 2 λ r. 7 by Corresponding to an x V, we define the Lyapunov operator L x on V L x (z) := x z. We say that two elements x and y in V operator commute if the corresponding Lyapunov operators L x and L y commute (which can happen if and only if x and y have their spectral decompositions with respect to the same Jordan frame [6]). We say that an element x is invertible in V if all the spectral eigenvalues of x are nonzero. This happens if and only if there is a y in V that operator commutes with x and x y = e. The set of squares K := {x 2 : x V } is (called) a symmetric cone. It is a self-dual closed convex cone. Let {e, e 2,..., e r } be a Jordan frame in a Euclidean Jordan algebra V. For i, j {, 2,..., r}, we define the Peirce eigenspaces V ii := {x V : x e i = x} = Re i and when i j, V ij := { x V : x e i = } 2 x = x e j. Theorem 4. (Theorem IV.2., Faraut and Koranyi [6]) The space V is the orthogonal direct sum of spaces V ij (i j). Thus, given a Jordan frame {e, e 2,...,e r }, we can write any element x V as r x = x ij, x i e i + i<j where x i R and x ij V ij. This expression is the Peirce decomposition of x with respect to {e, e 2,...,e r }.

8 8 Given the above Peirce decomposition of x, we define the Geršgorin radii of x: i r R i (x) := x ki + x ij, i =, 2..., r. (2) 2 ei k= j=i+ In what follows, we describe some examples of Euclidean Jordan algebras. 3.. Matrix algebras Let F denote any of the spaces R, C, H, and O. A matrix A F n n is said to be Hermitian if A := (A) T = A. Let Herm(F n n ):=set of all n n Hermitian matrices with entries from F. Given X, Y Herm(F n n ), we define X, Y := Re trace(xy ) and X Y := (XY + Y X) 2 where the trace of a matrix is the sum of its diagonal elements. (We note that when X and Y are complex, there is no need to take the real part.) It is known that Herm(R n n ), Herm(C n n ), and Herm(H n n ) are Euclidean Jordan algebras, each of rank n. Moreover, the set {E, E 2,...,E n } is a Jordan frame in each of these algebras where E i is the diagonal matrix with in the (i, i)-slot and zeros elsewhere. It is also known that Herm(O 3 3 ) is Euclidean Jordan algebra of rank 3. Furthermore, the set {E, E 2, E 3 } is a Jordan frame in this algebra. For a matrix X in any one of these algebras, it is easy to write down the Peirce decomposition with respect to {E, E 2,..., E n }. For example, in Herm(O 3 3 ), where X 2 = X = 0 a 0 a p a b a q c b c r = p E + q E 2 + r E 3 + X 2 + X 3 + X 23,, X 3 = 0 0 b b 0 0, and X 23 = Corresponding to this, we have (the Geršgorin radii of X): R (X) = 2 E ( X 2 + X 3 ) = a + b, c 0 c 0.

9 9 R 2 (X) = 2 E2 ( X 2 + X 23 ) = a + c, etc. More generally, for A = [a ij ] Herm(F n n ) (with n = 3 when F = O), it is easily seen that with respect to the Jordan frame {E, E 2,..., E n }, R i (A) := n j=,j i a ij ( i n). (3) 3.2. The Jordan spin algebra L n Consider R n (n > ) where any element x is written as [ ] x0 x = x with x 0 R and x R n. The inner product in R n is the usual inner product. The Jordan product x y in R n is defined by [ ] [ ] [ ] x0 y0 x, y x y = :=. x y x 0 y + y 0 x We shall denote this Euclidean Jordan algebra (R n,,, ) by L n. We note the spectral decomposition of any x with x 0: where Thus, x = λ e + λ 2 e 2 λ := x 0 + x, λ 2 := x 0 x, e := [ ] x, and e 2 := [ 2 x 2 x x det(x) = λ λ 2 = x 2 0 x 2. ]. Now consider any Jordan frame {e, e 2 } in L n. Then there exists a unit vector u R n such that e := [ ] and e 2 u 2 := [ ]. 2 u With respect to this, any x L n has a Peirce decomposition x = x e + x 2 e 2 + x 2

10 0 where [ 0 x 2 = v ] for some v R n with u, v = 0. (This is easy to verify, see e.g., Lemma 2.3.4, [2].) This leads to [ ] x0 x =, x where x 0 = 2 (x + x 2 ) and x = 2 (x x 2 )u + v. Thus det(x) = x 2 0 x 2 = x x 2 v 2 = x x 2 x 2 2. (4) We finally note that as e = e 2 = 2, the Geršgorin radii of x are given by R (x) = 2 e x 2 = x 2 = R 2 (x). (5) 3.3. Simple algebras and the structure theorem A Euclidean Jordan algebra is said to be simple if it is not the direct product of two (nontrivial) Euclidean Jordan algebras. The classification theorem (Chapter V, Faraut and Koranyi [6]) says that every simple Euclidean Jordan algebra is isomorphic to one of the following: () The Jordan spin algebra L n ; (2) Herm(R n n ); (3) Herm(C n n ); (4) Herm(H n n ); (5) Herm(O 3 3 ). Furthermore, the structure theorem, see (Chapters III and V, Faraut and Koranyi [6]) says that any Euclidean Jordan algebra is a (Cartesian) product of simple Euclidean Jordan algebras.

11 3.4. Algebra automorphisms Given a Euclidean Jordan algebra V, an invertible linear transformation Λ : V V is said to to be an algebra automorphism if Λ(x y) = Λ(x) Λ(y) x, y V. We need the following results for our later use: () The trace and determinant are invariant under algebra automorphisms. (2) In a simple Euclidean Jordan algebra, every algebra automorphism is orthogonal (that is, it preserves the inner product), see Page 56, [6]. (3) In a simple algebra, any Jordan frame can be mapped onto any other Jordan frame by an algebra automorphism, see Theorem IV.2.5, [6] The algebra Herm(O 3 3 ) The algebra Herm(O 3 3 ) is crucial for our analysis. We collect below some important results that are needed. For A, B Herm(O 3 3 ), the so-called Freudenthal product [3] is defined by A B := A B 2 (Atr(B) + B tr(a)) + (tr(a)tr(b) tr(a B))I, 2 where I is the identity matrix. Recall that for a matrix A Herm(O 3 3 ), det(a) is the product of its spectral eigenvalues. In the result below (which is essentially in [3]), we express this determinant in terms of the entries of A. Lemma 5. Let A Herm(O 3 3 ) be given by A := p a b ā q c, b c r where p, q, r R and a, b, c O. Then det(a) = 3 tr ((A A) A) = pqr + 2Re( b(ac)) r a 2 q b 2 p c 2. (6) Proof. The second equality comes from direct computation, see [3]. In particular, when A is diagonal, the middle expression reduces to the product of the diagonal entries of A.

12 2 We prove the first equality. By the spectral decomposition theorem (see Section 3), we may write A = λ f + λ 2 f 2 + λ 3 f 3, where λ, λ 2, λ 3 are the spectral eigenvalues of A, and {f, f 2, f 3 } is a Jordan frame in Herm(O 3 3 ). As this algebra is simple, there is an algebra automorphism Λ of Herm(O 3 3 ) that maps {f, f 2, f 3 } to {E, E 2, E 3 }, where E i is a 3 3 matrix with one in the (i, i) slot and zeros elsewhere. Then Λ(A) is a diagonal matrix with λ, λ 2, λ 3 on the diagonal. Since Λ(A B) = Λ(A) Λ(B), Λ(A B) = Λ(A) Λ(B) and trλ(a) = tr(a), we have (from the second equality in (6) applied to Λ(A)), But 3 tr ((Λ(A) Λ(A)) Λ(A)) = λ λ 2 λ 3. 3 tr ((Λ(A) Λ(A)) Λ(A)) = 3 trλ ((A A) A) = tr ((A A) A). 3 Thus, det(a) = λ λ 2 λ 3 = tr ((A A) A) 3 proving the first equality in (6). For objects a, b, c O and for the matrix A given above, we let [a, b] := ab ba, [a, b, c] := (ab)c a(bc), and Φ(a, b, c) := 2 Re ([a, b]c). Also, let s(a) := pq + qr + rp a 2 b 2 c 2. (Recall that tr(a) = p+q+r.) We need the following result from [3] which was verified using Mathematica. Lemma 6 (Lemma O3, [3]) The real eigenvalues of the 3 3 octonion Hermitian matrix A satisfy the modified characteristic equation det(λi A) = λ 3 (tra)λ 2 + s(a)λ det(a) = r where r is either of the two roots of r 2 + 4Φ(a, b, c)r [a, b, c] 2 = 0. Remark. It follows from Lemma 5 that the spectral eigenvalues of A are the roots of det(λi A) = λ 3 (tra)λ 2 + s(a)λ det(a) = 0.

13 3 4. Geršgorin type theorems for matrices Let F denote any one of the spaces R, C, H, and O. For A = [a ij ] F n n, we let n R i (A) := a ij. j=,j i We define σ l (A) and σ r (A) in the usual way. The following two results are routine generalizations of classical Geršgorin theorem and the Geršgorin type theorems of Zhang [5]. We state them for completeness. Theorem 7. (Geršgorin type theorem for left eigenvalues) For A = [a ij ] F n n, we have σ l (A) n {λ F : λ a ii R i (A)}. Proof. Suppose λ σ l (A) and 0 x F n with Ax = λx. Let x i := max j n x j (which is nonzero). Then (Ax) i = λx i implies (λ a ii )x i = Since ab = a b in F, we have n j i,j= a ij x j. λ a i x i R i (A) x i. Thus λ x i R i (A) proving the result. In what follows, we say that elements µ and λ in F are similar (and write µ λ) if there is a nonzero z F such that µ = zλz. (Note that zλz is well defined even in O because of the alternative property.) We state the following result without proof as it is similar to that of Theorem 7 in [5]. Theorem 8. (Geršgorin theorem for right eigenvalues) Let A = [a ij ] F n n. Then for every right eigenvalue λ of A there exists µ F, µ λ such that n µ {γ F : γ a ii R i (A)}.

14 4 5. Strict diagonal dominance in F n n Let F be as in the previous section. For a matrix A = [a ij ] F n n, we say that A is strictly diagonally dominant if a ii > R i (A) i =, 2,..., n. Theorem 9. For A = [a ij ] F n n, consider the following statements: () A is strictly diagonally dominant. (2) The implication [x F n, Ax = 0] x = 0 holds. (3) There exist unique matrices B and C in F n n (which are equal when A is defined over F = H) such that AB = I = CA. (4) A is invertible in the Euclidean Jordan algebra Herm(H n n ). (5) A is invertible in the Euclidean Jordan algebra Herm(O 3 3 ). Then we have the following implications: () (2) (3), (3) (4) when A Herm(H n n ), and () (5) when A Herm(O 3 3 ). Proof. The implication () (2) follows immediately from Theorem 7. The equivalence of (2) and (3) is obvious when F is R or C, follows from Theorem for F = H, and from Theorem 2 for F = O. Now assume that A belongs to Herm(H n n ). (3) (4): When (3) holds, there exists a unique matrix B H n n such that AB = BA = I. By uniqueness of B, we see that B = B, which means that B is Hermitian. To prove that B is the inverse of A in the the algebra Herm(H n n ), we need only to show that A and B operator commute, that is, L A L B = L B L A, where L A (X) := AX+XA 2 (X Herm(H n n )), etc. This easily follows due to the associativity in H. (4) (3): As A Herm(H n n ), by Theorem (e), A can be expanded as A = n λ m v m vm, (7) m=

15 where {v m : m =,...,n} is an orthonormal basis of eigenvectors of A, with real eigenvalues λ m. In view of the properties of v m, the set {v v,...,v nv n } is a Jordan frame in Herm(Hn n ). This means that (7) is the spectral decomposition of A in Herm(H n n ). Now suppose condition (4) holds. Then each λ m is nonzero. Now define B := n m= λ m v m v m. Then, due to properties of v m and associativity in H, we have AB = BA = I; hence (3) holds. We remark that it is possible to prove the implication (4) (3) without using (7). For example, we can show that AB = BA = I when B is the inverse of A in Herm(H n n ), i.e, when B operator commutes with A and A B = I. Finally, assume that A Herm(O 3 3 ). () (5): Let A be strictly diagonally dominant. As O is non-associative, the argument of (3) (4) cannot be used here. So, we offer a different proof. Let A = p a b ā q c, b c r where p, q, r R and a, b, c O. Next, suppose that A is not invertible in Herm(O 3 3 ) which means that one of the spectral eigenvalues of A is zero, that is, det(a) = 0, see Lemma 5. Thus (from (6)), This implies that 0 = deta = pqr + 2Re( b(ac)) r a 2 q b 2 p c 2. pqr = 2Re( b(ac)) + r a 2 + q b 2 + p c 2 2 a b c + r a 2 + q b 2 + p c 2, hence p q r 2 a b c ( r a 2 + q b 2 + p c 2 ) 0. Now, as A is strictly diagonally dominant, the matrix p a b B := a q c b c r is a real symmetric strictly diagonally dominant matrix with a positive diagonal. By a well-known matrix theory result (see [0], Theorem 6..0) B is positive definite and hence det B > 0. 5

16 6 Therefore, p q r 2 a b c ( r a 2 + q b 2 + p c 2 ) > 0 which is clearly a contradiction. Hence A is invertible in Herm(O 3 3 ). Remark 2. The following example shows that the implication (2) (5) fails for octonion matrices. In Herm(O 3 3 ), let A = 3 e2 e 6 e 2 3 e e 6 e 3 Then, using (6) and the multiplication table for O, det(a) = 0, and so zero is a spectral eigenvalue of A. This means that A is not invertible in the algebra Herm(O 3 3 ). We claim that zero is not a left/right eigenvalue of A. Assumming the contrary, by Lemma 6, λ = 0 must satisfy. det(λi A) = λ 3 (tr(a))λ 2 + s(a)λ deta = r where r is either of the two roots of r 2 + 4Φ(e 2, e 6, e )r [e 2, e 6, e ] 2 = 0, with s(a) and Φ previously defined. Thus, 0 = det(a) = r. Now, [e 2, e 6, e ] 2 = 2e 5 2 = 4 0; hence r 0, leading to a contradiction. Thus, zero is not a real eigenvalue of A, even though, it is a spectral eigenvalue of A. In particular, we have Ax = 0 x = 0. Remark 3. In the context of Herm(R n n ) or Herm(C n n ) matrices, it is well known that if X and Y are positive semidefinite matrices, then X Y = 0 X, Y = 0 XY = 0. (8) In this remark, we will demonstrate that these equivalences continue to hold in Herm(H n n ), but that the second equivalence fails in Herm(O 3 3 ). It is known that in any Euclidean Jordan algebra V with corresponding symmetric cone K, the following two statements are equivalent, see [8], Proposition 6: (i) x K, y K, and x y = 0.

17 7 (ii) x K, y K, and x, y = 0. Moreover, in each case, the objects x and y operator commute. Thus, to see (8) in Herm(H n n ) (or for that matter, in Herm(R n n ) or Herm(C n n )), it is enough to show that X and Y positive semidefinite in Herm(H n n ), X, Y = 0 XY = 0. In view of the operator commutativity and the spectral decomposition theorem, this reduces to showing: If F and F 2 are two primitive idempotents in Herm(H n n ) with Re tr(f F 2 ) = 0, then F F 2 = 0. Now if F and F 2 are two primitive idempotents in Herm(H n n ), then as in (7) we can expand F and F 2 using their eigenvalues and eigenvectors: F = vv and F 2 = ww, where v and w are unit quaternion vectors. If Re tr(f F 2 ) = 0, then Re tr(vv ww ) = 0. Putting c := v w, expanding tr(vv ww ) as a sum and using the fact that Re(ab ba) = 0 for any two quaternions, we see that tr(vv ww ) = cc. Thus, 0 = Re(cc) and so v w = c = 0. From this, we get F F 2 = vv ww = 0. Thus we have (8) for quaternion Hermitian matrices. Now we claim that the second equivalence in (8) fails for octonions. Consider the matrix A given in the previous example. We write the spectral decomposition for this A: A = 0 F + λ 2 F 2 + λ 3 F 3 where {F, F 2, F 3 } is a Jordan frame in Herm(O 3 3 ) and σ sp (A) = {0, λ 2, λ 3 }. We claim that both F 2 F and F 3 F cannot be zero simultaneously. Assuming the contrary, we have F 2 F = 0 and F 3 F = 0; hence AF = 0. Now if u is any column of F, then Au = 0. By the known property of A (see the end of previous remark), we must have u = 0 proving F = 0. But this is a contradiction as F is a primitive idempotent and hence cannot be zero. Remark 4. The following example shows that the implication (5) (2) in Theorem 9 need not be true. Let A := e 2 e 6 e 2 e e 6 e

18 8 Then x := + e + e 2 + e 3 + e 4 e 5 e 6 e 7, x 2 := 0, and x 3 := + e e 2 e 3 + e 4 e 5 + e 6 + e 7. A x x 2 x 3 = 0, hence 0 is a left/right eigenvalue of A. By the modified characteristic equation in Lemma 6, we get det(a) = r. Solving for r from r 2 + 4Φ(e 2, e 6, e )r [e 2, e 6, e ] 2 = 0, we get r = ±2 and so det(a) 0. Hence 0 is a not a spectral eigenvalue of A. Examples in Remarks and 4 show that for matrices in Herm(O 3 3 ), the spectral eigenvalues and real left/right eigenvalues of A can be different. 6. Strict diagonal dominance in Euclidean Jordan algebras Theorem 0. Let (V,,, ) be any Euclidean Jordan algebra of rank r and r x = x i e i + i<j be the Peirce decomposition of x V with respect to a given Jordan frame {e,..., e r }. If x is strictly diagonally dominant, that is, if i r x i > R i (x) := x ki + x ij i =, 2,..., r, 2 ei then x is invertible in V. k= x ij j=i+ Proof. We first suppose that V is simple. Case : Let V is one of the matrix algebras. We note that if the Peirce decomposition of x is strictly diagonally dominant with respect to the Jordan frame {e, e 2,..., e r }, then for any algebra automorphism Λ on V, the Perice decomposition of Λ(x) is strictly diagonally dominant with respect to {Λ(e ), Λ(e 2 ),..., Λ(e r )} (as any algebra automorphism on a simple algebra is orthogonal, see Section 3.4). As V is simple, any Jordan frame can be mapped onto another (see Section 3.4). Hence we assume, without loss of generality, that the Jordan frame is the canonical one given by

19 {E, E 2,...,E r } where E i is the matrix with one in the (i, i) slot and zeros elsewhere. Now if x is strictly diagonally dominant with respect to this Jordan frame, we can apply Theorem 9 above and get the invertibility. Case 2: Now assume that V = L n. Let x = x e + x 2 e 2 + x 2 be the Peirce decomposition of x with respect to a Jordan frame {e, e 2 }. Given x > R (x), x 2 > R 2 (x), we have to show that x is invertible in L n. Now computation in (5) shows that R (x) = x 2 = R 2 (x). Also, from (4), det(x) = x x 2 x 2 2. We see that det(x) 0 proving the invertibility of x. Thus, we have proved the invertibility of x when V is one of the standard simple algebras. Note that the result continues to hold in each of these standard algebras when we change the inner product to a constant multiple of the trace inner product. (The reason being that the Peirce decomposition remains the same except that the norms of objects get multiplied by a constant factor.) Now, using the classification theorem (see Section 3) and the fact that in any simple algebra, the inner product is a multiple of the trace product (see Prop. III.4. in [6]), we can prove our result in any simple Euclidean Jordan algebra. Now let V be any Euclidean Jordan algebra. By the structure theorem, we can write V = V V 2 V k where each V i is simple. For notational simplicity, we let k = 2 and put r = rank(v ), r 2 = rank(v 2 ). We regard any element of V as a column vector with two components, the first component belonging to V and the second component belonging to V 2. If c is any primitive idempotent in V, then exactly one component of c is nonzero and this nonzero component is a primitive idempotent in the corresponding component algebra. By rearranging the elements, we may write {e, e 2,..., e r } = {[ g 0 ] [ g2, 0 ],..., [ gr 0 ] [ 0, h 9 ] [ ]} 0,...,, h r2 where {g, g 2,...,g r } is a Jordan frame in V and {h, h 2,..., h r2 } is a Jordan frame in V 2. Now writing the given element x as a column vector with two components u V and v V 2, we may write the Peirce decomposition of x in the form r [ gi x = u i 0 ] + i<j r [ uij 0 ] + r 2 v i [ 0 h i ] + i<j r 2 [ 0 v ij ],

20 20 where we have [ used ] [ the ] fact that the Peirce space V ij with respect to gi 0 any pair {, } is zero. The strict diagonal dominance of x 0 hj now implies that u and v are strictly diagonally dominant with respect to {g, g 2,...,g r } in V and {h, h 2,..., h r2 } in V 2. By our previous arguments, u and v are invertible in V and V 2 respectively. It follows that x is invertible in V. This concludes the proof of the theorem. 7. A Geršgorin type theorem in Euclidean Jordan algebras Theorem. Let V be a Euclidean Jordan algebra of rank r and x = r x i e i + i<j be the Peirce decomposition of x V with respect to a given Jordan frame {e,..., e r }. Then where R i (x) := σ sp (x) 2 ei x ij r {λ R : λ x i R i (x)}, i x ki + k= r j=i+ x ij i =, 2,..., r. Moreover, if a union of k Geršgorin intervals forms an interval that is disjoint from the remaining n k Geršgorin intervals, then there are precisely k spectral eigenvalues of x in this interval. Note. It is possible to say precisely which k spectral eigenvalues lie in the union of k Geršgorin intervals, see the proof below. Proof. Suppose that the stated inclusion fails, so that there exists a λ σ sp (x) such that λ x i > R i (x), for all i =,..., r. Then y := x λe has the Peirce decomposition r y = i λ)e i + (x i<j and hence is a strictly diagonally dominant element of V. By Theorem 0, y is invertible. Now let x = λ f λ r f r x ij

21 be the spectral decomposition of x, where {f,..., f r } is a Jordan frame. Then y = (λ λ)f (λ r λ)f r is the spectral decomposition of y. As λ σ sp (x) = {λ, λ 2,..., λ r }, λ i = λ, for some i. It follows that zero is a spectral eigenvalue of y which means that y is not invertible. This is a contradiction. Hence we have the spectral inclusion. Now for the second part of the theorem. Its proof, as in the classical case of complex matrices (see [0], Page 345), relies on continuity of eigenvalues. First suppose that V is simple. Define x(ε) := r x i e i + ε i<j with ε [0, ]. Note that x() = x and x(0) = r x ie i. Also, R i (x(ε)) R i (x) for each i and so the spectrum of x(ε) is contained in the union of Geršgorin intervals of x. Now we consider the decreasing rearrangement of spectral eigenvalues of x(ε): λ (x(ε)) := λ (x(ε)) λ 2 (x(ε)).. λ r (x(ε)) where λ (x(ε)) λ 2 (x(ε)) λ r (x(ε)). In particular, for ε = 0, x λ x 2 (x(0)) =.. In view of the continuity of λ (x(ε)) in ε (see e.g., Theorem 9 in [9]) each of the spectral eigenvalue curves joining x i and λ i (x) lies in the union of all Geršgorin intervals of x. Now consider the union of k Geršgorin intervals that form an interval (i.e., a connected set) which is disjoint from other Geršgorin intervals of x. Corresponding to the center, say, x i of a Geršgorin interval that is contained in this union, the other end of the spectral eigenvalue curve, namely, λ i (x) must also be in this union. Even the converse x r x ij, 2

22 22 statement holds. Thus there are exactly k eigenvalues of x that lie in this union. Now let V be a general Euclidean Jordan algebra and let k Geršgorin intervals of x form an interval that is disjoint from other Geršgorin intervals of x. Define x(ε) as in the previous case. Suppose, without loss of generality, x is the center of one of the Geršgorin intervals in this union. Then the associated primitive idempotent e (in the Peirce decomposition of x with respect to {e, e 2,..., e r }) belongs to a unique factor (simple) algebra, say, V of V. Using the continuity of spectral eigenvalues in simple algebras (as observed above), we can conclude that the spectral eigenvalue curve joining x and one of the spectral eigenvalues of x lies in this union. Conversely, each spectral eigenvalue of x that lies in this union connects to one of the centers that lies in the union. Because of this one-to-one correspondence, we see that there are exactly k spectral eigenvalues of x lying in the union. This completes the proof. Since an object x in a Euclidean Jordan algebra belongs to the interior of the symmetric cone if and only if all its spectral eigenvalues are positive, the following result is an immediate consequence of the above theorem. Corollary 2. If in the above theorem x is strictly diagonally dominant with respect to some Jordan frame and the diagonal elements x i are positive, then x is in the interior of the symmetric cone. REFERENCES J.C. Baez, The octonions, Bulletin of American Mathematical Society, 39 (2002) T. Dray, J. Janesky, C.A. Manogue, Octonionic Hermitian matrices with non-real eigenvalues, Adv. Appl. Clifford Algebras, 0 (2000) T. Dray and C.A. Manogue, The octonionic eigenvalue problem, Adv. Appl. Clifford Algebras, 8 (998) T. Dray and C.A. Manogue, The exceptional Jordan eigenvalue problem, Internat. J. Theoret. Phys. 38 (999) T. Dray, C.A. Manogue, and S. Okubo, Orthogonal eigenbases over the octonions, Algebras Groups Geom. 9 (2002) J. Faraut and A. Korányi, Analysis on Symmetric Cones, Clarendon Press, Oxford, 994.

23 7 H. Freudenthal, Lie groups in the foundations of geometry, Adv. Math., (964) M.S. Gowda, R. Sznajder, and J. Tao, Some P-properties for linear transformations on Euclidean Jordan algebras, Linear Alg. Appl., 393 (2004) M.S. Gowda, J. Tao, and M. Moldovan, Some inertia theorems in Euclidean Jordan algebras, Research Report, Department of Mathematics and Statistics, University of Maryland, Baltimore County, Baltimore, MD 2250, February 2008 (Revised Sept. 2008). 0 R.A. Horn and C.R. Johnson, Matrix Analysis, Cambridge University Press, Cambridge, 985. S.H. Schmieta and F. Alizadeh, Extension of primal-dual interior point algorithms to symmetric cones, Math. Prog. Series A 96 (2003) J. Tao, Some P-Properties for Linear Transformations on the Lorentz Cone, PhD thesis, UMBC, Y. Tian, Matrix representations of octonions and their applications, Adv. Appl. Clifford Algebras, 0 (2000) F. Zhang, Quaternions and matrices of quaternions, Linear Algebra Appl., 25 (997) F. Zhang, Geršgorin type theorem for quaternionic matrices, Linear Algebra Appl., 424 (2007)

### Some inequalities involving determinants, eigenvalues, and Schur complements in Euclidean Jordan algebras

positivity manuscript No. (will be inserted by the editor) Some inequalities involving determinants, eigenvalues, and Schur complements in Euclidean Jordan algebras M. Seetharama Gowda Jiyuan Tao February

### ON POSITIVE SEMIDEFINITE PRESERVING STEIN TRANSFORMATION

J. Appl. Math. & Informatics Vol. 33(2015), No. 1-2, pp. 229-234 http://dx.doi.org/10.14317/jami.2015.229 ON POSITIVE SEMIDEFINITE PRESERVING STEIN TRANSFORMATION YOON J. SONG Abstract. In the setting

### ALGEBRA QUALIFYING EXAM PROBLEMS LINEAR ALGEBRA

ALGEBRA QUALIFYING EXAM PROBLEMS LINEAR ALGEBRA Kent State University Department of Mathematical Sciences Compiled and Maintained by Donald L. White Version: August 29, 2017 CONTENTS LINEAR ALGEBRA AND

### Linear Algebra. Workbook

Linear Algebra Workbook Paul Yiu Department of Mathematics Florida Atlantic University Last Update: November 21 Student: Fall 2011 Checklist Name: A B C D E F F G H I J 1 2 3 4 5 6 7 8 9 10 xxx xxx xxx

### Math Camp Lecture 4: Linear Algebra. Xiao Yu Wang. Aug 2010 MIT. Xiao Yu Wang (MIT) Math Camp /10 1 / 88

Math Camp 2010 Lecture 4: Linear Algebra Xiao Yu Wang MIT Aug 2010 Xiao Yu Wang (MIT) Math Camp 2010 08/10 1 / 88 Linear Algebra Game Plan Vector Spaces Linear Transformations and Matrices Determinant

### The Q Method for Symmetric Cone Programming

The Q Method for Symmetric Cone Programming Farid Alizadeh Yu Xia October 5, 010 Communicated by F. Potra Abstract The Q method of semidefinite programming, developed by Alizadeh, Haeberly and Overton,

### The Jordan Algebraic Structure of the Circular Cone

The Jordan Algebraic Structure of the Circular Cone Baha Alzalg Department of Mathematics, The University of Jordan, Amman 1194, Jordan Abstract In this paper, we study and analyze the algebraic structure

### Boolean Inner-Product Spaces and Boolean Matrices

Boolean Inner-Product Spaces and Boolean Matrices Stan Gudder Department of Mathematics, University of Denver, Denver CO 80208 Frédéric Latrémolière Department of Mathematics, University of Denver, Denver

### Recall the convention that, for us, all vectors are column vectors.

Some linear algebra Recall the convention that, for us, all vectors are column vectors. 1. Symmetric matrices Let A be a real matrix. Recall that a complex number λ is an eigenvalue of A if there exists

### Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition Prof. Tesler Math 283 Fall 2016 Also see the separate version of this with Matlab and R commands. Prof. Tesler Diagonalizing

### Math 443 Differential Geometry Spring Handout 3: Bilinear and Quadratic Forms This handout should be read just before Chapter 4 of the textbook.

Math 443 Differential Geometry Spring 2013 Handout 3: Bilinear and Quadratic Forms This handout should be read just before Chapter 4 of the textbook. Endomorphisms of a Vector Space This handout discusses

### Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )

Section 3.2 Theorem 3.6. Let A be an m n matrix of rank r. Then r m, r n, and, by means of a finite number of elementary row and column operations, A can be transformed into the matrix ( ) Ir O D = 1 O

### Properties of Matrices and Operations on Matrices

Properties of Matrices and Operations on Matrices A common data structure for statistical analysis is a rectangular array or matris. Rows represent individual observational units, or just observations,

### Algebra I Fall 2007

MIT OpenCourseWare http://ocw.mit.edu 18.701 Algebra I Fall 007 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms. 18.701 007 Geometry of the Special Unitary

### NONCOMMUTATIVE POLYNOMIAL EQUATIONS. Edward S. Letzter. Introduction

NONCOMMUTATIVE POLYNOMIAL EQUATIONS Edward S Letzter Introduction My aim in these notes is twofold: First, to briefly review some linear algebra Second, to provide you with some new tools and techniques

### Optimization Theory. A Concise Introduction. Jiongmin Yong

October 11, 017 16:5 ws-book9x6 Book Title Optimization Theory 017-08-Lecture Notes page 1 1 Optimization Theory A Concise Introduction Jiongmin Yong Optimization Theory 017-08-Lecture Notes page Optimization

### Problems in Linear Algebra and Representation Theory

Problems in Linear Algebra and Representation Theory (Most of these were provided by Victor Ginzburg) The problems appearing below have varying level of difficulty. They are not listed in any specific

### Topics in linear algebra

Chapter 6 Topics in linear algebra 6.1 Change of basis I want to remind you of one of the basic ideas in linear algebra: change of basis. Let F be a field, V and W be finite dimensional vector spaces over

### Review of Linear Algebra Definitions, Change of Basis, Trace, Spectral Theorem

Review of Linear Algebra Definitions, Change of Basis, Trace, Spectral Theorem Steven J. Miller June 19, 2004 Abstract Matrices can be thought of as rectangular (often square) arrays of numbers, or as

### Lecture 1. 1 Conic programming. MA 796S: Convex Optimization and Interior Point Methods October 8, Consider the conic program. min.

MA 796S: Convex Optimization and Interior Point Methods October 8, 2007 Lecture 1 Lecturer: Kartik Sivaramakrishnan Scribe: Kartik Sivaramakrishnan 1 Conic programming Consider the conic program min s.t.

### EE/ACM Applications of Convex Optimization in Signal Processing and Communications Lecture 2

EE/ACM 150 - Applications of Convex Optimization in Signal Processing and Communications Lecture 2 Andre Tkacenko Signal Processing Research Group Jet Propulsion Laboratory April 5, 2012 Andre Tkacenko

### AN ELEMENTARY PROOF OF THE SPECTRAL RADIUS FORMULA FOR MATRICES

AN ELEMENTARY PROOF OF THE SPECTRAL RADIUS FORMULA FOR MATRICES JOEL A. TROPP Abstract. We present an elementary proof that the spectral radius of a matrix A may be obtained using the formula ρ(a) lim

### Systems of Algebraic Equations and Systems of Differential Equations

Systems of Algebraic Equations and Systems of Differential Equations Topics: 2 by 2 systems of linear equations Matrix expression; Ax = b Solving 2 by 2 homogeneous systems Functions defined on matrices

### Lecture 21 - Jordan Algebras and Projective Spaces

Lecture 1 - Jordan Algebras and Projective Spaces April 15, 013 References: Jordan Operator Algebras. H. Hanche-Olsen and E. Stormer The Octonions. J. Baez 1 Jordan Algebras 1.1 Definition and examples

### Algebra Exam Syllabus

Algebra Exam Syllabus The Algebra comprehensive exam covers four broad areas of algebra: (1) Groups; (2) Rings; (3) Modules; and (4) Linear Algebra. These topics are all covered in the first semester graduate

### 1. General Vector Spaces

1.1. Vector space axioms. 1. General Vector Spaces Definition 1.1. Let V be a nonempty set of objects on which the operations of addition and scalar multiplication are defined. By addition we mean a rule

### Math Linear Algebra II. 1. Inner Products and Norms

Math 342 - Linear Algebra II Notes 1. Inner Products and Norms One knows from a basic introduction to vectors in R n Math 254 at OSU) that the length of a vector x = x 1 x 2... x n ) T R n, denoted x,

### 1 Multiply Eq. E i by λ 0: (λe i ) (E i ) 2 Multiply Eq. E j by λ and add to Eq. E i : (E i + λe j ) (E i )

Direct Methods for Linear Systems Chapter Direct Methods for Solving Linear Systems Per-Olof Persson persson@berkeleyedu Department of Mathematics University of California, Berkeley Math 18A Numerical

### Real symmetric matrices/1. 1 Eigenvalues and eigenvectors

Real symmetric matrices 1 Eigenvalues and eigenvectors We use the convention that vectors are row vectors and matrices act on the right. Let A be a square matrix with entries in a field F; suppose that

### Lecture 1 Review: Linear models have the form (in matrix notation) Y = Xβ + ε,

2. REVIEW OF LINEAR ALGEBRA 1 Lecture 1 Review: Linear models have the form (in matrix notation) Y = Xβ + ε, where Y n 1 response vector and X n p is the model matrix (or design matrix ) with one row for

### Linear algebra 2. Yoav Zemel. March 1, 2012

Linear algebra 2 Yoav Zemel March 1, 2012 These notes were written by Yoav Zemel. The lecturer, Shmuel Berger, should not be held responsible for any mistake. Any comments are welcome at zamsh7@gmail.com.

### Mathematical Methods wk 2: Linear Operators

John Magorrian, magog@thphysoxacuk These are work-in-progress notes for the second-year course on mathematical methods The most up-to-date version is available from http://www-thphysphysicsoxacuk/people/johnmagorrian/mm

### Symmetric Matrices and Eigendecomposition

Symmetric Matrices and Eigendecomposition Robert M. Freund January, 2014 c 2014 Massachusetts Institute of Technology. All rights reserved. 1 2 1 Symmetric Matrices and Convexity of Quadratic Functions

### . The following is a 3 3 orthogonal matrix: 2/3 1/3 2/3 2/3 2/3 1/3 1/3 2/3 2/3

Lecture Notes: Orthogonal and Symmetric Matrices Yufei Tao Department of Computer Science and Engineering Chinese University of Hong Kong taoyf@cse.cuhk.edu.hk Orthogonal Matrix Definition. An n n matrix

### LECTURE VI: SELF-ADJOINT AND UNITARY OPERATORS MAT FALL 2006 PRINCETON UNIVERSITY

LECTURE VI: SELF-ADJOINT AND UNITARY OPERATORS MAT 204 - FALL 2006 PRINCETON UNIVERSITY ALFONSO SORRENTINO 1 Adjoint of a linear operator Note: In these notes, V will denote a n-dimensional euclidean vector

### Some notes on Coxeter groups

Some notes on Coxeter groups Brooks Roberts November 28, 2017 CONTENTS 1 Contents 1 Sources 2 2 Reflections 3 3 The orthogonal group 7 4 Finite subgroups in two dimensions 9 5 Finite subgroups in three

### Diagonalization by a unitary similarity transformation

Physics 116A Winter 2011 Diagonalization by a unitary similarity transformation In these notes, we will always assume that the vector space V is a complex n-dimensional space 1 Introduction A semi-simple

Division Algebras Konrad Voelkel 2015-01-27 Contents 1 Big Picture 1 2 Algebras 2 2.1 Finite fields................................... 2 2.2 Algebraically closed fields........................... 2 2.3

### Linear algebra II Homework #1 solutions A = This means that every eigenvector with eigenvalue λ = 1 must have the form

Linear algebra II Homework # solutions. Find the eigenvalues and the eigenvectors of the matrix 4 6 A =. 5 Since tra = 9 and deta = = 8, the characteristic polynomial is f(λ) = λ (tra)λ+deta = λ 9λ+8 =

### Further Mathematical Methods (Linear Algebra) 2002

Further Mathematical Methods (Linear Algebra) 00 Solutions For Problem Sheet 0 In this Problem Sheet we calculated some left and right inverses and verified the theorems about them given in the lectures.

### IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET This is a (not quite comprehensive) list of definitions and theorems given in Math 1553. Pay particular attention to the ones in red. Study Tip For each

### Positive definite preserving linear transformations on symmetric matrix spaces

Positive definite preserving linear transformations on symmetric matrix spaces arxiv:1008.1347v1 [math.ra] 7 Aug 2010 Huynh Dinh Tuan-Tran Thi Nha Trang-Doan The Hieu Hue Geometry Group College of Education,

### Matrices. Chapter What is a Matrix? We review the basic matrix operations. An array of numbers a a 1n A = a m1...

Chapter Matrices We review the basic matrix operations What is a Matrix? An array of numbers a a n A = a m a mn with m rows and n columns is a m n matrix Element a ij in located in position (i, j The elements

### Moore Penrose inverses and commuting elements of C -algebras

Moore Penrose inverses and commuting elements of C -algebras Julio Benítez Abstract Let a be an element of a C -algebra A satisfying aa = a a, where a is the Moore Penrose inverse of a and let b A. We

### Fall Inverse of a matrix. Institute: UC San Diego. Authors: Alexander Knop

Fall 2017 Inverse of a matrix Authors: Alexander Knop Institute: UC San Diego Row-Column Rule If the product AB is defined, then the entry in row i and column j of AB is the sum of the products of corresponding

### (a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? Solution: dim N(A) 1, since rank(a) 3. Ax =

. (5 points) (a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? dim N(A), since rank(a) 3. (b) If we also know that Ax = has no solution, what do we know about the rank of A? C(A)

### INVESTIGATING THE NUMERICAL RANGE AND Q-NUMERICAL RANGE OF NON SQUARE MATRICES. Aikaterini Aretaki, John Maroulas

Opuscula Mathematica Vol. 31 No. 3 2011 http://dx.doi.org/10.7494/opmath.2011.31.3.303 INVESTIGATING THE NUMERICAL RANGE AND Q-NUMERICAL RANGE OF NON SQUARE MATRICES Aikaterini Aretaki, John Maroulas Abstract.

### e j = Ad(f i ) 1 2a ij/a ii

A characterization of generalized Kac-Moody algebras. J. Algebra 174, 1073-1079 (1995). Richard E. Borcherds, D.P.M.M.S., 16 Mill Lane, Cambridge CB2 1SB, England. Generalized Kac-Moody algebras can be

### The Eigenvalue Problem: Perturbation Theory

Jim Lambers MAT 610 Summer Session 2009-10 Lecture 13 Notes These notes correspond to Sections 7.2 and 8.1 in the text. The Eigenvalue Problem: Perturbation Theory The Unsymmetric Eigenvalue Problem Just

### Chapter Two Elements of Linear Algebra

Chapter Two Elements of Linear Algebra Previously, in chapter one, we have considered single first order differential equations involving a single unknown function. In the next chapter we will begin to

### A CLASS OF ORTHOGONALLY INVARIANT MINIMAX ESTIMATORS FOR NORMAL COVARIANCE MATRICES PARAMETRIZED BY SIMPLE JORDAN ALGEBARAS OF DEGREE 2

Journal of Statistical Studies ISSN 10-4734 A CLASS OF ORTHOGONALLY INVARIANT MINIMAX ESTIMATORS FOR NORMAL COVARIANCE MATRICES PARAMETRIZED BY SIMPLE JORDAN ALGEBARAS OF DEGREE Yoshihiko Konno Faculty

### REPRESENTATION THEORY WEEK 5. B : V V k

REPRESENTATION THEORY WEEK 5 1. Invariant forms Recall that a bilinear form on a vector space V is a map satisfying B : V V k B (cv, dw) = cdb (v, w), B (v 1 + v, w) = B (v 1, w)+b (v, w), B (v, w 1 +

### Abstract. In this article, several matrix norm inequalities are proved by making use of the Hiroshima 2003 result on majorization relations.

HIROSHIMA S THEOREM AND MATRIX NORM INEQUALITIES MINGHUA LIN AND HENRY WOLKOWICZ Abstract. In this article, several matrix norm inequalities are proved by making use of the Hiroshima 2003 result on majorization

### Matrix Mathematics. Theory, Facts, and Formulas with Application to Linear Systems Theory. Dennis S. Bernstein

Matrix Mathematics Theory, Facts, and Formulas with Application to Linear Systems Theory Dennis S. Bernstein PRINCETON UNIVERSITY PRESS PRINCETON AND OXFORD Contents Special Symbols xv Conventions, Notation,

### IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET This is a (not quite comprehensive) list of definitions and theorems given in Math 1553. Pay particular attention to the ones in red. Study Tip For each

### over a field F with char F 2: we define

Chapter 3 Involutions In this chapter, we define the standard involution (also called conjugation) on a quaternion algebra. In this way, we characterize division quaternion algebras as noncommutative division

### BASIC ALGORITHMS IN LINEAR ALGEBRA. Matrices and Applications of Gaussian Elimination. A 2 x. A T m x. A 1 x A T 1. A m x

BASIC ALGORITHMS IN LINEAR ALGEBRA STEVEN DALE CUTKOSKY Matrices and Applications of Gaussian Elimination Systems of Equations Suppose that A is an n n matrix with coefficents in a field F, and x = (x,,

### Math Linear Algebra Final Exam Review Sheet

Math 15-1 Linear Algebra Final Exam Review Sheet Vector Operations Vector addition is a component-wise operation. Two vectors v and w may be added together as long as they contain the same number n of

### MA 265 FINAL EXAM Fall 2012

MA 265 FINAL EXAM Fall 22 NAME: INSTRUCTOR S NAME:. There are a total of 25 problems. You should show work on the exam sheet, and pencil in the correct answer on the scantron. 2. No books, notes, or calculators

### Math 18, Linear Algebra, Lecture C00, Spring 2017 Review and Practice Problems for Final Exam

Math 8, Linear Algebra, Lecture C, Spring 7 Review and Practice Problems for Final Exam. The augmentedmatrix of a linear system has been transformed by row operations into 5 4 8. Determine if the system

### The Nearest Doubly Stochastic Matrix to a Real Matrix with the same First Moment

he Nearest Doubly Stochastic Matrix to a Real Matrix with the same First Moment William Glunt 1, homas L. Hayden 2 and Robert Reams 2 1 Department of Mathematics and Computer Science, Austin Peay State

### Linear Algebra Practice Problems

Linear Algebra Practice Problems Math 24 Calculus III Summer 25, Session II. Determine whether the given set is a vector space. If not, give at least one axiom that is not satisfied. Unless otherwise stated,

### and let s calculate the image of some vectors under the transformation T.

Chapter 5 Eigenvalues and Eigenvectors 5. Eigenvalues and Eigenvectors Let T : R n R n be a linear transformation. Then T can be represented by a matrix (the standard matrix), and we can write T ( v) =

### Linear Operators Preserving the Numerical Range (Radius) on Triangular Matrices

Linear Operators Preserving the Numerical Range (Radius) on Triangular Matrices Chi-Kwong Li Department of Mathematics, College of William & Mary, P.O. Box 8795, Williamsburg, VA 23187-8795, USA. E-mail:

### Linear algebra and applications to graphs Part 1

Linear algebra and applications to graphs Part 1 Written up by Mikhail Belkin and Moon Duchin Instructor: Laszlo Babai June 17, 2001 1 Basic Linear Algebra Exercise 1.1 Let V and W be linear subspaces

### Quadratic forms. Here. Thus symmetric matrices are diagonalizable, and the diagonalization can be performed by means of an orthogonal matrix.

Quadratic forms 1. Symmetric matrices An n n matrix (a ij ) n ij=1 with entries on R is called symmetric if A T, that is, if a ij = a ji for all 1 i, j n. We denote by S n (R) the set of all n n symmetric

### Math Camp Notes: Linear Algebra I

Math Camp Notes: Linear Algebra I Basic Matrix Operations and Properties Consider two n m matrices: a a m A = a n a nm Then the basic matrix operations are as follows: a + b a m + b m A + B = a n + b n

### Clifford Algebras and Spin Groups

Clifford Algebras and Spin Groups Math G4344, Spring 2012 We ll now turn from the general theory to examine a specific class class of groups: the orthogonal groups. Recall that O(n, R) is the group of

### ENTANGLED STATES ARISING FROM INDECOMPOSABLE POSITIVE LINEAR MAPS. 1. Introduction

Trends in Mathematics Information Center for Mathematical Sciences Volume 6, Number 2, December, 2003, Pages 83 91 ENTANGLED STATES ARISING FROM INDECOMPOSABLE POSITIVE LINEAR MAPS SEUNG-HYEOK KYE Abstract.

### Stat 159/259: Linear Algebra Notes

Stat 159/259: Linear Algebra Notes Jarrod Millman November 16, 2015 Abstract These notes assume you ve taken a semester of undergraduate linear algebra. In particular, I assume you are familiar with the

### Extra Problems for Math 2050 Linear Algebra I

Extra Problems for Math 5 Linear Algebra I Find the vector AB and illustrate with a picture if A = (,) and B = (,4) Find B, given A = (,4) and [ AB = A = (,4) and [ AB = 8 If possible, express x = 7 as

### Jordan normal form notes (version date: 11/21/07)

Jordan normal form notes (version date: /2/7) If A has an eigenbasis {u,, u n }, ie a basis made up of eigenvectors, so that Au j = λ j u j, then A is diagonal with respect to that basis To see this, let

### Eigenvalues and Eigenvectors

Eigenvalues and Eigenvectors week -2 Fall 26 Eigenvalues and eigenvectors The most simple linear transformation from R n to R n may be the transformation of the form: T (x,,, x n ) (λ x, λ 2,, λ n x n

### Summer School on Quantum Information Science Taiyuan University of Technology

Summer School on Quantum Information Science Taiyuan University of Technology Yiu Tung Poon Department of Mathematics, Iowa State University, Ames, Iowa 50011, USA (ytpoon@iastate.edu). Quantum operations

### c 1995 Society for Industrial and Applied Mathematics Vol. 37, No. 1, pp , March

SIAM REVIEW. c 1995 Society for Industrial and Applied Mathematics Vol. 37, No. 1, pp. 93 97, March 1995 008 A UNIFIED PROOF FOR THE CONVERGENCE OF JACOBI AND GAUSS-SEIDEL METHODS * ROBERTO BAGNARA Abstract.

### Eigenvalues and Eigenvectors

CHAPTER Eigenvalues and Eigenvectors CHAPTER CONTENTS. Eigenvalues and Eigenvectors 9. Diagonalization. Complex Vector Spaces.4 Differential Equations 6. Dynamical Systems and Markov Chains INTRODUCTION

### Epilogue: Quivers. Gabriel s Theorem

Epilogue: Quivers Gabriel s Theorem A figure consisting of several points connected by edges is called a graph. More precisely, a graph is a purely combinatorial object, which is considered given, if a

### ELEMENTARY LINEAR ALGEBRA WITH APPLICATIONS. 1. Linear Equations and Matrices

ELEMENTARY LINEAR ALGEBRA WITH APPLICATIONS KOLMAN & HILL NOTES BY OTTO MUTZBAUER 11 Systems of Linear Equations 1 Linear Equations and Matrices Numbers in our context are either real numbers or complex

### 4. Linear transformations as a vector space 17

4 Linear transformations as a vector space 17 d) 1 2 0 0 1 2 0 0 1 0 0 0 1 2 3 4 32 Let a linear transformation in R 2 be the reflection in the line = x 2 Find its matrix 33 For each linear transformation

### THE GEOMETRY IN GEOMETRIC ALGEBRA THESIS. Presented to the Faculty. of the University of Alaska Fairbanks. in Partial Fulfillment of the Requirements

THE GEOMETRY IN GEOMETRIC ALGEBRA A THESIS Presented to the Faculty of the University of Alaska Fairbanks in Partial Fulfillment of the Requirements for the Degree of MASTER OF SCIENCE By Kristopher N.

### FINDING DECOMPOSITIONS OF A CLASS OF SEPARABLE STATES

FINDING DECOMPOSITIONS OF A CLASS OF SEPARABLE STATES ERIK ALFSEN AND FRED SHULTZ Abstract. We consider the class of separable states which admit a decomposition i A i B i with the B i s having independent

### Supplementary Notes on Linear Algebra

Supplementary Notes on Linear Algebra Mariusz Wodzicki May 3, 2015 1 Vector spaces 1.1 Coordinatization of a vector space 1.1.1 Given a basis B = {b 1,..., b n } in a vector space V, any vector v V can

### Review of some mathematical tools

MATHEMATICAL FOUNDATIONS OF SIGNAL PROCESSING Fall 2016 Benjamín Béjar Haro, Mihailo Kolundžija, Reza Parhizkar, Adam Scholefield Teaching assistants: Golnoosh Elhami, Hanjie Pan Review of some mathematical

### Conics and their duals

9 Conics and their duals You always admire what you really don t understand. Blaise Pascal So far we dealt almost exclusively with situations in which only points and lines were involved. Geometry would

### JUST THE MATHS UNIT NUMBER 9.8. MATRICES 8 (Characteristic properties) & (Similarity transformations) A.J.Hobson

JUST THE MATHS UNIT NUMBER 9.8 MATRICES 8 (Characteristic properties) & (Similarity transformations) by A.J.Hobson 9.8. Properties of eigenvalues and eigenvectors 9.8. Similar matrices 9.8.3 Exercises

### Linear Algebra- Final Exam Review

Linear Algebra- Final Exam Review. Let A be invertible. Show that, if v, v, v 3 are linearly independent vectors, so are Av, Av, Av 3. NOTE: It should be clear from your answer that you know the definition.

### HASSE-MINKOWSKI THEOREM

HASSE-MINKOWSKI THEOREM KIM, SUNGJIN 1. Introduction In rough terms, a local-global principle is a statement that asserts that a certain property is true globally if and only if it is true everywhere locally.

### Chapter 2 Spectra of Finite Graphs

Chapter 2 Spectra of Finite Graphs 2.1 Characteristic Polynomials Let G = (V, E) be a finite graph on n = V vertices. Numbering the vertices, we write down its adjacency matrix in an explicit form of n

### On the adjacency matrix of a block graph

On the adjacency matrix of a block graph R. B. Bapat Stat-Math Unit Indian Statistical Institute, Delhi 7-SJSS Marg, New Delhi 110 016, India. email: rbb@isid.ac.in Souvik Roy Economics and Planning Unit

### Since a matrix is invertible if and only if its determinant is nonzero, we can also write. GL(n) = {A R n n : deta 0}. (2)

MATHEMATICS 7302 (Analytical Dynamics) YEAR 2016 2017, TERM 2 HANDOUT #13: INTRODUCTION TO THE ROTATION GROUP 1 Lie groups As mentioned in Handout #11, the general framework for studying symmetry in mathematics

### c c c c c c c c c c a 3x3 matrix C= has a determinant determined by

Linear Algebra Determinants and Eigenvalues Introduction: Many important geometric and algebraic properties of square matrices are associated with a single real number revealed by what s known as the determinant.

### 18.06 Problem Set 8 - Solutions Due Wednesday, 14 November 2007 at 4 pm in

806 Problem Set 8 - Solutions Due Wednesday, 4 November 2007 at 4 pm in 2-06 08 03 Problem : 205+5+5+5 Consider the matrix A 02 07 a Check that A is a positive Markov matrix, and find its steady state

### MAPPING AND PRESERVER PROPERTIES OF THE PRINCIPAL PIVOT TRANSFORM

MAPPING AND PRESERVER PROPERTIES OF THE PRINCIPAL PIVOT TRANSFORM OLGA SLYUSAREVA AND MICHAEL TSATSOMEROS Abstract. The principal pivot transform (PPT) is a transformation of a matrix A tantamount to exchanging

### arxiv:math-ph/ v1 31 May 2000

An Elementary Introduction to Groups and Representations arxiv:math-ph/0005032v1 31 May 2000 Author address: Brian C. Hall University of Notre Dame, Department of Mathematics, Notre Dame IN 46556 USA E-mail

### 22m:033 Notes: 7.1 Diagonalization of Symmetric Matrices

m:33 Notes: 7. Diagonalization of Symmetric Matrices Dennis Roseman University of Iowa Iowa City, IA http://www.math.uiowa.edu/ roseman May 3, Symmetric matrices Definition. A symmetric matrix is a matrix

### 5.6. PSEUDOINVERSES 101. A H w.

5.6. PSEUDOINVERSES 0 Corollary 5.6.4. If A is a matrix such that A H A is invertible, then the least-squares solution to Av = w is v = A H A ) A H w. The matrix A H A ) A H is the left inverse of A and

### Linear Algebra using Dirac Notation: Pt. 2

Linear Algebra using Dirac Notation: Pt. 2 PHYS 476Q - Southern Illinois University February 6, 2018 PHYS 476Q - Southern Illinois University Linear Algebra using Dirac Notation: Pt. 2 February 6, 2018

### MATH 323 Linear Algebra Lecture 6: Matrix algebra (continued). Determinants.

MATH 323 Linear Algebra Lecture 6: Matrix algebra (continued). Determinants. Elementary matrices Theorem 1 Any elementary row operation σ on matrices with n rows can be simulated as left multiplication