Journal of Pure and Applied Algebra 174 (2002) 83 93 wwwelseviercom/locate/jpaa Absolutely indecomposable symmetric matrices Hans A Keller a; ;1, A Herminia Ochsenius b;1 a Hochschule Technik+Architektur Luzern, CH-6048 Horw, Switzerland b Facultad de Matematicas, Universidad Catolica de Chile, Casilla 306, correo 22, Santiago de Chile, Chile Received 12 May 2001; received in revised form 10 December 2001 Communicated by Y Adamek Abstract Let A be a symmetricmatrix of size n n with entries in some (commutative) eld K We study the possibility of decomposing A into two blocks by conjugation by an orthogonal matrix T Mat n(k) We say that A is absolutely indecomposable if it is indecomposable over every extension of the base eld If K is formally real then every symmetricmatrix A diagonalizes orthogonally over the real closure of K Assume that K is a not formally real and of level s We prove that in Mat n(k) there exist symmetric, absolutely indecomposable matrices i n is congruent to 0, 1 or 1 modulo 2s c 2002 Elsevier Science BV All rights reserved MSC: 15A33; 12D15 1 Introduction The famous Spectral theorem states that every symmetric real matrix can be put into diagonal form by means of conjugation with an orthogonal matrix Clearly this statement carries over to matrices with entries in a real-closed eld However, for matrices over arbitrary elds the problem of diagonalization turns out to be intricate, and the process of diagonalization can run aground for arithmetical reasons and=or geometric ones In this paper we shall focus on the more general problem of orthogonally decomposing a given symmetric matrix A with entries in a eld K into two blocks of Corresponding author E-mail addresses: hakeller@htafhzch (HA Keller), hochsen@matpuccl (AH Ochsenius) 1 Supported by Fondecyt, Proyecto No 1990438 and No 7990049 0022-4049/02/$ - see front matter c 2002 Elsevier Science BV All rights reserved PII: S0022-4049(02)00037-3
84 HA Keller, AH Ochsenius / Journal of Pure and Applied Algebra 174 (2002) 83 93 smaller size, [ ] B1 0 A B := T 1 AT = ; where T is orthogonal: 0 B 2 We are interested in the geometric conditions underlying such decompositions Therefore, we shall remove the arithmetical obstacles by extending the base eld whenever it is convenient A matrix A which does not decompose orthogonally in any extension of the base eld is called absolutely indecomposable This can only happen if K is a non-formally real eld, and the level s of K turns out to play a critical role Our main result states that in Mat n (K) there exist symmetric matrices which are absolutely indecomposable if and only if n s and n 0 (mod 2s) orn ±1 (mod 2s) The proofs yield an explicit description of such matrices On the other hand, if n 6 s, we show that every symmetricmatrix can be diagonalized orthogonally over some extension eld of K This solves a problem posed by Diarra [1] We start by reviewing (in Section 2) some basic concepts on matrices and linear operators on the vector space E = K n In Section 3 we establish the result on diagonalization of small matrices In Section 4 we establish preliminary results on nilpotent operators and their Jordan blocks, and in Section 5 we combine them with Witt decompositions to obtain the main result 2 Orthogonal and semi-orthogonal decompositions In the following K is a non-formally real eld of level s (ie s is the smallest integer such that 1 is a sum of s squares in K) It is well known that s is a power of 2, ie s =2 u We always assume that char K 2 For n N, we consider the vector space E = K n with canonical base {e 1 ;e 2 ;:::;e n } and the usual inner product :; : dened by n n n a i e i ; b i e i = a i b i (a i ;b i K): If n s then E contains isotropic vectors, ie vectors x 0 with x; x = 0 In turn, if n 6 s then E is anisotropic Suppose that A is a symmetricmatrix in Mat n (K) and is the linear self-adjoint operator on E induced by A Our interest lies in the subspaces of E that are left invariant by, and to motivate the denitions we need, we shall review some basic facts Let U be a linear subspace of E; dim U =r with 0 r n and (U) U IfU is non-degenerate, then the orthogonal space U = {x E x u for all u U}, which is also invariant under, satises E = U U Hence we can nd orthogonal bases {u 1 ;u 2 ;:::;u r } and {u r+1 ;u r+2 ;:::;u n } of U and U, respectively Let : E E be the linear transformation dened by (e i )=u i ;;:::;n The matrix T of with
HA Keller, AH Ochsenius / Journal of Pure and Applied Algebra 174 (2002) 83 93 85 respect to the canonical base {e i } implements a decomposition [ ] B1 0 A B := T 1 AT = ; (1) 0 B 2 where the blocks B 1 and B 2 are of size r r and (n r) (n r), respectively If, in addition, the u i s are unit vectors, u i ;u i = 1 for all i, then the matrix T is orthogonal in the usual sense, that is, 1 ::: 0 T T = I n := ; (2) 0 ::: 1 where T is the transpose of T In that case we say that (1) is an orthogonal decomposition of A into two blocks Often, however, we cannot normalize the lengths of the vectors u i and instead of (2) we only have u 1 ;u 1 ::: 0 T T = : (3) 0 ::: u n ;u n An invertible matrix T such that T T is in diagonal form will be called semi-orthogonal, and (1) is then called a semi-orthogonal decomposition We say that a matrix B is block-diagonal if it is of the shape [ ] B1 0 B = : 0 B 2 Denition Let A Mat n (K) be symmetric We say that A is decomposable over K if there exists a semi-orthogonal matrix P Mat n (K) such that P 1 AP is block-diagonal; otherwise A is indecomposable over K The matrix A is said to be absolutely indecomposable if it is indecomposable over every extension eld of K Given a semi-orthogonal decomposition we can always obtain an orthogonal one by extending the base eld, as is shown by the following result Lemma 21 Suppose that the symmetric matrix A Mat n (K) admits a decomposition into blocks of size r and n r; respectively; by means of a semi-orthogonal matrix P Mat n (K) Then there exists an extension eld L of nite degree over K and an orthogonal matrix T Mat n (L) which decomposes A into two blocks of the same sizes r and n r
86 HA Keller, AH Ochsenius / Journal of Pure and Applied Algebra 174 (2002) 83 93 Proof Write P =[p ij ] The fact that P is semi-orthogonal means that c 1 ::: 0 P P = diag(c 1 ;:::;c n )= ; 0 ::: c n where c i := n k=1 p2 ki It is clear that c i 0 for all i because the matrix P is invertible and therefore of maximal rank n Consider the extension eld L := K( c 1 ;:::; c n ) Clearly [L : K] 6 2 n Dene ( 1 Q := diag ;:::; c1 ) 1 = cn 1 c1 ::: 0 0 ::: 1 cn Mat n(l): For the matrix T := PQ Mat n (L) wendt T= I n ; thus T is orthogonal By assumption; P 1 AP is block-diagonal; say [ ] B1 0 B := P 1 AP = ; 0 B 2 and it is easily checked that T AT = T 1 AT = Q 1 BQ has the same block structure as B Lemma 22 If a symmetric matrix A Mat n (K) can be diagonalized over K then there is an extension eld L of K over which A diagonalizes orthogonally Proof The assumption that A can be diagonalized over K entails that E = K n has an orthogonal base consisting of eigenvectors u 1 ;:::;u n The matrix P of the linear transformation : e i u i is semi-orthogonal Put c i := u i ;u i and let L := K( c 1 ;:::; c n ) The matrix T := P diag(1= c 1 ;:::;1= c n ) is orthogonal and diagonalizes A Lemma 23 Let A Mat n (K) be symmetric If the characteristic polynomial p A () of A has two dierent roots in ˆK; the algebraic closure of K; then there exists an extension eld L of nite degree [L : K] over which A can be decomposed orthogonally Proof Let # 1 # 2 be roots of p A () in ˆK Replacing K by K(# 1 ;# 2 ); if necessary; we may assume that # 1 ;# 2 K Then p A () is a product of two relatively prime polynomials in K[]; p A ()=q 1 ()q 2 () It follows that E = U 1 U 2 where U 1 := Ker(q 1 ())=Im(q 2 ()) and U 2 := Ker(q 2 ())=Im(q 1 ()): The sum is orthogonal; for is self-adjoint It follows that U 1 is non-degenerate Hence there is a semi-orthogonal matrix P Mat n (K) such that P 1 AP is block-diagonal Now the claim follows by Lemma 21
HA Keller, AH Ochsenius / Journal of Pure and Applied Algebra 174 (2002) 83 93 87 3 Diagonalization of small symmetric matrices In this section we deal with matrices the size of which is small compared with the level of the base eld The main result (which should be compared with [1, p 49]) is the following Theorem 31 Let A be a symmetric matrix of size n n with entries in a non-formally real eld K of level s If n 6 s then there exists an extension eld L with [L : K] over which A diagonalizes orthogonally Proof We show that there exists an extension eld K of K and a matrix P Mat n (K ) which diagonalizes A; then the claim follows by Lemma 22 (with K in place of K) (i) Suppose rst that p := char K 0 If p 1 (mod 4) then s = 1, hence n =1, so there is nothing to prove The case p=2 is excluded by our general assumption on the characteristic of K Suppose that p 3 (mod 4), so that s = 2 Consider a symmetric matrix [ ] a11 a 12 A := ; where a ij K; a 12 = a 21 0: a 21 a 22 The characteristic polynomial is p A () = det(a I)= 2 (a 11 + a 22 ) +(a 11 a 22 a 12 a 21 ): Its discriminant =(a 11 + a 22 ) 2 4(a 11 a 22 a 12 a 21 )=(a 11 a 22 ) 2 +(2a 12 ) 2 cannot be zero because s = 2 It follows that p A () has two dierent roots # 1 and # 2 in the algebraicclosure of K This entails that A can be diagonalized over the eld K = K(# 1 ;# 2 ) (ii) Suppose that char K = 0 Let A Mat n (K) be symmetric In the domain K[] the characteristic polynomial p A () has a (unique) factorization p A ()=q 1 () m1 q 2 () m2 q r () mr where the q i () are irreducible polynomials in K[] and m i 1 for i =1;:::;r Consider the vector space E = K n The linear transformation : E E dened by A is self-adjoint To the above factorization of p A () there corresponds an orthogonal decomposition E = E 1 E 2 E r where each subspace E i is invariant under, and q i () mi is the characteristic polynomial of the restriction i := Ei In particular, q i ( i ) mi =0: We claim that q i () is equal to the minimal polynomial of i In fact, this is obvious if m i = 1 Suppose then that m i 2 Let the integer t 1 be dened by the condition
88 HA Keller, AH Ochsenius / Journal of Pure and Applied Algebra 174 (2002) 83 93 that 2 t 1 m i 6 2 t Then q i ( i ) 2t = q i ( i ) mi q i ( i ) 2t m i =0: Consider the operator := q i ( i ) 2t 1 : E i E i Clearly is self-adjoint Moreover, 2 =[q i ( i ) 2t 1 ] 2 = q i ( i ) 2t =0: It follows that for all x E i we have (x);(x) = x; 2 (x) =0: Now E i is anisotropicsince n 6 s We conclude that (x) = 0 for all x E i, thus = q i ( i ) 2t 1 =0: If t 1 1 then we repeat the above argument with := q i ( i ) 2t 2, and so on After some steps we arrive at the conclusion that q i ( i )=0, which shows that q i () is indeed the minimal polynomial of i Therefore, q():=q 1 () q 2 () q r () is the minimal polynomial of Since the q i () s are irreducible in K[] and char K=0, we conclude that all the roots of q() in the algebraicclosure ˆK are simple It follows that A can be put into diagonal form by means of a semi-orthogonal matrix with entries in the eld K obtained from K by adjoining the roots of q() The proof is complete 4 Nilpotent operators 41 Preliminaries Henceforth, we assume that n s, so that E = K n contains isotropic vectors The task is to determine under which conditions there exist symmetric, absolutely indecomposable matrices A in Mat n (K) By Lemma 23 it is necessary that the characteristic polynomial p A () K[] has only one root # in the closure ˆK Replacing K by K(#) we may suppose that # K, thus p A ()=( #) n K[] A is absolutely indecomposable if and only if B := A # I n is so, hence we may restrict our attention to matrices A with characteristic polynomial p A = n Consequently A is nilpotent, A n =0 We look at the corresponding operators 42 Jordan Blocks Let : E E be a nilpotent linear operator, not necessarily self-adjoint A subspace V of E is called cyclic if it is generated by one vector x and its images i (x); i= 1; 2;::: Clearly a cyclic subspace is invariant under A cyclic subspace V which is not contained (properly) in another cyclic subspace is called a Jordan block Let V be a Jordan block of dimension m, generated by x Thus V := span{x; (x);:::; m 1 (x)} and m = 0 Put v i := m i (x) for i =1;:::;m The base {v 1 ;:::;v m } satises (v 1 )=0; (v i )=v i 1 for 2 6 i 6 m:
HA Keller, AH Ochsenius / Journal of Pure and Applied Algebra 174 (2002) 83 93 89 We call {v 1 ;:::;v m } a Jordan base of V With respect to that base, the restricted operator V has the familiar shape 0 1 0 ::: 0 0 0 1 ::: 0 J m (0) = : 0 0 0 ::: 1 0 0 0 ::: 0 Lemma 41 A nilpotent operator : E E induces a decomposition E = V 1 V 2 V k of E into a direct sum of Jordan blocks; although the decomposition is not unique; the number of blocks and their dimensions are uniquely determined by Proof See [2; p 194 ] From now on we assume that the operator : E E is nilpotent and self-adjoint Our aim is to study the interrelation between Jordan decompositions and orthogonal decompositions We begin with the following simple result Lemma 42 Let : E E be nilpotent and self-adjoint If the space E itself is a Jordan block for then is absolutely indecomposable Proof Notice that E is a Jordan block for if and only if is of order n; ie n = 0 but n 1 0 Suppose that is decomposable Then there exists a non-degenerate; invariant subspace U; and E = U U Put m := max{dim U; dim U } It follows easily that m = 0 Consequently is nilpotent of some order k n; a contradiction Clearly remains indecomposable under eld extensions 43 Interrelation between Jordan blocks of self-adjoint operators There are some rigid rules for the inner products between base vectors of Jordan blocks, as is shown by the following technical result Lemma 43 Let : E E be self-adjoint and nilpotent Let V and W be any two Jordan blocks (possibly equal) with Jordan bases {v 1 ;v 2 ;:::;v m } and {w 1 ;w 2 ;:::;w } Then v i ;w j = v i+k ;w j k for j k 1; i+ k 6 m; (4) v i ;w j =0 if i + j 6 m; (5)
90 HA Keller, AH Ochsenius / Journal of Pure and Applied Algebra 174 (2002) 83 93 Proof Notice that v i = (v i+1 ) for 1 6 i 6 m 1 Using the fact that is self-adjoint we get v i ;w j = (v i+1 );w j = v i+1 ;(w j ) = v i+1 ;w j 1 : Repeating the step we arrive at (4); the conditions on i; j; k make sure that all the terms in the formula are dened In order to show (5) we apply (4) with k = j 1; obtaining v i ;w j = v i+j 1 ;w 1 Since j + i 6 m we have v j+i 1 = (v j+i ); hence v i ;w j = (v i+j );w 1 = v i+j ;(w 1 ) = v i+j ; 0 =0 as claimed Corollary44 Let : E E be self-adjoint and nilpotent Let V be a Jordan block with cyclic base {v 1 ;v 2 ;:::;v m } Then v i ;v j =0 whenever i + j 6 m Moreover; V is non-degenerate if and only if v 1 ;v m 0 Proof We apply Lemma 43 with W = V The rst claim is clear in view of (5) In order to prove the second one put c := v 1 ;v m Notice that c = v k ;v m k+1 for all k =1;:::;mIfc = 0 then v 1 ;v i = 0 for all i =1;:::;m; which shows that V is degenerate Conversely; assume that V is degenerate Thus there exists a 0 v V V Write v = m a iv i and put k := max{i: a i 0} Then m 0= v m k+1 ;v = a i v m k+1 ;v i : Now for all i 6 k 1 we have v m k+1 ;v i = 0 since (m k +1)+i6m; and if i k + 1 then a i = 0 In the above sum we are left with the term with i = k; 0= v m k+1 ;v = a k v m k+1 ;v k = a k c; hence c = 0 as claimed Theorem 45 Let : E E be self-adjoint and nilpotent Assume that E is a direct sum of two or more Jordan blocks Then is decomposable Proof We must nd a proper non-degenerate; invariant subspace U of E Consider a decomposition of E into Jordan blocks; E= V 1 V 2 V r where r 2 We may assume that dim V 1 dim V i for i =2;:::;r Put m := dim V 1 and let {v 1 ;:::;v m } be a cyclic base of V 1 By Corollary 44; we have v 1 ;v k =0 for k =1;:::;m 1 Moreover; if v 1 ;v m 0 then V 1 is non-degenerate; hence is decomposable So we suppose that v 1 ;v m =0; hence v 1 V 1 Since E is non-degenerate; there exists a t {2;:::;r} such that V t is not orthogonal to v 1 Put := dim V t and let {w 1 ;:::;w } be a cyclic base of the Jordan block V t Recall that 6 m If m then, by Lemma 43 (5), v 1 ;w j = 0 for all j =1;:::;, hence v 1 V t, contrary to the assumption Therefore = m Now we look at V t If w 1 ;w m 0 then V t is a non-degenerate, invariant subspace So we suppose that w 1 ;w m = 0 For i =1;:::;m we put u i := v i + w i Then U := span{u 1 ;:::;u m } is a Jordan block generated by the cyclic vector u m In particular,
HA Keller, AH Ochsenius / Journal of Pure and Applied Algebra 174 (2002) 83 93 91 U is invariant under By Corollary 44, U is non-degenerate provided that u 1 ;u m 0 But u 1 ;u m = v 1 + w 1 ;v m + w m = v 1 ;v m +2 v 1 ;w m + v m ;w 1 =2 v 1 ;w m 0 This completes the proof 5 Witt decompositions The Witt index of a quadratic space measures the maximal dimension of subspaces containing only isotropic vectors We shall see below that the existence of self-adjoint, absolutely indecomposable matrices is closely related to that index A subspace V E is called totally isotropic if every x V is isotropic Since char K 2 this entails that x; y = 0 for all x; y V The maximal totally isotropic subspaces have all the same dimension This common dimension r is called the index of Witt of the space E; we write r = ind W (E) Notice that ind W (E) 6 1 2 dim(e), as E is non-degenerate It is well-known that E can be decomposed into an orthogonal sum of r hyperbolicplanes H i and an anisotropicsubspace E a, r E = H i Ea : Theorem 51 Let E := K n and let :; : be the canonical inner product The following assertions are equivalent: (i) E admits an operator : E E which is self-adjoint and nilpotent of order n (ii) The Witt index of E is maximal; ind W (E)=[n=2] Proof (i) (ii): Suppose that : E E is self-adjoint; n =0; n 1 0 This implies that E admits a base {v 1 ;v 2 ;:::;v n } such that (v i )=v i 1 for i =2;:::;n and (v 1 )=0 We apply Lemma 43 with W = V = E Put r := [n=2] For i; j = 1;:::; r we have i + j 6 2r 6 n; hence v i ;v j = 0 Thus the subspace J generated by {v 1 ;:::;v r } is totally isotropicand of maximal dimension [n=2]; as claimed In order to establish (ii) (i) we consider rst the case where n = dim E is even, n =2r Then (ii) implies that E is the orthogonal sum of r hyperbolicplanes, E = r H i Each H i has a base {v i ;w i } with v i ;v i = w i ;w i = 0 and v i ;w i =1: We dene an operator : E E by (v i )=v i+1 for i =1;:::;r 1; (v r )=w r ; (w i )=w i 1 for i =2;:::;r; (w 1 )=0: Clearly is nilpotent of order n The verication that is also self-adjoint is straightforward, using the denitions If n is odd, n =2r + 1, then (ii) means that ind W (E)=r, thus r E = H i Ea ;
92 HA Keller, AH Ochsenius / Journal of Pure and Applied Algebra 174 (2002) 83 93 where the H i are hyperbolicplanes with bases {v i ;w i } as before and E a is a straight line generated by an anisotropicvector e We dene : E E by (v i )=v i+1 for i =1;:::;r 1; (v r )=e; (e)=w r ; (w i )=w i 1 for i =2;:::;r; (w 1 )=0: It is readily veried that is self-adjoint and nilpotent of order n Example Let K be a eld of level s =2; for example; the eld Q 3 of 3-adicnumbers Fix a representation 1 =a 2 + b 2 In the space E = K 3 with the canonical internal product :; : we consider the vectors [ a g := [b; a; 0] T ; v:= [a; b; 1] T ; w:= 2 ; b ] T 2 ; 1 : 2 Then G := span{g} is an anisotropicstraight line with g; g = 1; and P := span{v; w} is a hyperbolicplane with v; v = w; w =0; v; w = 1; and E =G P The operator : E E; dened by v g; g w; w 0 is self-adjoint Moreover; Eis generated under by the cyclic vector v By Lemma 42; is absolutely indecomposable For the matrix of in the canonical base we nd 2ab a 2 b 2 b A = a 2 b 2 2ab a : b a 0 Next we compute the Witt index of the space E = K n with the canonical inner product :; : The authors are indebted to Professor R Baeza for the proof of the following result (see also [1; Remark on p 92]) Theorem 52 Let K be a eld of level s Let E =K n with the canonical inner product :; : Let t {0; 1;:::;2s 1} be dened by n t (mod 2s) If t 6 s then the anisotropic part E a has dimension t; if t s then dim E a =2s t Proof We rst deal with the case where n 2s Let {e 1 ;:::;e n } be the canonical base Consider a subspace F E generated by s vectors e i Since 1 is a sum of s squares in the eld K; there exists a vector f F with f; f = 1 Using Pster s theory of multiplicative quadratic forms (see [3; p 643]) we conclude that F admits an orthogonal base {f 1 ;:::;f s } with f i ;f i = 1 for i =1;:::;s Next, consider two subspaces F and G, each generated by s vectors e i and such that F G = {0} Clearly F G By what we have just seen, there are orthogonal bases {f 1 ;:::;f s } and {g 1 ;:::;g s } of F and G, respectively, such that f i ;f i = 1; g i ;g i = +1 for i =1;:::;s Then H i := span{f i ;g i } is a hyperbolicplane and F G = s H i Write n =2ks + t where 0 6 t 6 2s 1 Then H := span{e 1 ;e 2 ;:::;e 2ks } is a direct sum of k subspaces each of which is generated by 2s vectors e i We have seen above that such a subspace is an orthogonal sum of hyperbolic planes Therefore, H is an orthogonal sum of hyperbolicplanes We have E = H H where H := span{e 2ks+1 ;:::;e 2ks+t }; hence, dim H = t
HA Keller, AH Ochsenius / Journal of Pure and Applied Algebra 174 (2002) 83 93 93 Now if t 6 s then H is anisotropicand we see that E a =H has dimension t Onthe other hand, if t s then H admits a base {f 1 ;:::;f s ;g s+1 ;:::;g t } with f i ;f i = 1 and g j ;g j = +1 This means that H is an orthogonal sum of t s hyperbolicplanes with an anisotropicsubspace of dimension t 2(t s)=2s t To nish the proof we examine the case n 2s Ifn s then t = n and E a = E If s 6 n 2s then we see as before that E is an orthogonal sum of t s hyperbolic planes and an anisotropicspace of dimension 2s t Corollary53 Let E = r H i E a be a Witt decomposition of the space E = K n Then (a) E a = {0} if and only if n 0 (mod 2s) (b) dim E a =1 if and only if n ±1 (mod 2s) Now we can prove our main result Theorem 54 Let K be a non formally real eld of level s Let n N The following conditions are equivalent: (i) n is congruent to 0; 1; or 1 modulo 2s (ii) The Witt index of the space E = K n reaches its maximal value [n=2] (iii) In Mat n (K) there exist symmetric; absolutely indecomposable matrices Proof The equivalence (i) (ii) follows from Corollary 53 The implication (ii) (iii) is a consequence of Theorem 51 and Lemma 42 In order to show that (iii) implies (ii) suppose that the matrix A Mat n (K) is symmetricand absolutely indecomposable In virtue of the arguments at the beginning of Section 4 we may assume that the operator : E E dened by A is nilpotent Consider the decomposition E = t V i into Jordan blocks If t 2 then, by Theorem 45 is decomposable If t = 1 then E is a Jordan block, so (ii) follows by Theorem 51 The proof is complete References [1] B Diarra, Remarques sur les matrices orthogonales (resp symetriques) a coecients p-adiques, Ann Sci Univ Blaise Pascal, Clermont II, Ser Math, Fasc 26 (1990) 31 50 [2] N Jacobson, Basic Algebra I, Freeman and Company, San Francisco, 1974 [3] N Jacobson, Basic Algebra I, Freeman and Company, San Francisco, 1980