Linear Algebra 2 More on determinants and Evalues Exercises and Thanksgiving Activities 2. Determinant of a linear transformation, change of basis. In the solution set of Homework 1, New Series, I included a proof of the following theorem: Theorem 1 (Formulation 1.) Let T L(V ), where V is an n-dimensional vector space over the field F. Let the a square n n matrix M be the matrix of T in some basis of V. Then the square n n matrix N is the matrix of T with respect to some other basis of V if and only if there exists an invertible n n matrix A such that N A 1 MA. For the proof of the theorem see the posted solutions to Homework 1. Another way of phrasing it is to simply say that two matrices are the matrix of the same operator T L(V ) if and only they are similar, where similarity of matrices is defined by: M is similar to N, and we ll write m N, iff there exists an invertible n n matrix A such that N A 1 MA. Exercise 1 Show that similarity is an equivalence relation for matrices. Exercise 2 Show that if M N, then det(m) det(n). Definition 1 Let T L(V ), dim V n. Define det(t ) as follows. Let u 1,..., u n be a basis of V and let A be the matrix of T with respect to this basis. Then define det(t ) det(a). Exercise 3 Show this definition well defines det(t ). 2. Finding Eigenvalues If A is an n n matrix with entries in F, we automatically identify it with the linear operator in L(F n ) having the matrix A with respect to the standard basis of F n. The complex number λ is an eigenvalue of the matrix A if and only if det(λ A) 0; i.e., if and only if it is a zero of the characteristic polynomial of A, as is proved in the notes on determinants. Computing the characteristic polynomial and finding its zeros is probably the most efficient way of finding eigenvalues of an operator Exercise 4 Let T L(V ), dim V n and let u 1,..., u n and w 1,..., w n be bases of V ; assume A is the matrix of T with respect to the basis u 1,..., u n and assume B is the matrix of T with respect to the basis w 1,..., w n. Prove that A and B have the same characteristic polynomial. This allows one to define the characteristic polynomial of an operator T as the characteristic polynomial of any matrix of T with respect to a basis of the vector space. Exercise 5 Find all the eigenvalues of the following operators/matrices.
2 1. 2. A B 6 19 21 2 15 21 2 9 15 0 1 0 0 1 0 0 0 0 0 2 3 0 0 0 5 3. T : P(R n ) P(R n ) where (T p)(x) p(x + 2). 3. Algebraic and geometric dimensions, Theorem of Cayley Hamilton. In this part we assume that V is a finite n-dimensional vector space over C and that T L(V ). Let λ be an eigenvalue of T. Then E λ E(λ, T ) is the eigenspace of λ defined by E λ {v V : T v λv} N(T λi). The generalized eigenspace of T, denoted by G(λ, T ) G λ is defined by G λ {v V : (T λi) k v 0 for some k N}. Exercise 6 Show that E λ, G l are subspaces of V and E λ is a subspace of G λ. Exercise 7 Assume dim G λ r. Prove that G λ {v V : (T λi) r v 0}. The dimension of E l is called the geometric dimension of T while the dimension of G λ is called the algebraic dimension of T. The following theorem is not at all trivial; it is the basis for the Jordan canonical form of an operator. Theorem 2 Let λ 1,..., λ m be the distinct eigenvalues of T and let n k be the algebraic multiplicity of λ k for k 1,..., m. Then 1. V G λ1 Gλm. In particular n 1 + + n m n. 2. The characteristic polynomial p of T satisfies p(λ) (λ λ 1 ) n1 (λ λ m ) nm ; in other words, the algebraic multiplicity of an eigenvalue is equal to its multiplicity as a root of the characteristic polynomial. We will accept this theorem. I believe it is proved in Axler s book.
3 Exercise 8 Verify that the theorem holds for the matrix 2 1 5 0 0 2 3 0 0 0 2 0 0 0 0 3 As a consequence we get the following theorem, known as the Cayley-Hamilton Theorem. Theorem 3 Let p P(C) be the characteristic polynomial of the operator T L(V ), dim V finite. Then p(t ) 0. Exercise 9 Prove the Cayley-Hamilton Theorem. Use the factorization in part 2 of Theorem 2 and part 1 of that theorem to show that p(t ) vanishes on a basis of V. Example. Consider the matrix 5 2 2 1 13 10 4 1 2 2 4 0 15 6 6 7 If we use Laplace s expansion by the first row to compute p(λ), the characteristic polynomial, here is how it starts. p(λ) det ( 2) det λ 5 2 2 1 13 λ + 10 4 1 2 15 6 6 λ + 7 det 13 4 1 15 6 λ + 7 13 λ + 10 4 2 2 λ + 4 15 6 6 (λ 5) det + 2 det λ + 10 4 1 6 6 λ + 7 13 λ + 10 1 2 2 0 15 6 λ + 7 Then det λ + 10 4 1 6 6 λ + 7 ( 4) det (λ + 10) det ( λ + 4 0 6 λ + 7 ( 2 0 6 λ + 7 ) ) ( 2 λ + 4 + ( 1) 6 6 (λ + 10)(λ + 4)(λ + 7) + 4(2(λ + 7)) 2(12 + 6(λ + 4) )
4 and after this there are 11 further determinants to compute. One could have expanded by the last column; it would have shortened things a bit. But as you can see this method for computing determinants is also less than wonderful. The best method is probably use some row reductions to create zero entries, use Laplace once there are lots of zeros in the matrix. Anyway, thanks to the wonders of Maple I can get within seconds that p(λ) λ 4 + 16λ 3 + 72λ 2 432. Also thanks to Maple we find more or less painlessly that 12 12 12 4 A 2 72 72 36 4 24 24 12 0 96 60 60 40 132 96 96 28 A 3 564 528 312 28 216 216 0 0 780 528 528 244 816 672 672 160 A 4 3840 3696 2400 160 1728 1728 432 0 5568 4128 4128 1456 Exercise 10 Use these calculations to conclude that A 4 +16A 3 +72A 2 432I 0. Exercise 11 Why is this NOT a valid proof of the Cayley Hamilton Theorem: In p(λ) det(λi T ) set λ T to get p(t ) det(t T ) 0? Exercise 12 Let 5 2 2 1 13 10 4 1 A 2 2 4 0 15 6 6 7 as before. Show that det(a) 432 and that No calculations should be necessary. A 1 A 3 + 16A 2 + 72A.
Exercise 13 Recall that in p(λ) det(λi T ), the constant coefficient is ( 1) n det(t ). Prove: If T is invertible, then T 1 is a polynomial in T. 5