EE0 Linear Algebra: Tutorial 6, July-Dec 07-8 Covers sec 4.,.,. of GS. State True or False with proper explanation: (a) All vectors are eigenvectors of the Identity matrix. (b) Any matrix can be diagonalized. (c) Eigenvalues must be nonzero scalars. (d) A and B are said to be Similar matrices if there exists an invertible matrix P such that P AP B. A and B always have the same eigenvalues. (e) The sum of two eigenvectors of an operator T is always an eigenvector of T. Solution: (a) True. We know, S AS Λ. If A I, S IS is always diagonal (Λ is just I). The only requirement is that S should be invertible. (b) False. Any matrix with distinct eigenvalues can be diagonalized. (c) False. They can be zero as well. But, eigenvectors have to be nonzero. Having zero eigenvalue implies that the matrix is non-invertible. (d) True. If A and B are similar, there is some invertible matrix P such that P AP B. Thus, P A BP or AP PB. If Av λv, we have B(P v) λp v. Similarly, if Bv λv, then we have A(Pv) λpv. Thus both have same eigenvalues λ. (e) False. For example, vectors (, ) t and (0, ) t are eigenvectors of the matrix [ 0 0 But the sum of them (, 0) t is not an eigenvector of the same matrix. ]. Let T be the linear operator on n x n real matrices defined by T(A) A t. Show that ± are the only eigenvalues of T. Describe the eigenvectors corresponding to each eigenvalue of T. Hint: Write the Eigenvalue equation as T(A) A t λa and proceed. Solution: If T(A) A t λa for some λ and some nonzero matrix A, say A ij 0, we have A ij λa ji and and so A ji λa ij A ij λ A ij This means that λ can be only or -. And these two values are eigenvalues due to the existence of symmetric and skew-symmetric matrices.
The set of nonzero symmetric matrices are the eigenvectors corresponding to eigenvalue, while the set of nonzero skew-symmetric matrices are the eigenvectors corresponding to eigenvalue -.. Prove that the geometric multiplicity of an eigenvalue, γ A (λ i ), can not exceed its algebraic multiplicity, µ A (λ i ). Thus, from here conclude (and prove that) γ A (λ i ) µ A (λ i ) n Solution: See http://www.ee.iitm.ac.in/uday/07b-ee0/multiplicity.pdf 4. Consider the following N N matrix: x x 0 0 0 0... 0 0 0 x x x 0 0 0... 0 0 0 0 x x x 0 0... 0 0 0 A 0 0 x x x 0... 0 0 0 0 0 0............ x x x 0 0 0............ 0 x x This implies for N,,, matrix A looks like, x x 0 x x x x x x x x 0 x x Show that the determinant of A is (F N + F N )x N, where F, F and F N F N + F N. Hint: Use Mathematical Induction. Solution: (Use Mathematical Induction) It can be easily verified that the result is true for N,,. Let us assume that it is true upto dimension N N, where N > 4. If we prove that the result holds for when the matrix dimension is N N, then we are done. Carefully observe that the determinant of the N N matrix A is given by, ( ) x(det. of N N matrix) ( x) x(det. of N N matrix) x ((F N + F N )x N ) + x ( (F N + F N 4 )x N ) F N x N + F N x N. Hence proved. x x... Alt proof: Partition A n as A n x A n..... Page
OR (Using row transformations) Using the property that subtracting a multiple of one row from another row leaves the same determinant. Subtracting Row Row Row, followed by Row Row Row, and soon to matrix A to get a upper triangular matrix as shown below, det(a) x x 0 0 0 0... 0 0 0 0 x x 0 0 0... 0 0 0 0 0 x x 0 0... 0 0 0 0 0 0 x x 0... 0 0 0 8 0 0 0 0 x x... 0 0 0 It can be observed that the diagonal elements are of the following form a ii F ii F ii x. x x 0 0 0 0... 0 0 0 F 0 F x x 0 0... 0 0 0 0 F 0 0 F x x 0... 0 0 0 0 det(a) F 0 0 0 4 F x x... 0 0 0 0 F 0 0 0............ 0 N F N x x 0 0 0............ 0 0 F N F N x As the matrix is upper triangular, determinant is product of diagonal elements, i.e. det(a) F F 0 x F F x... F N F N x F N F N x F N x N (F N + F N )x N. Let p(λ) n (λ i λ) be the characteristic polynomial of the n n matrix A. Derive the i characteristic polynomial of A I, where I is an identity matrix of appropriate dimension. Hint: Use properties of eigen values and definition of a characteristic polynomial. Solution: Let Λ be such that, λ 0 0... 0 Λ 0 λ 0... 0............... 0 0 0... λ n Then, there exists an invertible matrix B such that A BΛB. Further, we get A Page
BΛB BΛB BΛ B. Now, the characteristic polynomial of A I is given by, ( ) det (A I) λi det(bλ B (λ + )I) det(bλ B (λ + )BB ) ( det B(Λ (λ + )I)B ) det(b)det(λ (λ + )I)det(B ) det(λ (λ + )I). Note that Λ (λ + )I is a diagonal matrix. Hence, the characteristic polynomial is n i (λ i (λ + )). 6. Prove that a linear transformation T on a finite dimensional vector space is inverible iff zero is not an eigen value of T Hint: Use properties of eigen values. Solution: T is invertible iff det(t) 0. det(t)product of eigen values. det(t) 0 eigen values are non zero 7. (a) What is wrong with this proof that projection matrices have det P? P A(A T A) A T so P A A T A AT Hint: Invertibility. (b) Suppose the 4by4 matrix M has four equal rows all containing a, b, c, d. We know that det(m) 0. Find the det(i + M) by any method? det(i + M) Hint: Use properties of determinants. + a b c d a + b c d a b + c d a b c + d Solution: (a) The proof is valid only if A is square and invertible, which is not the case everytime. If A is not invertible, then (A T A) A A T (A T A) A A T. (b) Using the property that subtracting a multiple of one row from another row leaves the same determinant. + a b c d det(i + M) 0 0 0 0 0 0 Page 4
Using the property that subtracting a multiple of one column from another column leaves the same determinant. + a + b + c + d b c d det(i + M) 0 0 0 0 0 0 + a + b + c + d 0 0 0 8. Find the eigenvalues and eigenvectors for both of these Markov matrices A and A. Expain why A 00 is close to A :.6. A A / /.4.8 / / Hint: Use diagonalization. Solution: Eigen values and eigenvectors of A are 0.4, and [ ] / /, respec- tively. Eigen values and eigenvectors of A are 0, and [ ] / /, [ ] / /, [ ] / /, respectively. We could see that the eigenvectors are linearly independent for A. So, matrix A can be diagonalizable as shown below. where S A SΛS [ ] / / / / 0.4 0 and Λ. 0 A SΛS SΛS SΛ S Similarly, A n SΛ n S and A SΛ S SΛ S 0.4 where Λ 0 0 0 0. So, the eigen values with magnitude less than the 0 one, will have less significance at higher powers. 0.4 00.6069 0 40 0. 9. When a + b c + d, show that (,) is an eigenvector and find both eigenvalues: a b A c d Hint: Use definition of eigen vector, Ax λx and substitute given vector for x. Solution: Let a + b c + d f. Eigen vector is the solution to the equation Ax λx. If (,) is an eigen vector, then a b c d a + b c + d f f f Page
which satisfies the equation Ax λx with eigen value λ f a + b c + d. 0. EXTRA: Find u(t) that satisfies the differential equation du/dt Pu, when P is a projection: du dt [ ] u with u(0) v(t) Here u(t) is a vector of time-varying functions, i.e., we can write u(t). You will w(t) find that a part of u increases exponentially while another part stays constant. Hint: Find eigen values and eigen vectors of P and substitute given initial condition. Solution: Solving det(p λi) 0 gives the eigen values of P as λ and λ 0 (This is true for all projection matrices). Solving Px λx for [ each of the eigen values gives the corresponding eigen vectors as x and x. ] In the differential equation, this produces the special solutions u e λt x. They are the pure exponential solutions to du/dt Pu. Let these be u e λt x and u e λt x. Any linear combinations of u and u will also be solutions to the differential equation. The complete solution is given by [ u(t) ] c e λt x + c e λt x. We now find c and c using the initial condition u(0), i.e., or c + c c [ This gives c 4 and c. Therefore the complete solution is u(t) 4e t +. ] Here, the first part of u increases exponentially while the nullspace part (corresponding to eigen value 0) remains fixed. c Page 6