Practice final solutions. I did not include definitions which you can find in Axler or in the course notes. These solutions are on the terse side, but would be acceptable in the final. However, if you are not sure about something, you will be much more likely to get partial credit if you write out more details! Problem 1. No justification is needed. In this question, V is an n-dimensional vector space over C, T, S are linear operators from V to itself, and W is an m- dimensional inner product space over R. Mark true or false. (a) Any spanning set for V contains a basis. (True) (b) V is isomorphic to its dual space V. (True; they have the same dimension.) (c) Suppose W 1, W 2 are two subspaces of W. Then there exists an isometry σ : W W so that σ(w 1 ) = W 2. (True: take orthonormal bases for both, extend, and map one to the other.) (d) If T S is zero, then ST is zero. (False: take 2 2 matrices S, T with entries only in the (1, 1) and (1, 2) entries, respectively.) (e) If T S is not injective, then ST is not injective. (True: same determinant.) (f) If T S is the identity operator, then ST is the identity operator. (True: if S is inverse to T, it is so both on the left and the right.) (g) The characteristic polynomials of T S and ST are the same. (True; tricky.) (h) Any linear transformation from W to itself has an eigenvector. (False: rotation of R 2 ; any self-adjoint transformation has an eigenvector.) (i) If x, y W satisfy x, y = 0, and x, y are both nonzero, then x and y are linearly independent. (True; same argument as orthonormal sets are LI. ) (j) if T n+1 = 0, then also T n = 0. (True; consider image(t ) image(t 2 )... ; on an old homework.) Problem 2. Let V be a finite-dimensional space, U V a subspace with U V. Suppose that x 1,..., x r is a linearly independent list in V, such that span(x 1,..., x r ) + U = V. Prove that we can find x r+1,..., x n U so that x 1,..., x n is a basis for V. We can extend x 1,..., x r to a basis x 1,..., x r, y r+1,..., y n. (Thus, n = dim V ). (1) For each r + 1 i n, there exists w i span(x 1,..., x r ), x i U with y i = w i + x i. Then x 1,..., x n is a basis as desired. Indeed, it suffices to verify that it spans V, since n = dim(v ). We may write any v V as r i=1 α ix i + n i=r+1 β iy i, which we may rewrite 1
2 as: r α i x i + i=1 n i=r+1 β i x i + n i=r+1 β i w i. The last term on the right belongs to span(x i ), and so the entire expression does. Thus, v span(x i ), as required. Prove there exists a basis (e 1,..., e n ) for V so that e i / U for all i. Choose a basis e 1,..., e t for U. Extend to a basis e 1,..., e n for V. Then e j / U for j > t; also, e j + e n / U for j t, since otherwise e n = (e j + e n ) e j would belong to U. Now consider e 1 + e n, e 2 + e n,..., e t + e n, e t+1, e t+2,..., e n. It is still a basis for V, because it is linearly independent: t α i (e i + e n ) + i=1 n i=t+1 e i = 0 = n 1 α i e i + (α n + α 1 + α 2 + + α t )e n = 0, i=1 which implies α i = 0 for i n 1, and then that α n = 0. Problem 3. Suppose V, W are finite dimensional vector spaces. Let K be a subspace of V and I a subspace of W. Explain why {T L(V, W ) : null(t ) = K and image(t ) = I} is not a subspace of L(V, W ). It does not contain T = 0 (at least, if K is not all of V ). Compute (with proof) the dimension of {T L(V, W ) : null(t ) K and image(t ) = I}. Let Q be the subspace described by this equation. Take a basis (v 1,..., v k ) for K and extend to a basis (v 1,..., v k, y 1,..., y r ) for V. Let Y = span(y 1,..., y r ). Then V = K int Y. Consider the map: Q Φ L(Y, I), L(Y, I) Ψ Q where Φ restricts a linear map to Y ; on the other hand, Ψ(S) extends a map by zero, i.e. Ψ(S)(k + y) = S(y), k K, y Y. This is well-defined because the map K Y V, defined by (k, y) k+y, is an isomorphism, by definition of the internal direct sum. Then Φ, Ψ are inverse to one another, and so Q is isomorphic to L(Y, I). Thus, dim Q = dim Y dim I = (dim V dim K) dim I. Problem 4. Let V be a finite-dimensional vector space over C. Let T L(V, V ). Define the adjoint map ˆT : V V. Suppose that there exists a basis of eigenvectors for T. Prove that there exists a basis of eigenvectors for ˆT. Suppose v i is a basis of eigenvectors, i.e. T (v i ) = λ i v i. The matrix of T with respect to bases (v i ), (v i ) is diagonal, with λ i s on the diagonal. Thus, the matrix of ˆT with respect to the dual bases (vi ), (v i ) is the transpose of this, i.e. diagonal with λ i s on the diagonal. Therefore, ˆT (v i ) = λ i v i, thus vi gives a basis of ˆT -eigenvectors.
3 Suppose that image(t ) null(t ). Prove that image( ˆT ) null( ˆT ). Note that if image(t ) null(t ) if and only if T 2 = 0. This is so if and only if T ˆ 2 = ˆT ˆT = ˆT 2 is zero. This is so if and only if ˆT 2 = 0. Problem 5. Let V be a finite dimensional vector space over the complex numbers. Let T L(V, V ). Let I be the identity transformation from V to V, and let T, T 2,... be the successive powers of T. In this question we will consider the subspace Q of L(V, V ) spanned by the (infinite) ( list 1, T, ) T 2,.... 2 0 Compute Q in the case when V = C 2 and T =. 0 3 Q is equal to the vector space of diagonal matrices. Prove that there exists N 1 so that {1, T, T 2,..., T N } spans Q. Q is finite-dimensional, because it is a subspace of the finite-dimensional space L(V, V ). Therefore, it is spanned by a finite collection e 1,..., e r Q. Each e i is a linear combination of (1, T,..., T Ni ) for some N i. Let N = max N i. Then every e i belongs to the span of (1, T,..., T N ), and thus (1, T,..., T N ) spans Q. Prove that dim(q) dim(v ) always. 1 The Cayley-Hamilton theorem shows that there is a polynomial p of degree dim V specifically, the characteristic polynomial of T so that p(t ) = 0. In particular, [ ] T dim V is a linear combination of I, T,..., T dim V 1. Now, T n span(i, T,..., T dim V 1 ) = T n+1 span(t, T 2,..., T dim V ), and the latter space is equal to span(i, T,..., T dim V 1 ) by above. By induction, we see T n span(i, T,..., T dim V 1 ) for every n. Thus, Q is spanned by dim V elements. So dim(q) dim V. Problem 6. Consider R 3 with the usual inner product (the dot product) and let M be a 3 3 symmetric matrix, thought of as a linear map R 3 R 3. Let e 1 = (1, 1, 2), e 2 = (3, 1, 0) and let W be the span of e 1, e 2. Find a vector e 3 perpendicular to W. We apply the Gram-Schmidt process to the linearly independent set (e 1, e 2, (0, 0, 1)). (We check this is linearly independent by computing the determinant; it is easy to compute by expansion around the last row. It is also possible to directly solve the equations e 3, e 1 = e 3, e 2 = 0 to get to the answer.) First, we scale e 1 to length 1 to get f 1 = (1, 1, 2)/ 6. Then, we project e 2 to get f 2 = e 2 e 2, f 1 f 1 = (3, 1, 0) (2/3, 2/3, 4/3) = (7/3, 1/3, 4/3). Now we scale f 2 to length 1 to get f 2 = (7, 1, 4)/ 66. Finally, we project (0, 0, 1) to the orthogonal complement of f 1, f 2 : f 3 = (0, 0, 1) (1, 1, 2)/3 (7, 1, 4)( 4)/66 = (1/11, 3/11, 1/11). We scale it to length 1 to get f 3 = (1, 3, 1)/ 11. Thus, e 3 = (1, 3, 1) is a vector perpendicular to W. Suppose you are given that M(w) W whenever w W, i.e., M preserves W. Prove that e 3 is an eigenvector of W. 1 Hint: Use the Cayley-Hamilton theorem.
4 For every w W, M(e 3 ), w = e 3, M(w) = 0, since M(w) W and e 3 is perpendicular to W. Thus, M(e 3 ) W. We have seen that dim W = 3 dim(w ) = 1; since e 3 W, we must have W = span(e 3 ). Since M(e 3 ) span(e 3 ), it follows that e 3 is an eigenvector. Find the vector in W that is closest to (0, 0, 1). It was discussed in class that, if f 1, f 2 is an orthonormal basis for W, the closest vector to v in W is w = v, f 1 f 1 + v, f 2 f 2. Here closest means that v w 2 is minimized. In any case, this equals: 1/3(1, 1, 2) 4(7, 1, 4)/66 = ( 1/11, 3/11, 10/11), so ( 1/11, ( 3/11, ) 10/11) is the closest point on W to (0, 0, 1). 3 2 Problem 7. Let A =. 1 2 Compute the determinant, the characteristic polynomial, and eigenvalues of A. The determinant is 4. The characteristic polynomial is (3 λ)(2 λ) 2 = λ 2 6λ + 4 = (λ 4)(λ 1). So the eigenvalues are 4 and 1. ( The eigenvector ) v( 1 corresponding ) to λ = 4 belongs to the kernel of 1 2 2, e.g. ; the eigenvector v 1 2 1 2 corresponding to λ = 1 ( ) ( ) 2 2 1 belongs to the kernel of, e.g.. 1 1 1 Compute A 100. (Please explain your procedure clearly; then you can get partial credit even if you make numerical errors.) We have seen Av 1 = 4v 1, Av 2 = v 2. Thus, A 100 v 1 = 4 100 v 1, A 100 v 2 = v 2. ( ) ( ) 2 1 4 100 0 In other words, with D =, AD = D. Thus, 1 1 0 1 ( ) ( ) A 100 4 100 0 = D D 1 (2 4 = 100 + 1)/3 (2 4 100 2)/3 0 1 (4 100 1)/3 (4 100 + 2)/3 Let B be an invertible 3 3 complex matrix with the following property: for any integer n, positive or negative, all the matrix entries Bij n of B n satisfy Bij n 10. Prove that all the eigenvalues of B have absolute value 1, and that B has a basis of eigenvectors. Fact. Put on C 3 the standard inner product (z 1, z 2, z 3 ) (w 1, w 2, w 3 ) = zi w i. If M is a 3 3 matrix all of whose entries are A in absolute value, and v any vector in C 3, then Mv 3A v. Proof of fact. In fact, each coordinate of Mv is of the form 3 j=1 M ijv j ; each term in this sum is at most A v. Therefore, Mv 3A 2 v 2 < 3A v. We now begin the proof. Let V λ be any generalized eigenspace of B. Let N = B λi, which we think of as a linear map N : V λ V λ. We know that N 3 = 0. For v V λ and any n 1:
5 (2) B n v = (λi + N) n v = (λ n + nλ n 1 N + n(n 1)λ n 2 N 2 /2)v We have simply expanded out (λi + N) n, using the fact that N 3 = 0. There exists v V λ that belongs to the null-space of N: since λ is an eigenvalue, the null-space of B λi is nonzero. Applying the above equation to v, we see simply: B n = λ n v. If λ > 1 and we take n so large that λ n > 30, this contradicts the Fact above, since on the one hand B n v 30 v ; on the other hand λ n v > 30 v. Therefore, λ 1. Applying the same reasoning to B 1, shows that λ 1 1. Putting these two together, λ = 1. This completes the proof that every eigenvalue has absolute value 1. The triangle inequality in an inner product space implies that v 1 + v 2 + v 3 v 1 v 2 v 3. Therefore, (λ n + nλ n 1 N + n(n 1)λ n 2 N 2 n(n 1) /2)v N 2 v n Nv v. 2 So, if we can find v V λ so that N 2 (v) 0, the norm of the right-hand side of the equation (3) B n v = (λi + N) n v = (λ n + nλ n 1 N + n(n 1)λ n 2 N 2 /2)v will grow without bound as n (it is a quadratic function of n.) This is a contradiction to the Fact above, since the left hand side is at most 30 v. If N 2 (v) = 0 for all v V λ, but there exists v V λ so that N(v) 0, the same reasoning still gives a contradiction. Therefore, N = 0 on V λ. Therefore, B(v) = λv whenever v V λ, and any basis for V λ is a basis of genuine eigenvectors. Doing this for every generalized eigenspace shows that there exists a basis of eigenvectors. Note: This problem was too hard and way too long. Sorry; I didn t realize when setting it how long a solution takes to write. Problem 8. Suppose that V, W are finite-dimensional inner product spaces, and T : V W a linear map. Define the singular values of T. [CORRECTED.] Suppose we are given a k-dimensional subspace Q V so that T v v for each v Q. Prove that at least k of the singular values of T counted with multiplicity are 1. Let the singular values of T be σ 1 σ 2.... There exist orthonormal bases e i, f i for V so that T e i = σ i f i. Suppose that the statement is not true, i.e. σ k < 1. We will derive a contradiction. Let W = span(e k,..., e n ). Then dim(q W ) = dim(q) + dim(w ) dim(q + W ) dim(q) + dim(w ) dim(v ) = 1. So there exists nonzero v Q W. Because v W, we can write v = j k α je j for some α j ; then T v = α j σ j f j = αj 2 = v. j k j k α 2 j σ2 j < j k
6 We used the fact that σ j 1 for every j k. However, the conclusion T v < v contradicts the fact v Q.