Exam questions with full solutions MH11 Linear Algebra II May 1 QUESTION 1 Let C be the set of complex numbers. (i) Consider C as an R-vector space with the operations of addition of complex numbers and scalar multiplication by real numbers. Prove that the set S = {1 + i, 1 i, } C is linearly dependent. (ii) Show that the same set S = {1 + i, 1 i, } C is linearly independent if C is considered as a Q-vector space, that is, the addition is the usual addition of complex numbers and the scalar multiplication is multiplication with rational numbers. Solution (i) We just need to explicitly construct a zero linear combination with real coefficients such that not all of the coefficients are zeroes: 1 (1 + i) + 1 (1 i) = (i ) An alternative solution: C is -dimensional as an R-vector space and hence any three elements are linearly dependent. (i ) One more alternative solution: suppose that a(1 + i) + b(1 i) + c =, where a, b, c R. It means that a + b + c = and a b =. Applying the row reduction, we can find that there exists a nonzero solution for a, b, c. (ii) Assuming that a(1 + i) + b(1 i) + c =, where a, b, c Q, we need to show that a = b = c =. First, we have a(1 + i) + b(1 i) + c = (a + b + c ) + (a b)i = and thus a + b + c = and a b =. Substituting a = b into the first equation, we get a + c =, or c = a. If a, then c would be irrational, which means a = b = and therefore c = =. :) 1
QUESTION Let M (R) be the R-vector space of matrices with real entries and let W M (R) be the subset of all matrices X satisfying both of the two conditions X = X t and tr X = (symmetric traceless matrices). (i) Show that W is a subspace of M (R). (ii) Find a basis and the dimension of W. (iii) Show that M (R) = W W, where W M (R) is the space of matrices whose first column has both entries equal to zero. Solution First, let s verify that W is a subspace: First, let be the zero matrix. Then W since t = and tr =. (15 marks) Let A, B W. Then (A + B) t = A t + B t = A + B and tr(a + B) = tr A + tr B = + = and hence A + B W. Let A W and let k R. Then (ka) t = ka t = ka and tr(ka) = k tr A = k = and hence ka W. [ ] a b Now consider an arbitrary matrix X =. It is symmetric if and only c d if b = c and traceless if and only if a + d [ =. ] Therefore an arbitrary element of a b the space W is a matrix of the form A =. Obviously, dim W = (because b a [ ] [ ] 1 1 there are two free parameters) and a possible basis is,. 1 [ ] An arbitrary element of the space W x is B =. To verify a direct sum y decomposition, we need to check two things. First, suppose that some matrix A = B belongs to W W. Equating entries in the first column, we get a = b = and thus A = B =. Second, we need to show that any matrix can be expressed as a sum of an element of W and an element of W : [ ] [ ] [ ] c11 c 1 c11 c = 1 c1 c + 1 c 1 c c 1 c 11 c + c 11 Alternatively, to verify that W + W = M (R), notice that dim W = dim W =. Now dim(w + W ) = dim W + dim W dim(w W ) = 4 and hence W + W is the whole space because dim M (R) = 4. :)
QUESTION Let V, W be vector spaces over C and let T : V W be a linear transformation. (i) Suppose that vectors v 1,..., v k V are linearly dependent. Prove or disprove: T(v 1 ),..., T(v k ) are also always linearly dependent. (ii) Suppose that vectors u 1,..., u l V are linearly independent. Prove or disprove: T(u 1 ),..., T(u l ) are also always linearly independent. Solution (i) If vectors v 1,..., v k V are linearly dependent, then there exists a linear dependence an untrivial zero linear combination a 1 v 1 + + a k v k =, where a 1,..., a k C and not all of the coefficients a 1,..., a k are zeroes. Then we have a 1 T(v 1 ) + + a k T(v k ) = T(a 1 v 1 + + a k v k ) = T() = and hence vectors T(v 1 ),..., T(v k ) are also linearly dependent. (ii) Here, vectors T(u 1 ),..., T(u l ) might be linearly dependent even if u 1,..., u l V are linearly independent. Here is a simple counter-example: V = C 1, T(z) = for all z, k = 1, v 1 = 1. :)
QUESTION 4 Let F be one of the fields Q, R, or C, let V be a vector space over F, and let f : V F be a linear transformation. Further, assume that v N( f ), where N( f ) is the nullspace of f. Prove that V = span{v} N( f ). Solution As always, to verify a direct sum decomposition, we need to check two things. First, in order to show that span{v} N( f ) = {}, suppose that u span{v} N( f ). Since u span{v}, we conclude that u = av for some a F. Since u N( f ), we have = f (u) = f (av) = a f (v). Besides, we know that v N( f ), which means f (v). Therefore a = and u =. Second, we need to show that an arbitrary element w of the space V can be expressed as a sum of an element of span{v} and an element of N( f ). Let s try to write it down explicitly: w = av }{{} span{v} + w } {{ av }, N( f ) which means = f (w av) = f (w) a f (v), that is, a = f (w), which makes sense f (v) since f (v). Thus we get w = f (w) f (v) v + w f (w) f (v) v, } {{ }} {{ } span{v} N( f ) which completes the proof. :) Remark It is essential that the space V might be infinite-dimensional and hence considering bases doesn t make sense. 4
QUESTION 5 Let M (Q) be the Q-vector space of matrices with rational entries and consider the linear operator T : M (Q) M (Q) given by Are matrices [ ] 1 and T(X) = X t + tr(x) [ ] 1 eigenvectors? 1 Solution Let s check it according to the definition: and hence and hence T ( ) [ ] 1 1 = + [ ]. 1 [ ] [ ] = 1 [ ] 1 is an eigenvector with the eigenvalue. Further, T ( ) 1 = 1 [ ] 1 + 1 [ ] = 1 [ ] [ ] λ λ [ ] 1 is not an eigenvector. :) 1 Remark Proving that T is a linear operator is not required. Finding all eigenvalues and eigenvectors is not required either. 5
QUESTION 6 Let Compute A 1. 1 1 A = 1 1 1 1 1 1. Solution First, observe that 1 A 1 = 1 1 1 1 1 1 1 1 Further, let T : R R be the linear operator of the left multiplication with the matrix 1 Notice that T acts as follows: it sends x 1 to x 1 and rotates the plane (x, x ) by the angle π/ in the counter-clockwise direction. It means that T 6 is the identity transformation and hence T 1 is also the identity (because 1 is a multiple of 6). Thus T 1 = T, which, according to its geometric meaning, does the following: sends x 1 to x 1 and rotates the plane (x, x ) by the angle π. The rotation by the angle π is minus identity and hence the whole transformation T is minus identity. Thus, 1 A 1 = 1 1 1 1 1 1 = 1 1 1 1 1 1 1 1 1 = :). 6
QUESTION 7 Let P (R) be the R-vector space of polynomials with real coefficients of degree at most. Prove that the formula f, g = x f (x)g(x)dx, f, g P (R) defines an inner product and find an orthonormal basis of the space P (R) with respect to it. (15 marks) Solution According to the lectures, to prove that f, g is an inner product, it s enough to check that the weight function x is positive on [, 1] (possibly except for finitely many points). Since x > for all x, it is true. To find an orthonormal basis, we ll apply the Gram-Schmidt orthogonalization to any basis, say {1, x, x }. The computation is below. w 1 = v 1 = 1, w 1, w 1 = v, w 1 = x x 1dx = x 1 1dx = x dx =, x dx =, w = v v, w 1 w 1, w 1 w 1 = x / 1 = x, w, w = v, w 1 = x x xdx = x 4 dx = 5, x x 1dx = 5, v, w = x x xdx =, w = v v, w w, w w v, w 1 w 1, w 1 w 1 = x /5 / 1 = x 5, ( w, w = x x dx = (x 5) 6 65 ) x4 + 9x dx 5 = 7 1 5 + 6 5 = 8 175 Thus {w 1, w, w } is an orthogonal basis. An orthonormal one is e 1 = w 1 w 1 =, e = 5 x, e = 175 8 ( x ) 5 :) 7
QUESTION 8 Let V be an inner product space over R, let W V be a subspace, and let W be the orthogonal complement of W. Further, let R be the set of all linear operators T : V V such that T(W) W. (i) Prove that R is an R-vector space (subspace of the space of linear operators on V). (ii) Let β = {v 1,..., v n } be an orthonormal basis of V such that {v 1,..., v k } is a basis of W and {v k+1,..., v k } is a basis of W. What can you say about the matrix of any T R with respect to the basis β? Solution To show that R is a subspace, we need to verify three conditions: The zero linear operator sends everything to the zero vector, which belongs to W and hence the zero operator belongs to R. Suppose that T 1, T R. It means that T 1 (w) W and T (w) W for all w W. In other words, we have u, T 1 (w) = u, T (w) = for all u, w W. But then we have u, (T 1 + T )(w) = u, T 1 (w) + T (w) = u, T 1 (w) + u, T (w) = and hence T 1 + T also belongs to R. Suppose that T R and k R. As shown above, it means that u, T(w) = for all u, w W. Then we have u, (kt)(w) = u, kt(w) = k u, T(w) = k = and hence kt also belongs to R. The matrix of any element of R is of the block form [ [k k] X [k (n k)] Y [(n k) k] Z [(n k) (n k)] ]. :) Remark In fact, the following more general fact is true: let U, V be any vector spaces and let U U and V V be subspaces. Then the set of linear transformations from U to V that map U into V is a subspace of L(U, V). It s not hard to prove from the definitions either. 8
QUESTION 9 Consider the space M n n (R) n n matrices with real entries. Further, let X, Y = tr(xy t ). Show that X, Y is an inner product on M n n (R). With respect to this inner product, what is the orthogonal complement of the subspace of symmetric matrices? Justify your answer. Solution In order to show that X, Y = tr(xy t ) is an inner product, we need to verify the four axioms: AI X + Y, Z = tr[(x + Y)Z t ] = tr(xz t + YZ t ) = tr(xz t ) + tr(yz t ) = X, Z + Y, Z checked. AII ax, Y = tr[(ax)y t ] = tr(axy t ) = a tr(xy t ) = a X, Y checked. AIII Y, X = tr(yx t ) = tr(yx t ) t = tr(xy t ) = X, Y checked. AIV X, X = tr(xx t ) = tr A = n i=1 a ii, where A = XX t. Notice that a ii is the dot product of the i th row of the matrix X and the i column of the matrix X t, which is also the i th row of the matrix X. In other words, we get a ii = n j=1 x ij. Thus X, X = i,j=1 n x ij, which is positive unless all entries are zero, which would mean that X is the zero matrix checked. The next question is to find the orthogonal complement of the subspace of the symmetric matrices. The guess is that it ll be the subspace of anti-symmetric matrices. First, let s prove that M asym n n (R) [Msym n n (R)]. Let X M asym n n (R) and let s show that X, Y = whenever Y = Y t. We have X, Y = tr(xy t ) = tr( X t Y) = tr(yx t ) = Y, X = X, Y and hence X, Y =. Since X, Y = for any Y M sym n n (R), it precisely means that X [Msym n n (R)] and therefore we indeed obtain M asym n n (R) [Msym n n (R)]. In order to show that M asym n n (R) = [Msym n n (R)], it s sufficient to check that dim M asym n n (R) = dim[msym n n (R)]. This equality easily follows from the two direct sum decompositions we proved in the lectures and tutorials: M n n (R) = M sym n n (R) Masym n n (R) = Msym n n (R) [Msym n n (R)]. :) Remark It s easy to compute directly that dim M sym n(n + 1) n n (R) = which was probably done in tutorials too. 9, dim M osym n n n(n 1) (R) =,