1. (5 pts) From Ch. 1.10 in Apostol: Problems 1,3,5,7,9. Also, when appropriate exhibit a basis for S. Solution. (1.10.1) Yes, S is a subspace of V 3 with basis {(0, 0, 1), (0, 1, 0)} and dimension 2. x 1 = 0 and x 2 = 0 so x 1 + x 2 = 0 and (x 1, y 1, z 1 ) + (x 2, y 2, z 2 ) = (x 1 + x 2, y 1 + y 2, z 1 + z 2 ) S, and S is closed under vector addition. For any a R and (x 1, y 1, z 1 ) S we have x 1 = 0 so ax 1 = 0 and a(x 1, y 1, z 1 ) = (ax 1, ay 1, az 1 ) S and S is closed under scalar multiplication. Thus S is a subspace of V 3. {(0, 0, 1), (0, 1, 0)} is independent in S because for any a, b R, a(0, 0, 1) + b(0, 1, 0) = (0, 0, 0) gives us (0, b, a) = (0, 0, 0) and a = b = 0. L({(0, 0, 1), (0, 1, 0)}) = S because for any (x, y, z) S, x = 0 so (x, y, z) = (0, y, z) = y(0, 1, 0) + z(0, 0, 1). Thus {(0, 0, 1), (0, 1, 0)} is a basis of S, and S has dimension 2. (1.10.3) Yes, S is a subspace of V 3 with basis {( 1, 0, 1), ( 1, 1, 0)} and dimension 2. x 1 + y 1 + z 1 = 0 and x 2 + y 2 + z 2 = 0 so (x 1 + x 2 ) + (y 1 + y 2 ) + (z 1 + z 2 ) = (x 1 + y 1 + z 1 ) + (x 2 + y 2 + z 2 ) = 0 and (x 1, y 1, z 1 ) + (x 2, y 2, z 2 ) = (x 1 + x 2, y 1 + y 2, z 1 + z 2 ) S, and S is closed under vector addition. For any a R and (x 1, y 1, z 1 ) S we have x 1 + y 1 + z 1 = 0 so ax 1 + ay 1 + az 1 = 0 and a(x 1, y 1, z 1 ) = (ax 1, ay 1, az 1 ) S and S is closed under scalar multiplication. Thus S is a subspace of V 3. {( 1, 0, 1), ( 1, 1, 0)} is independent in S because for any a, b R, a( 1, 0, 1) + b( 1, 1, 0) = (0, 0, 0) gives us ( a b, b, a) = (0, 0, 0) and a = b = 0. L({( 1, 0, 1), ( 1, 1, 0)}) = S because for any (x, y, z) S, x + y + z = 0 so (x, y, z) = ( y z, y, z) = y( 1, 1, 0) + z( 1, 0, 1). Thus {( 1, 0, 1), ( 1, 1, 0)} is a basis of S, and S has dimension 2. (1.10.5) Yes, S is a subspace of V 3 with basis {(1, 1, 1)} and dimension 1. x 1 = y 1 = z 1 and x 2 = y 2 = z 2 so x 1 + x 2 = y 1 + y 2 = z 1 + z 2 and (x 1, y 1, z 1 ) + (x 2, y 2, z 2 ) = (x 1 + x 2, y 1 + y 2, z 1 + z 2 ) S, and S is closed under vector addition. For any a R and (x 1, y 1, z 1 ) S we have x 1 = y 1 = z 1 so ax 1 = ay 1 = az 1 and a(x 1, y 1, z 1 ) = (ax 1, ay 1, az 1 ) S and S is closed under scalar multiplication. Thus S is a subspace of V 3. {(1, 1, 1)} is independent in S because for any a R, a(1, 1, 1) = (0, 0, 0) gives us (a, a, a) = (0, 0, 0) and a = 0. L({(1, 1, 1)}) = S because for any (x, y, z) S, x = y = z so (x, y, z) = (x, x, x) = x(1, 1, 1). Thus {(1, 1, 1)} is a basis of S, and S has dimension 1. (1.10.7) No, S is not a subspace of V 3 since (1, 1, 0), (1, 1, 0) S but (1, 1, 0) + (1, 1, 0) = (2, 0, 0) S. (1.10.9) Yes, S is a subspace of V 3 with basis {(1, 2, 3)} and dimension 1. y 1 = 2x 1, z 1 = 3x 1, y 2 = 2x 2, and z 2 = 3x 2, so y 1 + y 2 = 2(x 1 + x 2 ), z 1 + z 2 = 3(x 1 + x 2 ), and (x 1, y 1, z 1 )+(x 2, y 2, z 2 ) = (x 1 +x 2, y 1 +y 2, z 1 +z 2 ) S, and S is closed under vector addition. 1
For any a R and (x 1, y 1, z 1 ) S we have y 1 = 2x 1 and z 1 = 3x 1 so ay 1 = 2ax 1, az 1 = 3ax 1, and a(x 1, y 1, z 1 ) = (ax 1, ay 1, az 1 ) S and S is closed under scalar multiplication. Thus S is a subspace of V 3. {(1, 2, 3)} is independent in S because for any a R, a(1, 2, 3) = (0, 0, 0) gives us (a, 2a, 3a) = (0, 0, 0) and a = 0. L({(1, 2, 3)}) = S because for any (x, y, z) S, y = 2x and z = 3x, so (x, y, z) = (x, 2x, 3x) = x(1, 2, 3). Thus {(1, 2, 3)} is a basis of S, and S has dimension 1. 2. (8 pts) From Ch. 1.10 in Apostol: Problem 24. Solution. (a) We adapt the proof of Theorem 1.7(a). Let B = {x 1,..., x k } be any independent set of elements in S. If L(B) = S, then B is a basis for S. If not, then there is some element y in S which is not in L(B). Adjoin this element to B and consider the new set B = {x 1,..., x k, y}. If this set were dependent there would be scalars c 1,..., c k+1 not all zero such that k c i x i + c k+1 y = O. Since the x 1,..., x k are independent, we would not have c k+1 = 0 or else c 1 = = c k = 0 by the linear independence of x 1,..., x k, contrary to our assumption that B is dependent. Thus we would have c k+1 0 and could solve this equation for y and find that y L(B), contradicting our assumption that y is not in L(B). Therefore, the set B is independent but contains k + 1 elements. If L(B ) = S, then B is a basis of S. If L(B ) S, we can argue with B as we did with B, getting a new set B which contains k + 2 elements and is independent. We can repeat the process until we arrive at a basis in a total of dim V k steps or less, or else we eventually obtain an independent set with dim V + 1 elements, contradicting Theorem 1.5. This basis will be finite and have at most dim V elements, so S is finite dimensional and dim S dim V. (b) The if part holds trivially by substitution. The only if part follows from applying Theorem 1.7(b) to the basis for S constructed in part (a). The basis for S is a basis for V, so taking the span gives us S = V. (c) Apply Theorem 1.7(a). (d) Consider the example V = V 2 and S = R {0} V 2. {(1, 1), (1, 1)} is a basis for V which contains no basis for S because (1, 1), (1, 1) S and is not a basis for S since S contains nonzero elements. 3. Let V be a vector space and U, W subspaces of V. Prove that if U W = {0} then dim L(U, W ) = dim U + dim W. (Do not appeal to a more general result for dim L(U, W )). Solution. If U or W is infinite dimensional then suppose for the sake of contradiction that L(U, W ) is finite dimensional. Since U and W are subspaces of L(U, W ) that implies that U and W are finite dimensional by part (a) of the previous problem, so we get a contradiction. Thus we get L(U, W ) infinite dimensional. It remains to consider the case where U and W are both finite dimensional. Let S = {x 1,..., x n } be a basis for U and T = {y 1,..., y m } be a basis for W. A finite linear combination of elements of U or W is a finite linear combination of elements of S T because 2
we can rewrite each element of U as a finite linear combination of elements of S, rewrite each element of W as a finite linear combination of elements of T, and distribute to get a finite linear combination of elements of S T. Thus, L(U, W ) L(S, T ). Since S T U W, it follows that L(U, W ) L(S, T ) and hence L(U, W ) = L(S, T ) (ie. S T spans L(U, W )). S T is linearly independent by the following argument. Suppose a i x i + b j y j = 0. Then and since U W = {0} we have a i x i = b j y j, a i x i = b j y j = 0. Since S is linearly independent and T is linearly independent, we get a i = 0 for all 1 i n and b j = 0 for all 1 j m. Therefore, S T is linearly independent. S T spans L(U, W ) and is linearly independent so S T is a basis of L(U, W ) and dim L(U, W ) = S T = S + T = dim U + dim W. 4. (4 pts) From Ch. 2.4. in Apostol: Problems 11-14. Determine whether T is linear. Solution. (2.4.11) Yes, T is linear. Proof. In V 2, vector addition and scalar multiplication are defined in terms of rectangular coordinates, so it is necessary to describe what T does to points given in rectangular coordinates. (r, θ) is (x, y) = (r cos θ, r sin θ) in rectangular coordinates, and (r, θ + ϕ) is (r cos(θ + ϕ), r sin(θ + ϕ)) = (r(cos θ cos ϕ sin θ sin ϕ), r(sin θ cos ϕ + cos θ sin ϕ)) = (r cos θ cos ϕ r sin θ sin ϕ, r sin θ cos ϕ + r cos θ sin ϕ) = (x cos ϕ y sin ϕ, y cos ϕ + x sin ϕ), so T (x, y) = (x cos ϕ y sin ϕ, y cos ϕ + x sin ϕ). Suppose (x 1, y 1 ), (x 2, y 2 ) V 2. Then T ((x 1, y 1 ) + (x 2, y 2 )) = T (x 1 + x 2, y 1 + y 2 ) = ((x 1 + x 2 ) cos ϕ (y 1 + y 2 ) sin ϕ, (y 1 + y 2 ) cos ϕ + (x 1 + x 2 ) sin ϕ) = (x 1 cos ϕ y 1 sin ϕ, y 1 cos ϕ + x 1 sin ϕ) + (x 2 cos ϕ y 2 sin ϕ, y 2 cos ϕ + x 2 sin ϕ) 3
so T preserves vector addition. Suppose a R and (x 1, y 1 ) V 2. We have T (a(x 1, y 1 )) = T (ax 1, ay 1 ) = (ax 1 cos ϕ ay 1 sin ϕ, ay 1 cos ϕ + ax 1 sin ϕ) = a(x 1 cos ϕ y 1 sin ϕ, y 1 cos ϕ + x 1 sin ϕ) = at (x 1, y 1 ) (2.4.12) Yes, T is linear. Proof. T maps a point with polar coordinates (r, θ) onto the point with polar coordinates (r, ϕ θ), where ϕ is the angle of elevation of the fixed line. In V 2, vector addition and scalar multiplication are defined in terms of rectangular coordinates, so it is necessary to describe what T does to points given in rectangular coordinates. (r, θ) is (x, y) = (r cos θ, r sin θ) in rectangular coordinates, and (r, ϕ θ) is (r cos(ϕ θ), r sin(ϕ θ)) = (r(cos ϕ cos θ + sin ϕ sin θ), r(sin ϕ cos θ cos ϕ sin θ)) = (r cos θ cos ϕ + r sin θ sin ϕ, r cos θ sin ϕ r sin θ cos ϕ) = (x cos ϕ + y sin ϕ, x sin ϕ y cos ϕ), so T (x, y) = (x cos ϕ + y sin ϕ, x sin ϕ y cos ϕ). Suppose (x 1, y 1 ), (x 2, y 2 ) V 2. Then T ((x 1, y 1 ) + (x 2, y 2 )) = T (x 1 + x 2, y 1 + y 2 ) = ((x 1 + x 2 ) cos ϕ + (y 1 + y 2 ) sin ϕ, (x 1 + x 2 ) sin ϕ (y 1 + y 2 ) cos ϕ) = (x 1 cos ϕ + y 1 sin ϕ, x 1 sin ϕ y 1 cos ϕ) + (x 2 cos ϕ + y 2 sin ϕ, x 2 sin ϕ y 2 cos ϕ) so T preserves vector addition. Suppose a R and (x 1, y 1 ) V 2. We have T (a(x 1, y 1 )) = T (ax 1, ay 1 ) = (ax 1 cos ϕ + ay 1 sin ϕ, ax 1 sin ϕ ay 1 cos ϕ) = a(x 1 cos ϕ + y 1 sin ϕ, x 1 sin ϕ y 1 cos ϕ) = at (x 1, y 1 ) (2.4.13) No, T is not linear. T ((0, 1)) + T ((1, 0)) = (1, 1) + (1, 1) = (2, 2) (1, 1) = T (1, 1) = T ((0, 1) + (1, 0)) so T does not preserve vector addition. (2.4.14) Yes, T is linear. 4
Proof. In V 2, vector addition and scalar multiplication are defined in terms of rectangular coordinates, so it is necessary to describe what T does to points given in rectangular coordinates. (r, θ) is (x, y) = (r cos θ, r sin θ) in rectangular coordinates, and (2r, θ) is (2r cos θ, 2r sin θ) = (2x, 2y) so T (x, y) = (2x, 2y). Suppose (x 1, y 1 ), (x 2, y 2 ) V 2. Then T ((x 1, y 1 ) + (x 2, y 2 )) = T (x 1 + x 2, y 1 + y 2 ) = (2(x 1 + x 2 ), 2(y 1 + y 2 )) = (2x 1 + 2x 2, 2y 1 + 2y 2 ) = (2x 1, 2y 1 ) + (2x 2, 2y 2 ) so T preserves vector addition. Suppose a R and (x 1, y 1 ) V 2. Then T (a(x 1, y 1 )) = T (ax 1, ay 1 ) = (2ax 1, 2ay 1 ) = a(2x 1, 2y 1 ) = at (x 1, y 1 ) 5