Miderm II Solutions Problem. [8 points] (i) [4] Find the inverse of the matrix A = To find the inverse we row-reduce the augumented matrix [I A]. In our case, we row reduce We have A = 2 2 (ii) [2] Possibly using the derivation of (i), find the determinant of A. In (i), we row-reduced A to the identity matrix I, but in the process there is only one operation which changes the sign of the determinant (the third step which exchanges two columns). Hence det A =. (iii) [2] Can A be the matrix of the projection onto a plane V R? No. If A were the matrix of a projection, then A would not be invertible, since projections onto planes are clearly not surjective/injective (several vectors have the same projection onto a plane).
Problem 2. [7 points] Consider line L spanned by the vector in R4. (i) [4] Find the matrix of the orthogonal projection onto L. Let A = The matrix of the projection equals A(AT A) A T. We calculate A T A = [ ] The matrix of the projection Proj L becomes AAT = [ ] = (ii) [] Find the matrix of the orthogonal projection onto L. = = (AT A) =. The projections onto L and L add up to the identity. Then, the matrix of the projection onto L is 2 I Proj L = 2 2
Problem. [2 points] Consider the subspace V R 4 spanned by the vectors (i) [4] Give a basis of V. Let A = 2 to and 2 We have V = C(A) hence V = N(A T ). We row reduce A T = rref (A T ) = [ 2 ] [ 2 We have x, y as pivot variables, the last two variables z, w being free. Solving for x, y we find hence x y z w x = 2z w, y = z + w = z 2 Therefore a basis for V is given by the vectors 2, + w (ii) [4] Find an orthonormal basis for the subspace V. v We apply Gram-Schmidt to the basis v, v 2 given in the problem. We normalize the first vector w = v v = To find the second basis vector, we project onto the line spanned by w. Note that y 2 = v 2 ( v 2 w ) w. ]. Since v 2 w = 2 = = y 2 = v 2 + w =
Finally, we normalize y 2 to get The orthonormal basis is w 2 = y 2 y 2 = (iii) [4] Find the projection of the vector x = Since,. onto the subspace V. We use the orthonormal basis w, w 2 we found in (ii). We compute we find x w = Proj V ( x) = ( x w ) w + ( x w 2 ) w 2. Proj V ( x) = w + w 2 = =, x w 2 = + = = / 2/ 4/ /
Problem 4. [8 points.] True or false: T F For a square n n matrix, dim N(A) + dim C(A) = n. T: this is the rank-nullity theorem. T F For a square n n matrix, dim N(A) + dim C(A T ) = n. T: C(A T ) = N(A) and dim N(A) + dim N(A) = n. In fact, for any vector subspace V R n, we have dim V + dim V = n. T F An orthogonal n n matrix is invertible. T: in fact, A = A T. T F For any matrix A, A T A is invertible. F: take A =. The statement we made in class about A T A being invertible applied to matrices A whose columns were a basis of a subspace. T F If det A = 2 and det B = then det(a + B) = 5. F: determinants are not additive. T F The product of two invertible matrices is invertible. F: if det A, det B then det(ab) = det A det B which means AB invertible. T F If A is an orthogonal matrix corresponding to a linear transformation T : R 2 R 2, and R is a region in R 2, then the area of T (R) equals the area of R. T: we claim det A = ± so areat (R) = area (R). Indeed, if AA T = then det A det A T =. Since det A = det A T, it follows that (det A) 2 = hence det A = ±. T F Exchanging two columns of a square matrix changes the sign of the determinant. T: The determinant of the matrix equals the determinant of the transpose. Exchanging two rows of the transpose changes the sign of the determinant. But the rows of the transpose are the columns of the original matrix.
Extra Credit [5 points] Assume that A is a 2 2 matrix such that A n = for some n. Show that A 2 =. We may assume A, this case being clear. Let t = Trace (A). Since A n = = det(a n ) = = (det A) n = = det A =. From the homework, we know A 2 ta + det A I = = A 2 = ta. We compute A 2 = ta = A = ta 2 = t(ta) = t 2 A = A 4 = t 2 A 2 = t 2 (ta) = t A =... In general, we find A n = t n A. Since A n = = t n A = = t = (since we assumed A ). Note then that A 2 = ta =.