Math 220 - Linear Algebra (Summer 208) Solutions to Homework #7 Exercise 6..20 (a) TRUE. u v v u = 0 is equivalent to u v = v u. The latter identity is true due to the commutative property of the inner product (Theorem (a) on page 333). Therefore, the original identity is also true. (b) FALSE. This would be false for any c < 0 and v 0. If v 0, then, given c > 0, cv 0. By definition, v 0 for any v. Moreover, it is also easy to see from the definition that the vector norm is strictly greater than zero for any non-zero vector. Therefore, v > 0 and cv > 0. Clearly, given the above, the identity cv = c v does not make sense, since its left-hand side is positive, but the right-hand side is negative. (c) TRUE. Clearly follows from the definition of an orthogonal complement on page 336. (d) TRUE. The set of all vectors z that are orthogonal to W is called the orthogonal complement of W and is denoted by W. Clearly follows from the Pythagorean Theorem (Theorem 2 on page 336). Two vectors u and v are orthogonal if and only if u + v 2 = u 2 + v 2.
Exercise 6.2.23 (a) TRUE. For, example the set {[ ] [ ], 0 is linearly independent since 0 = 0 = 0, but not orthogonal since [ ] [ ] = + 0 = 0. 0 (c) FALSE. The book says: When the vectors in an orthogonal set of nonzero vectors are normalized to have unit length, the new vectors will still be orthogonal, and hence the new set will be an orthonormal set. This could be easily shown as follows. orthogonal set of non-zero vectors, i.e. Let S = {v,..., v n be an v i 0 for any i n, () and Let also v i v j = 0 for any i n, j n such that i j. (2) i.e. c,..., c n are norms of vectors from S. c i = v i for i n, (3) Normalizing a set is normalizing each vector in this set by scaling it by its own norm (so each vector becomes a unit vector). Therefore, we normalize S by constructing a new set S = {w,..., w n, where w i = v i v i, for all i n (4) 2
i.e. original vectors scaled by their norms. Now, to check that S is also orthogonal we need to show that w i w j = 0 for any i n, j n such that i j. (5) This could be done using (2) and (4). Let w i, w j S and i j. Then, w i w j (4) = v i v i v j v j = v i v j v (2) i v j = 0 = 0. (6) v i v j (d) FALSE. 0 This is not always true. Consider a counterexample A = 0. 0 0 Columns of A are orthonormal (orthogonal and have unit length), however A is not an orthogonal matrix. By definition, in order for A to be an orthogonal matrix its columns should form an orthonormal basis for R n (n = 3 in our example). Clearly, columns of A could not form a basis for R 3 since there are just 2 of them, and any basis in R 3 should have 3 vectors. Note that is A is not square. However, it is always true that any matrix which is square and has orthonormal columns is orthogonal. Also, note that the book gives a different definition of an orthogonal matrix on page 346 (which explicitly requires a matrix to be square): An orthogonal matrix is a square invertible matrix U such that U = U T. This definition of an orthogonal matrix is equivalent to what was provided in class (orthogonal matrix is a matrix whose columns form an orthonormal basis for R n ) due to Theorem 6 on page 345: (e) FALSE. A m n matrix U has orthonormal columns if and only if U T U = I. The distance from y to L is y ŷ, not ŷ (of course, the two values may be equal in some cases, but this is not true in general). See Example 4 on page 343. It considers specific vectors in R 2, however the argument made there is quite general and applicable to this question. 3
The distance from y to L is the length of the perpendicular line segment from y to the orthogonal projection ŷ. This length equals the length of y ŷ. Thus the distance is y ŷ. Figure 3 on page 343 gives a good illustration. Exercise 6.3.24 (a) {w,..., w p, v,..., v q is orthogonal if inner product of any two vectors in it is 0, i.e. all of the following should hold w i w j = 0 for any i p, j p such that i j, () v i v j = 0 for any i q, j q such that i j, (2) w i v j = 0 for any i p, j q. (3) () and (2) follows from the facts that W is orthogonal and W is also orthogonal, respectively. (3) follows from the definition of an orthogonal complement (page 336). Since ()-(3) is true, {w,..., w p, v,..., v q is orthogonal. 4
(b) By the Orthogonal Decomposition Theorem (Theorem 8 on page 350), any vector y R n can be written uniquely in the form y = ŷ + z () where ŷ is in W and z is in W. Since ŷ is in W and {w,..., w p is a basis for W, we can represent ŷ as a linear combination of w,..., w p, i.e. ŷ = a w + + a p w p (2) where a,..., a p are some scalars. Similarly, since z is in W and {v,..., v q is a basis for W, we can represent z as a linear combination of v,..., v q, i.e. where b,..., b q are some scalars. Now, plugging (2) and (3) into () we get z = b v + + b q v q (3) y = ŷ + z = a w + + a p w p + b v + + b q v q. (4) This means that y can be represented as a linear combination of w,..., w p, v,..., v q. Since y is an arbitrary vector from R n, by definition, the set {w,..., w p, v,..., v q spans R n. (c) We showed in (a) that {w,..., w p, v,..., v q is orthogonal. By Theorem 4 on page 340, any orthogonal set is linearly independent. Also, in (b) we showed that {w,..., w p, v,..., v q spans R n. Then, it follows from the above two facts (by definition) that {w,..., w p, v,..., v q is a basis for R n. The size of this set is p + q or, alternatively, dim {{ W + dim {{ W. Since p q this set is a basis for R n its size equals n, the dimension of R n, i.e. dim W + dim W = n. 5