MTH 2310, FALL 2011 SECTION 6.2: ORTHOGONAL SETS Homework Problems: 1, 5, 9, 13, 17, 21, 23 1, 27, 29, 35 1. Introduction We have discussed previously the benefits of having a set of vectors that is linearly independent or spans a certain vector space. It turns out that having vectors that are mutually orthogonal also has certain benefits, which we discuss in this section. 2. Orthogonal Sets Definition 1. A set of vectors {u 1,..., u p } in R n is an orthogonal set if each pair of vectors from the set is orthogonal- that is, if u i u j = 0 whenever i j. 3 1 1/2 Exercise 2. Show that the set 1, 2, 2 is an orthogonal set. (i.e., dot product 1 1 7/2 each vector with the other two and confirm that each inner product gives you 0). 1
2 SECTION 6.2: ORTHOGONAL SETS Theorem 2.1. If S = {u 1,..., u p } is an orthogonal set of nonzero vectors in R n, then S is linearly independent and hence is a basis for the subspace spanned by S. Proof. To prove this, we must show that if (1) 0 = c 1 u 1 + + c p u p for some scalars c 1,..., c p, then each c i must equal zero. Since all we know is that the vectors are mutually orthogonal, let s start by taking equation (1) and dotting both sides by the vector u 1. Definition 3. An orthogonal basis is a basis that is also an orthogonal set. It turns out that an orthogonal basis is much nicer than other bases, which the next theorem illustrates. Basically, the idea is that a difficult part of changing bases is calculating what the new weights will be for your vectors, and this calculation is greatly simplified if the vectors are orthogonal. Theorem 2.2. Let {u 1,..., u p } be an orthogonal basis for a vector space W. For each y in W, the weights in the linear combination y = c 1 u 1 + + c p u p are given by c j = y u j u j u j (j = 1,..., p) Proof. This proof is similar to the one above: start by taking both sides of y = c 1 u 1 + +c p u p and dotting them with u 1, then use this to get the expression for c 1, then explain why the expression for c j is true for any j.
MTH 2310, FALL 2011 3 3 1 1/2 Example 4. The set 1, 2, 2 is an orthogonal basis for R3. Use Theorm 2.2 to 1 1 7/2 6 express the vector y = 1 as a linear combination of that basis. 8 3. Orthogonal Projection In terms of the standard basis, it is easy to decompose a vector into its constituent parts. Given a different basis, though, it can be difficult to see how any other vector is written in terms of the basis. Geometrically, the big idea is this:
4 SECTION 6.2: ORTHOGONAL SETS In general, we can state this problem in the following way: given a nonzero vector u in R n, how can we decompose any other vector y into the sum of two vectors, one which is a scalar multiple of u and another which is orthogonal to u? We do the following: Definition 5. Let u be a nonzero vector and y another vector. Then ŷ = y u u u u is the orthogonal projection of y onto u, and z = y ŷ is the component of y orthogonal to u. Notice that if the vector u was replaced by any scalar multiple cu, you still get the same result for ŷ (You may want to check this on your own if you do not believe me). The important thing is not the vector u, but the line spanned by the vector u. Definition 6. Let u be a nonzero vector and y another vector. span{u}. Then ŷ = proj L y = y u u u u is the orthogonal projection of y onto L. Let L be the line formed by
MTH 2310, FALL 2011 5 [ ] [ ] 7 4 Exercise 7. Let y = and u =. 6 2 (a) Use the definitions above to find ŷ, the orthogonal projection of y onto u. (b) Find the component of y orthogonal to u (that is, z = y ŷ). (c) Check that ŷ and z are orthogonal. Exercise 8. Given u 0 in R n, let L=Span{u}. Show that the mapping x proj L x is a linear transformation. Another interpretation of the orthogonal projection is as follows. Since the ŷ and z are perpendicular, we can think of ŷ as the point on the line L that is closest to the point y. This idea forms the foundation for many applications of linear algebra, because often you cannot find a solution to a problem, and you are just searching for the closest thing to a solution, in other words you are looking for a projection of your ideal solution. We discuss this more in section 6.5.
6 SECTION 6.2: ORTHOGONAL SETS 4. Orthonormal Sets If you have two vectors that are orthogonal, any scalar multiples of the two vectors will be orthogonal. Why is this true? Thus if we have a set of orthogonal vectors and we scale them to be unit vectors, they will become a set of orthogonal unit vectors, which will make us happy. Definition 9. An orthogonal set of unit vectors is called an orthonormal set. If the set spans a vector space we call it an orthonormal basis, since the set will be linearly independent (Why?). {[ 2/ ] [ 5 Example 10. Prove that 1/ 1/ ]} 5, 5 2/ is an orthonormal basis. 5 Theorem 4.1. A matrix U has orthonormal columns if and only if U T U = I. Proof. We prove the case where U has three columns, the general case follows easily. Let U = [u 1 u 2 u 3 ].
NOTES 7 Theorem 4.2. Let U be an m n matrix with orthonormal columns and let x and y be in R n. Then (a) Ux = x (b) (Ux) (Uy) = x y (c) (Ux) (Uy) = 0 if and only if x y = 0 Proof. We prove only part (b), the other follow from it. We prove the case where U has three ] x 1 y 1 columns. Let U = [u 1 u 2 u 3, x =, and y =. x 2 x 3 y 2 y 3 Notes 1 T, T, F, F, F