. Orthogonal Complements and Projections In this section we discuss orthogonal complements and orthogonal projections. The orthogonal complement of a subspace S is the complement that is orthogonal to S. Then the orthogonal projection onto S is the projection onto S with respect to its orthogonal complement. Consider any set of vectors E R n. What happens if we look at all of the vectors orthogonal to each vector in E denoted E (pronounced E perp )? Definition.. For E R n let E = { x R n x y = 0 for all y E}. When S is a subspace we call S the orthogonal complement of S. and For example 2 = x x x x 2 = 0 = 0 2 x x 2 x + 2 + x = 0 = span 0. 0 x x x 0 = 2 = 0 = 0 x x x x = x x + 2 + x = 0 + x = 0 = span. Notice that E is a subspace in these two examples which is not a coincidence. Proposition.2. For any E R n E is a subspace. Proof. We show that E satisfies the three properties of being a subspace. (a) We have 0 E since 0 y = 0 for all y E. Thus 0 E. (b) Suppose x E. Then for all y E ( x + ) y = x y + y = 0 + 0 = 0 since x E. Therefore x + y 2 E and E is closed under addition. (c) Suppose x E and c is a scalar. Then for all y E (c x) y = c( x y) = c(0) = 0 since x E. Therefore c x E and E is closed under scalar multiplication. However if E is a nonzero subspace we have to solve an infinite set of equations to find E according to the definition. How can we possibly find E for a nonzero subspace? We need to realize that there is a lot of redudancy in these equations: For example if x y then x c y for all scalars c. In fact we can reduce the set of equations to just being orthogonal to a basis for S. Proposition.. For any { v... v k } R n (span{ v... v k }) = { v... v k }.
2 Proof. Let S = span{ v... v k }. First if x S then x v = = x v k = 0 because v... v k S. Thus x { v... v k } and S { v... v k }. On the other hand if x { v... v k } then x v = = x v k = 0 Now for any y S we can write y = c v + + c k v k so x y = x (c v + + c k v k ) = c ( x v ) + + c k ( x v k ) = c (0) + + c k (0) = 0. Thus x S and { v... v k } S. This shows S = { v... v k }. We will use Proposition?? shortly to describe how to find the orthogonal complement of a subspace more systematically. Recall that the column space of a matrix A denote col(a) is the span of its columns. It now becomes of use to us to also look at the span of the rows of a matrix. Let row(a) denote the span of the rows of matrix A. For example row ([ 2 ]) 0 0 = span 2. Finally let the transpose A of a matrix A denote A with its rows and columns interchanged. For example [ ] 0 2 = 2 0 Notice that col(a) = row(a ) and that (A ) = A. Now we can formalize how to find the orthogonal complement of s subspace. Theorem.4. For any matrix A row(a) = ker(a) and col(a) = ker(a ). Proof. Let v... v m denote the rows of m n matrix. Then row(a) = span{ v... v m } and row(a) = (span{ v... v m }) = { v... v m } = { x R n x v = 0... x v m = 0}. Now notice that x v = 0... x v m = 0 are exactly the same questions we wolve when we are finding ker(a). For example if v = 2 then x x v = 0 2 = 0 x + 2 + x = 0 x which is exactly the equation row represents in finding ker(a). Thus ker(a) = { x R n x v = 0... x v m = 0}. as well. Hence row(a) = ker(a). Finally notice that col(a) = row(a ) = ker(a ). Examples : Find bases for the following orthogonal complements. (a) span 2 4
0 (b) span 0 2 Solutions: (a) (b) span 2 4. 2 4 = row ([ 2 4 ]) ([ ]) = ker 2 4 = span 0 0 0 0. 0 0 0 span 2 = row = ker ([ ]) ([ ]) = ker 0 2 0 2 ([ 0 4 ]) 0 2 4 = span 2 0. 0 In fact as you may have guessed from the name S has a relationship to S that we have already seen namely that of a subspace complement. {[ As ]} opposed to {[ just ]} any subspace {[ ]} complement the orthogonal 0 5 2 complement to S is unique. For example span span span are all subspace complements {[ {[ {[ 0 0 of span but only span is the orthogonal complement of span. 0]} ]} ]} S S Theorem.5. For any subspace S R n S and S are complements in R n. Proof. First we show that S S = 0. Suppose x S S. Then since x S and x S x must be orthogonal to itself: x x = 0 = x = 0 by the definiteness of the dot product. Therefore S S = 0. Next we show that S + S = n using dimension. Let { v... v k } be a basis for S so that dim(s) = k. Then S = (span { v... v k }) = { v... v k } = { x R n x v = 0... x v k = 0}.
4 So S is the set of solutions to a homogeneous system of k equations in n variables which means there are at most k leading variables and so at least n k free variables Hence dim(s ( n k. Then dim(s + S ) = dim(s) + dim(s ) dim(s S ) = k + dim(s ) 0 k + (n k) = n. But S + S R n so dim(s + S ) n as well. This forces dim(s + S ) = n and thus S + S = R n. Now that we know S and S are complements we can discuss the orthogonal projection π SS onto S with respect to S. For shorthand let π S := π SS. In order to describe a formula for the orthogonal projection we will use the following fact to be proven in the next section: Theorem.6. Every subspace of R n has an orthogonal basis. Proof. See next section. Proposition.7. Let S R n be an orthogonal basis for R n and { u... u k } be an orthogonal basis for S. Then x u π S ( x) = u 2 u + + u k 2 u k Proof. Let { v... v n k } be an orthogonal basis for S. As you will see for homeowork this makes { u... u k v... v n k } an orthogonal basis for R n. This means for any x R n x span{ u... u k v... v n k }. Hence by Proposition?? ( ) ( ) x u x v x vn k x = u 2 u + + u k 2 u k + v 2 v + + v n k 2 v n k. Since and we can conclude that x u u 2 u + + u k 2 u k S x v x vn k v 2 v + + v n k 2 v n k S π S ( x) = π SS ( x) = x u u 2 u + + u k 2 u k. Exercises: () Find bases for the following orthogonal complements. 2 (a) span 4 7 4 (b) span 2 2 2 (c) span 5 7
6 (d) span 5 4 0 0 (e) span 0 0 0 0 4 2 5 (2) Find formulae for π S ( x) for each of the following subspaces S. (a) S = span 2. (b) S = span 2. (c) S = span. 2 2 (d) S = span 0 4 6 5. 2 (e) S = span 0 0 0 2 0 Problems:. (2) Suppose S R n is a subspace { u... u k } is an orthogonal basis for S and { v... v n k } is an orthogonal basis for S. Show that { u... u k v... v n k } is orthogonal basis for R n. 2. (2) Fix x R n and a subspace S R n. Show that π S ( x) is the unique vector in R n so that (i) π S ( x) S (ii) x π S ( x) S.. () Let S be a subspace of R n. Show that for all x y R n x y = π S ( x) π S ( y) + π S ( x) π S ( y) 4. () Show that (S ) = S for any subspace S of R n. 5. () (a) Show that if A C (A and C differ by EROs) then row(a) = row(c). (b) Show that the nonzero rows in RREF(A) form a basis for row(a). 6. () Show that for any matrix A rank(a) = rank(a ). 7. (2) Prove or give a counterexample: For any matrix A nullity(a) = nullity(a ). 8. (2) Show that if the rows of m n matrix A are linearly independent then rank(a) = m and nullity(a) = n m.
6 9. () Suppose { a... a n } is a basis for R n and b... b n are scalars. Show that there exists a unique x R n so that x a = b... x a n = b n. 0. () Suppose { u... u k } is orthonormal in R n and let S = span{ u... u k }. Then for x R n show that x S if and only if x 2 = ( x u ) 2 + + ( x u k ) 2.. (4) Fix x R n and a subspace S R n. Show that the minimum of x y 2 over all y S is acheived uniquely when y = π S ( x). 2. (4) Suppose P : R n R n is a projection map which satisfies Show that P is an orthogonal projection. P ( x) x for all x R n.