x 1 + 2x 2 + 3x 3 = 0 x 1 + 2x 2 + 3x 3 = 0, x 2 + x 3 = 0 x 3 3 x 3 1

Similar documents
Solutions: We leave the conversione between relation form and span form for the reader to verify. x 1 + 2x 2 + 3x 3 = 0

Solution: (a) S 1 = span. (b) S 2 = R n, x 1. x 1 + x 2 + x 3 + x 4 = 0. x 4 Solution: S 5 = x 2. x 3. (b) The standard basis vectors

The Fundamental Theorem of Linear Algebra

Worksheet for Lecture 23 (due December 4) Section 6.1 Inner product, length, and orthogonality

March 27 Math 3260 sec. 56 Spring 2018

Elementary Linear Algebra Review for Exam 2 Exam is Monday, November 16th.

PRACTICE PROBLEMS FOR THE FINAL

Solutions to Math 51 First Exam April 21, 2011

of A in U satisfies S 1 S 2 = { 0}, S 1 + S 2 = R n. Examples 1: (a.) S 1 = span . 1 (c.) S 1 = span, S , S 2 = span 0 (d.

Review Notes for Linear Algebra True or False Last Updated: February 22, 2010

Dot Products, Transposes, and Orthogonal Projections

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017

Solutions to Final Practice Problems Written by Victoria Kala Last updated 12/5/2015

2018 Fall 2210Q Section 013 Midterm Exam II Solution

Notice that the set complement of A in U satisfies

(v, w) = arccos( < v, w >

ELE/MCE 503 Linear Algebra Facts Fall 2018

SUMMARY OF MATH 1600

Math 123, Week 5: Linear Independence, Basis, and Matrix Spaces. Section 1: Linear Independence

Math 102, Winter 2009, Homework 7

Lecture 3q Bases for Row(A), Col(A), and Null(A) (pages )

Overview. Motivation for the inner product. Question. Definition

Math Linear Algebra

Answers in blue. If you have questions or spot an error, let me know. 1. Find all matrices that commute with A =. 4 3

Final Review Written by Victoria Kala SH 6432u Office Hours R 12:30 1:30pm Last Updated 11/30/2015

LINEAR ALGEBRA REVIEW

x 2 For example, Theorem If S 1, S 2 are subspaces of R n, then S 1 S 2 is a subspace of R n. Proof. Problem 3.

Chapter 6: Orthogonality

Chapter 6. Orthogonality and Least Squares

Section 6.1. Inner Product, Length, and Orthogonality

(v, w) = arccos( < v, w >

Worksheet for Lecture 25 Section 6.4 Gram-Schmidt Process

MATH 15a: Linear Algebra Practice Exam 2

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

(v, w) = arccos( < v, w >

Typical Problem: Compute.

v = v 1 2 +v 2 2. Two successive applications of this idea give the length of the vector v R 3 :

7. Dimension and Structure.

LINEAR ALGEBRA: THEORY. Version: August 12,

MATH 2331 Linear Algebra. Section 2.1 Matrix Operations. Definition: A : m n, B : n p. Example: Compute AB, if possible.

MATH Linear Algebra

Lecture 9: Vector Algebra

1. General Vector Spaces

2. (10 pts) How many vectors are in the null space of the matrix A = 0 1 1? (i). Zero. (iv). Three. (ii). One. (v).

Math 3191 Applied Linear Algebra

Definitions for Quizzes

Lecture 13: Row and column spaces

Chapter 6. Orthogonality

Vector Spaces. distributive law u,v. Associative Law. 1 v v. Let 1 be the unit element in F, then

Solutions to Final Exam

Chapter SSM: Linear Algebra. 5. Find all x such that A x = , so that x 1 = x 2 = 0.

2. Every linear system with the same number of equations as unknowns has a unique solution.

22m:033 Notes: 6.1 Inner Product, Length and Orthogonality

Math 1180, Notes, 14 1 C. v 1 v n v 2. C A ; w n. A and w = v i w i : v w = i=1

Consider a subspace V = im(a) of R n, where m. Then,

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

Chapter 4 Euclid Space

Orthogonal Complements

Definition Suppose S R n, V R m are subspaces. A map U : S V is linear if

Chapter 2 Subspaces of R n and Their Dimensions

Matrix invertibility. Rank-Nullity Theorem: For any n-column matrix A, nullity A +ranka = n

4.3 - Linear Combinations and Independence of Vectors

Linear Algebra: Homework 7

y 2 . = x 1y 1 + x 2 y x + + x n y n 2 7 = 1(2) + 3(7) 5(4) = 3. x x = x x x2 n.

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

MTH 2310, FALL Introduction

Math Linear Algebra II. 1. Inner Products and Norms

First of all, the notion of linearity does not depend on which coordinates are used. Recall that a map T : R n R m is linear if

Math 308 Practice Test for Final Exam Winter 2015

is injective because f(a) = f(b) only when a = b. On the other hand, 1

Kernel and range. Definition: A homogeneous linear equation is an equation of the form A v = 0

MATH 260 LINEAR ALGEBRA EXAM III Fall 2014

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.

Linear Algebra Highlights

Glossary of Linear Algebra Terms. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB

SSEA Math 51 Track Final Exam August 30, Problem Total Points Score

Math 3C Lecture 25. John Douglas Moore

The Gram Schmidt Process

The Gram Schmidt Process

Worksheet for Lecture 15 (due October 23) Section 4.3 Linearly Independent Sets; Bases

6.4 BASIS AND DIMENSION (Review) DEF 1 Vectors v 1, v 2,, v k in a vector space V are said to form a basis for V if. (a) v 1,, v k span V and

Chapter 2. General Vector Spaces. 2.1 Real Vector Spaces

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

Math 21b: Linear Algebra Spring 2018

Linear Algebra. Grinshpan

Math 261 Lecture Notes: Sections 6.1, 6.2, 6.3 and 6.4 Orthogonal Sets and Projections

Lecture 14: Orthogonality and general vector spaces. 2 Orthogonal vectors, spaces and matrices

Spring 2015 Midterm 1 03/04/15 Lecturer: Jesse Gell-Redman

Exam in TMA4110 Calculus 3, June 2013 Solution

Answer Key for Exam #2

MAT 242 CHAPTER 4: SUBSPACES OF R n

Worksheet for Lecture 15 (due October 23) Section 4.3 Linearly Independent Sets; Bases

Linear Models Review

DS-GA 1002 Lecture notes 10 November 23, Linear models

y 1 y 2 . = x 1y 1 + x 2 y x + + x n y n y n 2 7 = 1(2) + 3(7) 5(4) = 4.

Homework 5. (due Wednesday 8 th Nov midnight)

S09 MTH 371 Linear Algebra NEW PRACTICE QUIZ 4, SOLUTIONS Prof. G.Todorov February 15, 2009 Please, justify your answers.

MATH 2210Q MIDTERM EXAM I PRACTICE PROBLEMS

Midterm #2 Solutions

Row Space, Column Space, and Nullspace

Transcription:

. Orthogonal Complements and Projections In this section we discuss orthogonal complements and orthogonal projections. The orthogonal complement of a subspace S is the complement that is orthogonal to S. Then the orthogonal projection onto S is the projection onto S with respect to its orthogonal complement. Consider any set of vectors E R n. What happens if we look at all of the vectors orthogonal to each vector in E denoted E (pronounced E perp )? Definition.. For E R n let E = { x R n x y = 0 for all y E}. When S is a subspace we call S the orthogonal complement of S. and For example 2 = x x x x 2 = 0 = 0 2 x x 2 x + 2 + x = 0 = span 0. 0 x x x 0 = 2 = 0 = 0 x x x x = x x + 2 + x = 0 + x = 0 = span. Notice that E is a subspace in these two examples which is not a coincidence. Proposition.2. For any E R n E is a subspace. Proof. We show that E satisfies the three properties of being a subspace. (a) We have 0 E since 0 y = 0 for all y E. Thus 0 E. (b) Suppose x E. Then for all y E ( x + ) y = x y + y = 0 + 0 = 0 since x E. Therefore x + y 2 E and E is closed under addition. (c) Suppose x E and c is a scalar. Then for all y E (c x) y = c( x y) = c(0) = 0 since x E. Therefore c x E and E is closed under scalar multiplication. However if E is a nonzero subspace we have to solve an infinite set of equations to find E according to the definition. How can we possibly find E for a nonzero subspace? We need to realize that there is a lot of redudancy in these equations: For example if x y then x c y for all scalars c. In fact we can reduce the set of equations to just being orthogonal to a basis for S. Proposition.. For any { v... v k } R n (span{ v... v k }) = { v... v k }.

2 Proof. Let S = span{ v... v k }. First if x S then x v = = x v k = 0 because v... v k S. Thus x { v... v k } and S { v... v k }. On the other hand if x { v... v k } then x v = = x v k = 0 Now for any y S we can write y = c v + + c k v k so x y = x (c v + + c k v k ) = c ( x v ) + + c k ( x v k ) = c (0) + + c k (0) = 0. Thus x S and { v... v k } S. This shows S = { v... v k }. We will use Proposition?? shortly to describe how to find the orthogonal complement of a subspace more systematically. Recall that the column space of a matrix A denote col(a) is the span of its columns. It now becomes of use to us to also look at the span of the rows of a matrix. Let row(a) denote the span of the rows of matrix A. For example row ([ 2 ]) 0 0 = span 2. Finally let the transpose A of a matrix A denote A with its rows and columns interchanged. For example [ ] 0 2 = 2 0 Notice that col(a) = row(a ) and that (A ) = A. Now we can formalize how to find the orthogonal complement of s subspace. Theorem.4. For any matrix A row(a) = ker(a) and col(a) = ker(a ). Proof. Let v... v m denote the rows of m n matrix. Then row(a) = span{ v... v m } and row(a) = (span{ v... v m }) = { v... v m } = { x R n x v = 0... x v m = 0}. Now notice that x v = 0... x v m = 0 are exactly the same questions we wolve when we are finding ker(a). For example if v = 2 then x x v = 0 2 = 0 x + 2 + x = 0 x which is exactly the equation row represents in finding ker(a). Thus ker(a) = { x R n x v = 0... x v m = 0}. as well. Hence row(a) = ker(a). Finally notice that col(a) = row(a ) = ker(a ). Examples : Find bases for the following orthogonal complements. (a) span 2 4

0 (b) span 0 2 Solutions: (a) (b) span 2 4. 2 4 = row ([ 2 4 ]) ([ ]) = ker 2 4 = span 0 0 0 0. 0 0 0 span 2 = row = ker ([ ]) ([ ]) = ker 0 2 0 2 ([ 0 4 ]) 0 2 4 = span 2 0. 0 In fact as you may have guessed from the name S has a relationship to S that we have already seen namely that of a subspace complement. {[ As ]} opposed to {[ just ]} any subspace {[ ]} complement the orthogonal 0 5 2 complement to S is unique. For example span span span are all subspace complements {[ {[ {[ 0 0 of span but only span is the orthogonal complement of span. 0]} ]} ]} S S Theorem.5. For any subspace S R n S and S are complements in R n. Proof. First we show that S S = 0. Suppose x S S. Then since x S and x S x must be orthogonal to itself: x x = 0 = x = 0 by the definiteness of the dot product. Therefore S S = 0. Next we show that S + S = n using dimension. Let { v... v k } be a basis for S so that dim(s) = k. Then S = (span { v... v k }) = { v... v k } = { x R n x v = 0... x v k = 0}.

4 So S is the set of solutions to a homogeneous system of k equations in n variables which means there are at most k leading variables and so at least n k free variables Hence dim(s ( n k. Then dim(s + S ) = dim(s) + dim(s ) dim(s S ) = k + dim(s ) 0 k + (n k) = n. But S + S R n so dim(s + S ) n as well. This forces dim(s + S ) = n and thus S + S = R n. Now that we know S and S are complements we can discuss the orthogonal projection π SS onto S with respect to S. For shorthand let π S := π SS. In order to describe a formula for the orthogonal projection we will use the following fact to be proven in the next section: Theorem.6. Every subspace of R n has an orthogonal basis. Proof. See next section. Proposition.7. Let S R n be an orthogonal basis for R n and { u... u k } be an orthogonal basis for S. Then x u π S ( x) = u 2 u + + u k 2 u k Proof. Let { v... v n k } be an orthogonal basis for S. As you will see for homeowork this makes { u... u k v... v n k } an orthogonal basis for R n. This means for any x R n x span{ u... u k v... v n k }. Hence by Proposition?? ( ) ( ) x u x v x vn k x = u 2 u + + u k 2 u k + v 2 v + + v n k 2 v n k. Since and we can conclude that x u u 2 u + + u k 2 u k S x v x vn k v 2 v + + v n k 2 v n k S π S ( x) = π SS ( x) = x u u 2 u + + u k 2 u k. Exercises: () Find bases for the following orthogonal complements. 2 (a) span 4 7 4 (b) span 2 2 2 (c) span 5 7

6 (d) span 5 4 0 0 (e) span 0 0 0 0 4 2 5 (2) Find formulae for π S ( x) for each of the following subspaces S. (a) S = span 2. (b) S = span 2. (c) S = span. 2 2 (d) S = span 0 4 6 5. 2 (e) S = span 0 0 0 2 0 Problems:. (2) Suppose S R n is a subspace { u... u k } is an orthogonal basis for S and { v... v n k } is an orthogonal basis for S. Show that { u... u k v... v n k } is orthogonal basis for R n. 2. (2) Fix x R n and a subspace S R n. Show that π S ( x) is the unique vector in R n so that (i) π S ( x) S (ii) x π S ( x) S.. () Let S be a subspace of R n. Show that for all x y R n x y = π S ( x) π S ( y) + π S ( x) π S ( y) 4. () Show that (S ) = S for any subspace S of R n. 5. () (a) Show that if A C (A and C differ by EROs) then row(a) = row(c). (b) Show that the nonzero rows in RREF(A) form a basis for row(a). 6. () Show that for any matrix A rank(a) = rank(a ). 7. (2) Prove or give a counterexample: For any matrix A nullity(a) = nullity(a ). 8. (2) Show that if the rows of m n matrix A are linearly independent then rank(a) = m and nullity(a) = n m.

6 9. () Suppose { a... a n } is a basis for R n and b... b n are scalars. Show that there exists a unique x R n so that x a = b... x a n = b n. 0. () Suppose { u... u k } is orthonormal in R n and let S = span{ u... u k }. Then for x R n show that x S if and only if x 2 = ( x u ) 2 + + ( x u k ) 2.. (4) Fix x R n and a subspace S R n. Show that the minimum of x y 2 over all y S is acheived uniquely when y = π S ( x). 2. (4) Suppose P : R n R n is a projection map which satisfies Show that P is an orthogonal projection. P ( x) x for all x R n.