Announcements Monday, November 19

Similar documents
Announcements Monday, November 19

Announcements Wednesday, November 15

Announcements Wednesday, November 15

Announcements Monday, November 20

Section 6.1. Inner Product, Length, and Orthogonality

Chapter 6. Orthogonality and Least Squares

Announcements Monday, November 26

Announcements Monday, November 26

Announcements Monday, October 29

Announcements Monday, September 17

Announcements Wednesday, September 20

MATH 1553, SPRING 2018 SAMPLE MIDTERM 2 (VERSION B), 1.7 THROUGH 2.9

March 27 Math 3260 sec. 56 Spring 2018

Announcements Wednesday, November 01

Announcements Monday, September 25

Announcements Monday, September 18

Announcements Wednesday, August 30

Overview. Motivation for the inner product. Question. Definition

Announcements Wednesday, August 30

Solutions to Math 51 Midterm 1 July 6, 2016

Announcements Wednesday, November 01

Announcements Wednesday, October 04

Worksheet for Lecture 23 (due December 4) Section 6.1 Inner product, length, and orthogonality

1. Determine by inspection which of the following sets of vectors is linearly independent. 3 3.

Assignment 1 Math 5341 Linear Algebra Review. Give complete answers to each of the following questions. Show all of your work.

Announcements Wednesday, September 05

Math 2331 Linear Algebra

Math 2331 Linear Algebra

Math 3191 Applied Linear Algebra

Orthogonal Complements

Announcements Wednesday, September 27

Math 314H EXAM I. 1. (28 points) The row reduced echelon form of the augmented matrix for the system. is the matrix

Section 6.5. Least Squares Problems

MATH 1553, FALL 2018 SAMPLE MIDTERM 2: 3.5 THROUGH 4.4

Orthogonal Projection. Hung-yi Lee

Math 3191 Applied Linear Algebra

Chapter 3. Directions: For questions 1-11 mark each statement True or False. Justify each answer.

Announcements Monday, November 13

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL

Announcements Wednesday, November 7

Announcements Wednesday, November 7

Answer Key for Exam #2

Solutions to Review Problems for Chapter 6 ( ), 7.1

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

Announcements Monday, October 02

Announcements September 19

Lecture 03. Math 22 Summer 2017 Section 2 June 26, 2017

v = v 1 2 +v 2 2. Two successive applications of this idea give the length of the vector v R 3 :

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

Announcements Monday, November 13

MAT 242 CHAPTER 4: SUBSPACES OF R n

Linear Algebra Final Exam Study Guide Solutions Fall 2012

Math 3191 Applied Linear Algebra

Solutions to Math 51 First Exam October 13, 2015

MATH 1553, JANKOWSKI MIDTERM 2, SPRING 2018, LECTURE A

Math 4377/6308 Advanced Linear Algebra

Section 6.4. The Gram Schmidt Process

Announcements August 31

Exam 2 Solutions. (a) Is W closed under addition? Why or why not? W is not closed under addition. For example,

(i) [7 points] Compute the determinant of the following matrix using cofactor expansion.

Math 51 Midterm 1 July 6, 2016

Dot Products, Transposes, and Orthogonal Projections

Lecture 3: Linear Algebra Review, Part II

Section 3.3. Matrix Equations

Linear Algebra MATH20F Midterm 1

Advanced Linear Algebra Math 4377 / 6308 (Spring 2015) March 5, 2015

Math 22 Fall 2018 Midterm 2

2018 Fall 2210Q Section 013 Midterm Exam II Solution

MATH 260 LINEAR ALGEBRA EXAM III Fall 2014

PRACTICE PROBLEMS FOR THE FINAL

Review Notes for Linear Algebra True or False Last Updated: February 22, 2010

APPM 2360 Exam 2 Solutions Wednesday, March 9, 2016, 7:00pm 8:30pm

Orthogonality and Least Squares

Dimension. Eigenvalue and eigenvector

Chapter 6: Orthogonality

Math 3C Lecture 25. John Douglas Moore

Vector space and subspace

1 Last time: inverses

LINEAR ALGEBRA SUMMARY SHEET.

Shorts

Lecture 10: Vector Algebra: Orthogonal Basis

Math 308 Spring Midterm Answers May 6, 2013

Math 54 First Midterm Exam, Prof. Srivastava September 23, 2016, 4:10pm 5:00pm, 155 Dwinelle Hall.

Sept. 26, 2013 Math 3312 sec 003 Fall 2013

EK102 Linear Algebra PRACTICE PROBLEMS for Final Exam Spring 2016

Solutions to Final Exam

STAT 151A: Lab 1. 1 Logistics. 2 Reference. 3 Playing with R: graphics and lm() 4 Random vectors. Billy Fang. 2 September 2017

1 Last time: determinants

Section 6.2, 6.3 Orthogonal Sets, Orthogonal Projections

The Fundamental Theorem of Linear Algebra

MATH 1553 SAMPLE FINAL EXAM, SPRING 2018

MATH 1553 PRACTICE FINAL EXAMINATION

Choose three of: Choose three of: Choose three of:

Math 2331 Linear Algebra

We showed that adding a vector to a basis produces a linearly dependent set of vectors; more is true.

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

Linear Combination. v = a 1 v 1 + a 2 v a k v k

Math 1553 Introduction to Linear Algebra. School of Mathematics Georgia Institute of Technology

Announcements Wednesday, October 10

Math 308 Practice Final Exam Page and vector y =

Transcription:

Announcements Monday, November 19 You should already have the link to view your graded midterm online. Course grades will be curved at the end of the semester. The percentage of A s, B s, and C s to be awarded depends on many factors, and will not be determined until all grades are in. Individual exam grades are not curved. Send regrade requests by tomorrow. WeBWorK 6.6, 7.1, 7.2 are due the Wednesday after Thanksgiving. No more quizzes! My office is Skiles 244 and Rabinoffice hours are: Mondays, 12 1pm; Wednesdays, 1 3pm. (But not this Wednesday.)

Section 7.2 Orthogonal Complements

Orthogonal Complements Definition Let W be a subspace of R n. Its orthogonal complement is W = { v in R n v w = 0 for all w in W } read W perp. Pictures: W is orthogonal complement A T is transpose The orthogonal complement of a line in R 2 is the perpendicular line. [interactive] W W The orthogonal complement of a line in R 3 is the perpendicular plane. [interactive] W W The orthogonal complement of a plane in R 3 is the perpendicular line. [interactive] W W

Poll Poll Let W be a 2-plane in R 4. How would you describe W? A. The zero space {0}. B. A line in R 4. C. A plane in R 4. D. A 3-dimensional space in R 4. E. All of R 4. For example, if W is the xy-plane, then W is the zw-plane: x 0 y 0 0 z = 0. 0 w

Orthogonal Complements Basic properties Let W be a subspace of R n. Facts: 1. W is also a subspace of R n 2. (W ) = W 3. dim W + dim W = n 4. If W = Span{v 1, v 2,..., v m}, then W = all vectors orthogonal to each v 1, v 2,..., v m = { x in R n x v i = 0 for all i = 1, 2,..., m } v T 1 = Nul v2 T... vm T Let s check 1. Is 0 in W? Yes: 0 w = 0 for any w in W. Suppose x, y are in W. So x w = 0 and y w = 0 for all w in W. Then (x + y) w = x w + y w = 0 + 0 = 0 for all w in W. So x + y is also in W. Suppose x is in W. So x w = 0 for all w in W. If c is a scalar, then (cx) w = c(x 0) = c(0) = 0 for any w in W. So cx is in W.

Orthogonal Complements Computation 1 1 Problem: if W = Span 1, 1, compute W. 1 1 By property 4, we have to find the null space of the matrix whose rows are ( 1 1 1 ) and ( 1 1 1 ), which we did before: Nul ( ) 1 1 1 1 = Span 1 1 1 1. 0 [interactive] v T 1 Span{v 1, v 2,..., v m} = Nul v2 T.. vm T

Orthogonal Complements Row space, column space, null space Definition The row space of an m n matrix A is the span of the rows of A. It is denoted Row A. Equivalently, it is the column space of A T : It is a subspace of R n. Row A = Col A T. We showed before that if A has rows v T 1, v T 2,..., v T m, then Hence we have shown: Fact: (Row A) = Nul A. Span{v 1, v 2,..., v m} = Nul A. Replacing A by A T, and remembering Row A T = Col A: Fact: (Col A) = Nul A T. Using property 2 and taking the orthogonal complements of both sides, we get: Fact: (Nul A) = Row A and Col A = (Nul A T ).

Orthogonal Complements Reference sheet Orthogonal Complements of Most of the Subspaces We ve Seen For any vectors v 1, v 2,..., v m: v T 1 Span{v 1, v 2,..., v m} = Nul v2 T.. vm T For any matrix A: Row A = Col A T and (Row A) = Nul A (Col A) = Nul A T Row A = (Nul A) Col A = (Nul A T ) For any other subspace W, first find a basis v 1,..., v m, then use the above trick to compute W = Span{v 1,..., v m}.

Section 7.3 Orthogonal Projections

Best Approximation Suppose you measure a data point x which you know for theoretical reasons must lie on a subspace W. x x y y W Due to measurement error, though, the measured x is not actually in W. Best approximation: y is the closest point to x on W. How do you know that y is the closest point? The vector from y to x is orthogonal to W : it is in the orthogonal complement W.

Orthogonal Decomposition Theorem Every vector x in R n can be written as x = x W + x W for unique vectors x W in W and x W in W. The equation x = x W + x W is called the orthogonal decomposition of x (with respect to W ). The vector x W is the orthogonal projection of x onto W. The vector x W is the closest vector to x on W. x [interactive 1] [interactive 2] x W x W W

Orthogonal Decomposition Justification Theorem Every vector x in R n can be written as x = x W + x W for unique vectors x W in W and x W in W. Why? Uniqueness: suppose x = x W + x W = x W + x W for x W, x W in W and x W, x W in W. Rewrite: x W x W = x W x W. The left side is in W, and the right side is in W, so they are both in W W. But the only vector that is perpendicular to itself is the zero vector! Hence 0 = x W x W = x W = x W 0 = x W x W = x W = x W Existence: We will compute the orthogonal decomposition later using orthogonal projections.

Orthogonal Decomposition Example Let W be the xy-plane in R 3. Then W is the z-axis. 2 2 0 x = 1 = x W = 1 x W = 0. 3 0 3 a a 0 x = b = x W = b x W = 0. c 0 c This is just decomposing a vector into a horizontal component (in the xy-plane) and a vertical component (on the z-axis). x x W x W W [interactive]

Orthogonal Decomposition Computation? Problem: Given x and W, how do you compute the decomposition x = x W + x W? Observation: It is enough to compute x W, because x W = x x W.

The A T A trick Theorem (The A T A Trick) Let W be a subspace of R n, let v 1, v 2,..., v m be a spanning set for W (e.g., a basis), and let A = v 1 v 2 v m. Then for any x in R n, the matrix equation A T Av = A T x (in the unknown vector v) is consistent, and x W = Av for any solution v. Recipe for Computing x = x W + x W Write W as a column space of a matrix A. Find a solution v of A T Av = A T x (by row reducing). Then x W = Av and x W = x x W.

The A T A Trick Example Problem: Compute the orthogonal projection of a vector x = (x 1, x 2, x 3) in R 3 onto the xy-plane. First we need a basis for the xy-plane: let s choose 1 0 1 0 e 1 = 0 e 2 = 1 A = 0 1. 0 0 0 0 Then A T A = ( ) 1 0 = I 0 1 2 A T x 1 x 2 = x 3 ( 1 0 0 0 1 0 ) 1 ( ) x x 2 x1 =. x x 2 3 Then A T Av = v and A T x = ( x 1 ) x 2, so the only solution of A T Av = A T x is v = ( x 1 ) x 2. Therefore, x W = Av = A ( x1 x 2 ) 1 0 = 0 1 0 0 ( x1 x 2 ) x 1 =. x 2 0

The A T A Trick Another Example Problem: Let 1 x 1 x = 2 W = x 2 in R 3 x1 x 2 + x 3 = 0. 3 x 3 Compute the distance from x to W. The distance from x to W is x W, so we need to compute the orthogonal projection. First we need a basis for W = Nul ( 1 1 1 ). This matrix is in RREF, so the parametric form of the solution set is x 1 = x 2 x 3 x 2 = x 2 x 3 = x 3 PVF Hence we can take a basis to be 1 1 1, 0 0 1 x 1 1 1 x 2 = x 2 1 + x 3 0. x 3 0 1 1 1 A = 1 0 0 1

The A T A Trick Another Example, Continued Problem: Let 1 x 1 x = 2 W = x 2 in R 3 x1 x 2 + x 3 = 0. 3 x 3 Compute the distance from x to W. We compute A T A = ( 2 ) 1 1 2 A T x = ( ) 3. 2 To solve A T Av = A T x we form an augmented matrix and row reduce: ( ) ( ) 2 1 3 RREF 1 0 8/3 v = 1 ( ) 8. 1 2 2 0 1 7/3 3 7 x W = Av = 1 1 8 x 3 W = x x W = 1 2 2. 3 7 2 The distance is x W = 1 3 4 + 4 + 4 1.155. [interactive]

The A T A trick Proof Theorem (The A T A Trick) Let W be a subspace of R n, let v 1, v 2,..., v m be a spanning set for W (e.g., a basis), and let A = v 1 v 2 v m. Then for any x in R n, the matrix equation A T Av = A T x (in the unknown vector v) is consistent, and x W = Av for any solution v. Proof: Let x = x W + x W. Then x W is in W = Nul(A T ), so A T x W = 0. Hence A T x = A T (x W + x W ) = A T x W + A T x W = A T x W. Since x W is in W = Span{v 1, v 2,..., v m}, we can write x W = c 1v 1 + c 2v 2 + + c mv m. If v = (c 1, c 2,..., c m) then Av = x W, so A T x = A T x W = A T Av.

Orthogonal Projection onto a Line Problem: Let L = Span{u} be a line in R n and let x be a vector in R n. Compute x L. We have to solve u T uv = u T x, where u is an n 1 matrix. But u T u = u u and u T x = u x are scalars, so v = u x u u Projection onto a Line = x L = uv = u x u u u. The projection of x onto a line L = Span{u} is x L = u x u u u x L = x x L. x L L x u x L = u x u u u

Orthogonal Projection onto a Line Example Problem: Compute the orthogonal projection of x = ( ) 6 4 onto the line L spanned by u = ( 3 2), and find the distance from u to L. x L = x u u u u = 18 + 8 9 + 4 ( ) 3 = 10 2 13 ( ) 3 2 x L = x x L = 1 13 ( ) 48. 72 The distance from x to L is x L = 1 13 482 + 72 2 6.656. ( ) 6 4 ( 3 2) L 10 ( ) 3 13 2 [interactive]

Summary Let W be a subspace of R n. The orthogonal complement W is the set of all vectors orthogonal to everything in W. We have (W ) = W and dim W + dim W = n. Row A = Col A T, (Row A) = Nul A, Row A = (Nul A), (Col A) = Nul A T, Col A = (Nul A T ). Orthogonal decomposition: any vector x in R n can be written in a unique way as x = x W + x W for x W in W and x W in W. The vector x W is the orthogonal projection of x onto W. The vector x W is the closest point to x in W : it is the best approximation. The distance from x to W is x W. If W = Col A then to compute x W, solve the equation A T Av = A T x; then x W = Av. If W = L = Span{u} is a line then x L = u x u u u.