Announcements Monday, November 20

Similar documents
Announcements Monday, November 19

Announcements Monday, November 19

Announcements Monday, November 26

Announcements Monday, November 26

Announcements Wednesday, November 15

Announcements Monday, October 29

Announcements Wednesday, November 15

Announcements Wednesday, November 01

Orthogonal Complements

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL

Chapter 6: Orthogonality

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

Chapter 6. Orthogonality and Least Squares

Announcements Wednesday, November 7

Announcements Wednesday, November 01

Section 6.2, 6.3 Orthogonal Sets, Orthogonal Projections

Linear Systems. Class 27. c 2008 Ron Buckmire. TITLE Projection Matrices and Orthogonal Diagonalization CURRENT READING Poole 5.4

Problem # Max points possible Actual score Total 120

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

MA 265 FINAL EXAM Fall 2012

Solutions to Review Problems for Chapter 6 ( ), 7.1

Announcements Wednesday, October 04

Assignment 1 Math 5341 Linear Algebra Review. Give complete answers to each of the following questions. Show all of your work.

Announcements Wednesday, November 7

MAT 1302B Mathematical Methods II

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

Announcements Monday, November 13

March 27 Math 3260 sec. 56 Spring 2018

MTH 2032 SemesterII

80 min. 65 points in total. The raw score will be normalized according to the course policy to count into the final score.

ft-uiowa-math2550 Assignment NOTRequiredJustHWformatOfQuizReviewForExam3part2 due 12/31/2014 at 07:10pm CST

This is a closed book exam. No notes or calculators are permitted. We will drop your lowest scoring question for you.

Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008

PRACTICE PROBLEMS FOR THE FINAL

MTH 2310, FALL Introduction

Solutions to Final Exam

I. Multiple Choice Questions (Answer any eight)

Announcements Monday, November 06

Announcements Monday, November 13

Section 6.1. Inner Product, Length, and Orthogonality

MATH 1553 SAMPLE FINAL EXAM, SPRING 2018

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix

Math 3191 Applied Linear Algebra

Announcements: Nov 10

Math 54 Second Midterm Exam, Prof. Srivastava October 31, 2016, 4:10pm 5:00pm, 155 Dwinelle Hall.

MATH 304 Linear Algebra Lecture 34: Review for Test 2.

Announcements Wednesday, September 20

Math 20F Final Exam(ver. c)

Chapter 7: Symmetric Matrices and Quadratic Forms

Announcements Wednesday, September 27

Math 3191 Applied Linear Algebra

Announcements Monday, September 17

Final Exam, Linear Algebra, Fall, 2003, W. Stephen Wilson

235 Final exam review questions

LINEAR ALGEBRA SUMMARY SHEET.

Math 18, Linear Algebra, Lecture C00, Spring 2017 Review and Practice Problems for Final Exam

6.1. Inner Product, Length and Orthogonality

Practice Exam. 2x 1 + 4x 2 + 2x 3 = 4 x 1 + 2x 2 + 3x 3 = 1 2x 1 + 3x 2 + 4x 3 = 5

Announcements Monday, September 25

MAT188H1S LINEAR ALGEBRA: Course Information as of February 2, Calendar Description:

Elementary Linear Algebra Review for Exam 2 Exam is Monday, November 16th.

Dimension. Eigenvalue and eigenvector

For each problem, place the letter choice of your answer in the spaces provided on this page.

MATH. 20F SAMPLE FINAL (WINTER 2010)

Diagonalizing Matrices

Math 261 Lecture Notes: Sections 6.1, 6.2, 6.3 and 6.4 Orthogonal Sets and Projections

1. Select the unique answer (choice) for each problem. Write only the answer.

MATH 23a, FALL 2002 THEORETICAL LINEAR ALGEBRA AND MULTIVARIABLE CALCULUS Solutions to Final Exam (in-class portion) January 22, 2003

Family Feud Review. Linear Algebra. October 22, 2013

Dot Products. K. Behrend. April 3, Abstract A short review of some basic facts on the dot product. Projections. The spectral theorem.

Announcements Wednesday, August 30

Then x 1,..., x n is a basis as desired. Indeed, it suffices to verify that it spans V, since n = dim(v ). We may write any v V as r

Section 6.4. The Gram Schmidt Process

MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors.

Section Instructors: by now you should be scheduled into one of the following Sections:

Linear Algebra Final Exam Solutions, December 13, 2008

Final Exam - Take Home Portion Math 211, Summer 2017

0 2 0, it is diagonal, hence diagonalizable)

Lecture 3: Linear Algebra Review, Part II

MATH 221, Spring Homework 10 Solutions

Definition (T -invariant subspace) Example. Example

M340L Final Exam Solutions May 13, 1995

33A Linear Algebra and Applications: Practice Final Exam - Solutions

Linear Algebra- Final Exam Review

Announcements Wednesday, August 30

Announcements Tuesday, April 17

Announcements Monday, October 02

Answer Keys For Math 225 Final Review Problem

Matrices related to linear transformations

Spring 2014 Math 272 Final Exam Review Sheet

5.) For each of the given sets of vectors, determine whether or not the set spans R 3. Give reasons for your answers.

Linear Algebra 2 Spectral Notes

Announcements Monday, September 18

Designing Information Devices and Systems II

THE USE OF A CALCULATOR, CELL PHONE, OR ANY OTHER ELEC- TRONIC DEVICE IS NOT PERMITTED DURING THIS EXAMINATION.

Math Final December 2006 C. Robinson

ALGEBRA QUALIFYING EXAM PROBLEMS LINEAR ALGEBRA

MATH 423 Linear Algebra II Lecture 33: Diagonalization of normal operators.

Transcription:

Announcements Monday, November 20 You already have your midterms! Course grades will be curved at the end of the semester. The percentage of A s, B s, and C s to be awarded depends on many factors, and will not be determined until all grades are in. Individual exam grades are not curved. WeBWorK 6.1, 6.2, 6.3 due the Wednesday after Thanksgiving. Reading day: Math is 1 3pm on December 6 in Clough 144 and 152. I ll be there for part of it. My office is Skiles 244. Rabinoffice hours are Monday, 1 3pm and Tuesday, 9 11am.

Section 6.2/6.3 Orthogonal Projections

Best Approximation Suppose you measure a data point x which you know for theoretical reasons must lie on a subspace W. x x y y W Due to measurement error, though, the measured x is not actually in W. Best approximation: y is the closest point to x on W. How do you know that y is the closest point? The vector from y to x is orthogonal to W : it is in the orthogonal complement W.

Orthogonal Decomposition Recall: If W is a subspace of R n, its orthogonal complement is W = { v in R n v is perpendicular to every vector in W } Theorem Every vector x in R n can be written as x = x W + x W for unique vectors x W in W and x W in W. The equation x = x W + x W (with respect to W ). is called the orthogonal decomposition of x The vector x W is the closest vector to x on W. x [interactive 1] [interactive 2] x W x W W

Orthogonal Decomposition Justification Theorem Every vector x in R n can be written as x = x W + x W for unique vectors x W in W and x W in W. Why? Uniqueness: suppose x = x W + x W = x W + x W for x W, x W in W and x W, x W in W. Rewrite: x W x W = x W x W. The left side is in W, and the right side is in W, so they are both in W W. But the only vector that is perpendicular to itself is the zero vector! Hence 0 = x W x W = x W = x W 0 = x W x W = x W = x W Existence: We will compute the orthogonal decomposition later using orthogonal projections.

Orthogonal Decomposition Example Let W be the xy-plane in R 3. Then W is the z-axis. 1 1 0 x = 2 = x W = 2 x W = 0. 3 0 3 a a 0 x = b = x W = b x W = 0. c 0 c This is just decomposing a vector into a horizontal component (in the xy-plane) and a vertical component (on the z-axis). x x W x W W [interactive]

Orthogonal Decomposition Computation? Problem: Given x and W, how do you compute the decomposition x = x W + x W? Observation: It is enough to compute x W, because x W = x x W. First we need to discuss orthogonal sets. Definition A set of nonzero vectors is orthogonal if each pair of vectors is orthogonal. It is orthonormal if, in addition, each vector is a unit vector. Lemma An orthogonal set of vectors is linearly independent. Hence it is a basis for its span. Suppose {u 1, u 2,..., u m} is orthogonal. We need to show that the equation c 1u 1 + c 2u 2 + + c mu m = 0 has only the trivial solution c 1 = c 2 = = c m = 0. 0 = u 1 (c 1u 1 + c 2u 2 + + c mu m) = c 1(u 1 u 1) + 0 + 0 + + 0. Hence c 1 = 0. Similarly for the other c i.

Orthogonal Sets Examples 1 1 1 Example: B = 1, 2, 0 is an orthogonal set. Check: 1 1 1 1 1 1 1 1 1 1 2 = 0 1 0 = 0 2 0 = 0. 1 1 1 1 1 1 Example: B = {e 1, e 2, e 3} is an orthogonal set. Check: 1 0 1 0 0 0 0 1 = 0 0 0 = 0 1 0 = 0. 0 0 0 1 0 1 Example: Let x = ( ) ( a b be a nonzero vector, and let y = b a orthogonal set: ( ) a b ( ) b = ab + ab = 0. a ). Then {x, y} is an

Orthogonal Projections Definition Let W be a subspace of R n, and let {u 1, u 2,..., u m} be an orthogonal basis for W. The orthogonal projection of a vector x onto W is proj W (x) def = m i=1 x u i u i u i u i = x u1 u 1 u 1 u 1 + x u2 u 2 u 2 u 2 + + x un u n u n u n. This is a vector in W because it is in Span{u 1, u 2,..., u m}. Theorem Let W be a subspace of R n, and let x be a vector in R n. Then x W = proj W (x) and x W = x proj W (x). In particular, proj W (x) is the closest point to x in W. Why? Let y = proj W (x). We need to show that x y is in W. In other words, u i (x y) = 0 for each i. Let s do u 1: ( ) m x u i u 1 (x y) = u 1 x u i = u 1 x x u1 (u 1 u 1) 0 = 0. u i u i u 1 u 1 i=1

Orthogonal Projection onto a Line The formula for orthogonal projections is simple when W is a line. Let L = Span{u} be a line in R n, and let x be in R n. The orthogonal projection of x onto L is the point proj L (x) = x u u u u. x y x u [interactive] L y = proj L (x) Example: Compute the orthogonal projection of x = ( ) 6 4 onto the line L spanned by u = ( 3 2). ( ) 6 4 y = proj L (x) = x u u u u = 18 + 8 9 + 4 ( ) 3 = 10 2 13 ( ) 3. 2 L 10 ( ) 3 13 2 ( 3 2)

Orthogonal Projection onto a Plane Easy example 1 What is the projection of x = 2 onto the xy-plane? 3 Answer: The xy-plane is W = Span{e 1, e 2}, and {e 1, e 2} is an orthogonal basis. 1 x W = proj W 2 = x e1 e 1 + x e2 e 2 = 1 1 e 3 1 e 1 e 2 e 2 1 2 e1 + 1 2 1 1 2 e2 = 2. 0 So this is the same projection as before. x x W x W W [interactive]

Orthogonal Projections More complicated example 1.1 1 0 What is the projection of x = 1.4 onto W = Span 0, 1.1? 1.45 0.2 Answer: The basis is orthogonal, so 1.1 x W = proj W 1.4 = x u1 u 1 + x u2 u 2 u 1.45 1 u 1 u 2 u 2 = ( 1.1)(1) 1 0 (1.4)(1.1) + (1.45)(.2) + 1 2 1.1 0 2 + (.2) 2 This turns out to be equal to u 2 1.1u 1. 0 1.1.2 x x W xw u 2 u 1 W [interactive]

Orthogonal Projections Properties First we restate the property we ve been using all along. Let x be a vector and let x = x W +x W be its orthogonal decomposition with respect to a subspace W. The following vectors are the same: x W proj W (x) The closest vector to x on W We can think of orthogonal projection as a transformation: Theorem Let W be a subspace of R n. proj W : R n R n 1. proj W is a linear transformation. 2. For every x in W, we have proj W (x) = x. 3. For every x in W, we have proj W (x) = 0. x proj W (x). 4. The range of proj W is W and the null space of proj W is W.

Poll Let W be a subspace of R n. Poll Let A be the matrix for proj W. What is/are the eigenvalue(s) of A? A. 0 B. 1 C. 1 D. 0, 1 E. 1, 1 F. 0, 1 G. 1, 0, 1 The 1-eigenspace is W. The 0-eigenspace is W. We have dim W + dim W = n, so that gives n linearly independent eigenvectors already. So the answer is D.

Orthogonal Projections Matrices What is the matrix for proj W : R 3 R 3, where 1 1 W = Span 0, 1? 1 1 Answer: Recall how to compute the matrix for a linear transformation: A = proj W (e 1) proj W (e 2) proj W (e 3). We compute: proj W (e 1) = proj W (e 2) = proj W (e 3) = e1 u1 e1 u2 u 1 + u 2 = 1 u 1 u 1 u 2 u 2 2 e2 u1 e2 u2 u 1 + u 2 = 0 + 1 u 1 u 1 u 2 u 2 3 e3 u1 e3 u2 u 1 + u 2 = 1 u 1 u 1 u 2 u 2 2 5/6 1/3 1/6 Therefore A = 1/3 1/3 1/3. 1/6 1/3 5/6 5/6 1/3 1/6 1 0 + 1 1 1 = 1 3 1 1 1 = 1/3 1 1/3 1 0 + 1 1 1 = 1/6 1/3 1 3 1 5/6

Orthogonal Projections Matrix facts Let W be an m-dimensional subspace of R n, let proj W : R n W be the projection, and let A be the matrix for proj L. Fact 1: A is diagonalizable with eigenvalues 0 and 1; it is similar to the diagonal matrix with m ones and n m zeros on the diagonal. Why? Let v 1, v 2,..., v m be a basis for W, and let v m+1, v m+2,..., v n be a basis for W. These are (linearly independent) eigenvectors with eigenvalues 1 and 0, respectively, and they form a basis for R n because there are n of them. Example: If W is a plane in R 3, then A is similar to projection onto the xy-plane: 1 0 0 0 1 0. 0 0 0 Fact 2: A 2 = A. Why? Projecting twice is the same as projecting once: proj W proj W = proj W = A A = A.

Coordinates with respect to Orthogonal Bases Let W be a subspace with orthogonal basis B = {u 1, u 2,..., u m}. For x in W we have proj W (x) = x, so x = proj W (x) = m i=1 x u i u i u i u i = x u1 u 1 u 1 u 1 + x u2 u 2 u 2 u 2 + + x un u n u n u n. This makes it easy to compute the B-coordinates of x. Corollary Let W be a subspace with orthogonal basis B = {u 1, u 2,..., u m}. Then ( ) x u1 x u 2 x u m [x] B =,,...,. u 1 u 1 u 2 u 2 u m u m [interactive]

Coordinates with respect to Orthogonal Bases Example Problem: Find the B-coordinates of x = ( 0 3), where {( ) ( )} 1 4 B =,. 2 2 Old way:( 1 4 ) 0 rref ( 1 0 ) 6/5 2 2 3 0 1 6/20 = [x] B = New way: note B is an orthogonal basis. ( ) ( ) x u1 x u 2 3 2 [x] B =, u 2 = u 1 u 1 u 2 u 2 1 2 + 2, 3 2 2 ( 4) 2 + 2 2 ( ) 6/5. 6/20 = ( ) 6 5, 3. 10 x u 2 3 10 u 2 u 1 6 5 u 1 [interactive]

Orthogonal Projections Distance to a subspace 1 1 What is the distance from e 1 to W = Span 0, 1? 1 1 5/6 Answer: The closest point on W to e 1 is proj W (e 1) = 1/3. 1/6 The distance from e 1 to this point is dist ( e 1, proj W (e ) 1) = (e 1) W 1 5/6 = 0 1/3 0 1/6 1/6 = 1/3 1/6 = (1/6) 2 + ( 1/3) 2 + ( 1/6) 2 = 1 6. 1 1 1 y 1 0 1 [interactive] 1 e 1 6 (e 1 ) W W

Summary Let W be a subspace of R n. Any vector x in R n can be written in a unique way as x = x W + x W for x W in W and x W in W. This is called its orthogonal decomposition. The vector x W is the closest point to x in W : it is the best approximation. The distance from x to W is x W. If you have an orthogonal basis {u 1, u 2,..., u m} for W, then x W = proj W (x) = x u i u i u i u i = x u1 u 1 u 1 u 1 + x u2 u 2 u 2 u 2 + + x un u n u n u n. Hence x W = x proj W (x). If you have an orthogonal basis {u 1, u 2,..., u m} for W, then ( ) x u1 x u 2 x u m [x] B =,,...,. u 1 u 1 u 2 u 2 u m u m We can think of proj W : R n R n as a linear transformation. Its null space is W, and its range is W. The matrix A for proj W is diagonalizable with eigenvalues 0 and 1. It is idempotent: A 2 = A.