Announcements Monday, November 26

Similar documents
Announcements Monday, November 26

Announcements Monday, November 20

Announcements Monday, October 29

Announcements Monday, November 19

Announcements Monday, November 19

Announcements Wednesday, November 01

Announcements Wednesday, November 01

Announcements Wednesday, November 7

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix

Linear Systems. Class 27. c 2008 Ron Buckmire. TITLE Projection Matrices and Orthogonal Diagonalization CURRENT READING Poole 5.4

Announcements Wednesday, November 7

Announcements Wednesday, November 15

Announcements Wednesday, November 15

Chapter 6: Orthogonality

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

Dimension. Eigenvalue and eigenvector

Assignment 1 Math 5341 Linear Algebra Review. Give complete answers to each of the following questions. Show all of your work.

Solutions to Review Problems for Chapter 6 ( ), 7.1

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL

Announcements Monday, November 06

March 27 Math 3260 sec. 56 Spring 2018

Announcements Wednesday, October 04

Announcements Monday, November 13

ft-uiowa-math2550 Assignment OptionalFinalExamReviewMultChoiceMEDIUMlengthForm due 12/31/2014 at 10:36pm CST

MATH 1553 SAMPLE FINAL EXAM, SPRING 2018

Solutions to Final Exam

EK102 Linear Algebra PRACTICE PROBLEMS for Final Exam Spring 2016

For each problem, place the letter choice of your answer in the spaces provided on this page.

LINEAR ALGEBRA SUMMARY SHEET.

Announcements Monday, November 13

ft-uiowa-math2550 Assignment NOTRequiredJustHWformatOfQuizReviewForExam3part2 due 12/31/2014 at 07:10pm CST

MATH 1553 PRACTICE FINAL EXAMINATION

Announcements Wednesday, September 27

Section 6.4. The Gram Schmidt Process

This is a closed book exam. No notes or calculators are permitted. We will drop your lowest scoring question for you.

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

Problem # Max points possible Actual score Total 120

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

Orthogonal Complements

Announcements Monday, October 02

Chapter 6. Orthogonality and Least Squares

Study Guide for Linear Algebra Exam 2

Announcements Monday, September 25

Math 3191 Applied Linear Algebra

Chapter 5. Eigenvalues and Eigenvectors

Announcements September 19

Chapter 7: Symmetric Matrices and Quadratic Forms

Solving a system by back-substitution, checking consistency of a system (no rows of the form

Eigenvalues, Eigenvectors, and Diagonalization

PRACTICE PROBLEMS FOR THE FINAL

Section 6.5. Least Squares Problems

Check that your exam contains 30 multiple-choice questions, numbered sequentially.

Answer Keys For Math 225 Final Review Problem

I. Multiple Choice Questions (Answer any eight)

6. Orthogonality and Least-Squares

Math 18, Linear Algebra, Lecture C00, Spring 2017 Review and Practice Problems for Final Exam

Review Notes for Linear Algebra True or False Last Updated: February 22, 2010

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

Worksheet for Lecture 25 Section 6.4 Gram-Schmidt Process

Vector space and subspace

Lecture 13: Orthogonal projections and least squares (Section ) Thang Huynh, UC San Diego 2/9/2018

Math 3191 Applied Linear Algebra

Announcements Monday, September 18

Elementary Linear Algebra Review for Exam 2 Exam is Monday, November 16th.

MATH 221, Spring Homework 10 Solutions

Linear Algebra Final Exam Study Guide Solutions Fall 2012

1. Let A = (a) 2 (b) 3 (c) 0 (d) 4 (e) 1

Name: Final Exam MATH 3320

MTH 464: Computational Linear Algebra

MATH 2210Q MIDTERM EXAM I PRACTICE PROBLEMS

Orthogonal Projection. Hung-yi Lee

MTH 2032 SemesterII

FINAL EXAM Ma (Eakin) Fall 2015 December 16, 2015

Section 6.1. Inner Product, Length, and Orthogonality

Math 20F Practice Final Solutions. Jor-el Briones

Announcements Wednesday, August 30

Announcements Wednesday, September 20

7. Symmetric Matrices and Quadratic Forms

Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008

5.) For each of the given sets of vectors, determine whether or not the set spans R 3. Give reasons for your answers.

Announcements Monday, September 17

Announcements Wednesday, August 30

YORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH M Test #1. July 11, 2013 Solutions

Singular Value Decomposition

Announcements Wednesday, September 05

MAT188H1S LINEAR ALGEBRA: Course Information as of February 2, Calendar Description:

MATH 115A: SAMPLE FINAL SOLUTIONS

Jordan Normal Form and Singular Decomposition

MA 265 FINAL EXAM Fall 2012

Math Final December 2006 C. Robinson

Mon Mar matrix eigenspaces. Announcements: Warm-up Exercise:

Schur s Triangularization Theorem. Math 422

Summer Session Practice Final Exam

The Jordan Normal Form and its Applications

NATIONAL UNIVERSITY OF SINGAPORE MA1101R

Section 6.2, 6.3 Orthogonal Sets, Orthogonal Projections

Homework 5. (due Wednesday 8 th Nov midnight)

LINEAR ALGEBRA QUESTION BANK

Section Instructors: by now you should be scheduled into one of the following Sections:

Transcription:

Announcements Monday, November 26 Please fill out your CIOS survey! WeBWorK 6.6, 7.1, 7.2 are due on Wednesday. No quiz on Friday! But this is the only recitation on chapter 7. My office is Skiles 244 and Rabinoffice hours are: Mondays, 12 1pm; Wednesdays, 1 3pm.

Orthogonal Projections Review Recall: Let W be a subspace of R n. The orthogonal complement W is the set of vectors orthogonal to everything in W. The orthogonal decomposition of a vector x with respect to W is the unique way of writing x = x W + x W for x W in W and x W in W. The vector x W is the orthogonal projection of x onto W. It is the closest vector to x in W. To compute x W, write W as Col A and solve A T Av = A T x; then x W = Av. x W x W x W W

Projection as a Transformation Change in Perspective: let us consider orthogonal projection as a transformation. Definition Let W be a subspace of R n. Define a transformation T : R n R n by T (x) = x W. This transformation is also called orthogonal projection with respect to W. Theorem Let W be a subspace of R n and let T : R n R n be the orthogonal projection with respect to W. Then: 1. T is a linear transformation. 2. For every x in R n, T (x) is the closest vector to x in W. 3. For every x in W, we have T (x) = x. 4. For every x in W, we have T (x) = 0. 5. T T = T. 6. The range of T is W and the null space of T is W.

Projection Matrix Method 1 Let W be a subspace of R n and let T : R n R n be the orthogonal projection with respect to W. Since T is a linear transformation, it has a matrix. How do you compute it? The same as any other linear transformation: compute T (e 1), T (e 2),..., T (e n).

Projection Matrix Example Problem: Let L = Span {( 3 2)} and let T : R 2 R 2 be the orthogonal projection onto L. Compute the matrix A for T. It s easy to compute orthogonal projection onto a line: T (e 1) = (e 1) L = u e1 u u u = 3 ( ) 3 13 2 T (e 2) = (e 2) L = u e2 u u u = 2 ( ) = A = 1 ( ) 9 6. 3 13 6 4 13 2

Projection Matrix Another Example Problem: Let x 1 W = x 2 in R 3 x1 x 2 + x 3 = 0 x 3 and let T : R 3 R 3 be orthogonal projection onto W. Compute the matrix B for T. In the slides for the last lecture we computed W = Col A for 1 1 A = 1 0. 0 1 To compute T (e i ) we have to solve the matrix equation A T Av = A T e i. We have ( ) ( ) A T 2 1 A = A T e 1 2 i = the ith column of A T 1 1 0 =. 1 0 1

Projection Matrix Another Example, Continued Problem: Let x 1 W = x 2 in R 3 x1 x 2 + x 3 = 0 x 3 and let T : R 3 R 3 be orthogonal projection onto W. Compute the matrix B for T. ( 2 1 ) 1 RREF ( 1 0 ) 1/3 1 2 1 0 1 1/3 ( 2 1 1 1 2 0 ( 2 1 0 1 2 1 ) RREF ( 1 0 ) 2/3 0 1 1/3 = T (e 1 ) = 1 ( ) 1 3 A = 1 1 3 2 1 1 = T (e 2 ) = 1 ( ) 2 3 A = 1 1 2 1 3 1 = T (e 2 ) = 1 ( ) 1 3 A = 1 1 1 2 3 2 = B = 1 2 1 1 1 2 1. 3 1 1 2 ) RREF ( 1 0 ) 1/3 0 1 2/3

Projection Matrix Method 2 Theorem Let {v 1, v 2,..., v m} be a linearly independent set in R n, and let A = v 1 v 2 v m. Then the m m matrix A T A is invertible. Proof: We ll show Nul(A T A) = {0}. Suppose A T Av = 0. Then Av is in Nul(A T ) = Col(A). But Av is in Col(A) as well, so Av = 0, and hence v = 0 because the columns of A are linearly independent.

Projection Matrix Method 2 Theorem Let {v 1, v 2,..., v m} be a linearly independent set in R n, and let A = v 1 v 2 v m. Then the m m matrix A T A is invertible. Let W be a subspace of R n and let T : R n R n be the orthogonal projection with respect to W. Let {v 1, v 2,..., v m} be a basis for W and let A be the matrix with columns v 1, v 2,..., v m. To compute T (x) = x W you solve A T Av = Ax; then x W = Av. v = (A T A) 1 (A T x) = T (x) = Av = [ A(A T A) 1 A T ] x. If the columns of A are a basis for W then the matrix for T is A(A T A) 1 A T.

Projection Matrix Example Problem: Let L = Span {( 3 2)} and let T : R 2 R 2 be the orthogonal projection onto L. Compute the matrix A for T. The set {( 3 2)} is a basis for L, so A = u(u T u) 1 u T = 1 u u uut = 1 13 ( ) 3 ( ) 1 3 2 = 2 13 ( ) 9 6. 6 4 Matrix of Projection onto a Line If L = Span{u} is a line in R n, then the matrix for projection onto L is 1 u u uut.

Projection Matrix Another Example Problem: Let x 1 W = x 2 in R 3 x1 x 2 + x 3 = 0 x 3 and let T : R 3 R 3 be orthogonal projection onto W. Compute the matrix B for T. In the slides for the last lecture we computed W = Col A for 1 1 A = 1 0. 0 1 The columns are linearly independent, so they form a basis for W. Hence ( ) 1 B = A(A T A) 1 A T 2 1 = A A T = 1 ( ) 2 1 1 2 3 A A T 1 2 = 1 2 1 1 1 2 1. 3 1 1 2

Poll Let W be a subspace of R n. Poll Let A be the matrix for proj W. What is/are the eigenvalue(s) of A? A. 0 B. 1 C. 1 D. 0, 1 E. 1, 1 F. 0, 1 G. 1, 0, 1 The 1-eigenspace is W. The 0-eigenspace is W. We have dim W + dim W = n, so that gives n linearly independent eigenvectors already. So the answer is D.

Projection Matrix Facts Theorem Let W be an m-dimensional subspace of R n, let T : R n W be the projection, and let A be the matrix for T. Then: 1. Col A = W, which is the 1-eigenspace. 2. Nul A = W, which is the 0-eigenspace. 3. A 2 = A. 4. A is similar to the diagonal matrix with m ones and n m zeros on the diagonal. Proof of 4: Let v 1, v 2,..., v m be a basis for W, and let v m+1, v m+2,..., v n be a basis for W. These are (linearly independent) eigenvectors with eigenvalues 1 and 0, respectively, and they form a basis for R n because there are n of them. Example: If W is a plane in R 3, then A is similar to projection onto the xy-plane: 1 0 0 0 1 0. 0 0 0

A Projection Matrix is Diagonalizable Let W be an m-dimensional subspace of R n, let T : R n R n be the orthogonal projection onto W, and let A be the matrix for T. Here s how to diagonalize A: Find a basis {v 1, v 2,..., v m} for W. Find a basis {v m+1, v m+2,..., v n} for W. Then m ones, n m zeros 1 0 0 A = v 1 v 2 v n 0 1 0 v 1 v 2 v n 0 0 0 1 Remark: If you already have a basis for W, then it s faster to compute A(A T A) 1 A T.

A Projection Matrix is Diagonalizable Example Problem: Let x 1 W = x 2 in R 3 x1 x 2 + x 3 = 0 x 3 and let T : R 3 R 3 be orthogonal projection onto W. Compute the matrix B for T. As we have seen several times, a basis for W is 1 1 1, 0. 0 1 By definition, W is the orthogonal complement of the line spanned by (1, 1, 1), so W = Span{(1, 1, 1)}. Hence B = 1 1 1 1 0 1 1 0 0 0 1 0 1 1 1 1 1 0 1 = 1 2 1 1 1 2 1. 0 1 1 0 0 0 0 1 1 3 1 1 2

Reflections Let W be a subspace of R n and let x be a vector in R n. Definition The reflection of x over W is the vector ref W (x) = x 2x W. In other words, to find ref W (x) one starts at x, then moves to x x W = x W, then continues in the same direction one more time, to end on the opposite side of W. Since x W = x x W we have x x W x W W x W ref W (x) ref W (x) = x 2(x x W ) = 2x W x. If T is the orthogonal projection, then ref W (x) = 2T (x) x.

Reflections Properties Theorem Let W be an m-dimensional subspace of R n, and let A be the matrix for ref W. Then 1. ref W ref W is the identity transformation and A 2 is the identity matrix. 2. ref W and A are invertible; they are their own inverses. 3. The 1-eigenspace of A is W and the 1-eigenspace of A is W. 4. A is similar to the diagonal matrix with m ones and n m negative ones on the diagonal. 5. If B is the matrix for the orthogonal projection onto W, then A = 2B I n. Example: Let The matrix for ref W A = 2 1 3 x 1 W = x 2 in R 3 x 1 x 2 + x 3 = 0. x 3 is 2 1 1 1 2 1 1 1 2 I 3 = 1 3 1 2 2 2 1 2 2 2 1.

Summary Today we considered orthogonal projection as a transformation. Orthogonal projection is a linear transformation. We gave three methods to compute its matrix. Four if you count the special case when W is a line. The matrix for projection onto W has eigenvalues 1 and 0 with eigenspaces W and W. A projection matrix is diagonalizable. Reflection is 2 projection minus the identity.