Announcements Monday, November 26

Similar documents
Announcements Monday, November 26

Announcements Monday, November 19

Announcements Monday, November 20

Announcements Wednesday, November 01

Announcements Monday, November 19

Announcements Monday, October 29

Announcements Wednesday, November 01

Announcements Wednesday, November 7

Announcements Wednesday, November 7

Announcements Wednesday, November 15

Announcements Wednesday, November 15

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix

Announcements Wednesday, October 04

Assignment 1 Math 5341 Linear Algebra Review. Give complete answers to each of the following questions. Show all of your work.

Linear Systems. Class 27. c 2008 Ron Buckmire. TITLE Projection Matrices and Orthogonal Diagonalization CURRENT READING Poole 5.4

Announcements Monday, September 25

Announcements Wednesday, September 27

Announcements Monday, November 13

Announcements September 19

Solutions to Review Problems for Chapter 6 ( ), 7.1

Announcements Monday, November 13

Orthogonal Projection. Hung-yi Lee

Announcements Monday, October 02

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

MATH 1553 PRACTICE FINAL EXAMINATION

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL

ft-uiowa-math2550 Assignment NOTRequiredJustHWformatOfQuizReviewForExam3part2 due 12/31/2014 at 07:10pm CST

Announcements Monday, September 18

Chapter 6: Orthogonality

Section 6.4. The Gram Schmidt Process

Study Guide for Linear Algebra Exam 2

6. Orthogonality and Least-Squares

Name: Final Exam MATH 3320

Lecture 13: Orthogonal projections and least squares (Section ) Thang Huynh, UC San Diego 2/9/2018

MATH 221, Spring Homework 10 Solutions

Announcements Monday, November 06

Solutions to Final Exam

Chapter 6. Orthogonality and Least Squares

5.) For each of the given sets of vectors, determine whether or not the set spans R 3. Give reasons for your answers.

Math 3191 Applied Linear Algebra

Problem # Max points possible Actual score Total 120

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

Check that your exam contains 30 multiple-choice questions, numbered sequentially.

MAT188H1S LINEAR ALGEBRA: Course Information as of February 2, Calendar Description:

LINEAR ALGEBRA SUMMARY SHEET.

Announcements Wednesday, September 05

ft-uiowa-math2550 Assignment OptionalFinalExamReviewMultChoiceMEDIUMlengthForm due 12/31/2014 at 10:36pm CST

7. Symmetric Matrices and Quadratic Forms

Eigenvalues, Eigenvectors, and Diagonalization

Linear Algebra Final Exam Study Guide Solutions Fall 2012

MA 265 FINAL EXAM Fall 2012

Section Instructors: by now you should be scheduled into one of the following Sections:

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

Chapter 5. Eigenvalues and Eigenvectors

Mon Mar matrix eigenspaces. Announcements: Warm-up Exercise:

FINAL EXAM Ma (Eakin) Fall 2015 December 16, 2015

Dimension. Eigenvalue and eigenvector

Announcements Wednesday, October 25

Linear Algebra- Final Exam Review

Announcements Wednesday, September 20

Announcements Monday, September 17

18.06 Problem Set 8 - Solutions Due Wednesday, 14 November 2007 at 4 pm in

(a) II and III (b) I (c) I and III (d) I and II and III (e) None are true.

I. Multiple Choice Questions (Answer any eight)

MATH 115A: SAMPLE FINAL SOLUTIONS

Math 2331 Linear Algebra

Announcements Wednesday, October 10

Assignment #9: Orthogonal Projections, Gram-Schmidt, and Least Squares. Name:

Review problems for MA 54, Fall 2004.

Mathematical Methods for Engineers 1 (AMS10/10A)

spring, math 204 (mitchell) list of theorems 1 Linear Systems Linear Transformations Matrix Algebra

Math 54 HW 4 solutions

Announcements Wednesday, August 30

YORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH M Test #1. July 11, 2013 Solutions

Math 18, Linear Algebra, Lecture C00, Spring 2017 Review and Practice Problems for Final Exam

Section 6.1. Inner Product, Length, and Orthogonality

Announcements: Nov 10

This is a closed book exam. No notes or calculators are permitted. We will drop your lowest scoring question for you.

Announcements Wednesday, August 30

Kevin James. MTHSC 3110 Section 4.3 Linear Independence in Vector Sp

For each problem, place the letter choice of your answer in the spaces provided on this page.

EK102 Linear Algebra PRACTICE PROBLEMS for Final Exam Spring 2016

Jordan Normal Form and Singular Decomposition

University of Colorado Denver Department of Mathematical and Statistical Sciences Applied Linear Algebra Ph.D. Preliminary Exam January 23, 2015

The Jordan Normal Form and its Applications

MTH 2032 SemesterII

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

Math Final December 2006 C. Robinson

(a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? Solution: dim N(A) 1, since rank(a) 3. Ax =

22m:033 Notes: 7.1 Diagonalization of Symmetric Matrices

Definition: An n x n matrix, "A", is said to be diagonalizable if there exists a nonsingular matrix "X" and a diagonal matrix "D" such that X 1 A X

Chapter 7: Symmetric Matrices and Quadratic Forms

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Test 3, Linear Algebra

235 Final exam review questions

Definition (T -invariant subspace) Example. Example

Math 3191 Applied Linear Algebra

Answer Keys For Math 225 Final Review Problem

Transcription:

Announcements Monday, November 26 Please fill out your CIOS survey! WeBWorK 6.6, 7.1, 7.2 are due on Wednesday. No quiz on Friday! But this is the only recitation on chapter 7. My office is Skiles 244 and Rabinoffice hours are: Mondays, 12 1pm; Wednesdays, 1 3pm. (But not this Wednesday.)

Orthogonal Projections Review Recall: Let W be a subspace of R n. The orthogonal complement W is the set of vectors orthogonal to everything in W. The orthogonal decomposition of a vector x with respect to W is the unique way of writing x = x W + x W for x W in W and x W in W. The vector x W is the orthogonal projection of x onto W. It is the closest vector to x in W. To compute x W, write W as Col A and solve A T Av = A T x; then x W = Av. x W x W x W W

Projection as a Transformation Change in Perspective: let us consider orthogonal projection as a transformation. Definition Let W be a subspace of R n. Define a transformation T : R n R n by T (x) = x W. This transformation is also called orthogonal projection with respect to W. Theorem Let W be a subspace of R n and let T : R n R n be the orthogonal projection with respect to W. Then: 1. T is a linear transformation. 2. For every x in W, T (x) is the closest vector to x in W. 3. For every x in W, we have T (x) = x. 4. For every x in W, we have T (x) = 0. 5. T T = T. 6. The range of T is W and the null space of T is W.

Method 1 Let W be a subspace of R n and let T : R n R n be the orthogonal projection with respect to W. Since T is a linear transformation, it has a matrix. How do you compute it? The same as any other linear transformation: compute T (e 1), T (e 2),..., T (e n).

Example Problem: Let L = Span {( 3 2)} and let T : R 2 R 2 be the orthogonal projection onto L. Compute the matrix A for T.

Another Example Problem: Let x 1 W = x 2 in R 3 x1 x 2 + x 3 = 0 x 3 and let T : R 3 R 3 be orthogonal projection onto W. Compute the matrix B for T.

Another Example, Continued Problem: Let x 1 W = x 2 in R 3 x1 x 2 + x 3 = 0 x 3 and let T : R 3 R 3 be orthogonal projection onto W. Compute the matrix B for T.

Method 2 Theorem Let {v 1, v 2,..., v m} be a linearly independent set in R n, and let A = v 1 v 2 v m. Then the m m matrix A T A is invertible. Proof:

Method 2 Theorem Let {v 1, v 2,..., v m} be a linearly independent set in R n, and let A = v 1 v 2 v m. Then the m m matrix A T A is invertible. Let W be a subspace of R n and let T : R n R n be the orthogonal projection with respect to W. Let {v 1, v 2,..., v m} be a basis for W and let A be the matrix with columns v 1, v 2,..., v m. To compute T (x) = x W you solve A T Av = Ax; then x W = Av. v = (A T A) 1 (A T x) = T (x) = Av = [ A(A T A) 1 A T ] x. If the columns of A are a basis for W then the matrix for T is A(A T A) 1 A T.

Example Problem: Let L = Span {( 3 2)} and let T : R 2 R 2 be the orthogonal projection onto L. Compute the matrix A for T. Matrix of Projection onto a Line If L = Span{u} is a line in R n, then the matrix for projection onto L is 1 u u uut.

Another Example Problem: Let x 1 W = x 2 in R 3 x1 x 2 + x 3 = 0 x 3 and let T : R 3 R 3 be orthogonal projection onto W. Compute the matrix B for T.

Poll

Facts Theorem Let W be an m-dimensional subspace of R n, let T : R n W be the projection, and let A be the matrix for T. Then: 1. Col A = W, which is the 1-eigenspace. 2. Nul A = W, which is the 0-eigenspace. 3. A 2 = A. 4. A is similar to the diagonal matrix with m ones and n m zeros on the diagonal. Example: If W is a plane in R 3, then A is similar to projection onto the xy-plane: 1 0 0 0 1 0. 0 0 0

Method 3 Computing the Projection Matrix by Diagonalizing Let W be an m-dimensional subspace of R n, let T : R n R n be the orthogonal projection onto W, and let A be the matrix for T. Here s how to compute A: Find a basis {v 1, v 2,..., v m} for W. Find a basis {v m+1, v m+2,..., v n} for W. Then m ones, n m zeros 1 0 0 A = v 1 v 2 v m 0 1 0 v 1 v 2 v m 0 0 0 1 Remark: If you already have a basis for W, then it s faster to compute A(A T A) 1 A T.

Example Problem: Let x 1 W = x 2 in R 3 x1 x 2 + x 3 = 0 x 3 and let T : R 3 R 3 be orthogonal projection onto W. Compute the matrix B for T.

Reflections Let W be a subspace of R n and let x be a vector in R n. Definition The reflection of x over W is the vector ref W (x) = x 2x W. In other words, to find ref W (x) one starts at x, then moves to x x W = x W, then continues in the same direction one more time, to end on the opposite side of W. Since x W = x x W we have x x W x W W x W ref W (x) ref W (x) = x 2(x x W ) = 2x W x. If T is the orthogonal projection, then ref W (x) = 2T (x) x.

Reflections Properties Theorem Let W be an m-dimensional subspace of R n, and let A be the matrix for ref W. Then 1. ref W ref W is the identity transformation and A 2 is the identity matrix. 2. ref W and A are invertible; they are their own inverses. 3. The 1-eigenspace of A is W and the 1-eigenspace of A is W. 4. A is similar to the diagonal matrix with m ones and n m negative ones on the diagonal. 5. If B is the matrix for the orthogonal projection onto W, then A = 2B I n. Example: Let The matrix for ref W A = 2 1 3 x 1 W = x 2 in R 3 x 1 x 2 + x 3 = 0. x 3 is 2 1 1 1 2 1 1 1 2 I 3 = 1 3 1 2 2 2 1 2 2 2 1.

Summary Today we considered orthogonal projection as a transformation. Orthogonal projection is a linear transformation. We gave three methods to compute its matrix. Four if you count the special case when W is a line. The matrix for projection onto W has eigenvalues 1 and 0 with eigenspaces W and W. A projection matrix is diagonalizable. Reflection is 2 projection minus the identity.