Chapter 6. Orthogonality

Similar documents
Math 224, Fall 2007 Exam 3 Thursday, December 6, 2007

Answers in blue. If you have questions or spot an error, let me know. 1. Find all matrices that commute with A =. 4 3

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

There are two things that are particularly nice about the first basis

Math 407: Linear Optimization

Assignment 1 Math 5341 Linear Algebra Review. Give complete answers to each of the following questions. Show all of your work.

Review of Linear Algebra

is Use at most six elementary row operations. (Partial

Section 6.4. The Gram Schmidt Process

Dot Products, Transposes, and Orthogonal Projections

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

Miderm II Solutions To find the inverse we row-reduce the augumented matrix [I A]. In our case, we row reduce

7. Dimension and Structure.

Typical Problem: Compute.

Math Linear Algebra Final Exam Review Sheet

Problem # Max points possible Actual score Total 120

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL

Definitions for Quizzes

P = A(A T A) 1 A T. A Om (m n)

Problem Set 1. Homeworks will graded based on content and clarity. Please show your work clearly for full credit.

MTH 2310, FALL Introduction

Lecture 13: Orthogonal projections and least squares (Section ) Thang Huynh, UC San Diego 2/9/2018

MATH 22A: LINEAR ALGEBRA Chapter 4

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )

Matrix Algebra: Summary

Chapter SSM: Linear Algebra. 5. Find all x such that A x = , so that x 1 = x 2 = 0.

Problem Set (T) If A is an m n matrix, B is an n p matrix and D is a p s matrix, then show

x 1 + 2x 2 + 3x 3 = 0 x 1 + 2x 2 + 3x 3 = 0, x 2 + x 3 = 0 x 3 3 x 3 1

LINEAR ALGEBRA REVIEW

MATH 167: APPLIED LINEAR ALGEBRA Least-Squares

Solutions to Final Practice Problems Written by Victoria Kala Last updated 12/5/2015

(a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? Solution: dim N(A) 1, since rank(a) 3. Ax =

Recall the convention that, for us, all vectors are column vectors.

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017

ELE/MCE 503 Linear Algebra Facts Fall 2018

MATH 2331 Linear Algebra. Section 2.1 Matrix Operations. Definition: A : m n, B : n p. Example: Compute AB, if possible.

Chapter 3 Transformations

MATH 2210Q MIDTERM EXAM I PRACTICE PROBLEMS

SUMMARY OF MATH 1600

Conceptual Questions for Review

BASIC NOTIONS. x + y = 1 3, 3x 5y + z = A + 3B,C + 2D, DC are not defined. A + C =

Lecture 4 Orthonormal vectors and QR factorization

2. Every linear system with the same number of equations as unknowns has a unique solution.

Math 353, Practice Midterm 1

CS 246 Review of Linear Algebra 01/17/19

Lecture 1 Review: Linear models have the form (in matrix notation) Y = Xβ + ε,

Chapter 2. Vectors and Vector Spaces

MODULE 8 Topics: Null space, range, column space, row space and rank of a matrix

Linear Algebra Final Exam Study Guide Solutions Fall 2012

MATH 235. Final ANSWERS May 5, 2015

Glossary of Linear Algebra Terms. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB

Linear Algebra, Summer 2011, pt. 3

Linear Algebra Review. Vectors

Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

Chapter 6: Orthogonality

6.1. Inner Product, Length and Orthogonality

Chapter 3: Theory Review: Solutions Math 308 F Spring 2015

1 Inner Product and Orthogonality

SECTION 3.3. PROBLEM 22. The null space of a matrix A is: N(A) = {X : AX = 0}. Here are the calculations of AX for X = a,b,c,d, and e. =

Math 24 Winter 2010 Sample Solutions to the Midterm

MATH 221: SOLUTIONS TO SELECTED HOMEWORK PROBLEMS

4 Vector Spaces. 4.1 Basic Definition and Examples. Lecture 10

A SHORT SUMMARY OF VECTOR SPACES AND MATRICES

Section 6.2, 6.3 Orthogonal Sets, Orthogonal Projections

Math 3191 Applied Linear Algebra

Homework 5. (due Wednesday 8 th Nov midnight)

Pseudoinverse & Moore-Penrose Conditions

Math 215 HW #9 Solutions

MATH Linear Algebra

LINEAR ALGEBRA QUESTION BANK

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.

Solutions to Final Exam

ORTHOGONALITY AND LEAST-SQUARES [CHAP. 6]

MATRICES ARE SIMILAR TO TRIANGULAR MATRICES

Definition 1. A set V is a vector space over the scalar field F {R, C} iff. there are two operations defined on V, called vector addition

Linear Algebra Massoud Malek

Final EXAM Preparation Sheet

Review of Matrices and Block Structures

5.) For each of the given sets of vectors, determine whether or not the set spans R 3. Give reasons for your answers.

Properties of Matrices and Operations on Matrices

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

Choose three of: Choose three of: Choose three of:

Math Linear Algebra

(II.B) Basis and dimension

14 Singular Value Decomposition

Solutions to Review Problems for Chapter 6 ( ), 7.1

Answer Key for Exam #2

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

PRACTICE PROBLEMS FOR THE FINAL

AMS526: Numerical Analysis I (Numerical Linear Algebra)

Math 416, Spring 2010 Matrix multiplication; subspaces February 2, 2010 MATRIX MULTIPLICATION; SUBSPACES. 1. Announcements

EE731 Lecture Notes: Matrix Computations for Signal Processing

March 27 Math 3260 sec. 56 Spring 2018

235 Final exam review questions

Vector Spaces, Orthogonality, and Linear Least Squares

Orthogonal Projection and Least Squares Prof. Philip Pennance 1 -Version: December 12, 2016

Elementary Linear Algebra Review for Exam 2 Exam is Monday, November 16th.

can only hit 3 points in the codomain. Hence, f is not surjective. For another example, if n = 4

CSL361 Problem set 4: Basic linear algebra

Transcription:

6.4 The Projection Matrix 1 Chapter 6. Orthogonality 6.4 The Projection Matrix Note. In Section 6.1 (Projections), we projected a vector b R n onto a subspace W of R n. We did so by finding a basis for W and a basis for the perp space W. We then found the coordinate vector of b with respect to these two bases combined, and from this the projection of b onto W could be found. In Figure 6.8, a projection of a + b onto a subspace W is given schematically ( or we can interpret this as a situation where we are in R 3 and W is a two dimensional subspace). With the projection onto W represented as the function T, this figure suggests that T( a + b) = T( a) + T( b). Figure 6.9 suggests that T(r a) = rt( a).

6.4 The Projection Matrix 2 If these do in fact hold, then we would have that projection is a linear transformation and so (by Theorem 3.10) there would exist a matrix P such that the projection of b onto W is given by P b. Note. In this section, we show that the projection of any b onto W is of the form P b where P is a matrix independent of b. This will then show that projection onto W is a linear transformation. We first need a preliminary theorem on rank. Theorem 6.10. The Rank of (A T )A. Let A be an m n matrix of rank r. Then the n n symmetric matrix (A T )A also has rank r. Note. Let W = sp( a 1, a 2,..., z k ) be a subspace of R n where a 1, a 2,..., a k are linearly independent (so { a 1, a 2,..., a k } is a basis for W, though maybe neither an orthogonal nor orthonormal basis for W). Let b R n and let p = b W be the projection of b onto W. Then b = p + ( b p) bw + b W. By Theorem 6.1(4), p is the unique vector such that: 1. the vector p lies in the subspace W, and 2. the vector b p is perpendicular to every vector in W (or b p W ).

6.4 The Projection Matrix 3 Note. Let A be an n k matrix with its columns as a 1, a 2,..., a k. Then the column space of A is W. Since p W then (by the Column Space Criterion of Section 1.6, see page 92) there is column vector r R k such that p = A r. Since b p = b A r is perpendicular to each vector in W, then (A x) ( b A r) = 0 for any A x W (where x R k is any column vector). So (A x) ( b A r) = (A x) T ( b A r) = x T A T ( b A r) = x T (A T b A T A r) = [0]. But this means that x (A T b A T A r) = 0 for any x R k. Since A T b A T A r R k, this implies that A T b A T A r = 0 R k (this follows from Exercise 41(c) of Section 1.3). Now k k matrix A T A is invertible since A is invertible (by Theorem 1.12, since its columns are linearly independent), so rank(a) = k by Theorem 2.6 ( An Invertibility Criterion ) and by Theorem 6.10 ( The Rank of (A T )A ), rank(a T A) = k and so, again by Theorem 2.6, A T A is invertible. So we have A T b A T A r = 0 implies that r = (A T A) 1 (A T b). Since p = A r = bw then b W = A(A T A) 1 A T b.

6.4 The Projection Matrix 4 Note. We summarize the argument above as: Projection b W of b onto the Subspace W Let W = sp( a 1, a 2,..., a k ) be a k-dimensional subspace of R n and let matrix A have as its columns the vectors a 1, a 2,..., a k. The projection of b onto W is b W = A(A T A) 1 A T b. Definition. Let W = sp( a 1, a 2,..., a k ) be a k-dimensional subspace of R n and let matrix A have as its columns the vectors a 1, a 2,..., a k. The matrix P = A(A T A) 1 A T is the projection matrix for the subspace W. Example 2. Use the definition to find the projection matrix for the subspace in R 3 of the x 2 x 2 -plane (which is spanned by ĵ = [0, 1, 0] and ˆk = [0, 0, 1]). Solution. We have A = [î ĵ] = 0 0 1 0 0 1 and AT = 0 1 0. Then 0 0 1

6.4 The Projection Matrix 5 A T A = 1 0 = (A T A) 1, so 0 1 0 0 0 P = A(A T A) 1 A T = AA T = 0 1 0. 0 0 1 Notice that P b 1 0 b 2 = b 2, as expected. b 2 b 2 Example. Page 368 number 6. Note. We see that the projection matrix P is computed in terms of matrix A which is based on the basis for W. However, the choice of P is ultimately unique, as the next theorem claims.

6.4 The Projection Matrix 6 Theorem 6.11. Projection Matrix. Let W be a subspace of R n. There is a unique n n matrix P such that, for each column vector b R n, the vector P b is the projection of b onto W. The projection matrix can be found by selecting any basis { a 1, a 2,..., a k } for W and computing P = A(A T A) 1 A T, where A is the n k matrix having column vectors a 1, a 2,..., a k. Theorem 6.12. Characterization Projection Matrices. The projection matrix P for a subspace W of R n is both idempotent (that is, P 2 = P) and symmetric (that is, P = P T ). Conversely, every n n matrix that is both idempotent and symmetric is a projection matrix (specifically, it is the projection matrix for its column space). Note. Since Theorem 6.12 says that (to paraphrase) P is a projection matrix if and only if P is idempotent and symmetric, then we could take this as the definition of a projection matrix. There may be setting out there where you see this as the definition. Unfortunately, such a definition looses site of all the underlying geometric properties which we have used here.

6.4 The Projection Matrix 7 Note. If we have an orthonormal basis for W, say { a 1, a 2,..., a k }, then Fraliegh and Beauregard claim that A T A = I where I is the k k identity. This follows from the fact that the (ij)-entry of A T A is the ith row vector of A T dotted with the jth column vector of A. But this is 0 if i j simply a i a j. Since the basis is orthonormal, a i a j = So 1 if i = j. A T A = I. Hence P = A(A T A) 1 A T AIA T = AA T. So the computation of P is simplified when we have an orthonormal basis for W. Example. Page 369 number 27. Revised: 6/10/2017