Math 2331 Linear Algebra

Similar documents
Math 2331 Linear Algebra

Math 3191 Applied Linear Algebra

Math 4377/6308 Advanced Linear Algebra

Math 4377/6308 Advanced Linear Algebra

Math 2331 Linear Algebra

Math 4377/6308 Advanced Linear Algebra

Math 4377/6308 Advanced Linear Algebra

Math 4377/6308 Advanced Linear Algebra

Solutions to Review Problems for Chapter 6 ( ), 7.1

Math 4377/6308 Advanced Linear Algebra

Section 6.2, 6.3 Orthogonal Sets, Orthogonal Projections

Math 2331 Linear Algebra

Orthogonal Complements

Math 4377/6308 Advanced Linear Algebra

MTH 2310, FALL Introduction

Math 261 Lecture Notes: Sections 6.1, 6.2, 6.3 and 6.4 Orthogonal Sets and Projections

Math 2331 Linear Algebra

4.3 Linear, Homogeneous Equations with Constant Coefficients. Jiwen He

Di erential Equations

6. Orthogonality and Least-Squares

Math 3191 Applied Linear Algebra

Lecture 4. Section 2.5 The Pinching Theorem Section 2.6 Two Basic Properties of Continuity. Jiwen He. Department of Mathematics, University of Houston

Section 6.4. The Gram Schmidt Process

March 27 Math 3260 sec. 56 Spring 2018

6.1. Inner Product, Length and Orthogonality

Lecture 10: Vector Algebra: Orthogonal Basis

Assignment 1 Math 5341 Linear Algebra Review. Give complete answers to each of the following questions. Show all of your work.

To find the least-squares solution to Ax = b, we project b onto Col A to obtain the vector ) b = proj ColA b,

Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

1.1 Differential Equation Models. Jiwen He

Chapter 6: Orthogonality

Problem # Max points possible Actual score Total 120

Math 2331 Linear Algebra

Announcements Monday, November 26

LINEAR ALGEBRA SUMMARY SHEET.

Math 2331 Linear Algebra

2.4 Hilbert Spaces. Outline

Homework 5. (due Wednesday 8 th Nov midnight)

Orthogonality and Least Squares

Announcements Monday, November 20

Math 2331 Linear Algebra

Math 2331 Linear Algebra

Orthogonality and Least Squares

Announcements Monday, November 19

MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors.

Least squares problems Linear Algebra with Computer Science Application

Math Linear Algebra

Announcements Monday, November 26

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

Math 3191 Applied Linear Algebra

Math 3191 Applied Linear Algebra

Orthogonal Projection. Hung-yi Lee

Maths for Signals and Systems Linear Algebra in Engineering

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

Announcements Monday, November 19

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

The Gram Schmidt Process

The Gram Schmidt Process

EECS 275 Matrix Computation

Lecture 21. Section 6.1 More on Area Section 6.2 Volume by Parallel Cross Section. Jiwen He. Department of Mathematics, University of Houston

Math 18, Linear Algebra, Lecture C00, Spring 2017 Review and Practice Problems for Final Exam

MTH 2032 SemesterII

ft-uiowa-math2550 Assignment NOTRequiredJustHWformatOfQuizReviewForExam3part2 due 12/31/2014 at 07:10pm CST

MATH 1553 PRACTICE FINAL EXAMINATION

Lecture 03. Math 22 Summer 2017 Section 2 June 26, 2017

1. Diagonalize the matrix A if possible, that is, find an invertible matrix P and a diagonal

EECS 275 Matrix Computation

Linear Models Review

Worksheet for Lecture 25 Section 6.4 Gram-Schmidt Process

STAT 151A: Lab 1. 1 Logistics. 2 Reference. 3 Playing with R: graphics and lm() 4 Random vectors. Billy Fang. 2 September 2017

Mon Apr dot product, length, orthogonality, projection onto the span of a single vector. Announcements: Warm-up Exercise:

Math 2331 Linear Algebra

Miderm II Solutions To find the inverse we row-reduce the augumented matrix [I A]. In our case, we row reduce

Section 6.1. Inner Product, Length, and Orthogonality

Linear Algebra Final Exam Study Guide Solutions Fall 2012

2. Every linear system with the same number of equations as unknowns has a unique solution.

Example Linear Algebra Competency Test

ft-uiowa-math2550 Assignment OptionalFinalExamReviewMultChoiceMEDIUMlengthForm due 12/31/2014 at 10:36pm CST

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

MTH 309Y 37. Inner product spaces. = a 1 b 1 + a 2 b a n b n

MATH Linear Algebra

Linear Systems. Class 27. c 2008 Ron Buckmire. TITLE Projection Matrices and Orthogonal Diagonalization CURRENT READING Poole 5.4

Solutions of Linear system, vector and matrix equation

Recall: Dot product on R 2 : u v = (u 1, u 2 ) (v 1, v 2 ) = u 1 v 1 + u 2 v 2, u u = u u 2 2 = u 2. Geometric Meaning:

Lecture 13: Orthogonal projections and least squares (Section ) Thang Huynh, UC San Diego 2/9/2018

Lecture 9: Vector Algebra

Math 416, Spring 2010 Gram-Schmidt, the QR-factorization, Orthogonal Matrices March 4, 2010 GRAM-SCHMIDT, THE QR-FACTORIZATION, ORTHOGONAL MATRICES

Chapter 6. Orthogonality and Least Squares

Schur s Triangularization Theorem. Math 422

SUMMARY OF MATH 1600

Lecture 1.4: Inner products and orthogonality

Pseudoinverse and Adjoint Operators

Advanced Linear Algebra Math 4377 / 6308 (Spring 2015) March 5, 2015

Answer Key for Exam #2

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL

Consider a subspace V = im(a) of R n, where m. Then,

MATH 1553 SAMPLE FINAL EXAM, SPRING 2018

This is a closed book exam. No notes or calculators are permitted. We will drop your lowest scoring question for you.

Math 3C Lecture 25. John Douglas Moore

Transcription:

6. Orthogonal Projections Math 2 Linear Algebra 6. Orthogonal Projections Jiwen He Department of Mathematics, University of Houston jiwenhe@math.uh.edu math.uh.edu/ jiwenhe/math2 Jiwen He, University of Houston Math 2, Linear Algebra / 6

6. Orthogonal Projections Orthogonal Projection: Review Orthogonal Projection: Examples The Orthogonal Decomposition Theorem The Orthogonal Decomposition: Example Geometric Interpretation of Orthogonal Projections The Best Approximation Theorem The Best Approximation Theorem: Example New View of Matrix Multiplication Orthogonal Projection: Theorem Jiwen He, University of Houston Math 2, Linear Algebra 2 / 6

Orthogonal Projection: Review ŷ = y u u uu is the orthogonal projection of onto. Suppose {u,..., u p } is an orthogonal basis for W in R n. each y in W, ( ) ( ) y u y up y = u + + u p u u u p u p For Jiwen He, University of Houston Math 2, Linear Algebra / 6

Orthogonal Projection: Example Example Suppose {u, u 2, u } is an orthogonal basis for R and let W =Span{u, u 2 }. Write y in R as the sum of a vector ŷ in W and a vector z in W. Jiwen He, University of Houston Math 2, Linear Algebra 4 / 6

Orthogonal Projection: Example (cont.) Solution: Write ( ) ( ) ( ) y u y u2 y u y = u + u 2 + u u u u 2 u 2 u u where ( ) ( ) ( ) y u y u2 y u ŷ= u + u 2, z = u. u u u 2 u 2 u u To show that z is orthogonal to every vector in W, show that z is orthogonal to the vectors in {u, u 2 }. Since z u = = = z u 2 = = = Jiwen He, University of Houston Math 2, Linear Algebra 5 / 6

The Orthogonal Decomposition Theorem Theorem (8) Let W be a subspace of R n. Then each y in R n can be uniquely represented in the form y =ŷ + z where ŷ is in W and z is in W. In fact, if {u,..., u p } is any orthogonal basis of W, then ( ) ( ) y u y up ŷ = u + + u p u u u p u p and z = y ŷ. The vector ŷ is called the orthogonal projection of y onto W. Jiwen He, University of Houston Math 2, Linear Algebra 6 / 6

The Orthogonal Decomposition Theorem Jiwen He, University of Houston Math 2, Linear Algebra 7 / 6

The Orthogonal Decomposition: Example Example Let u =, u 2 =, and y =. Observe that {u, u 2 } is an orthogonal basis for W =Span{u, u 2 }. Write y as the sum of a vector in W and a vector orthogonal to W. Solution: ( ) proj W y = ŷ = y u u u u + + ( ) = ( ) z = y ŷ = ( ) y u2 u 2 u 2 u 2 = = 9 Jiwen He, University of Houston Math 2, Linear Algebra 8 / 6

Geometric Interpretation of Orthogonal Projections Jiwen He, University of Houston Math 2, Linear Algebra 9 / 6

The Best Approximation Theorem Theorem (9 The Best Approximation Theorem) Let W be a subspace of R n, y any vector in R n, and ŷ the orthogonal projection of y onto W. Then ŷ is the point in W closest to y, in the sense that for all v in W distinct from ŷ. y ŷ < y v Jiwen He, University of Houston Math 2, Linear Algebra / 6

The Best Approximation Theorem (cont.) Outline of Proof: Let v in W distinct from ŷ. Then v ŷ is also in W (why?) z = y ŷ is orthogonal to W y ŷ is orthogonal to v ŷ y v = (y ŷ) + (ŷ v) = y v 2 = y ŷ 2 + ŷ v 2. y v 2 > y ŷ 2 Hence, y ŷ < y v. Jiwen He, University of Houston Math 2, Linear Algebra / 6

The Best Approximation Theorem: Example Example Find the closest point to y in Span{u, u 2 } where 2 y = 4, u =, and u 2=. 2 Solution: ( ) ( ) y u y u2 ŷ= u + u 2 u u u 2 u 2 = ( ) + ( ) = Jiwen He, University of Houston Math 2, Linear Algebra 2 / 6

New View of Matrix Multiplication Part of Theorem below is based upon another way to view matrix multiplication where A is m p and B is p n AB = [ col A col 2 A col p A ] row B row 2 B. row p B = (col A) (row B) + + (col p A) (row p B) Jiwen He, University of Houston Math 2, Linear Algebra / 6

New View of Matrix Multiplication: Example Example = [ 5 6 [ 5 ] [ 2 4 2 [ 5 6 ] = ] [ 2 4 2 ] [ 2 ] + [ 6 [ 4 5 7 ] ] ] [ 4 2 ] = Jiwen He, University of Houston Math 2, Linear Algebra 4 / 6

Orthogonal Matrix So if U = [ ] u u 2 u p. Then U T u T = 2. u T u T p. So UU T = u u T + u 2u T 2 + + u pu T p ( UU T ) y = ( u u T + u 2u T 2 + + u pu T ) p y = ( u u T ) ( y + u2 u T ) ( 2 y + + up u T ) p y ( = u u T y ) ( + u 2 u T 2 y ) ( + + u p u T p y ) = (y u ) u + (y u 2 ) u 2 + + (y u p ) u p ( UU T ) y = (y u ) u + (y u 2 ) u 2 + + (y u p ) u p Jiwen He, University of Houston Math 2, Linear Algebra 5 / 6

Orthogonal Projection: Theorem Theorem () If {u,..., u p } is an orthonormal basis for a subspace W of R n, then proj W y = (y u ) u + + ( y u p ) up If U = [ u u 2 u p ], then Outline of Proof: proj W y =UU T y for all y in R n. proj W y = ( y u u u ) u + + ( y up u p u p ) u p = (y u ) u + + ( y u p ) up = UU T y. Jiwen He, University of Houston Math 2, Linear Algebra 6 / 6