Orthogonal Projection. Hung-yi Lee

Similar documents
Math 3191 Applied Linear Algebra

Advanced Linear Algebra Math 4377 / 6308 (Spring 2015) March 5, 2015

Lecture 13: Orthogonal projections and least squares (Section ) Thang Huynh, UC San Diego 2/9/2018

Announcements Monday, November 26

Assignment 1 Math 5341 Linear Algebra Review. Give complete answers to each of the following questions. Show all of your work.

March 27 Math 3260 sec. 56 Spring 2018

Study Guide for Linear Algebra Exam 2

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

Section 6.1. Inner Product, Length, and Orthogonality

6. Orthogonality and Least-Squares

Announcements Monday, November 19

Math 2331 Linear Algebra

Solutions to Review Problems for Chapter 6 ( ), 7.1

1. Let m 1 and n 1 be two natural numbers such that m > n. Which of the following is/are true?

Review Notes for Linear Algebra True or False Last Updated: February 22, 2010

Chapter 3. Directions: For questions 1-11 mark each statement True or False. Justify each answer.

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

Worksheet for Lecture 23 (due December 4) Section 6.1 Inner product, length, and orthogonality

MA 265 FINAL EXAM Fall 2012

Announcements Wednesday, November 15

Chapter 6. Orthogonality and Least Squares

Announcements Monday, November 26

Orthogonality and Least Squares

7. Dimension and Structure.

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

5.) For each of the given sets of vectors, determine whether or not the set spans R 3. Give reasons for your answers.

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

Least squares problems Linear Algebra with Computer Science Application

YORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH M Test #1. July 11, 2013 Solutions

Review for Exam 2 Solutions

Math 4377/6308 Advanced Linear Algebra

Chapter 6: Orthogonality

Lecture 9: Vector Algebra

Solutions to Final Exam

MTH 362: Advanced Engineering Mathematics

MATH 323 Linear Algebra Lecture 12: Basis of a vector space (continued). Rank and nullity of a matrix.

Announcements Monday, November 19

MATH 221, Spring Homework 10 Solutions

Final Review Written by Victoria Kala SH 6432u Office Hours R 12:30 1:30pm Last Updated 11/30/2015

MAT 242 CHAPTER 4: SUBSPACES OF R n

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL

Warm-up. True or false? Baby proof. 2. The system of normal equations for A x = y has solutions iff A x = y has solutions

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017

235 Final exam review questions

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

Linear Algebra Final Exam Study Guide Solutions Fall 2012

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

Dimension. Eigenvalue and eigenvector

MATH SOLUTIONS TO PRACTICE PROBLEMS - MIDTERM I. 1. We carry out row reduction. We begin with the row operations

Announcements Wednesday, November 15

The definition of a vector space (V, +, )

LINEAR ALGEBRA SUMMARY SHEET.

Lecture Summaries for Linear Algebra M51A

General Vector Space (3A) Young Won Lim 11/19/12

ft-uiowa-math2550 Assignment OptionalFinalExamReviewMultChoiceMEDIUMlengthForm due 12/31/2014 at 10:36pm CST

I. Multiple Choice Questions (Answer any eight)

Linear algebra I Homework #1 due Thursday, Oct. 5

Math 3191 Applied Linear Algebra

Math 54 HW 4 solutions

Final Exam, Linear Algebra, Fall, 2003, W. Stephen Wilson

v = v 1 2 +v 2 2. Two successive applications of this idea give the length of the vector v R 3 :

Instructions Please answer the five problems on your own paper. These are essay questions: you should write in complete sentences.

(a) II and III (b) I (c) I and III (d) I and II and III (e) None are true.

2018 Fall 2210Q Section 013 Midterm Exam II Solution

Check that your exam contains 30 multiple-choice questions, numbered sequentially.

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix

Overview. Motivation for the inner product. Question. Definition

Solving a system by back-substitution, checking consistency of a system (no rows of the form

Worksheet for Lecture 25 Section 6.4 Gram-Schmidt Process

MATH 2331 Linear Algebra. Section 2.1 Matrix Operations. Definition: A : m n, B : n p. Example: Compute AB, if possible.

Inner products. Theorem (basic properties): Given vectors u, v, w in an inner product space V, and a scalar k, the following properties hold:

Objective: Introduction of vector spaces, subspaces, and bases. Linear Algebra: Section

MTH 2310, FALL Introduction

Math 308 Practice Test for Final Exam Winter 2015

Problem # Max points possible Actual score Total 120

Linear independence, span, basis, dimension - and their connection with linear systems

CSL361 Problem set 4: Basic linear algebra

STAT 151A: Lab 1. 1 Logistics. 2 Reference. 3 Playing with R: graphics and lm() 4 Random vectors. Billy Fang. 2 September 2017

MATH. 20F SAMPLE FINAL (WINTER 2010)

MATH 2360 REVIEW PROBLEMS

Kevin James. MTHSC 3110 Section 4.3 Linear Independence in Vector Sp

SECTION 3.3. PROBLEM 22. The null space of a matrix A is: N(A) = {X : AX = 0}. Here are the calculations of AX for X = a,b,c,d, and e. =

Math 18, Linear Algebra, Lecture C00, Spring 2017 Review and Practice Problems for Final Exam

1 Last time: inverses

MATH2210 Notebook 3 Spring 2018

Announcements Monday, October 29

MODULE 8 Topics: Null space, range, column space, row space and rank of a matrix

Announcements Wednesday, November 01

Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008

Math Linear algebra, Spring Semester Dan Abramovich

Matrix invertibility. Rank-Nullity Theorem: For any n-column matrix A, nullity A +ranka = n

MTH 2032 SemesterII

Math 2174: Practice Midterm 1

Linear Algebra Practice Problems

DS-GA 1002 Lecture notes 10 November 23, Linear models

2. Every linear system with the same number of equations as unknowns has a unique solution.

homogeneous 71 hyperplane 10 hyperplane 34 hyperplane 69 identity map 171 identity map 186 identity map 206 identity matrix 110 identity matrix 45

Review Notes for Midterm #2

EXAM. Exam #2. Math 2360 Summer II, 2000 Morning Class. Nov. 15, 2000 ANSWERS

W2 ) = dim(w 1 )+ dim(w 2 ) for any two finite dimensional subspaces W 1, W 2 of V.

Transcription:

Orthogonal Projection Hung-yi Lee

Reference Textbook: Chapter 7.3, 7.4

Orthogonal Projection What is Orthogonal Complement What is Orthogonal Projection How to do Orthogonal Projection Application of Orthogonal Projection

Orthogonal Complement The orthogonal complement of a nonempty vector set S is denoted as S (S perp). S is the set of vectors that are orthogonal to every vector in S S = v: v u = 0, u S W S = R n S = {0} W S = {0} S = R n

Orthogonal Complement The orthogonal complement of a nonempty vector set S is denoted as S (S perp). S is the set of vectors that are orthogonal to every vector in S W = V = S = v: v u = 0, u S w 1 w 2 0 w 1, w 2 R 0 0 v 3 v 3 R = W? V W : for all v V and w W, v w = 0 W V: since e 1, e 2 W, all z = [ z 1 W must have z 1 = z 2 = 0 z 2 z 3 ] T

Properties of Orthogonal Complement Is S always a subspace? For any nonempty vector set S, Span S = S Let W be a subspace, and B be a basis of W. B = W What is S S? Zero vector

Properties of Orthogonal Complement Example: For W = Span{u 1, u 2 }, where u 1 = [ 1 1 1 4 ] T and u 2 =[ 1 1 1 2 ] T v W if and only if u 1 v = u 2 v = 0 i.e., v = [ x 1 x 2 x 3 x 4 ] T satisfies is a basis for W. W = Solutions of Ax=0 = Null A

Properties of Orthogonal Complement For any matrix A Row A = Null A v (Row A) For all w Span{rows of A}, w v = 0 Av = 0. Col A = Null A T (Col A) = (Row A T ) = Null A T. For any subspace W of R n dimw + dimw = n rank nullity

Unique For any subspace W of R n Basis: w 1, w 2,, w k dimw + dimw = n Basis: z 1, z 2,, z n k For every vector u, u = w + z (unique) Basis for R n W W W W w z 0 u

Orthogonal Projection What is Orthogonal Complement What is Orthogonal Projection How to do Orthogonal Projection Application of Orthogonal Projection

Orthogonal Projection orthogonal projection W z u = w + z (unique) W W Orthogonal Projection Operator: The function U W u is the orthogonal projection of u on W. Linear?

Orthogonal Projection W z w in subspace W is closest to vector u. closest Always orthogonal z is always orthogonal to all vectors in W.

Closest Vector Property 0 Among all vectors in subspace W, the vector closest to u is the orthogonal projection of u on W u w w u u w' w w, w, w w W. (u w) (w w ) = 0. u w 2 = (u w) + (w w ) 2 = u w W 2 + w w 2 w = U W (u) > u w 2 The distance from a vector u to a subspace W is the distance between u and the orthogonal projection of u on W

Orthogonal Projection Matrix Orthogonal projection operator is linear. It has standard matrix. z = u - w u Orthogonal Projection Matrix P w w = U W (u) = P W u W 0 v = U W (v) = P W v

Orthogonal Projection What is Orthogonal Complement What is Orthogonal Projection How to do Orthogonal Projection Application of Orthogonal Projection

Orthogonal Projection on a line Orthogonal projection of a vector on a line z u = 0 z v z w u L v: any vector u: any nonzero vector on L w: orthogonal projection of v onto L, w = cu z: v w v w u = v cu u = v u cu u = v u c u 2 c = v u u 2 w = cu = v u u 2 u Distance from tip of v to L : z = v w = v =0 v u u 2 u

Orthogonal Projection Example: c = v u u 2 w = cu = v u u 2 u z u = 0 v z L L is y = (1/2)x z w u v = 4 1 u = 2 1

Orthogonal Projection Matrix Let C be an n x k matrix whose columns form a basis for a subspace W P W = C C T C 1 C T n x n Proof: Let u R n and w = U W (u). Since W = Col C, w = Cv for some v R k and u w W 0 = C T (u w) = C T u C T w = C T u C T Cv. C T u = C T Cv. v = (C T C) 1 C T u and w = C(C T C) 1 C T u as C T C is invertible.

Orthogonal Projection Matrix Let C be an n x k matrix whose columns form a basis for a subspace W P W = C C T C 1 C T n x n Let C be a matrix with linearly independent columns. Then C T C is invertible. Proof: We want to prove that C T C has independent columns. Suppose C T Cb = 0 for some b. b T C T Cb = (Cb) T Cb = (Cb) (Cb) = Cb 2 = 0. Cb = 0 b = 0 since C has L.I. columns. Thus C T C is invertible.

Orthogonal Projection Matrix Example: Let W be the 2-dimensional subspace of R 3 with equation x 1 x 2 +2x 3 = 0. P W = C C T C 1 C T 1 2 W has a basis 1 2 1, 0 0 1 C = 1 0 0 1 5 1 2 1 1 5 2 6 2 2 2 P W = P W 3 1 4 = 0 4 2

Orthogonal Projection What is Orthogonal Complement What is Orthogonal Projection How to do Orthogonal Projection Application of Orthogonal Projection

Solution of Inconsistent System of Linear Equations Suppose Ax = b is an inconsistent system of linear equations. b is not in the column space of A Find vector z minimizing Az b b Ax b Az b 0 Az = P W b Ax W = Col A

Least Square Approximation predict e.g. data pairs: x 1 y 1 x 2 y 2 x i y i ( 今天股票, 明天股票 ) ( 今天 PM2.5, 明天 PM2.5) Find the least-square line y = a 0 + a 1 x to best fit the data Regression

Least Square Approximation y = a 0 + a 1 x Error Vector: x i, a 0 + a 1 x i y i a 0 + a 1 x i x i, y i Find a 0 and a 1 minimizing E

Least Square Approximation Error Vector: Find a 0 and a 1 minimizing E E = y (a 0 v 1 + a 1 v 2 ) 2 = y Ca 2

Least Square Approximation Find a minimizing E = y Ca 2 B = {v 1, v 2 } (L.I.) Ca is the orthogonal projection of y on W = Span B. find a such that Ca = P W y

Example 1 y = 0.056 + 0.745x. Prediction: if the rough weight is 2.65, the finished weight is 0.056 +0.745(2.65) = 2.030. (estimation)

Least Square Approximation Best quadratic fit: using y = a 0 + a 1 x + a 2 x 2 to fit the data points (x 1, y 1 ), (x 2, y 2 ),, (x n, y n ) y = a 0 + a 1 x + a 2 x 2 e = y 1 a 0 + a 1 x 1 + a 2 x 1 2 y 2 a 0 + a 1 x 2 + a 2 x 2 2 y n a 0 + a 1 x n + a 2 x n 2 Find a 0, a 1 and a 2 minimizing E

Least Square Approximation Best quadratic fit: using y = a 0 + a 1 x + a 2 x 2 to fit the data points (x 1, y 1 ), (x 2, y 2 ),, (x n, y n ) e = y 1 a 0 + a 1 x 1 + a 2 x 1 2 y 2 a 0 + a 1 x 2 + a 2 x 2 2 y n a 0 + a 1 x n + a 2 x n 2 Find a 0, a 1 and a 2 minimizing E

y = a 0 + a 1 x + a 2 x 2 y = 101.00 + 29.77x 16.11x 2 Best fitting polynomial of any desired maximum degree may be found with the same method.

Multivariable Least Square Approximation y A y A = a 0 + a 1 x A + a 2 x B x A x B http://www.palass.org/publications/newsletter/palaeomath-101/palaeomathpart-4-regression-iv