Lecture 13: Orthogonal projections and least squares (Section ) Thang Huynh, UC San Diego 2/9/2018

Similar documents
Lecture 10: Vector Algebra: Orthogonal Basis

MODULE 8 Topics: Null space, range, column space, row space and rank of a matrix

Pseudoinverse & Moore-Penrose Conditions

Orthogonal Projection. Hung-yi Lee

Worksheet for Lecture 25 Section 6.4 Gram-Schmidt Process

Matrix equation Ax = b

Assignment 1 Math 5341 Linear Algebra Review. Give complete answers to each of the following questions. Show all of your work.

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

Lecture 3: Linear Algebra Review, Part II

Problem # Max points possible Actual score Total 120

Pseudoinverse and Adjoint Operators

Least squares problems Linear Algebra with Computer Science Application

MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors.

Announcements Monday, November 26

Chapter 6 - Orthogonality

Solutions to Review Problems for Chapter 6 ( ), 7.1

Homework 5. (due Wednesday 8 th Nov midnight)

Announcements Monday, November 26

MATH 2360 REVIEW PROBLEMS

5.) For each of the given sets of vectors, determine whether or not the set spans R 3. Give reasons for your answers.

Solutions to Section 2.9 Homework Problems Problems 1 5, 7, 9, 10 15, (odd), and 38. S. F. Ellermeyer June 21, 2002

MATH 2210Q MIDTERM EXAM I PRACTICE PROBLEMS

Lecture 9: Vector Algebra

Math 261 Lecture Notes: Sections 6.1, 6.2, 6.3 and 6.4 Orthogonal Sets and Projections

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

Math 407: Linear Optimization

Math 102 Final Exam - Dec 14 - PCYNH pm Fall Name Student No. Section A0

YORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH M Test #1. July 11, 2013 Solutions

Linear Systems. Class 27. c 2008 Ron Buckmire. TITLE Projection Matrices and Orthogonal Diagonalization CURRENT READING Poole 5.4

MATH 2050 Assignment 4 Fall Due: Thursday. u v v 2 v = P roj v ( u) = P roj u ( v) =

Math 3191 Applied Linear Algebra

1. Let m 1 and n 1 be two natural numbers such that m > n. Which of the following is/are true?

Department of Aerospace Engineering AE602 Mathematics for Aerospace Engineers Assignment No. 4

Chapter 6. Orthogonality

Inner products. Theorem (basic properties): Given vectors u, v, w in an inner product space V, and a scalar k, the following properties hold:

Projections and Least Square Solutions. Recall that given an inner product space V with subspace W and orthogonal basis for

YORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH M Test #2 Solutions

Linear Algebra Practice Problems

MTH 2310, FALL Introduction

Section 6.2, 6.3 Orthogonal Sets, Orthogonal Projections

A SHORT SUMMARY OF VECTOR SPACES AND MATRICES

MA 265 FINAL EXAM Fall 2012

Review Notes for Linear Algebra True or False Last Updated: February 22, 2010

Orthogonal Complements

Orthogonality and Least Squares

Math 3191 Applied Linear Algebra

MATH 304 Linear Algebra Lecture 18: Orthogonal projection (continued). Least squares problems. Normed vector spaces.

AMS526: Numerical Analysis I (Numerical Linear Algebra)

Linear Algebra Final Exam Study Guide Solutions Fall 2012

Rank & nullity. Defn. Let T : V W be linear. We define the rank of T to be rank T = dim im T & the nullity of T to be nullt = dim ker T.

EXAM. Exam #2. Math 2360 Summer II, 2000 Morning Class. Nov. 15, 2000 ANSWERS

Department of Aerospace Engineering AE602 Mathematics for Aerospace Engineers Assignment No. 6

Upon successful completion of MATH 220, the student will be able to:

Vector Space Concepts

Moore-Penrose Conditions & SVD

Math 18, Linear Algebra, Lecture C00, Spring 2017 Review and Practice Problems for Final Exam

Linear Combination. v = a 1 v 1 + a 2 v a k v k

(a) Compute the projections of vectors [1,0,0] and [0,1,0] onto the line spanned by a Solution: The projection matrix is P = 1

GEOMETRY OF MATRICES x 1

Linear Algebra Review. Vectors

MATH 323 Linear Algebra Lecture 12: Basis of a vector space (continued). Rank and nullity of a matrix.

x = t 1 x 1 + t 2 x t k x k

To find the least-squares solution to Ax = b, we project b onto Col A to obtain the vector ) b = proj ColA b,

Check that your exam contains 20 multiple-choice questions, numbered sequentially.

1. TRUE or FALSE. 2. Find the complete solution set to the system:

Check that your exam contains 30 multiple-choice questions, numbered sequentially.

(a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? Solution: dim N(A) 1, since rank(a) 3. Ax =

Miderm II Solutions To find the inverse we row-reduce the augumented matrix [I A]. In our case, we row reduce

Linear Least Square Problems Dr.-Ing. Sudchai Boonto

Lecture 22: Section 4.7

PRACTICE FINAL EXAM. why. If they are dependent, exhibit a linear dependence relation among them.

Student Mathematical Connections in an Introductory Linear Algebra Course Employing a Hybrid of Inquiry-Oriented Teaching and Traditional Lecture

Assignment #9: Orthogonal Projections, Gram-Schmidt, and Least Squares. Name:

Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008

Math Camp II. Basic Linear Algebra. Yiqing Xu. Aug 26, 2014 MIT

Linear Algebra MATH20F Midterm 1

14: Hunting the Nullspace. Lecture

Consider a subspace V = im(a) of R n, where m. Then,

Exercise Sheet 1.

Math 54 HW 4 solutions

LINEAR ALGEBRA REVIEW

Linear independence, span, basis, dimension - and their connection with linear systems

(a) II and III (b) I (c) I and III (d) I and II and III (e) None are true.

Lecture 1: Systems of linear equations and their solutions

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix

Orthogonality. 6.1 Orthogonal Vectors and Subspaces. Chapter 6

Math 2174: Practice Midterm 1

2018 Fall 2210Q Section 013 Midterm Exam II Solution

DS-GA 1002 Lecture notes 10 November 23, Linear models

Solutions to Math 51 First Exam October 13, 2015

6.1. Inner Product, Length and Orthogonality

Section 2.2: The Inverse of a Matrix

Lecture: Linear algebra. 4. Solutions of linear equation systems The fundamental theorem of linear algebra

is Use at most six elementary row operations. (Partial

MA 242 LINEAR ALGEBRA C1, Solutions to First Midterm Exam

Math 21b: Linear Algebra Spring 2018

MATH 1553, SPRING 2018 SAMPLE MIDTERM 2 (VERSION B), 1.7 THROUGH 2.9

MATH10212 Linear Algebra B Homework 6. Be prepared to answer the following oral questions if asked in the supervision class:

Math 1553, Introduction to Linear Algebra

Normed & Inner Product Vector Spaces

Transcription:

Lecture 13: Orthogonal projections and least squares (Section 3.2-3.3) Thang Huynh, UC San Diego 2/9/2018

Orthogonal projection onto subspaces Theorem. Let W be a subspace of R n. Then, each x in R n can be uniquely written as x = x + x. in W in W 1

Orthogonal projection onto subspaces Theorem. Let W be a subspace of R n. Then, each x in R n can be uniquely written as x = x + x. in W in W x is the orthogonal projection of x onto W. 1

Orthogonal projection onto subspaces Theorem. Let W be a subspace of R n. Then, each x in R n can be uniquely written as x = x + x. in W in W x is the orthogonal projection of x onto W. x is the point in W closest to x. 1

Orthogonal projection onto subspaces Theorem. Let W be a subspace of R n. Then, each x in R n can be uniquely written as x = x + x. in W in W x is the orthogonal projection of x onto W. x is the point in W closest to x. If v 1,, v m is an orthogonal basis of W, then x = ( x v 1 ) v v 1 v 1 + + ( x v m ) v 1 v m v m. m 1

Orthogonal projection onto subspaces Theorem. Let W be a subspace of R n. Then, each x in R n can be uniquely written as x = x + x. in W in W x is the orthogonal projection of x onto W. x is the point in W closest to x. If v 1,, v m is an orthogonal basis of W, then Once x = ( x v 1 ) v v 1 v 1 + + ( x v m ) v 1 v m v m. m x is determined, x = x x. 1

Orthogonal projection onto subspaces Example. Let W = span { { 3 0 1, 0 1 0 } }, and x = 0 3 10. Find the orthogonal projection of x onto W. Write x as a vector in W plus a vector orthogonal to W. 2

Orthogonal projection onto subspaces Definition. Let v 1,, v m be an orthogonal basis of W, a subspace of R n. The projection map π W R n R n, given by π W (x) = ( x v 1 v 1 v 1 ) v 1 + + ( x v m v m v m ) v m is linear (why?). The matrix P representing π W with respect to the standard basis is the corresponding projection matrix. 3

Orthogonal projection onto subspaces Definition. Let v 1,, v m be an orthogonal basis of W, a subspace of R n. The projection map π W R n R n, given by π W (x) = ( x v 1 v 1 v 1 ) v 1 + + ( x v m v m v m ) v m is linear (why?). The matrix P representing π W with respect to the standard basis is the corresponding projection matrix. Example. Find the projection matrix P which corresponds to { 3 orthogonal projection onto W = span 0, 0 } 1 in R 3. Then { 1 0 } find the orthogonal projection of x = 0 3 onto W. 10 3

Least squares In practice, Ax b. 4

Least squares In practice, Ax b. Definition. Ax = b if x is a least squares solution of the system x is such that Ax b is as small as possible. 4

Least squares In practice, Ax b. Definition. x is a least squares solution of the system Ax = b if x is such that Ax b is as small as possible. If Ax = b is consistent, i.e. b is in C(A), then a least squares solution x is just an ordinary solution. 4

Least squares In practice, Ax b. Definition. x is a least squares solution of the system Ax = b if x is such that Ax b is as small as possible. If Ax = b is consistent, i.e. b is in C(A), then a least squares solution x is just an ordinary solution. Interesting case: Ax = b is inconsistent, i.e. b is NOT in C(A). 4

Least squares In practice, Ax b. Definition. x is a least squares solution of the system Ax = b if x is such that Ax b is as small as possible. If Ax = b is consistent, i.e. b is in C(A), then a least squares solution x is just an ordinary solution. Interesting case: Ax = b is inconsistent, i.e. b is NOT in C(A). What should we do to find x? 4

Least squares In practice, Ax b. Definition. x is a least squares solution of the system Ax = b if x is such that Ax b is as small as possible. If Ax = b is consistent, i.e. b is in C(A), then a least squares solution x is just an ordinary solution. Interesting case: Ax = b is inconsistent, i.e. b is NOT in C(A). What should we do to find x? replace b with its projection and solve Ax = b. b onto C(A) 4

The normal equations Theorem x is a least squares solution of Ax = b if and only if A T Ax = A T b (the normal equation). Proof. 5

The normal equations Example. Find the least squares solution to Ax = b, where A = 1 1 1 1, b = 2 1. 0 0 1 6

The normal equations Example. Find the least squares solution to Ax = b, where A = 1 1 1 1, b = 2 1. 0 0 1 Solution. A T A = 1 1 0 1 1 1 1 0 1 1 = 2 0 0 0 0 2 and A T b = 1 1 0 2 1 1 0 1 = 1. 1 3 6

The normal equations The normal equation A T Ax = A T b is 2 0 x = 1. 0 2 3 Solving it, we obtain x = 1/2. 3/2 7

The normal equations Example. Find the least squares solution to Ax = b, where A = 4 0 0 2, b = 2 0. 1 1 11 What is the projection of b onto C(A)? 8

The normal equations Example. Find the least squares solution to Ax = b, where A = 4 0 0 2, b = 2 0. 1 1 11 What is the projection of b onto C(A)? Solution. and A T A = 4 0 1 4 0 0 2 1 0 2 = 17 1 1 1 1 5 A T b = 4 0 1 2 0 2 1 0 = 19. 11 11 8

The normal equations The normal equation A T Ax = A T b is 17 1 x = 19. 1 5 11 Solving it, we obtain x = 1. 2 9

The normal equations The normal equation A T Ax = A T b is 17 1 x = 19. 1 5 11 Solving it, we obtain x = 1. 2 The projection of b onto C(A) is Ax = 4 0 0 2 1 = 4 1 1 2 4. 3 9

Why is Ax the projection of b onto C(A)? The projection b of b onto C(A) is b = Ax, with x such that A T Ax = A T b. 10

Why is Ax the projection of b onto C(A)? The projection b of b onto C(A) is b = Ax, with x such that A T Ax = A T b. If A has full column rank (columns of A linearly independent), then b = A(A T A) 1 A T b. The projection matrix for projecting onto C(A) is P = A(A T A) 1 A T. 10

Application: least squares lines Example. Find β 1, β 2 such that the line y = β 1 + β 2 x best fits the data points (2, 1), (5, 2), (7, 3), (8, 3). 11

Application: least squares lines Example. Find β 1, β 2 such that the line y = β 1 + β 2 x best fits the data points (2, 1), (5, 2), (7, 3), (8, 3). Solution. The equations y i = β 1 + β 2 x i in matrix form 1 x 1 1 x 2 1 x 3 1 x 4 design matrix X β 1 = β 2 y 1 y 2 y 3 y 4 observation vector y. 11

Application: least squares lines We need to find a least squares solution to Then 1 2 1 1 5 β 1 1 7 = 2. β 2 3 1 8 3 1 2 X T X = 1 1 1 1 1 5 = 2 5 7 8 1 7 4 22, 22 142 1 8 1 X T y = 1 1 1 1 2 = 2 5 7 8 3 9 57 3 β 1 = 2/7 β 2 5/14. 12