To find the least-squares solution to Ax = b, we project b onto Col A to obtain the vector ) b = proj ColA b,

Similar documents
Orthogonality and Least Squares

Least squares problems Linear Algebra with Computer Science Application

Worksheet for Lecture 25 Section 6.4 Gram-Schmidt Process

Math 2331 Linear Algebra

6. Orthogonality and Least-Squares

Math 3191 Applied Linear Algebra

Section 6.5. Least Squares Problems

Section 6.4. The Gram Schmidt Process

Chapter 6: Orthogonality

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

Assignment 1 Math 5341 Linear Algebra Review. Give complete answers to each of the following questions. Show all of your work.

Solutions to Review Problems for Chapter 6 ( ), 7.1

Review Notes for Linear Algebra True or False Last Updated: February 22, 2010

Announcements Monday, November 26

Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition

Lecture 13: Orthogonal projections and least squares (Section ) Thang Huynh, UC San Diego 2/9/2018

Homework 5. (due Wednesday 8 th Nov midnight)

Linear Algebra- Final Exam Review

Orthogonal Complements

EK102 Linear Algebra PRACTICE PROBLEMS for Final Exam Spring 2016

1. Diagonalize the matrix A if possible, that is, find an invertible matrix P and a diagonal

Section 6.2, 6.3 Orthogonal Sets, Orthogonal Projections

Linear Algebra Final Exam Study Guide Solutions Fall 2012

18.06SC Final Exam Solutions

Lecture 10: Vector Algebra: Orthogonal Basis

MATH 22A: LINEAR ALGEBRA Chapter 4

Math 3191 Applied Linear Algebra

Math 261 Lecture Notes: Sections 6.1, 6.2, 6.3 and 6.4 Orthogonal Sets and Projections

LINEAR ALGEBRA SUMMARY SHEET.

7. Dimension and Structure.

Linear Systems. Class 27. c 2008 Ron Buckmire. TITLE Projection Matrices and Orthogonal Diagonalization CURRENT READING Poole 5.4

Announcements Monday, November 26

Orthogonality and Least Squares

Orthogonal Projection. Hung-yi Lee

Chapter 5 Orthogonality

Lecture 3: Linear Algebra Review, Part II

2. Every linear system with the same number of equations as unknowns has a unique solution.

6 Gram-Schmidt procedure, QR-factorization, Orthogonal projections, least square

LINEAR ALGEBRA QUESTION BANK

SUMMARY OF MATH 1600

1 0 1, then use that decomposition to solve the least squares problem. 1 Ax = 2. q 1 = a 1 a 1 = 1. to find the intermediate result:

Math 416, Spring 2010 Gram-Schmidt, the QR-factorization, Orthogonal Matrices March 4, 2010 GRAM-SCHMIDT, THE QR-FACTORIZATION, ORTHOGONAL MATRICES

Chapter 7: Symmetric Matrices and Quadratic Forms

ft-uiowa-math2550 Assignment OptionalFinalExamReviewMultChoiceMEDIUMlengthForm due 12/31/2014 at 10:36pm CST

MTH 2032 SemesterII

Least-Squares Systems and The QR factorization

Inner products. Theorem (basic properties): Given vectors u, v, w in an inner product space V, and a scalar k, the following properties hold:

Math 1553, Introduction to Linear Algebra

Announcements Monday, November 20

Consider a subspace V = im(a) of R n, where m. Then,

Warm-up. True or false? Baby proof. 2. The system of normal equations for A x = y has solutions iff A x = y has solutions

MTH 2310, FALL Introduction

ft-uiowa-math2550 Assignment NOTRequiredJustHWformatOfQuizReviewForExam3part2 due 12/31/2014 at 07:10pm CST

Problem # Max points possible Actual score Total 120

LAB 2: Orthogonal Projections, the Four Fundamental Subspaces, QR Factorization, and Inconsistent Linear Systems

(v, w) = arccos( < v, w >

Solutions to Final Practice Problems Written by Victoria Kala Last updated 12/5/2015

Maths for Signals and Systems Linear Algebra in Engineering

Lecture 4: Applications of Orthogonality: QR Decompositions

Problem 1. CS205 Homework #2 Solutions. Solution

Linear Algebra Math 221

Dimension and Structure

Vector Spaces, Orthogonality, and Linear Least Squares

(v, w) = arccos( < v, w >

Solution of Linear Equations

COMP 558 lecture 18 Nov. 15, 2010

and u and v are orthogonal if and only if u v = 0. u v = x1x2 + y1y2 + z1z2. 1. In R 3 the dot product is defined by

PRACTICE FINAL EXAM. why. If they are dependent, exhibit a linear dependence relation among them.

MATH 2331 Linear Algebra. Section 2.1 Matrix Operations. Definition: A : m n, B : n p. Example: Compute AB, if possible.

(c)

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

3 QR factorization revisited

18.06 Quiz 2 April 7, 2010 Professor Strang

MATH 235: Inner Product Spaces, Assignment 7

March 27 Math 3260 sec. 56 Spring 2018

YORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH M Test #2 Solutions

MODULE 8 Topics: Null space, range, column space, row space and rank of a matrix

Elementary Linear Algebra Review for Exam 2 Exam is Monday, November 16th.

(a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? Solution: dim N(A) 1, since rank(a) 3. Ax =

MATH 167: APPLIED LINEAR ALGEBRA Least-Squares

Kevin James. MTHSC 3110 Section 4.3 Linear Independence in Vector Sp

Projections and Least Square Solutions. Recall that given an inner product space V with subspace W and orthogonal basis for

22m:033 Notes: 7.1 Diagonalization of Symmetric Matrices

Math 407: Linear Optimization

Recall: Dot product on R 2 : u v = (u 1, u 2 ) (v 1, v 2 ) = u 1 v 1 + u 2 v 2, u u = u u 2 2 = u 2. Geometric Meaning:

Orthogonal Transformations

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

Roberto s Notes on Linear Algebra Chapter 9: Orthogonality Section 2. Orthogonal matrices

18.06 Professor Johnson Quiz 1 October 3, 2007

REVIEW FOR EXAM III SIMILARITY AND DIAGONALIZATION

I. Multiple Choice Questions (Answer any eight)

(v, w) = arccos( < v, w >

Linear Analysis Lecture 16

7. Symmetric Matrices and Quadratic Forms

5.) For each of the given sets of vectors, determine whether or not the set spans R 3. Give reasons for your answers.

Linear Algebra Massoud Malek

Chapter 2: Matrix Algebra

Orthogonality. 6.1 Orthogonal Vectors and Subspaces. Chapter 6

Student Mathematical Connections in an Introductory Linear Algebra Course Employing a Hybrid of Inquiry-Oriented Teaching and Traditional Lecture

This is a closed book exam. No notes or calculators are permitted. We will drop your lowest scoring question for you.

Transcription:

Least-Squares A famous application of linear algebra is the case study provided by the 1974 update of the North American Datum (pp. 373-374, a geodetic survey that keeps accurate measurements of distances and elevations at 268,000 points along the topography of this continent. The data collected provided the input for the solution of a massive system of linear equations involving 900,000 variables and twice as many equations! What s more, the system determined by the measured data was inconsistent! In fact, this is not surprising: data from real-world measurements is often prone to many kinds of errors, and may lead to systems of equations that are formally unsolvable because of these errors. What is needed then is a method to deal with such obstacles. This is the main idea behind the method of least squares. When a system of equations Ax = b with m n coefficient matrix A is inconsistent, it is because the vector b does not lie in the column space of A. It is often the case in practice that the system is inconsistent because the data for the system, its coefficient matrix A and vector of constants b, is inaccurate due to measurement error; that is, the system is almost consistent.

This suggests that the two vectors Ax and b should be nearby in R n, and that any attempt to solve the unsolvable system Ax = b should instead look for a vector x so that A x is as close to b as possible. That is, we want to locate the vector x for which b A x b Ax for every x R n. We can rephrase this condition by saying that we want to find the value x of the vector x that minimizes the quantity b y, where y = Ax. But minimizing b y is identical to minimizing b y 2 = (b 1 y 1 2 +L+ (b n y n 2. Because solving this new problem involves minimizing a quantity that is a sum of squares, the solution vector x is called a least-squares solution to the problem given by the inconsistent system Ax = b. To find the least-squares solution to Ax = b, we project b onto Col A to obtain the vector b = proj ColA b, the closest vector in Col A to b. Since b lies in Col A, we can then find an x that satisfies A x = b. This x satisfies our least-squares condition.

Notice that if Ax = b is consistent, then any vector x that makes A x as close to b as possible will be a regular solution to the consistent system Ax = b, i.e., the distance b A x will be zero. When the system Ax = b is inconsistent, the value of b A x is positive; we call it the least-squares error associated with the system. The Orthogonal Decomposition Theorem says that since b lies in Col A, then b A x = b b is a vector orthogonal to Col A. Therefore, b A x is orthogonal to each of the column vectors a j ( j = 1,2,,n of the matrix A. We can express this relationship by writing a T j ( b A x = 0. But the vectors a T j are the rows of A, so taken together, these orthogonality conditions can be written in the form A T ( b A x = 0, or more simply, A T A x = A T b. Conversely, any solution to the system of equations (* (A T A x = A T b will yield a vector x = x for which b A x is orthogonal to each of the column vectors a j of the matrix A, hence will be a least-squares solution to Ax = b.

For this reason, we call the system of equations determined by (* the normal equations for Ax = b. Our discussion proves the Theorem The set of least-squares solutions to Ax = b is precisely the solution set of the system of normal equations (A T A x = A T b. // (See Example 2, p. 412. In many cases, the least-squares problem Ax = b comes with a matrix A having linearly independent columns, so that A has a QR decomposition. We can exploit this property to compute the leastsquares solution more quickly, as we now show. Theorem If A is an m n matrix with linearly independent columns, then where A = QR is any QR decomposition of A, the vector x = R 1 Q T b is the only least-squares solution to the system with matrix equation Ax = b. That is, the least-squares solution to Ax = b is the solution to the system Rx = Q T b. Proof Recall that the columns of Q form an orthonormal basis for Col A, so the projection of b

onto Col A can be computed as b = QQ T b. Then b = QQ T b = Q(RR 1 Q T b = (QR(R 1 Q T b = A(R 1 Q T b shows that the vector x = R 1 Q T b satisfies A x = b. That is, x = R 1 Q T b is a least-squares solution for Ax = b. Conversely, if x is any least-squares solution for Ax = b, then it satisfies the normal equations: (A T A x = A T b. But since the columns of Q are orthonormal, Q T Q = I, and so A T A = (QR T QR = R T Q T QR = R T R. Therefore, the normal equations take the form R T R x = (QR T b = R T Q T b. Since R is invertible, so is R T, so multiplication first by the inverse of R T, then by the inverse of R, yields x = R 1 Q T b, showing that there is only one least-squares solution to Ax = b. //