Least squares problems Linear Algebra with Computer Science Application

Similar documents
6. Orthogonality and Least-Squares

Solutions to Review Problems for Chapter 6 ( ), 7.1

Orthogonality and Least Squares

Chapter 6: Orthogonality

Orthogonality and Least Squares

Worksheet for Lecture 25 Section 6.4 Gram-Schmidt Process

To find the least-squares solution to Ax = b, we project b onto Col A to obtain the vector ) b = proj ColA b,

LINEAR ALGEBRA SUMMARY SHEET.

Linear Algebra- Final Exam Review

For each problem, place the letter choice of your answer in the spaces provided on this page.

Lecture 3: Linear Algebra Review, Part II

2. Every linear system with the same number of equations as unknowns has a unique solution.

Warm-up. True or false? Baby proof. 2. The system of normal equations for A x = y has solutions iff A x = y has solutions

MATH 2331 Linear Algebra. Section 2.1 Matrix Operations. Definition: A : m n, B : n p. Example: Compute AB, if possible.

Math 54 Second Midterm Exam, Prof. Srivastava October 31, 2016, 4:10pm 5:00pm, 155 Dwinelle Hall.

A SHORT SUMMARY OF VECTOR SPACES AND MATRICES

Assignment 1 Math 5341 Linear Algebra Review. Give complete answers to each of the following questions. Show all of your work.

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

Lecture 13: Orthogonal projections and least squares (Section ) Thang Huynh, UC San Diego 2/9/2018

Linear Algebra Final Exam Study Guide Solutions Fall 2012

Math x + 3y 5z = 14 3x 2y + 3z = 17 4x + 3y 2z = 1

Review Notes for Linear Algebra True or False Last Updated: February 22, 2010

Math 21b: Linear Algebra Spring 2018

Math 1553, Introduction to Linear Algebra

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017

Projections and Least Square Solutions. Recall that given an inner product space V with subspace W and orthogonal basis for

(c)

Solutions to Final Practice Problems Written by Victoria Kala Last updated 12/5/2015

Homework 5. (due Wednesday 8 th Nov midnight)

1. Let m 1 and n 1 be two natural numbers such that m > n. Which of the following is/are true?

MATH 22A: LINEAR ALGEBRA Chapter 4

May 9, 2014 MATH 408 MIDTERM EXAM OUTLINE. Sample Questions

orthogonal relations between vectors and subspaces Then we study some applications in vector spaces and linear systems, including Orthonormal Basis,

Chapter 3. Directions: For questions 1-11 mark each statement True or False. Justify each answer.

MATH 1553 SAMPLE FINAL EXAM, SPRING 2018

Math 407: Linear Optimization

Orthogonal Projection. Hung-yi Lee

MATH 304 Linear Algebra Lecture 18: Orthogonal projection (continued). Least squares problems. Normed vector spaces.

Review for Chapter 1. Selected Topics

MA 242 LINEAR ALGEBRA C1, Solutions to First Midterm Exam

18.06 Professor Johnson Quiz 1 October 3, 2007

February 20 Math 3260 sec. 56 Spring 2018

Chapter 6 - Orthogonality

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

WI1403-LR Linear Algebra. Delft University of Technology

Sample Final Exam: Solutions

Study Guide for Linear Algebra Exam 2

Final Review Written by Victoria Kala SH 6432u Office Hours R 12:30 1:30pm Last Updated 11/30/2015

Math 308 Discussion Problems #4 Chapter 4 (after 4.3)

Section 6.4. The Gram Schmidt Process

The matrix will only be consistent if the last entry of row three is 0, meaning 2b 3 + b 2 b 1 = 0.

1. Diagonalize the matrix A if possible, that is, find an invertible matrix P and a diagonal

Math 265 Linear Algebra Sample Spring 2002., rref (A) =

Math 2030, Matrix Theory and Linear Algebra I, Fall 2011 Final Exam, December 13, 2011 FIRST NAME: LAST NAME: STUDENT ID:

ft-uiowa-math2550 Assignment OptionalFinalExamReviewMultChoiceMEDIUMlengthForm due 12/31/2014 at 10:36pm CST

Review Solutions for Exam 1

Chapter 2 Notes, Linear Algebra 5e Lay

MATH 167: APPLIED LINEAR ALGEBRA Least-Squares

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

Math Camp II. Basic Linear Algebra. Yiqing Xu. Aug 26, 2014 MIT

LECTURES 14/15: LINEAR INDEPENDENCE AND BASES

Typical Problem: Compute.

MATH 323 Linear Algebra Lecture 12: Basis of a vector space (continued). Rank and nullity of a matrix.

(Practice)Exam in Linear Algebra

Vector Spaces, Orthogonality, and Linear Least Squares

YORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH M Test #1. July 11, 2013 Solutions

MTH 464: Computational Linear Algebra

Orthogonality. 6.1 Orthogonal Vectors and Subspaces. Chapter 6

MA 265 FINAL EXAM Fall 2012

7. Dimension and Structure.

Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008

MATH 220 FINAL EXAMINATION December 13, Name ID # Section #

ft-uiowa-math2550 Assignment NOTRequiredJustHWformatOfQuizReviewForExam3part2 due 12/31/2014 at 07:10pm CST

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

Linear Algebra Exam 1 Spring 2007

Dimension and Structure

Linear transformations

Least Squares. Stephen Boyd. EE103 Stanford University. October 28, 2017

MTH501- Linear Algebra MCQS MIDTERM EXAMINATION ~ LIBRIANSMINE ~

Designing Information Devices and Systems I Discussion 13B

1. Select the unique answer (choice) for each problem. Write only the answer.

Pseudoinverse & Moore-Penrose Conditions

Orthogonal Complements

MTH 2032 SemesterII

Chapter 6. Linear Independence. Chapter 6

Family Feud Review. Linear Algebra. October 22, 2013

1. Determine by inspection which of the following sets of vectors is linearly independent. 3 3.

Chapter 3: Theory Review: Solutions Math 308 F Spring 2015

Chapter 1: Systems of Linear Equations and Matrices

MATH 2331 Linear Algebra. Section 1.1 Systems of Linear Equations. Finding the solution to a set of two equations in two variables: Example 1: Solve:

Linear Algebra - Part II

Lecture 4: Applications of Orthogonality: QR Decompositions

Elementary maths for GMT

MATH 2030: ASSIGNMENT 4 SOLUTIONS

Check that your exam contains 20 multiple-choice questions, numbered sequentially.

Solution: By inspection, the standard matrix of T is: A = Where, Ae 1 = 3. , and Ae 3 = 4. , Ae 2 =

Econ Slides from Lecture 7

1 0 1, then use that decomposition to solve the least squares problem. 1 Ax = 2. q 1 = a 1 a 1 = 1. to find the intermediate result:

Check that your exam contains 30 multiple-choice questions, numbered sequentially.

Elementary Linear Algebra Review for Exam 2 Exam is Monday, November 16th.

Transcription:

Linear Algebra with Computer Science Application April 8, 018 1 Least Squares Problems 11 Least Squares Problems What do you do when Ax = b has no solution? Inconsistent systems arise often in applications When a solution is demanded and none exists, the best one can do is to find an x that makes Ax as close as possible to b Think of Ax as an approximation to b The smaller the distance between b and Ax,$ the better the approximation The general least-squares problem is to find an x that makes b Ax as small as possible The adjective least-squares arises from the fact that b Ax is the square root of a sum of squares Definition 1 If A is m n and b is in R m, a least-squares solution of Ax = b is a vector ˆx in R n such that b Aˆx b Ax for all x R n The most important aspect of the least-squares problem is that no matter what x we select, the vector Ax will necessarily be in the column space, ColA So we seek an x that makes Ax the closest point in ColA to b Of course, if b happens to be in ColA, then b is Ax for some x, and such an x is a least-squares solution ) 1 Solution of the general least-squares problem Given A and b as above, apply the Best Approximation Theorem to the subspace ColA Let ˆb = proj ColA b, Because ˆb is in the column space of A, the equation Ax = ˆb is consistent, and there is an ˆx in R n such that Aˆx = ˆb 1/6

By construction we have b ˆb ColA, b Aˆx ColA, implying that ˆx is the best approximation solution We can write this orthogonality a j (b Aˆx) = 0, a T j (b Aˆx) = 0, Since a T j are rows of A T we have A T (b Aˆx) = 0, Therefore, to find ˆx we solve A T Aˆx = A T b This matrix equation represents a system of equations called the normal equations for Ax = b Theorem 1 The set of least-squares solutions of Ax = b coincides with the nonempty set of solutions of the normal equations A T Ax = A T b 13 Example Find a least-squares solution of the inconsistent system Ax = b for A = 0 1, b = 0 0 1 11 Solution To use normal equations we compute: [ ] 0 A T 0 1 A = 0 = 0 1 1 1 [ ] A T 0 1 b = 0 = 0 1 11 [ ] 17 1 1 5 [ ] 19 11 Then the equation A T Aˆx = A T b becomes [ ] 17 1 x = 1 5 [ ] 19 1 /6

Row operations can be used to solve this system, but since A T A is invertible and, it is probably faster to compute and then to solve A T Ax = A T b as 1 Example ˆx = (A T A) 1 A T b = 1 8 Find a least-squares solution of Ax = b for (A T A) 1 = 1 8 [ ] 5 1 1 17 [ ] [ ] 5 1 19 1 17 11 1 1 0 0 3 1 1 0 0 1 A = 1 0 1 0 1 0 1 0, b = 0 1 0 0 1 5 1 0 0 1 1 = 1 [ ] 8 = 8 168 [ ] 1 Solution Compute 6 A T A = 0 0 0 0, AT b = 0 0 6 The augmented matrix for A T Ax = A T b is 6 1 0 0 1 3 0 0 0 0 0 1 0 1 5 0 0 1 1 0 0 0 0 0 0 0 The general solution is x 1 = 3 x, x = 5 + x, x 3 = + x, and x is free So the general least-squares solution of Ax = b has the form 3 1 ˆx = 5 + x 1 1 0 1 The next theorem gives useful criteria for determining when there is only one least-squares solution of Ax = b Of course, the orthogonal projection ˆb is always unique 3/6

15 Least square solution Theorem Let A be an m n matrix The following statements are logically equivalent: i The equation Ax = b has a unique least-squares solution for each b in R m ii The columns of A are linearly independent iii The matrix A T A is invertible When these statements are true, the least-squares solution ˆx is given by ˆx = (A T A) 1 A T b When a least-squares solution ˆx is used to produce Aˆx as an approximation to b, the distance from b to Aˆx is called the least-squares error of this approximation 16 Example Given A and b as in the first example, determine the least-squares error in the least-squares solution of Ax = b We had b = 0 and Aˆx = 11 3 Solution and The error is given by b Aˆx = 0 =, 11 3 8 b Aˆx = + 16 + 6 = 8 The least-squares error is 8 For any x R, the distance between b and the vector Ax is at least 8 See the figure below Note that the least-squares solution ˆx itself does not appear in the figure 17 Alternative calculations of least-squares solutions The next example shows how to find a least-squares solution of Ax = b when the columns of A are orthogonal Such matrices often appear in linear regression problems, discussed in the next section 18 Example Find a least-squares solution of Ax = b for /6

1 6 1 A = 1 1 1, b = 1 1 7 6 Solution Because the columns a 1 and a of A are orthogonal, the orthogonal projection of b onto ColA is given by ˆb = b a 1 a 1 a 1 a 1 + b a a a a = a 1 + 1 a, ˆb = 1 1 5/ 11/ Now that ˆb is known, we can solve Aˆx = ˆb But this is trivial, since we already know what weights to place on the columns of A to produce ˆb It is clear from the equation above that 19 Numerical notes ˆx = [ ] 1/ In some cases, the normal equations for a least-squares problem can be ill-conditioned; that is, small errors in the calculations of the entries of A T A can sometimes cause relatively large errors in the solution ˆx If the columns of A are linearly independent, the least-squares solution can often be computed more reliably through a QR factorization of A 110 Least-squares and QR factorization Theorem 3 Given an m n matrix A with linearly independent columns, let A = QR be a QR factorization of A Then, for each b in R m, the equation Ax = b has a unique least-squares solution, given by ˆx = R 1 Q T b 111 Example Find the least-squares solution of Ax = b for 1 3 5 A = 1 1 0 1 1, b = 1 3 3 3 5 7 3 Solution The QR factorization of A can be obtained as follows 1 1 1 Q = 1 1 1 1 5 1 1 1, R = 0 3 0 0 1 1 1 5/6

Then Using back substitution in R we have 6 Q T b = 6 10 ˆx = 6 6/6