Consider a subspace V = im(a) of R n, where m. Then,

Similar documents
Final Exam - Take Home Portion Math 211, Summer 2017

Inner products. Theorem (basic properties): Given vectors u, v, w in an inner product space V, and a scalar k, the following properties hold:

Assignment 1 Math 5341 Linear Algebra Review. Give complete answers to each of the following questions. Show all of your work.

Transpose & Dot Product

Transpose & Dot Product

x 1 + 2x 2 + 3x 3 = 0 x 1 + 2x 2 + 3x 3 = 0, x 2 + x 3 = 0 x 3 3 x 3 1

Definitions for Quizzes

SOLUTION KEY TO THE LINEAR ALGEBRA FINAL EXAM 1 2 ( 2) ( 1) c a = 1 0

Final Review Sheet. B = (1, 1 + 3x, 1 + x 2 ) then 2 + 3x + 6x 2

Chapter SSM: Linear Algebra. 5. Find all x such that A x = , so that x 1 = x 2 = 0.

Answers in blue. If you have questions or spot an error, let me know. 1. Find all matrices that commute with A =. 4 3

Dot Products, Transposes, and Orthogonal Projections

Math 261 Lecture Notes: Sections 6.1, 6.2, 6.3 and 6.4 Orthogonal Sets and Projections

Solutions: We leave the conversione between relation form and span form for the reader to verify. x 1 + 2x 2 + 3x 3 = 0

Topic 2 Quiz 2. choice C implies B and B implies C. correct-choice C implies B, but B does not imply C

Lecture 13: Orthogonal projections and least squares (Section ) Thang Huynh, UC San Diego 2/9/2018

Final Exam, Linear Algebra, Fall, 2003, W. Stephen Wilson

MATH 23a, FALL 2002 THEORETICAL LINEAR ALGEBRA AND MULTIVARIABLE CALCULUS Solutions to Final Exam (in-class portion) January 22, 2003

Dot Products. K. Behrend. April 3, Abstract A short review of some basic facts on the dot product. Projections. The spectral theorem.

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL

MODULE 8 Topics: Null space, range, column space, row space and rank of a matrix

CHANGE OF BASIS FOR LINEAR SPACES

Projections and Least Square Solutions. Recall that given an inner product space V with subspace W and orthogonal basis for

PRACTICE FINAL EXAM. why. If they are dependent, exhibit a linear dependence relation among them.

Answer Key for Exam #2

33A Linear Algebra and Applications: Practice Final Exam - Solutions

Section 6.2, 6.3 Orthogonal Sets, Orthogonal Projections

Orthogonal complement

Math 416, Spring 2010 Gram-Schmidt, the QR-factorization, Orthogonal Matrices March 4, 2010 GRAM-SCHMIDT, THE QR-FACTORIZATION, ORTHOGONAL MATRICES

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

Linear Algebra Practice Problems

Solutions to Review Problems for Chapter 6 ( ), 7.1

MTH 309Y 37. Inner product spaces. = a 1 b 1 + a 2 b a n b n

MTH 2310, FALL Introduction

MATH 15a: Linear Algebra Practice Exam 2

Problem # Max points possible Actual score Total 120

(a) II and III (b) I (c) I and III (d) I and II and III (e) None are true.

235 Final exam review questions

Chapter 6. Orthogonality

1. General Vector Spaces

Eigenspaces. (c) Find the algebraic multiplicity and the geometric multiplicity for the eigenvaules of A.

Math 3191 Applied Linear Algebra

Exercise Sheet 1.

Math 217 Midterm 1. Winter Solutions. Question Points Score Total: 100

PRACTICE PROBLEMS FOR THE FINAL

Worksheet for Lecture 25 Section 6.4 Gram-Schmidt Process

x 2 For example, Theorem If S 1, S 2 are subspaces of R n, then S 1 S 2 is a subspace of R n. Proof. Problem 3.

Math 118, Fall 2014 Final Exam

x y + z = 3 2y z = 1 4x + y = 0

S09 MTH 371 Linear Algebra NEW PRACTICE QUIZ 4, SOLUTIONS Prof. G.Todorov February 15, 2009 Please, justify your answers.

TBP MATH33A Review Sheet. November 24, 2018

Practice Exam. 2x 1 + 4x 2 + 2x 3 = 4 x 1 + 2x 2 + 3x 3 = 1 2x 1 + 3x 2 + 4x 3 = 5

We see that this is a linear system with 3 equations in 3 unknowns. equation is A x = b, where

ALGEBRA QUALIFYING EXAM PROBLEMS LINEAR ALGEBRA

MAT2342 : Introduction to Applied Linear Algebra Mike Newman, fall Projections. introduction

Math 312 Final Exam Jerry L. Kazdan May 5, :00 2:00

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

To find the least-squares solution to Ax = b, we project b onto Col A to obtain the vector ) b = proj ColA b,

The Fundamental Theorem of Linear Algebra

This is a closed book exam. No notes or calculators are permitted. We will drop your lowest scoring question for you.

Math 113 Midterm Exam Solutions

LINEAR ALGEBRA BOOT CAMP WEEK 1: THE BASICS

Math 416, Spring 2010 Matrix multiplication; subspaces February 2, 2010 MATRIX MULTIPLICATION; SUBSPACES. 1. Announcements

Elementary linear algebra

Data fitting. Ideal equation for linear relation

is Use at most six elementary row operations. (Partial

Linear Systems. Class 27. c 2008 Ron Buckmire. TITLE Projection Matrices and Orthogonal Diagonalization CURRENT READING Poole 5.4

Chapter 6: Orthogonality

Math 110: Worksheet 3

Diagonalizing Matrices

Orthogonality and Least Squares

Linear Algebra FALL 2013 MIDTERM EXAMINATION Solutions October 11, 2013

Announcements Monday, November 20

Solutions to Final Exam

No books, no notes, no calculators. You must show work, unless the question is a true/false, yes/no, or fill-in-the-blank question.

Recall: Dot product on R 2 : u v = (u 1, u 2 ) (v 1, v 2 ) = u 1 v 1 + u 2 v 2, u u = u u 2 2 = u 2. Geometric Meaning:

LINEAR ALGEBRA BOOT CAMP WEEK 2: LINEAR OPERATORS

Chapter 6. Orthogonality

Homework 5. (due Wednesday 8 th Nov midnight)

Orthogonal Complements

Functional Analysis Exercise Class

Final Exam. Linear Algebra Summer 2011 Math S2010X (3) Corrin Clarkson. August 10th, Solutions

TEST 1: Answers. You must support your answers with necessary work. My favorite number is three. Unsupported answers will receive zero credit.

Math 2331 Linear Algebra

Orthogonal Projection. Hung-yi Lee

which arises when we compute the orthogonal projection of a vector y in a subspace with an orthogonal basis. Hence assume that P y = A ij = x j, x i

Assignment #9: Orthogonal Projections, Gram-Schmidt, and Least Squares. Name:

MATH 167: APPLIED LINEAR ALGEBRA Least-Squares

A Primer in Econometric Theory

Math 290-2: Linear Algebra & Multivariable Calculus Northwestern University, Lecture Notes

MATH 320: PRACTICE PROBLEMS FOR THE FINAL AND SOLUTIONS

Linear Algebra. Grinshpan

Linear Algebra Final Exam Solutions, December 13, 2008

EECS 275 Matrix Computation

Math 21b: Linear Algebra Spring 2018

Math 4242 Fall 2016 (Darij Grinberg): homework set 8 due: Wed, 14 Dec b a. Here is the algorithm for diagonalizing a matrix we did in class:

Definition Suppose S R n, V R m are subspaces. A map U : S V is linear if

Math 415 Exam I. Name: Student ID: Calculators, books and notes are not allowed!

MATH Linear Algebra

Spring 2014 Math 272 Final Exam Review Sheet

Transcription:

5.4 LEAST SQUARES AND DATA FIT- TING ANOTHER CHARACTERIZATION OF ORTHOG- ONAL COMPLEMENTS Consider a subspace V = im(a) of R n, where A = [ ] v 1 v 2... v m. Then, V = { x in R n : v x = 0, for all v in V } = { x in R n : v i x = 0, for i = 1,...,m} = { x in R n : v i T x = 0, for i = 1,...,m} In other words, V is the kernel of the matrix A T = v T 1 v T 2. v T m 1

Fact 5.4.1 For any matrix A, (im A) = ker (A T ). Example: consider the line Then V = im 1 2 3 V = ker[123] is the plan with equation x 1 + 2x 2 + 3x 3 = 0. (See Figure 1) Fact 5.4.2 Consider a subspace V of R n. then, a. dim(v ) + dim(v ) = n b. (V ) = V c. V V = { 0} 2

Proof a. Let T( x) = proj V be the orthogonal projection onto V. Note that im(a) = V and ker(t) = V. Fact 3.3.9 tells us that n = dim(ker T) + dim(im T) = dim(v) + dim(v ). b. First observe that V (V ), since a vector in V is orthogonal to every vector in V (by definition of V ). Furthermore, the dimensions of the two spaces are equal, by part(a): dim(v ) = n dim(v ) = n (n dim(v )) = dim(v ). It follows that the two spaces are equal. (See Exercise 3.3.41.) c. If x is in V and in V, then x is orthogonal to itself; that is, x x = x 2 = 0, and thus x = 0.

Fact 5.4.3 a. If A is an m n matrix, then ker(a) = ker(a T A). b. If A is an m n matrix with ker(a) = { 0}, then A T A is invertible. Proof a. Clearly, the kernel of A is contained in the kernel kernel of A T A. Conversely, consider a vector x in the kernel of A T A, so that A T A x = 0 Then, A x is in the image of A and in the kernel of A T. Since ker(a T ) is the orthogonal complement of im(a) by Fact 5.4.1, the vector A x is 0 by Fact 5.4.2(c), that is, x is in the kernel of A. b. Note that A T A is an n n matrix. By part (a), ker(a T A) = { 0}, and A T A is therefore invertible.(see Summary 3.3.11) 3

An Alternative Characterization of Orthogonal Projections Fact 5.4.4 Consider a vector x in R n and a subspace V of R n. Then, the orthogonal projection proj V x is the vector in V closest to x, in that x proj V x < x v, for all v in V different from proj V x Least-Squares Approximations Definition 5.4.5 Least-squares solution Consider a linear system A x = b, where A is an m n matrix. A vector a in R n is called a least squares solution of this system if b A x b A x for all x in R n. 4

The vector x is a least-square solution of the system A x = b Def 5.4.5 b A x b A x for all x in R n. Def 5.4.4 A x = proj V b, where V = im(a) Fact 5.1.6 and 5.4.1 b A x is in V = im(a) = ker(a T ) A T ( b A x ) = 0 A T A x = A T b 5

Fact 5.4.6 The normal equation The least-squares solutions of the system A x = b, are the exact solutions of the (consistent) system A T A x = A T b, The system A T A x = A T b is called the normal equation of A x = b Fact 5.4.7 If ker(a) = { 0}, then the linear system A x = b, has the unique least-squares solution x = (A T A) 1 A T b 6

Example 1 Use Fact 5.4.7 to find the leastsquares solution x of the system A x = b, where A = 1 1 1 2 1 3 and b 0 0 6 what is the geometric relationship between A x and b? Solution We compute x = (A T A) 1 A T b = [ 4 3 ] and A x = 1 2 5 Recall that A x is the orthogonal projection of b onto the image of A. 7

Fact 5.4.8 The matrix of an orthogonal projection Consider a subspace V of R n with basis v 1, v 2,..., v m. Let A = [ v 1, v 2,..., v m ] Then the matrix of the orthogonal projection onto V is A(A T A) 1 A T. This means we are not required to find an orthonormal basis of V here. If the vectors v i happen to be orthonormal, then A T A = I m and the formula simplifies to A T A. (See Fact 5.3.10.) 8

Example 2 Find the matrix of the orthogonal projection onto the subspace of R 4 spanned by the vector 1 1 1 1 and 1 2 3 4 Solution Let A = 1 1 1 2 1 3 1 4 and compute A(A T A) 1 A T = 7 4 1 2 4 3 2 1 1 2 3 4 2 1 4 7 9

Data Fitting Scientists are often interested in fitting a function of a certain type to data they have gathered. The functions considered could be linear, polynomial, relational trigonometric, or exponential. The equations we have to solve as we fit data are frequently linear. (See Exercises 29 and 30 of section 1.1, and Exercises 30 through 33 of Section 1.2.) Example 3 Find a cubic polyonmial whose graph passes through the points (1, 3), (-1, 13), (2, 1), (-2, 33). Solution We are looking for a function f(t) = c 0 + c 1 t + c 2 t 2 + c 3 t 3 such that f(1) = 3, f(-1) = 13, f(2) = 1, f(-2) = 33; that is, we have to solve the linear system 10

c 0 + c 1 + c 2 + c 3 = 3 c 0 c 1 + c 2 c 3 = 13 c 0 + 2c 1 + 4c 2 + 8c 3 = 1 c 0 2c 1 + 4c 2 8c 3 = 33 This linear system has the unique solution c 0 c 1 c 2 c 3 = 5 4 3 1. Thus, the cubic polynomial whose graph passes through the four given data points if f(t) = 5 4t + 3t 2 t 3, as shown in Figure 6. 11

Example 4 Fit a quadratic function to the four data points (a 1, b 1 ) = (-1, 8), (a 2, b 2 ) = (0, 8), (a 3, b 3 ) = (1, 4), and (a 4, b 4 ) = (2, 16). Solution We are looking for a function f(t) = c 0 + c 1 t + C 2 t 2 such that f(a 1 ) = b 1 f(a 2 ) = b 2 f(a 3 ) = b 3 f(a 4 ) = b 4 where or c 0 c 1 + c 2 = 8 c 0 = 8 c 0 +c 1 + c 2 = 4 c 0 +2c 1 + 4c 2 = 16 or A c 0 c 1 c 2 A = 1 1 1 1 0 0 1 1 1 1 2 4 and b = 8 8 4 16. 12

We have four equations, corresponding to the four data points, but only three unknowns, the three coefficients of a quadratic polynomial. Check that this system is indeed inconsistent. The least-squares solution is x = c 0 c 1 c 2 = (A T A) 1 A T b = 5 1 3 The least-squares approximation is f (t) = 5 t + 3t 2, as shown in Figure 7. This quadratic function f (t) fits the data points best, in that the vector A x = f (a 1 ) f (a 2 ) f (a 3 ) f (a 4 ) is close as possible to 13

A = b 1 b 2 b 3 b 4. This means that b - A c 2 = (b 1 - f (a 1 )) 2 + (b 2 - f (a 2 )) 2 + (b 3 - f (a 3 )) 2 + (b 4 - f (a 4 )) 2 is minimal: The sum of the squares of the vertical distances between graph and data points is minimal. (See Figure 8.)

Example 5 Find the linear function c 0 + c 1 t that best fits the data points (a 1, b 1 ), (a 2, b 2 ),...,(a n, b n ), use least squares. Assume that a 1 a 2. Solution We attempt to solve the system c 0 + c 1 a 1 = b 1 c 0. + c 1 a 2. = b 2. c 0 + c 1 a n = b n or 1 a 1 1 a 2.. [ c0 c 1 ] = b 1 b 2., 1 a n b n or [ c0 ] A c 1 = b 14

Note that rank(a) = 2, since a 1 a 2. least-squares solution is [ 1... 1 a 1... a n c 0 c 1 = (AT A) 1 A T b = ] 1 a 1.. 1 a n 1 [ 1... 1 a 1... a n The ] b 1. b n = [ n Σ i a i Σ i a i Σ i a 2 i ] 1 [ Σ i b i Σ i a i b i (where i refers to the sum for i = 1,...,n) ] We have found that C 0 = ( i a2 i )( i b i) ( i a i)( i a ib i ) n( i a2 i ) ( i a i) 2, C 1 = n( i a ib i ) ( i a i)( i b i) n( i a2 i ) ( i a i) 2. There formulas are well known to statisticians. There is no need to memorize them.

Example 6 In the accompanying table, we list the scores of five students in the three exams given in a class. Find the function of the form f = c 0 + c 1 h + c 2 m that best fits these data, using least squares. what score f does your formula predict for Marlisa, another student, whose scores in the first two exams were h = 92 and m = 72? Solution We attempt to solve the system c 0 + 76c 1 + 48c 2 = 43 c 0 + 92c 1 + 92c 2 = 90 c 0 + 68c 1 + 82c 2 = 64 c 0 + 86c 1 + 68c 2 = 69 c 0 + 54c 1 + 70c 2 = 50. 15

The least-squares solution is c 0 c 1 c 2. = (A T A) 1 A T b 42.4 0.639 0.799. The function which gives the best fit is approximately f = -42.4 + 0.639h + 0.799m. The formula predicts the score f = -42.4 + 0.639 92 + 0.799 72 74. for Marlisa. 16