Projections and Least Square Solutions. Recall that given an inner product space V with subspace W and orthogonal basis for

Size: px
Start display at page:

Download "Projections and Least Square Solutions. Recall that given an inner product space V with subspace W and orthogonal basis for"

Transcription

1 Math 57 Spring 18 Projections and Least Square Solutions Recall that given an inner product space V with subspace W and orthogonal basis for W, B {v 1, v,..., v k }, the orthogonal projection of V onto W is defined by Proj W (u) u, v 1 v 1, v 1 v 1 + u, v v, v v + + u, v k v k, v k v k. This function has the following properties: (a) It is a linear operator on V (b) It is a projection operator: P P (c) Given a vector v, the vector v Proj W (v) is orthogonal to every vector in W. In this set of notes, I address the following question: Suppose T : U V is linear but not onto. If b is not in the range of T what can we say about T (u) b other than it is inconsistent? That is, what is the best we can do in a situation like this? Definition 1 A vector u is called a least squares solution to T (x) b if T (u) b is a minimum over all x U. That is, for every x U, T (u) b T (x) b for all x V. One way to find a least squares solution is to find an expression for T (x) b, or more likely, T (x) b and use calculus or some other approach ( to ) find ( a minimum for x. For example, suppose that T : R R is defined by T ) a b a + b a + b + c c d ( a ) + b + c a + b + c + d 1 and suppose we want the least squares solution to T (A). An inner product on ( ) 4 a b R is A, B tr(b t A). We want c d ( ) ( ) ( ) ( ) a + b a + b + c 1 a + b a + b + c 1, a + b + c a + b + c + d 4 a + b + c a + b + c + d 4 as small as possible. This calls for minimizing the expression (a + b 1) + (a + b + c ) + (a + b + c ) + (a + b + c + d 4) over all a, b, c, d R. As we might have learned in Calc II, one way to do this: get a system of equations in a, b, c, d by taking partials with respect to a, b, c, d and setting them equal to. For us, the system is 4a + 4b + c + d 1 4a + 4b + c + d 1 a + b + c + d 9 a + b + c + d 4 The solution to this system is a 1 b, c, d (. That is, there are infinitely many ) ( ) 1 1 5/ least squares solutions, with one of them being A, with T (A). / / 5/ 4 Now let s apply the theory of projections to this problem.

2 Theorem 1 A vector u is a least squares solution to T (x) b if and only if u is a solution to T (x) Proj W (b) where W is the range of T. Proof: Let w Proj W (b) and let x be any vector in V. We have T (x) b T (x) b, T (x) b (T (x) w) (b w), (T (x) w) (b w) T (x) w, T (x) w T (x) w, b w b w, T (x) w + b w, b w. However, T (x) w, b w for all x V. This is because T (x) w is in W, the range of T but b w b Proj W (b) is orthogonal to every vector in W. Consequently, T (x) b T (x) w + b w. Since b w is a constant, independent of x, to minimize this quantity, we must minimize T (x) w, which happens if and only if x is a solution to T (x) w. Back ( to) our example. An alternative to calculus to solving our problem is to project 1 u onto the range of T and solve the resulting equation. The range of T has basis 4 {( ) ( ) ( )} 1 1 {v 1, v, v },,. By a happy coincidence, this is an orthogonal 1 1 basis. We have Proj W (u) u, v 1 v 1, v 1 v 1 + u, v v, v v + u, v v, v v 1 ( ) ( ) ( ) ( ) 1 5/ 5/ 4 ( ) ( ) a + b a + b + c 1 5/ so we solve. This leads again to a + b 1, c a + b + c a + b + c + d 5/ 4, d. Least Squares Solutions to Systems of Equations We can say much more in the case of solving (inconsistent) systems of equations Ax b, where we use the usual dot product for our inner product. By Theorem 1, a least squares solution will be a solution to Ax w where w is the projection of b onto the column space of A, since the column space of A is the range of the transformation T (x) Ax. Here is an example: Suppose we want the least squares solution to ( x y ) 6. Note Page

3 that the system is inconsistent. What we seek are x and y so that Ax is as close as possible x + y x + y to b. One way is with calculus: Minimize x + 4y 6, x + 4y 6 5x + 6y 5x + 6y (x + y ) + (x + 4y 6) + (5x + 6y ). Using partial derivatives, we get the system 5x + 44y 6 of equations, which has solution x 4, y 4. 44x + 56y 48 Alternatively, we project onto the column space of A. For that, we find an orthogonal 1 1 basis, say, 4, for the column space of A. The projection of 6 onto the ( ) 4 column space is Now solve the new system 4 x 4, 5 1 y 5 5 ( ) a consistent system with solution. 4 But in this setting, things are nicer than in general. Theorem If A is an m n matrix, then the set of least squares solutions to Ax b is the same as the set of solutions to A t Ax A t b. In particular, this last equation is always consistent. Proof: Let W be the column space of A. Then W null(a t ). This is because A t v is a column vector whose entries are dot products of v with the rows of A t, which are the columns of A. So if A t v then v must be orthogonal to each of the columns of A, meaning v W. Similarly, if v W then v is orthogonal to every row of A, and so A t v and v null(a t ). Suppose that u is a least squares solution to Ax b. Then Au Proj W (b) so b Au b Proj W (b), a vector orthogonal to every vector in W. That is, b Au W. By the comment above, this means b Au is in the null space of A t so A t (b Au). This rearranges to A t Au A t b. Similarly, if A t Au A t b then A t (b Au) so b Au null(a t ) W. If we write Au b v W then Au v + b. Since the Proj is a projection, Proj W (w) w for every w W. This means Proj W (Au) Au. We have Au Proj W (Au) Proj W (v + b) Proj W (v) + Proj W (b). But v is orthogonal to all vectors in W so Proj W (v),giving Au Proj W (b). 1 ( ) Back to our previous example, we can now find the least squares solution to 4 x y 5 6 Page

4 6 by multiplying both sides by A t. Following this approach, ( ) ( ) ( ) x 1 5 6, 4 6 y ( ) ( ) 5 44 x or y ( ) 6, a system that we solved earlier. 48 In general, if A is an m n matrix of rank n, then A t A will be an invertible matrix. More generally, the rank of A t A is the same as the rank of A. When A t A is invertible, we can find the least squares solution to Ax b via the formula x (A t A) 1 A t b. Moreover, since we are really solving Ax Proj W (b) we end up with a formula for the projection, More generally, Proj W (b) A(A t A) 1 A t b. Theorem Let W be a subspace of R n with basis {w 1, w,..., w k }. Then the standard matrix for the projection of R n onto W is A(A t A) 1 A t where A has the vectors w i as columns. As an example, on Homework 9, I asked for the standard matrix for the orthogonal projection of R onto the plane x + y z. Here are two solutions, one not making use of Theorem, the other using Theorem. For the homework, you should have used this first approach. The first method is to let W be the space defined by the plane, to find an orthogonal basis for the plane, and to use the formula for a projection in terms of an orthogonal basis to get the answer. In the solutions, I found an orthogonal basis by inspection, but let s say we did not do that. Writing x y + z, and writing things horizontally to save vertical space, a basis is {( 1, 1, ), (, 1)}. Orthogonalizing this, b (,, 1) (, 1), ( 1, 1, ) ( 1, 1, ) (,, 1) + ( 1, 1, ) (1, 1, 1). ( 1, 1), ( 1, 1, ) This is nice, the natural basis for the plane transformed into the basis I used in the homework solution. The rest of the solution is now the same: (x, y, z), ( 1, 1, ) (x, y, z), (1, 1, 1) T (x, y, z) ( 1, 1, ) + (1, 1, 1) ( 1, 1, ), ( 1, 1, ) (1, 1, 1), (1, 1, 1) x + y ( 1, 1, ) + x + y + z (1, 1, 1), 1/ 1/ 1/ 1/ 1/ 5/6 1/6 1/ with matrix 1/ 1/ + 1/ 1/ 1/ 1/6 5/6 1/. 1/ 1/ 1/ 1/ 1/ 1/ Page 4

5 The calculation making use of Theorem would go like this: Start with the same basis as before, {( 1, 1, ), (, 1)}, and form a matrix A with these basis vectors as its columns. Then ( ) 1 1 the matrix of the projection is A(A t A) 1 A t. In this case, A t A 1 1 ( ). The matrix of the transformation is 5 as before. 1 A(A t A) 1 A t 1 1 ( ) ( ) ( ) /6 1/6 1/ 1 5 1/6 5/6 1/, 6 1/ 1/ 1/ 1 1 Some special cases that come up frequently: Linear and quadratic fits to data. Suppose we have a bunch of points (x 1, y 1 ), (x, y ),..., (x k, y k ), and we want the best linear approximation, y mx + b, to these points. One can quibble about what best means here, but the most common thing to do is to find a least squares solution: for any given point, we have (x i 1) ( ) m y b i so the problem can be case as solving x 1 1 y x 1 ( ) 1 m y.. b.. x k 1 If we wish for a quadratic fit instead, y ax + bx + c, then the least squares problem is x 1 x 1 1 y 1 x x 1 a b y... c.. x k x k 1 Note the Vandermonde-like nature of the matrix involved. This extends to higher degree polynomials. Let s do one final example, the ill-conceived example from class, T : P P defined by T (p(x)) p(1) + (x 1) p (1), with the problem being to find the least squares solution to T (p(x)) x + x + 1 with inner product p(x), q(x) y k y k p(x)q(x) dx. I quickly gave up on both approaches to the problem, a calculus based approach and the approach using Page 5

6 projections. Here are both worked out. For the calculus based approach, we seek to minimize T (p(x)) (x + x + 1), T (p(x)) (x + x + 1). Letting p(x) ax + bx + c, T (p(x)) a + b + c + a(x 1) ax ax + a + b + c. Working with the square of the norm we want the minimum of ((a 1)x (a + 1)x + a + b + c 1) dx In class I mentioned that it is easiest to differentiate under the integral sign giving a partial derivative with respect to b of ( 1 ((a 1)x (a + 1)x + a + b + c 1) dx (a 1) 1 ) (a + 1) + a + b + c 1. Setting this equal to gives 4a + b + c 11. We get the same equation from the partial with 6 respect to c. I balked at the partial with respect to a which requires the integral ((a 1)x (a + 1)x + a + b + c 1)(x x + ) dx. I used Maple to get 8a + 4b + 4c 17. Multiplying the first equation by -4/ and adding 15 6 to the second gives a 9, b + c 17. That is, there are infinitely many least squares 16 4 solutions with one of them (taking b ) being p(x) 9 16 x Now let s use the projection method. We project x +x+1 onto the range to T and solve a new equation. The range of T is spanned by 1 and (x 1). If we orthogonalize this, one basis is { 1, (x 1) 1 } {b1, b } and the projection of x + x + 1 onto the range will be x + x + 1, b 1 b 1, b 1 b 1 + x + x + 1, b b 11 b, b 6 + 9/18 ( (x 1) 1 ) 9 4/45 16 (x 1) Now we solve T (p(x)) this new vector or a(x 1) + a + b + c 9(x 16 1) + 9. We 16 must have a 9 and b + c 17 as before Page 6

Assignment 1 Math 5341 Linear Algebra Review. Give complete answers to each of the following questions. Show all of your work.

Assignment 1 Math 5341 Linear Algebra Review. Give complete answers to each of the following questions. Show all of your work. Assignment 1 Math 5341 Linear Algebra Review Give complete answers to each of the following questions Show all of your work Note: You might struggle with some of these questions, either because it has

More information

MODULE 8 Topics: Null space, range, column space, row space and rank of a matrix

MODULE 8 Topics: Null space, range, column space, row space and rank of a matrix MODULE 8 Topics: Null space, range, column space, row space and rank of a matrix Definition: Let L : V 1 V 2 be a linear operator. The null space N (L) of L is the subspace of V 1 defined by N (L) = {x

More information

Math 224, Fall 2007 Exam 3 Thursday, December 6, 2007

Math 224, Fall 2007 Exam 3 Thursday, December 6, 2007 Math 224, Fall 2007 Exam 3 Thursday, December 6, 2007 You have 1 hour and 20 minutes. No notes, books, or other references. You are permitted to use Maple during this exam, but you must start with a blank

More information

There are two things that are particularly nice about the first basis

There are two things that are particularly nice about the first basis Orthogonality and the Gram-Schmidt Process In Chapter 4, we spent a great deal of time studying the problem of finding a basis for a vector space We know that a basis for a vector space can potentially

More information

MATH 167: APPLIED LINEAR ALGEBRA Least-Squares

MATH 167: APPLIED LINEAR ALGEBRA Least-Squares MATH 167: APPLIED LINEAR ALGEBRA Least-Squares October 30, 2014 Least Squares We do a series of experiments, collecting data. We wish to see patterns!! We expect the output b to be a linear function of

More information

Math 24 Spring 2012 Questions (mostly) from the Textbook

Math 24 Spring 2012 Questions (mostly) from the Textbook Math 24 Spring 2012 Questions (mostly) from the Textbook 1. TRUE OR FALSE? (a) The zero vector space has no basis. (F) (b) Every vector space that is generated by a finite set has a basis. (c) Every vector

More information

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017 Math 4A Notes Written by Victoria Kala vtkala@math.ucsb.edu Last updated June 11, 2017 Systems of Linear Equations A linear equation is an equation that can be written in the form a 1 x 1 + a 2 x 2 +...

More information

Final Review Written by Victoria Kala SH 6432u Office Hours R 12:30 1:30pm Last Updated 11/30/2015

Final Review Written by Victoria Kala SH 6432u Office Hours R 12:30 1:30pm Last Updated 11/30/2015 Final Review Written by Victoria Kala vtkala@mathucsbedu SH 6432u Office Hours R 12:30 1:30pm Last Updated 11/30/2015 Summary This review contains notes on sections 44 47, 51 53, 61, 62, 65 For your final,

More information

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP) MATH 20F: LINEAR ALGEBRA LECTURE B00 (T KEMP) Definition 01 If T (x) = Ax is a linear transformation from R n to R m then Nul (T ) = {x R n : T (x) = 0} = Nul (A) Ran (T ) = {Ax R m : x R n } = {b R m

More information

Solutions to Final Practice Problems Written by Victoria Kala Last updated 12/5/2015

Solutions to Final Practice Problems Written by Victoria Kala Last updated 12/5/2015 Solutions to Final Practice Problems Written by Victoria Kala vtkala@math.ucsb.edu Last updated /5/05 Answers This page contains answers only. See the following pages for detailed solutions. (. (a x. See

More information

Least squares problems Linear Algebra with Computer Science Application

Least squares problems Linear Algebra with Computer Science Application Linear Algebra with Computer Science Application April 8, 018 1 Least Squares Problems 11 Least Squares Problems What do you do when Ax = b has no solution? Inconsistent systems arise often in applications

More information

Instructions Please answer the five problems on your own paper. These are essay questions: you should write in complete sentences.

Instructions Please answer the five problems on your own paper. These are essay questions: you should write in complete sentences. Instructions Please answer the five problems on your own paper. These are essay questions: you should write in complete sentences.. Recall that P 3 denotes the vector space of polynomials of degree less

More information

The Gram Schmidt Process

The Gram Schmidt Process u 2 u The Gram Schmidt Process Now we will present a procedure, based on orthogonal projection, that converts any linearly independent set of vectors into an orthogonal set. Let us begin with the simple

More information

The Gram Schmidt Process

The Gram Schmidt Process The Gram Schmidt Process Now we will present a procedure, based on orthogonal projection, that converts any linearly independent set of vectors into an orthogonal set. Let us begin with the simple case

More information

MATH 221, Spring Homework 10 Solutions

MATH 221, Spring Homework 10 Solutions MATH 22, Spring 28 - Homework Solutions Due Tuesday, May Section 52 Page 279, Problem 2: 4 λ A λi = and the characteristic polynomial is det(a λi) = ( 4 λ)( λ) ( )(6) = λ 6 λ 2 +λ+2 The solutions to the

More information

Linear Systems. Math A Bianca Santoro. September 23, 2016

Linear Systems. Math A Bianca Santoro. September 23, 2016 Linear Systems Math A4600 - Bianca Santoro September 3, 06 Goal: Understand how to solve Ax = b. Toy Model: Let s study the following system There are two nice ways of thinking about this system: x + y

More information

Definition 1. A set V is a vector space over the scalar field F {R, C} iff. there are two operations defined on V, called vector addition

Definition 1. A set V is a vector space over the scalar field F {R, C} iff. there are two operations defined on V, called vector addition 6 Vector Spaces with Inned Product Basis and Dimension Section Objective(s): Vector Spaces and Subspaces Linear (In)dependence Basis and Dimension Inner Product 6 Vector Spaces and Subspaces Definition

More information

Math 250B Midterm II Information Spring 2019 SOLUTIONS TO PRACTICE PROBLEMS

Math 250B Midterm II Information Spring 2019 SOLUTIONS TO PRACTICE PROBLEMS Math 50B Midterm II Information Spring 019 SOLUTIONS TO PRACTICE PROBLEMS Problem 1. Determine whether each set S below forms a subspace of the given vector space V. Show carefully that your answer is

More information

Linear Systems. Class 27. c 2008 Ron Buckmire. TITLE Projection Matrices and Orthogonal Diagonalization CURRENT READING Poole 5.4

Linear Systems. Class 27. c 2008 Ron Buckmire. TITLE Projection Matrices and Orthogonal Diagonalization CURRENT READING Poole 5.4 Linear Systems Math Spring 8 c 8 Ron Buckmire Fowler 9 MWF 9: am - :5 am http://faculty.oxy.edu/ron/math//8/ Class 7 TITLE Projection Matrices and Orthogonal Diagonalization CURRENT READING Poole 5. Summary

More information

1. Let m 1 and n 1 be two natural numbers such that m > n. Which of the following is/are true?

1. Let m 1 and n 1 be two natural numbers such that m > n. Which of the following is/are true? . Let m and n be two natural numbers such that m > n. Which of the following is/are true? (i) A linear system of m equations in n variables is always consistent. (ii) A linear system of n equations in

More information

Math 308 Discussion Problems #4 Chapter 4 (after 4.3)

Math 308 Discussion Problems #4 Chapter 4 (after 4.3) Math 38 Discussion Problems #4 Chapter 4 (after 4.3) () (after 4.) Let S be a plane in R 3 passing through the origin, so that S is a two-dimensional subspace of R 3. Say that a linear transformation T

More information

MATH 22A: LINEAR ALGEBRA Chapter 4

MATH 22A: LINEAR ALGEBRA Chapter 4 MATH 22A: LINEAR ALGEBRA Chapter 4 Jesús De Loera, UC Davis November 30, 2012 Orthogonality and Least Squares Approximation QUESTION: Suppose Ax = b has no solution!! Then what to do? Can we find an Approximate

More information

March 27 Math 3260 sec. 56 Spring 2018

March 27 Math 3260 sec. 56 Spring 2018 March 27 Math 3260 sec. 56 Spring 2018 Section 4.6: Rank Definition: The row space, denoted Row A, of an m n matrix A is the subspace of R n spanned by the rows of A. We now have three vector spaces associated

More information

Answer Key for Exam #2

Answer Key for Exam #2 . Use elimination on an augmented matrix: Answer Key for Exam # 4 4 8 4 4 4 The fourth column has no pivot, so x 4 is a free variable. The corresponding system is x + x 4 =, x =, x x 4 = which we solve

More information

Transpose & Dot Product

Transpose & Dot Product Transpose & Dot Product Def: The transpose of an m n matrix A is the n m matrix A T whose columns are the rows of A. So: The columns of A T are the rows of A. The rows of A T are the columns of A. Example:

More information

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det What is the determinant of the following matrix? 3 4 3 4 3 4 4 3 A 0 B 8 C 55 D 0 E 60 If det a a a 3 b b b 3 c c c 3 = 4, then det a a 4a 3 a b b 4b 3 b c c c 3 c = A 8 B 6 C 4 D E 3 Let A be an n n matrix

More information

Transpose & Dot Product

Transpose & Dot Product Transpose & Dot Product Def: The transpose of an m n matrix A is the n m matrix A T whose columns are the rows of A. So: The columns of A T are the rows of A. The rows of A T are the columns of A. Example:

More information

Solutions to Math 51 Midterm 1 July 6, 2016

Solutions to Math 51 Midterm 1 July 6, 2016 Solutions to Math 5 Midterm July 6, 26. (a) (6 points) Find an equation (of the form ax + by + cz = d) for the plane P in R 3 passing through the points (, 2, ), (2,, ), and (,, ). We first compute two

More information

Review problems for MA 54, Fall 2004.

Review problems for MA 54, Fall 2004. Review problems for MA 54, Fall 2004. Below are the review problems for the final. They are mostly homework problems, or very similar. If you are comfortable doing these problems, you should be fine on

More information

Chapter 3: Theory Review: Solutions Math 308 F Spring 2015

Chapter 3: Theory Review: Solutions Math 308 F Spring 2015 Chapter : Theory Review: Solutions Math 08 F Spring 05. What two properties must a function T : R m R n satisfy to be a linear transformation? (a) For all vectors u and v in R m, T (u + v) T (u) + T (v)

More information

Lecture 3: Linear Algebra Review, Part II

Lecture 3: Linear Algebra Review, Part II Lecture 3: Linear Algebra Review, Part II Brian Borchers January 4, Linear Independence Definition The vectors v, v,..., v n are linearly independent if the system of equations c v + c v +...+ c n v n

More information

Math 307: Problems for section 3.1

Math 307: Problems for section 3.1 Math 307: Problems for section 3.. Show that if P is an orthogonal projection matrix, then P x x for every x. Use this inequality to prove the Cauchy Schwarz inequality x y x y. If P is an orthogonal projection

More information

MTH 2310, FALL Introduction

MTH 2310, FALL Introduction MTH 2310, FALL 2011 SECTION 6.2: ORTHOGONAL SETS Homework Problems: 1, 5, 9, 13, 17, 21, 23 1, 27, 29, 35 1. Introduction We have discussed previously the benefits of having a set of vectors that is linearly

More information

Department of Aerospace Engineering AE602 Mathematics for Aerospace Engineers Assignment No. 6

Department of Aerospace Engineering AE602 Mathematics for Aerospace Engineers Assignment No. 6 Department of Aerospace Engineering AE Mathematics for Aerospace Engineers Assignment No.. Find the best least squares solution x to x, x 5. What error E is minimized? heck that the error vector ( x, 5

More information

Vector Spaces, Orthogonality, and Linear Least Squares

Vector Spaces, Orthogonality, and Linear Least Squares Week Vector Spaces, Orthogonality, and Linear Least Squares. Opening Remarks.. Visualizing Planes, Lines, and Solutions Consider the following system of linear equations from the opener for Week 9: χ χ

More information

Problem 1: Solving a linear equation

Problem 1: Solving a linear equation Math 38 Practice Final Exam ANSWERS Page Problem : Solving a linear equation Given matrix A = 2 2 3 7 4 and vector y = 5 8 9. (a) Solve Ax = y (if the equation is consistent) and write the general solution

More information

Review 1 Math 321: Linear Algebra Spring 2010

Review 1 Math 321: Linear Algebra Spring 2010 Department of Mathematics and Statistics University of New Mexico Review 1 Math 321: Linear Algebra Spring 2010 This is a review for Midterm 1 that will be on Thursday March 11th, 2010. The main topics

More information

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION MATH (LINEAR ALGEBRA ) FINAL EXAM FALL SOLUTIONS TO PRACTICE VERSION Problem (a) For each matrix below (i) find a basis for its column space (ii) find a basis for its row space (iii) determine whether

More information

Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition

Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition MATH 322, Linear Algebra I J. Robert Buchanan Department of Mathematics Spring 205 Motivation When working with an inner product space, the most

More information

Recall: Dot product on R 2 : u v = (u 1, u 2 ) (v 1, v 2 ) = u 1 v 1 + u 2 v 2, u u = u u 2 2 = u 2. Geometric Meaning:

Recall: Dot product on R 2 : u v = (u 1, u 2 ) (v 1, v 2 ) = u 1 v 1 + u 2 v 2, u u = u u 2 2 = u 2. Geometric Meaning: Recall: Dot product on R 2 : u v = (u 1, u 2 ) (v 1, v 2 ) = u 1 v 1 + u 2 v 2, u u = u 2 1 + u 2 2 = u 2. Geometric Meaning: u v = u v cos θ. u θ v 1 Reason: The opposite side is given by u v. u v 2 =

More information

4.3 - Linear Combinations and Independence of Vectors

4.3 - Linear Combinations and Independence of Vectors - Linear Combinations and Independence of Vectors De nitions, Theorems, and Examples De nition 1 A vector v in a vector space V is called a linear combination of the vectors u 1, u,,u k in V if v can be

More information

Lecture 13: Orthogonal projections and least squares (Section ) Thang Huynh, UC San Diego 2/9/2018

Lecture 13: Orthogonal projections and least squares (Section ) Thang Huynh, UC San Diego 2/9/2018 Lecture 13: Orthogonal projections and least squares (Section 3.2-3.3) Thang Huynh, UC San Diego 2/9/2018 Orthogonal projection onto subspaces Theorem. Let W be a subspace of R n. Then, each x in R n can

More information

2. Every linear system with the same number of equations as unknowns has a unique solution.

2. Every linear system with the same number of equations as unknowns has a unique solution. 1. For matrices A, B, C, A + B = A + C if and only if A = B. 2. Every linear system with the same number of equations as unknowns has a unique solution. 3. Every linear system with the same number of equations

More information

Math 54 HW 4 solutions

Math 54 HW 4 solutions Math 54 HW 4 solutions 2.2. Section 2.2 (a) False: Recall that performing a series of elementary row operations A is equivalent to multiplying A by a series of elementary matrices. Suppose that E,...,

More information

Linear Algebra. and

Linear Algebra. and Instructions Please answer the six problems on your own paper. These are essay questions: you should write in complete sentences. 1. Are the two matrices 1 2 2 1 3 5 2 7 and 1 1 1 4 4 2 5 5 2 row equivalent?

More information

MATH PRACTICE EXAM 1 SOLUTIONS

MATH PRACTICE EXAM 1 SOLUTIONS MATH 2359 PRACTICE EXAM SOLUTIONS SPRING 205 Throughout this exam, V and W will denote vector spaces over R Part I: True/False () For each of the following statements, determine whether the statement is

More information

Matrices related to linear transformations

Matrices related to linear transformations Math 4326 Fall 207 Matrices related to linear transformations We have encountered several ways in which matrices relate to linear transformations. In this note, I summarize the important facts and formulas

More information

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL MATH 3 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL MAIN TOPICS FOR THE FINAL EXAM:. Vectors. Dot product. Cross product. Geometric applications. 2. Row reduction. Null space, column space, row space, left

More information

Linear Algebra Massoud Malek

Linear Algebra Massoud Malek CSUEB Linear Algebra Massoud Malek Inner Product and Normed Space In all that follows, the n n identity matrix is denoted by I n, the n n zero matrix by Z n, and the zero vector by θ n An inner product

More information

Linear Models Review

Linear Models Review Linear Models Review Vectors in IR n will be written as ordered n-tuples which are understood to be column vectors, or n 1 matrices. A vector variable will be indicted with bold face, and the prime sign

More information

Math 407: Linear Optimization

Math 407: Linear Optimization Math 407: Linear Optimization Lecture 16: The Linear Least Squares Problem II Math Dept, University of Washington February 28, 2018 Lecture 16: The Linear Least Squares Problem II (Math Dept, University

More information

Math 261 Lecture Notes: Sections 6.1, 6.2, 6.3 and 6.4 Orthogonal Sets and Projections

Math 261 Lecture Notes: Sections 6.1, 6.2, 6.3 and 6.4 Orthogonal Sets and Projections Math 6 Lecture Notes: Sections 6., 6., 6. and 6. Orthogonal Sets and Projections We will not cover general inner product spaces. We will, however, focus on a particular inner product space the inner product

More information

LIMITS AT INFINITY MR. VELAZQUEZ AP CALCULUS

LIMITS AT INFINITY MR. VELAZQUEZ AP CALCULUS LIMITS AT INFINITY MR. VELAZQUEZ AP CALCULUS RECALL: VERTICAL ASYMPTOTES Remember that for a rational function, vertical asymptotes occur at values of x = a which have infinite its (either positive or

More information

1 Review of the dot product

1 Review of the dot product Any typographical or other corrections about these notes are welcome. Review of the dot product The dot product on R n is an operation that takes two vectors and returns a number. It is defined by n u

More information

MATH 167: APPLIED LINEAR ALGEBRA Chapter 3

MATH 167: APPLIED LINEAR ALGEBRA Chapter 3 MATH 167: APPLIED LINEAR ALGEBRA Chapter 3 Jesús De Loera, UC Davis February 18, 2012 Orthogonal Vectors and Subspaces (3.1). In real life vector spaces come with additional METRIC properties!! We have

More information

Chapter 6: Orthogonality

Chapter 6: Orthogonality Chapter 6: Orthogonality (Last Updated: November 7, 7) These notes are derived primarily from Linear Algebra and its applications by David Lay (4ed). A few theorems have been moved around.. Inner products

More information

Solutions to Section 2.9 Homework Problems Problems 1 5, 7, 9, 10 15, (odd), and 38. S. F. Ellermeyer June 21, 2002

Solutions to Section 2.9 Homework Problems Problems 1 5, 7, 9, 10 15, (odd), and 38. S. F. Ellermeyer June 21, 2002 Solutions to Section 9 Homework Problems Problems 9 (odd) and 8 S F Ellermeyer June The pictured set contains the vector u but not the vector u so this set is not a subspace of The pictured set contains

More information

Linear Algebra Final Exam Study Guide Solutions Fall 2012

Linear Algebra Final Exam Study Guide Solutions Fall 2012 . Let A = Given that v = 7 7 67 5 75 78 Linear Algebra Final Exam Study Guide Solutions Fall 5 explain why it is not possible to diagonalize A. is an eigenvector for A and λ = is an eigenvalue for A diagonalize

More information

LINEAR ALGEBRA REVIEW

LINEAR ALGEBRA REVIEW LINEAR ALGEBRA REVIEW When we define a term, we put it in boldface. This is a very compressed review; please read it very carefully and be sure to ask questions on parts you aren t sure of. x 1 WedenotethesetofrealnumbersbyR.

More information

Updated: January 16, 2016 Calculus II 7.4. Math 230. Calculus II. Brian Veitch Fall 2015 Northern Illinois University

Updated: January 16, 2016 Calculus II 7.4. Math 230. Calculus II. Brian Veitch Fall 2015 Northern Illinois University Math 30 Calculus II Brian Veitch Fall 015 Northern Illinois University Integration of Rational Functions by Partial Fractions From algebra, we learned how to find common denominators so we can do something

More information

MATH 2210Q MIDTERM EXAM I PRACTICE PROBLEMS

MATH 2210Q MIDTERM EXAM I PRACTICE PROBLEMS MATH Q MIDTERM EXAM I PRACTICE PROBLEMS Date and place: Thursday, November, 8, in-class exam Section : : :5pm at MONT Section : 9: :5pm at MONT 5 Material: Sections,, 7 Lecture 9 8, Quiz, Worksheet 9 8,

More information

MTH Linear Algebra. Study Guide. Dr. Tony Yee Department of Mathematics and Information Technology The Hong Kong Institute of Education

MTH Linear Algebra. Study Guide. Dr. Tony Yee Department of Mathematics and Information Technology The Hong Kong Institute of Education MTH 3 Linear Algebra Study Guide Dr. Tony Yee Department of Mathematics and Information Technology The Hong Kong Institute of Education June 3, ii Contents Table of Contents iii Matrix Algebra. Real Life

More information

There are six more problems on the next two pages

There are six more problems on the next two pages Math 435 bg & bu: Topics in linear algebra Summer 25 Final exam Wed., 8/3/5. Justify all your work to receive full credit. Name:. Let A 3 2 5 Find a permutation matrix P, a lower triangular matrix L with

More information

Math 601 Solutions to Homework 3

Math 601 Solutions to Homework 3 Math 601 Solutions to Homework 3 1 Use Cramer s Rule to solve the following system of linear equations (Solve for x 1, x 2, and x 3 in terms of a, b, and c 2x 1 x 2 + 7x 3 = a 5x 1 2x 2 x 3 = b 3x 1 x

More information

Linear Algebra, Summer 2011, pt. 3

Linear Algebra, Summer 2011, pt. 3 Linear Algebra, Summer 011, pt. 3 September 0, 011 Contents 1 Orthogonality. 1 1.1 The length of a vector....................... 1. Orthogonal vectors......................... 3 1.3 Orthogonal Subspaces.......................

More information

is Use at most six elementary row operations. (Partial

is Use at most six elementary row operations. (Partial MATH 235 SPRING 2 EXAM SOLUTIONS () (6 points) a) Show that the reduced row echelon form of the augmented matrix of the system x + + 2x 4 + x 5 = 3 x x 3 + x 4 + x 5 = 2 2x + 2x 3 2x 4 x 5 = 3 is. Use

More information

b for the linear system x 1 + x 2 + a 2 x 3 = a x 1 + x 3 = 3 x 1 + x 2 + 9x 3 = 3 ] 1 1 a 2 a

b for the linear system x 1 + x 2 + a 2 x 3 = a x 1 + x 3 = 3 x 1 + x 2 + 9x 3 = 3 ] 1 1 a 2 a Practice Exercises for Exam Exam will be on Monday, September 8, 7. The syllabus for Exam consists of Sections One.I, One.III, Two.I, and Two.II. You should know the main definitions, results and computational

More information

Equations in Quadratic Form

Equations in Quadratic Form Equations in Quadratic Form MATH 101 College Algebra J. Robert Buchanan Department of Mathematics Summer 2012 Objectives In this lesson we will learn to: make substitutions that allow equations to be written

More information

MIT Final Exam Solutions, Spring 2017

MIT Final Exam Solutions, Spring 2017 MIT 8.6 Final Exam Solutions, Spring 7 Problem : For some real matrix A, the following vectors form a basis for its column space and null space: C(A) = span,, N(A) = span,,. (a) What is the size m n of

More information

LINEAR ALGEBRA KNOWLEDGE SURVEY

LINEAR ALGEBRA KNOWLEDGE SURVEY LINEAR ALGEBRA KNOWLEDGE SURVEY Instructions: This is a Knowledge Survey. For this assignment, I am only interested in your level of confidence about your ability to do the tasks on the following pages.

More information

Math Real Analysis II

Math Real Analysis II Math 4 - Real Analysis II Solutions to Homework due May Recall that a function f is called even if f( x) = f(x) and called odd if f( x) = f(x) for all x. We saw that these classes of functions had a particularly

More information

Solutions to Math 51 First Exam April 21, 2011

Solutions to Math 51 First Exam April 21, 2011 Solutions to Math 5 First Exam April,. ( points) (a) Give the precise definition of a (linear) subspace V of R n. (4 points) A linear subspace V of R n is a subset V R n which satisfies V. If x, y V then

More information

Math Linear Algebra II. 1. Inner Products and Norms

Math Linear Algebra II. 1. Inner Products and Norms Math 342 - Linear Algebra II Notes 1. Inner Products and Norms One knows from a basic introduction to vectors in R n Math 254 at OSU) that the length of a vector x = x 1 x 2... x n ) T R n, denoted x,

More information

1. General Vector Spaces

1. General Vector Spaces 1.1. Vector space axioms. 1. General Vector Spaces Definition 1.1. Let V be a nonempty set of objects on which the operations of addition and scalar multiplication are defined. By addition we mean a rule

More information

Inner products. Theorem (basic properties): Given vectors u, v, w in an inner product space V, and a scalar k, the following properties hold:

Inner products. Theorem (basic properties): Given vectors u, v, w in an inner product space V, and a scalar k, the following properties hold: Inner products Definition: An inner product on a real vector space V is an operation (function) that assigns to each pair of vectors ( u, v) in V a scalar u, v satisfying the following axioms: 1. u, v

More information

Orthogonality and Least Squares

Orthogonality and Least Squares 6 Orthogonality and Least Squares 6.1 INNER PRODUCT, LENGTH, AND ORTHOGONALITY INNER PRODUCT If u and v are vectors in, then we regard u and v as matrices. n 1 n The transpose u T is a 1 n matrix, and

More information

Methods of Mathematical Physics X1 Homework 2 - Solutions

Methods of Mathematical Physics X1 Homework 2 - Solutions Methods of Mathematical Physics - 556 X1 Homework - Solutions 1. Recall that we define the orthogonal complement as in class: If S is a vector space, and T is a subspace, then we define the orthogonal

More information

Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008

Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008 Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008 Exam 2 will be held on Tuesday, April 8, 7-8pm in 117 MacMillan What will be covered The exam will cover material from the lectures

More information

No books, no notes, no calculators. You must show work, unless the question is a true/false, yes/no, or fill-in-the-blank question.

No books, no notes, no calculators. You must show work, unless the question is a true/false, yes/no, or fill-in-the-blank question. Math 304 Final Exam (May 8) Spring 206 No books, no notes, no calculators. You must show work, unless the question is a true/false, yes/no, or fill-in-the-blank question. Name: Section: Question Points

More information

The set of all solutions to the homogeneous equation Ax = 0 is a subspace of R n if A is m n.

The set of all solutions to the homogeneous equation Ax = 0 is a subspace of R n if A is m n. 0 Subspaces (Now, we are ready to start the course....) Definitions: A linear combination of the vectors v, v,..., v m is any vector of the form c v + c v +... + c m v m, where c,..., c m R. A subset V

More information

(v, w) = arccos( < v, w >

(v, w) = arccos( < v, w > MA322 Sathaye Notes on Inner Products Notes on Chapter 6 Inner product. Given a real vector space V, an inner product is defined to be a bilinear map F : V V R such that the following holds: For all v

More information

MATH 221: SOLUTIONS TO SELECTED HOMEWORK PROBLEMS

MATH 221: SOLUTIONS TO SELECTED HOMEWORK PROBLEMS MATH 221: SOLUTIONS TO SELECTED HOMEWORK PROBLEMS 1. HW 1: Due September 4 1.1.21. Suppose v, w R n and c is a scalar. Prove that Span(v + cw, w) = Span(v, w). We must prove two things: that every element

More information

Chapters 5 & 6: Theory Review: Solutions Math 308 F Spring 2015

Chapters 5 & 6: Theory Review: Solutions Math 308 F Spring 2015 Chapters 5 & 6: Theory Review: Solutions Math 308 F Spring 205. If A is a 3 3 triangular matrix, explain why det(a) is equal to the product of entries on the diagonal. If A is a lower triangular or diagonal

More information

Linear Algebra, Summer 2011, pt. 2

Linear Algebra, Summer 2011, pt. 2 Linear Algebra, Summer 2, pt. 2 June 8, 2 Contents Inverses. 2 Vector Spaces. 3 2. Examples of vector spaces..................... 3 2.2 The column space......................... 6 2.3 The null space...........................

More information

(a) II and III (b) I (c) I and III (d) I and II and III (e) None are true.

(a) II and III (b) I (c) I and III (d) I and II and III (e) None are true. 1 Which of the following statements is always true? I The null space of an m n matrix is a subspace of R m II If the set B = {v 1,, v n } spans a vector space V and dimv = n, then B is a basis for V III

More information

Math 61CM - Solutions to homework 2

Math 61CM - Solutions to homework 2 Math 61CM - Solutions to homework 2 Cédric De Groote October 5 th, 2018 Problem 1: Let V be the vector space of polynomials of degree at most 5, with coefficients in a field F Let U be the subspace of

More information

(v, w) = arccos( < v, w >

(v, w) = arccos( < v, w > MA322 F all203 Notes on Inner Products Notes on Chapter 6 Inner product. Given a real vector space V, an inner product is defined to be a bilinear map F : V V R such that the following holds: For all v,

More information

MAT Linear Algebra Collection of sample exams

MAT Linear Algebra Collection of sample exams MAT 342 - Linear Algebra Collection of sample exams A-x. (0 pts Give the precise definition of the row echelon form. 2. ( 0 pts After performing row reductions on the augmented matrix for a certain system

More information

7. Dimension and Structure.

7. Dimension and Structure. 7. Dimension and Structure 7.1. Basis and Dimension Bases for Subspaces Example 2 The standard unit vectors e 1, e 2,, e n are linearly independent, for if we write (2) in component form, then we obtain

More information

MATH 2331 Linear Algebra. Section 2.1 Matrix Operations. Definition: A : m n, B : n p. Example: Compute AB, if possible.

MATH 2331 Linear Algebra. Section 2.1 Matrix Operations. Definition: A : m n, B : n p. Example: Compute AB, if possible. MATH 2331 Linear Algebra Section 2.1 Matrix Operations Definition: A : m n, B : n p ( 1 2 p ) ( 1 2 p ) AB = A b b b = Ab Ab Ab Example: Compute AB, if possible. 1 Row-column rule: i-j-th entry of AB:

More information

Dot Products. K. Behrend. April 3, Abstract A short review of some basic facts on the dot product. Projections. The spectral theorem.

Dot Products. K. Behrend. April 3, Abstract A short review of some basic facts on the dot product. Projections. The spectral theorem. Dot Products K. Behrend April 3, 008 Abstract A short review of some basic facts on the dot product. Projections. The spectral theorem. Contents The dot product 3. Length of a vector........................

More information

Dr. Abdulla Eid. Section 4.2 Subspaces. Dr. Abdulla Eid. MATHS 211: Linear Algebra. College of Science

Dr. Abdulla Eid. Section 4.2 Subspaces. Dr. Abdulla Eid. MATHS 211: Linear Algebra. College of Science Section 4.2 Subspaces College of Science MATHS 211: Linear Algebra (University of Bahrain) Subspaces 1 / 42 Goal: 1 Define subspaces. 2 Subspace test. 3 Linear Combination of elements. 4 Subspace generated

More information

LINEAR ALGEBRA QUESTION BANK

LINEAR ALGEBRA QUESTION BANK LINEAR ALGEBRA QUESTION BANK () ( points total) Circle True or False: TRUE / FALSE: If A is any n n matrix, and I n is the n n identity matrix, then I n A = AI n = A. TRUE / FALSE: If A, B are n n matrices,

More information

Linear Algebra Practice Problems

Linear Algebra Practice Problems Linear Algebra Practice Problems Page of 7 Linear Algebra Practice Problems These problems cover Chapters 4, 5, 6, and 7 of Elementary Linear Algebra, 6th ed, by Ron Larson and David Falvo (ISBN-3 = 978--68-78376-2,

More information

Cheat Sheet for MATH461

Cheat Sheet for MATH461 Cheat Sheet for MATH46 Here is the stuff you really need to remember for the exams Linear systems Ax = b Problem: We consider a linear system of m equations for n unknowns x,,x n : For a given matrix A

More information

Study Guide for Linear Algebra Exam 2

Study Guide for Linear Algebra Exam 2 Study Guide for Linear Algebra Exam 2 Term Vector Space Definition A Vector Space is a nonempty set V of objects, on which are defined two operations, called addition and multiplication by scalars (real

More information

[Disclaimer: This is not a complete list of everything you need to know, just some of the topics that gave people difficulty.]

[Disclaimer: This is not a complete list of everything you need to know, just some of the topics that gave people difficulty.] Math 43 Review Notes [Disclaimer: This is not a complete list of everything you need to know, just some of the topics that gave people difficulty Dot Product If v (v, v, v 3 and w (w, w, w 3, then the

More information

. = V c = V [x]v (5.1) c 1. c k

. = V c = V [x]v (5.1) c 1. c k Chapter 5 Linear Algebra It can be argued that all of linear algebra can be understood using the four fundamental subspaces associated with a matrix Because they form the foundation on which we later work,

More information

Chapter Two Elements of Linear Algebra

Chapter Two Elements of Linear Algebra Chapter Two Elements of Linear Algebra Previously, in chapter one, we have considered single first order differential equations involving a single unknown function. In the next chapter we will begin to

More information

Math 20F Final Exam(ver. c)

Math 20F Final Exam(ver. c) Name: Solutions Student ID No.: Discussion Section: Math F Final Exam(ver. c) Winter 6 Problem Score /48 /6 /7 4 /4 5 /4 6 /4 7 /7 otal / . (48 Points.) he following are rue/false questions. For this problem

More information