The Fundamental Theorem of Linear Algebra

Similar documents
x 1 + 2x 2 + 3x 3 = 0 x 1 + 2x 2 + 3x 3 = 0, x 2 + x 3 = 0 x 3 3 x 3 1

Dot Products, Transposes, and Orthogonal Projections

Solutions: We leave the conversione between relation form and span form for the reader to verify. x 1 + 2x 2 + 3x 3 = 0

MAT2342 : Introduction to Applied Linear Algebra Mike Newman, fall Projections. introduction

March 27 Math 3260 sec. 56 Spring 2018

Math 123, Week 5: Linear Independence, Basis, and Matrix Spaces. Section 1: Linear Independence

MATH 2331 Linear Algebra. Section 2.1 Matrix Operations. Definition: A : m n, B : n p. Example: Compute AB, if possible.

Lecture 3q Bases for Row(A), Col(A), and Null(A) (pages )

2. (10 pts) How many vectors are in the null space of the matrix A = 0 1 1? (i). Zero. (iv). Three. (ii). One. (v).

2018 Fall 2210Q Section 013 Midterm Exam II Solution

Gaussian elimination

Final Review Sheet. B = (1, 1 + 3x, 1 + x 2 ) then 2 + 3x + 6x 2

Solutions to Final Practice Problems Written by Victoria Kala Last updated 12/5/2015

Review Notes for Linear Algebra True or False Last Updated: February 22, 2010

7. Dimension and Structure.

Linear Algebra Fundamentals

Final Review Written by Victoria Kala SH 6432u Office Hours R 12:30 1:30pm Last Updated 11/30/2015

MTH 362: Advanced Engineering Mathematics

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL

Math 308 Discussion Problems #4 Chapter 4 (after 4.3)

2. Every linear system with the same number of equations as unknowns has a unique solution.

. = V c = V [x]v (5.1) c 1. c k

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017

Answers in blue. If you have questions or spot an error, let me know. 1. Find all matrices that commute with A =. 4 3

PRACTICE PROBLEMS FOR THE FINAL

SUMMARY OF MATH 1600

REVIEW FOR EXAM III SIMILARITY AND DIAGONALIZATION

Lecture 13: Row and column spaces

Typical Problem: Compute.

Review Notes for Midterm #2

Math 407: Linear Optimization

Math 369 Exam #2 Practice Problem Solutions

4.3 - Linear Combinations and Independence of Vectors

Definitions for Quizzes

LINEAR ALGEBRA REVIEW

1. General Vector Spaces

Math 4153 Exam 3 Review. The syllabus for Exam 3 is Chapter 6 (pages ), Chapter 7 through page 137, and Chapter 8 through page 182 in Axler.

The set of all solutions to the homogeneous equation Ax = 0 is a subspace of R n if A is m n.

(v, w) = arccos( < v, w >

MATH Linear Algebra

Matrix invertibility. Rank-Nullity Theorem: For any n-column matrix A, nullity A +ranka = n

Assignment 1 Math 5341 Linear Algebra Review. Give complete answers to each of the following questions. Show all of your work.

Lecture 1: Review of linear algebra

NORMS ON SPACE OF MATRICES

Notes on singular value decomposition for Math 54. Recall that if A is a symmetric n n matrix, then A has real eigenvalues A = P DP 1 A = P DP T.

Linear Algebra, Summer 2011, pt. 3

Chapter 6: Orthogonality

Announcements Monday, November 19

This lecture is a review for the exam. The majority of the exam is on what we ve learned about rectangular matrices.

Overview. Motivation for the inner product. Question. Definition

Eigenspaces. (c) Find the algebraic multiplicity and the geometric multiplicity for the eigenvaules of A.

Chapter SSM: Linear Algebra. 5. Find all x such that A x = , so that x 1 = x 2 = 0.

Orthogonal Complements

Conceptual Questions for Review

Solutions to Final Exam

Chapter 2 Subspaces of R n and Their Dimensions

Linear Algebra Highlights

Practice Final Exam. Solutions.

Worksheet for Lecture 25 Section 6.4 Gram-Schmidt Process

EK102 Linear Algebra PRACTICE PROBLEMS for Final Exam Spring 2016

Worksheet for Lecture 23 (due December 4) Section 6.1 Inner product, length, and orthogonality

Lecture 3: Linear Algebra Review, Part II

4 ORTHOGONALITY ORTHOGONALITY OF THE FOUR SUBSPACES 4.1

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

(v, w) = arccos( < v, w >

Section 6.1. Inner Product, Length, and Orthogonality

MATH 2030: ASSIGNMENT 4 SOLUTIONS

MATH36001 Generalized Inverses and the SVD 2015

Math 308 Practice Test for Final Exam Winter 2015

Vector Spaces, Orthogonality, and Linear Least Squares

The Four Fundamental Subspaces

Linear Algebra: Homework 7

Math 3C Lecture 25. John Douglas Moore

(v, w) = arccos( < v, w >

Elementary Linear Algebra Review for Exam 2 Exam is Monday, November 16th.

Rank & nullity. Defn. Let T : V W be linear. We define the rank of T to be rank T = dim im T & the nullity of T to be nullt = dim ker T.

Fall TMA4145 Linear Methods. Exercise set Given the matrix 1 2

Roberto s Notes on Linear Algebra Chapter 9: Orthogonality Section 2. Orthogonal matrices

LINEAR ALGEBRA QUESTION BANK

Linear Algebra. Grinshpan

Miderm II Solutions To find the inverse we row-reduce the augumented matrix [I A]. In our case, we row reduce

Math 2174: Practice Midterm 1

5.) For each of the given sets of vectors, determine whether or not the set spans R 3. Give reasons for your answers.

First we introduce the sets that are going to serve as the generalizations of the scalars.

(a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? Solution: dim N(A) 1, since rank(a) 3. Ax =

Solutions to Section 2.9 Homework Problems Problems 1 5, 7, 9, 10 15, (odd), and 38. S. F. Ellermeyer June 21, 2002

Linear Algebra, Summer 2011, pt. 2

DEPARTMENT OF MATHEMATICS

Problem Set (T) If A is an m n matrix, B is an n p matrix and D is a p s matrix, then show

Math Linear algebra, Spring Semester Dan Abramovich

A PRIMER ON SESQUILINEAR FORMS

Math 217 Midterm 1. Winter Solutions. Question Points Score Total: 100

Chapter 2: Matrix Algebra

Vector Spaces. 9.1 Opening Remarks. Week Solvable or not solvable, that s the question. View at edx. Consider the picture

YORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH M Test #2 Solutions

No books, no notes, no calculators. You must show work, unless the question is a true/false, yes/no, or fill-in-the-blank question.

YORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH M Test #1. July 11, 2013 Solutions

Solving Systems of Equations Row Reduction

ft-uiowa-math2550 Assignment OptionalFinalExamReviewMultChoiceMEDIUMlengthForm due 12/31/2014 at 10:36pm CST

Chapter 4 Euclid Space

Transcription:

The Fundamental Theorem of Linear Algebra Nicholas Hoell Contents 1 Prelude: Orthogonal Complements 1 2 The Fundamental Theorem of Linear Algebra 2 2.1 The Diagram........................................ 3 3 Exercises 4 These notes are going to present a striking result, fundamental to understanding how the concepts we ve seen all semester relate to one another. But, equally important, we re going to see a remarkable picture which illustrates the theorem. 1 Prelude: Orthogonal Complements Before we get to the main theorem, a bit on the terminology it uses. We ll begin with a natural definition. Definition 1 Let S R n be a subspace. The set S R n is defined to be the collection of all vectors v R n orthogonal to S and called the orthogonal complement of S in R n. In other words, S = {u R n u v = 0, v S } You should be able to prove that S is a subspace of R n (see exercises). The purpose of these notes is to establish a version of the Fundamental Theorem of Linear Algebra. The result can be thought of as a type of representation theorem, namely, it tells us something about how vectors are by describing the canonical subspaces of a matrix A in which they live. To understand this we consider the following representation theorem. Theorem 1 Let v R n and let S R n be a subspace. Then there are vectors s S, s S such 1

that v = s + s In other words, vectors can be expressed in terms of pieces living in orthogonal spaces. The above theorem is sometimes expressed in the notation R n = S S where the notation is called the direct sum" of S and S. Proof The proof needs a small fact which you will see if you take MAT224. Here s the fact: not only does every subspace S of R n have a basis, it actually has an orthonormal basis, namely a basis B consisting of vectors s j where s j = 1 holds for all j and s i s j = 0 when i j. The hard part of this statement actually is showing that subspaces have bases (which we ve done). From there the idea is to use an algorithm to turn a given basis into one in which all the vectors are made orthogonal and of unit length. But it s a technical detail I don t want to describe here, I only want to use the result. So let s agree S has an orthonormal basis as described. Set s = (v s 1 )s 1 + + (v s dim S )s dim S which must clearly be an element of S. Then define s = v s. It s obvious that v = s+s and you can verify that s S giving the result. Where this is going: We ll soon see that for an m n matrix A we have (ker A) = col(a T ). If this is true, then that means (using the notation above) that R n = ker(a) col(a T ). In this case a vector x R n can be written as x = p+v where Av = 0 and p row(a) since row(a) = col(a T ). In other words if b col(a) we can solve Ax = b and since x R n = ker(a) col(a T ) we can write x as In the above p row(a) since row(a) = col(a T ). The space of possible v h s is dim(ker(a))- dimensional, so the nullity describes the lack of unique solvability for the linear system Ax = b. 2 The Fundamental Theorem of Linear Algebra We can now get on with proving the main theorem in this course, the capstone to our understanding what it means to solve systems of linear equations. Proposition 1 Suppose A is an m n matrix. Then col(a T ) = ker A 2

Proof 0 (row 1 (A)) v (row 2 (A)) v Suppose v ker A, then Av = = so v is orthogonal to row i (A) for 0.. 0 (row n (A)) v i = 1,..., m. Therefore v (row(a)) = (col(a T )), i.e. ker A (col(a T )). As well, if v (col(a T )) then, in particular, we must have that v (row i (A)) = 0 for i = 1,..., m. But in this case we d again have v ker A. Thus (col(a T )) ker A. Since the sets contain each other they must be equal and we re done. If v (S ) then clearly v s = 0 for all s S. If {s 1,..., s dim S } is an orthonormal basis for S and {s 1,..., s dim S } is an orthonormal basis for S we can write out v = (v s 1 )s 1 + + (v s dim S )s dim S + (v s 1 )s 1 + + (v s dim S )s dim S = (v s 1 )s 1 + + (v s dim S )s dim S S So (S ) S. As well, clearly for s S we have s (S ) since s s = 0 must be true for all s S. Thus, S (S ). From these facts it follows that S = (S ) must hold for all subspaces S R n. (Make sure you can verify this statement). Applying this general fact to the previous theorem we immediately get the following. Theorem 2: Fundamental Theorem of Linear Algebra Suppose A is an m n matrix. Then So that col(a T ) = (ker A) R n = ker(a) col(a T ) gives an orthogonal decomposition of R n into the null space and the row space of matrix A. Therefore, for b col(a) we have that Ax = b is solved by for p row(a) a particular solution, Ap = b, and v h ker A a generic vector in ker A. 2.1 The Diagram The content of this theorem, the fundamental theorem of linear algebra, is encapsulated in the following figure. If a picture is worth a thousand words, this figure is worth at least several hours thought. 3

dim r p Ap = b b dim r row(a) col(a) R n 0 R m dim n r v h null(a) Av h = 0 null(a T ) dim m r Figure 1: Solving Ax = b for an m n matrix A with rank(a) = r. Notice this picture has all the major players we ve seen in this course: there are vectors, subspaces, dimensions, and they are all playing a role in the way we conceptualize solving a linear system of equations. Moreover, note the duality in the picture: the subspaces on the right-hand side, those living in R m, are the corresponding subspaces appearing on the left-hand side for the transpose of A rather than A (e.g. col(a) appearing on the upper right is just row(a T ) whereas we see row(a) on the upper left-hand side). The subspaces meet at right angles because they re orthogonal complements of each other and the intersection of the spaces is the zero subspace of the respective ambient space (R n or R m ). Try to visualize how the picture changes as we look at limiting values of possible rank for A. Namely, how would this above change in the case where rank(a) = n say? Or m? Or if A is square and invertible? If A is symmetric? Etc. 3 Exercises You should be able to do the following. 1. Calculate the orthogonal complements of the improper subspaces of R n. 2. Let v R 3 be a nonzero vector. Describe (span{v}). 3. Prove that S S = {0} holds for any subspace S R n. 4. Prove that S is a subspace of R n whenever S is. 5. Prove that for S R n a subspace we have dim S + dim S = n. 6. If S, W are subspaces of R n show that S W = W S 7. Let S = {x R 4 x 1 + x 2 + x 3 + x 4 = 0}. Prove that S is a subspace and find a basis for S. 4

8. Prove that every m n matrix A defines a linear transformation from row(a) onto col(a), i.e. T A : row(a) col(a) 9. Consider A, an m n matrix and b R m. Consider the following claim 1 : One and only one of the following systems is solvable (a) Ax = b (b) A T y = 0 with y T b 0 Prove that the above options cannot BOTH hold. In other words, if one of the above holds the other one mustn t. 1 A variation of this strange-sounding claim is very important in numerous applications including, very importantly, in differential equations. 5