LECTURES 14/15: LINEAR INDEPENDENCE AND BASES

Similar documents
Linear Algebra Handout

LECTURES 4/5: SYSTEMS OF LINEAR EQUATIONS

Chapter 2 Notes, Linear Algebra 5e Lay

1 Last time: inverses

Elementary maths for GMT

Math Linear Algebra Final Exam Review Sheet

Math 314/814 Topics for first exam

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017

2. Every linear system with the same number of equations as unknowns has a unique solution.

Chapter 3. Vector spaces

ANALYTICAL MATHEMATICS FOR APPLICATIONS 2018 LECTURE NOTES 3

MATH 304 Linear Algebra Lecture 10: Linear independence. Wronskian.

MATH 213 Linear Algebra and ODEs Spring 2015 Study Sheet for Midterm Exam. Topics

The definition of a vector space (V, +, )

This MUST hold matrix multiplication satisfies the distributive property.

Linear Combination. v = a 1 v 1 + a 2 v a k v k

Math 54 Homework 3 Solutions 9/

Span & Linear Independence (Pop Quiz)

1 Last time: determinants

Math 54 HW 4 solutions

INVERSE OF A MATRIX [2.2]

Math Computation Test 1 September 26 th, 2016 Debate: Computation vs. Theory Whatever wins, it ll be Huuuge!

Lecture 22: Section 4.7

Topic 15 Notes Jeremy Orloff

Math 314H EXAM I. 1. (28 points) The row reduced echelon form of the augmented matrix for the system. is the matrix

Math 1553 Introduction to Linear Algebra

Math 344 Lecture # Linear Systems

Elementary Linear Algebra Review for Exam 2 Exam is Monday, November 16th.

LINEAR ALGEBRA REVIEW

10. Rank-nullity Definition Let A M m,n (F ). The row space of A is the span of the rows. The column space of A is the span of the columns.

Topic 14 Notes Jeremy Orloff

[Disclaimer: This is not a complete list of everything you need to know, just some of the topics that gave people difficulty.]

MAT2342 : Introduction to Applied Linear Algebra Mike Newman, fall Projections. introduction

Chapter 3. Directions: For questions 1-11 mark each statement True or False. Justify each answer.

Section 2.2: The Inverse of a Matrix

MH1200 Final 2014/2015

Solving Systems of Linear Equations

Glossary of Linear Algebra Terms. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB

Vector Spaces. 9.1 Opening Remarks. Week Solvable or not solvable, that s the question. View at edx. Consider the picture

Math 54 First Midterm Exam, Prof. Srivastava September 23, 2016, 4:10pm 5:00pm, 155 Dwinelle Hall.

Final Review Sheet. B = (1, 1 + 3x, 1 + x 2 ) then 2 + 3x + 6x 2

Mon Feb Matrix algebra and matrix inverses. Announcements: Warm-up Exercise:

Math 3108: Linear Algebra

Linear Algebra, Summer 2011, pt. 2

( v 1 + v 2 ) + (3 v 1 ) = 4 v 1 + v 2. and ( 2 v 2 ) + ( v 1 + v 3 ) = v 1 2 v 2 + v 3, for instance.

February 20 Math 3260 sec. 56 Spring 2018

Span and Linear Independence

Math 308 Discussion Problems #4 Chapter 4 (after 4.3)

Math 323 Exam 2 Sample Problems Solution Guide October 31, 2013

EXAM 2 REVIEW DAVID SEAL

Lecture 1 Systems of Linear Equations and Matrices

Review for Chapter 1. Selected Topics

No books, notes, any calculator, or electronic devices are allowed on this exam. Show all of your steps in each answer to receive a full credit.

Second Midterm Exam April 14, 2011 Answers., and

MATH 167: APPLIED LINEAR ALGEBRA Chapter 2

Math Camp II. Basic Linear Algebra. Yiqing Xu. Aug 26, 2014 MIT

MTH 362: Advanced Engineering Mathematics

3.4 Elementary Matrices and Matrix Inverse

Chapter 4 & 5: Vector Spaces & Linear Transformations

Inverting Matrices. 1 Properties of Transpose. 2 Matrix Algebra. P. Danziger 3.2, 3.3

1111: Linear Algebra I

LS.1 Review of Linear Algebra

MODEL ANSWERS TO THE FIRST QUIZ. 1. (18pts) (i) Give the definition of a m n matrix. A m n matrix with entries in a field F is a function

MAT 242 CHAPTER 4: SUBSPACES OF R n

7. Dimension and Structure.

1 Matrices and Systems of Linear Equations

MATH 15a: Linear Algebra Practice Exam 2

Math 240, 4.3 Linear Independence; Bases A. DeCelles. 1. definitions of linear independence, linear dependence, dependence relation, basis

Linear Algebra. Preliminary Lecture Notes

Row Space and Column Space of a Matrix

Chapter 1. Vectors, Matrices, and Linear Spaces

Chapter 1: Linear Equations

Math 314 Lecture Notes Section 006 Fall 2006

CSL361 Problem set 4: Basic linear algebra

MATH 304 Linear Algebra Lecture 20: Review for Test 1.

MTH 464: Computational Linear Algebra

Review Solutions for Exam 1

Chapter 1: Linear Equations

Kevin James. MTHSC 3110 Section 4.3 Linear Independence in Vector Sp

1 9/5 Matrices, vectors, and their applications

Math 123, Week 5: Linear Independence, Basis, and Matrix Spaces. Section 1: Linear Independence

MATH 2331 Linear Algebra. Section 2.1 Matrix Operations. Definition: A : m n, B : n p. Example: Compute AB, if possible.

MATH 2050 Assignment 6 Fall 2018 Due: Thursday, November 1. x + y + 2z = 2 x + y + z = c 4x + 2z = 2

Matrices and systems of linear equations

Worksheet for Lecture 15 (due October 23) Section 4.3 Linearly Independent Sets; Bases

1 Matrices and Systems of Linear Equations. a 1n a 2n

Lecture 8: Determinants I

Study Guide for Linear Algebra Exam 2

Answers in blue. If you have questions or spot an error, let me know. 1. Find all matrices that commute with A =. 4 3

Review Notes for Linear Algebra True or False Last Updated: February 22, 2010

Worksheet for Lecture 15 (due October 23) Section 4.3 Linearly Independent Sets; Bases

MATH 310, REVIEW SHEET 2

Math 2114 Common Final Exam May 13, 2015 Form A

Math 3191 Applied Linear Algebra

MTH 464: Computational Linear Algebra

Vector Spaces and Dimension. Subspaces of. R n. addition and scalar mutiplication. That is, if u, v in V and alpha in R then ( u + v) Exercise: x

Section Gaussian Elimination

Linear equations in linear algebra

Section 1.1 System of Linear Equations. Dr. Abdulla Eid. College of Science. MATHS 211: Linear Algebra

Dot Products, Transposes, and Orthogonal Projections

Transcription:

LECTURES 14/15: LINEAR INDEPENDENCE AND BASES MA1111: LINEAR ALGEBRA I, MICHAELMAS 2016 1. Linear Independence We have seen in examples of span sets of vectors that sometimes adding additional vectors doesn t increase the span of a set of vectors. For example, if v is a vector in R 3, then span(v) = span(v, 2v). We want to develop language to exclude such cases and throw away redundant information in the description of subspaces built out of span sets. To this end, we set the following terminology. Definition. We say that a non-empty set S of vectors in V is linearly dependent if there are vectors v 1,..., v n V and scalars c 1,..., c n F not all equal to zero for which c 1 v 1 +... + c n v n = 0. If no such non-trivial linear dependency exists, we say that the set S is linearly independent. Example. A set with only one non-zero vector is linearly independent, as if cv = 0, then we saw before that c = 0 or v = 0, and v 0 by assumption. Thus, any equation must have c = 0, and then it violates the condition of not all scalars in a linear dependency being 0. Example. In R 3, for any vector v the set S = {e 1, e 2, e 3, v} is linearly dependent, as we can write any such vector as v = c 1 e 1 + c 2 e 2 + c 3 e 3 for real numbers c 1, c 2, c 3, and thus and the coefficient in front of v is 1 0. c 1 e 1 c 2 e 2 c 3 e 3 + v = 0, Example. Any set which contains the zero vector is linearly dependent. For example, we have the linear dependency 1 0 = 0. Example. By definition the empty set is always linearly independent as there are no possible linear combinations in the definition above to check! As we have seen, properties about linear combinations of vectors can be expressed in terms of solution sets to systems of linear equations. In the case of linear independence, suppose that we wish to determine whether S = {v 1,..., v n } is linearly independent, Date: November 2, 4, 2016. 1

2 MA1111: LINEAR ALGEBRA I, MICHAELMAS 2016 where v i = a 1i. a mi R m. Then we are asking whether there are scalars c 1,..., c n, not all zero, for which a 11 a 1n c 1 v 1 +... + c n v n = c 1. +... + c n. = 0, a m1 a mn or Ac = 0 with A defined by A ij = a ij (i.e., A is the matrix whose columns are v 1,..., v n ) and c = c 1., and we only require that c 0. That is, a finite set of vectors in R n c n is linearly independent if and only if the matrix A with those vectors as its columns has ker(a) = {0}. As performing elementary row operations on an augmented matrix preserves the solution sets, and as any elementary row operations keep the vector 0 unchanged, this is equivalent to there being a non-trivial kernel of the RREF of A, which happens exactly when there is a column without a pivot in the RREF of A. That is, we have shown the following. Theorem. The following are equivalent for a set of vectors v 1,..., v n R m, where A = ( ) v 1... v n is the matrix with columns v1,..., v n. (1) The vectors v 1,..., v n are linearly independent. (2) The kernel of A is the trivial subspace {0}. (3) The kernel of the RREF of A is the trivial subspace {0}. (4) The RREF of A has a pivot in every column. In particular, if n = m, that is, if the matrix A is square, this is equivalent to det A 0. Example. In R 3, the set of vectors v 1 = (1, 4, 7), v 2 = (2, 5, 8), v 3 = (3, 6, 9) is linearly dependent as det 4 5 6 = 0. 7 8 9 However, the set of vectors v 1 = (1, 4, 7), v 2 = (2, 5, 0), v 3 = (3, 6, 9) is linearly independent as det 4 5 6 = 48 0. 7 0 9 Note that this doesn t say exactly the same things as the example in the last lecture when we discussed the spans of these vectors. However, we will see that spans of n vectors in

LECTURES 14/15: LINEAR INDEPENDENCE AND BASES 3 R n are all of R n if and only if they are linearly independent. This is essentially a special case of the Fundamental Theorem of Linear Algebrta. Example. In R 4, the vectors v 1 = (1, 4, 7, 1), v 2 = (2, 5, 8, 2), v 3 = (3, 6, 9, 3) are linearly dependent as the RREF of A = 4 5 6 7 8 9 is 1 0 1 0 1 2 0 0 0, 0 0 0 which has 2 pivots and 3 columns, and hence there are infinitely many solutions to Ax = 0. However, the vectors v 1 = (1, 4, 7, 1), v 2 = (2, 5, 8, 2), v 3 = (3, 6, 9, 4) are linearly independent as the RREF of A = 4 5 6 7 8 9 1 2 4 is 1 0 0 0 1 0 0 0 1, 0 0 0 which has a pivot in each column, and hence trivial kernel. Example. If we have a set of n vectors v 1..., v n in R m, if n > m, then they must be linearly dependent as the corresponding matrix A has n columns, but only m rows. In order to be linearly independent, there must be a pivot in each column, that is there must be n pivots. However, there can only be one pivot in each row, so there are at most m < n pivots. Example. By the last example, it is automatic that the set of vectors v 1 = (1, 4, 7), v 2 = (2, 5, 8), v 3 = (3, 6, 9), v 4 = ( 1, 4, 5) is linearly dependent, and we don t have to do any work with row reduction to determine this. 2. Bases of vector spaces We now combine the notions of span and linear independence to describe set of vectors which build up vector spaces without redundant information.

4 MA1111: LINEAR ALGEBRA I, MICHAELMAS 2016 Definition. A basis for a vector space V is a set S V which is linearly independent and which spans V. Example. We defined the span of the empty set to be {0}, and clearly there can be no linear dependence relations in a set with no elements, and so is a basis for {0}. Example. In R n, we have regularly referred to the vectors e 1,..., e n as standard unit basis vectors. They do indeed form a basis of R n, as we have repeatedly seen that they span R n, and they are linearly independent as the matrix whose columns are the standard unit basis vectors is the identity matrix I n, which of course has non-zero determinant. The same construction, setting e i to be the vector with a 1 in the i-th position and zero elsewhere, provides a basis of F n for any field F. Example. The set {1, x, x 2,...} of all non-negative integral powers of x is a basis of P(F ). Example. The set {M ij : 1 i m, 1 j n} where M ij is the m n matrix with entry 1 in the (i, j)-th entry and 0 elsewhere is a basis of the space of matrices M m n (F ). The idea of a basis providing the smallest possible set to describe all elements of a vector space (i.e., without redundant information) is made precise by the following result. Theorem. A set S V is a basis for V if and only if every element of V can be expressed uniquely as a linear combination of elements of S. Proof. Suppose that S is a basis for V. Then by definition span(s) = V, so every element of V can be written as a linear combination of elements of S. To show the uniqueness of this expression, suppose that v V has two representations as linear combinations of elements v 1,..., v n S: v = c 1 v 1 +... + c n v n = c 1v 1 +... + c nv n. Taking the difference of these two representations, we find that (c 1 c 1)v 1 +... + (c n c n)v n = 0, and since S is linearly independent we have c 1 c 1 = 0,..., c n c n = 0. Hence, c 1 = c 1,..., c n = c n, and so the two representations of v were the same all along. Conversely, if every element of V is a unique linear combination of elements of S, then clearly S spans V. To show that S is linearly independent, suppose that c 1 v 1 +... + c n v n = 0 with v 1..., v n S and with not all c 1,..., c n equal to zero. Then we can find many non-unique representations of elements of V as linear combinations of elements from S. For example, v 1 = 1 v 1 = (c 1 + 1)v 1 +... + c n v n gives two different representations of v 1.

LECTURES 14/15: LINEAR INDEPENDENCE AND BASES 5 We will later return to this fact, and the coefficients in these unique representations will be called the coordinates of a vector with respect to a basis. In fact, this idea will allow us to show that all vector spaces spanned by a finite number of vectors (this is most of them that we have seen, other than vector spaces built out of functions like P(R) and C ) are really the same thing as the simple vector space F n for some n. Our two most important examples of computing bases are for our two most important subspaces: kernels (or null spaces) and column spaces (or spans of vectors in R n ). We now discuss each case. Example. The kernel of a matrix can be found using row reduction, as we have seen. In particular, we have seen above that the kernel of a matrix A is the same as the kernel of the RREF of A, and we can use our Gaussian elimination algorithm to easily find a basis for the kernel of any matrix. For example, we saw above that A = 4 5 6 7 8 9 has a non-trivial kernel. This is the same as the kernel of the RREF of A, which we have seen is 1 0 1 R = 0 1 2 0 0 0. 0 0 0 The solutions to the equation R = 0 are obtained by setting the free variable x 3 = t for t R, and solving to get the one-parameter infinite set of solutions x 1 = t, x 2 = 2t, x 3 = t, which is a line in R 3 with basis (1, 2, 1). As another example, consider the matrix A = 1 4 1 1 1 1 1. 1 2 1 2 1 This matrix has RREF 1 0 0 2 2 R = 0 1 0 1 0. 0 0 The solution set to Rx = 0 is found by setting x 4 = t 4, x 5 = t 5 for t 4, t 5 R, and solving for the pivotal variables: x 1 = 2t 4 + 2t 5, x 2 = t 4, x 3 = 2t 4 3t 5, x 4 = t 4, x 5 = t 5.

6 MA1111: LINEAR ALGEBRA I, MICHAELMAS 2016 That is, the the vectors in the kernel are those vectors x = i.e., the span of 2 2 1 x = t 4 2 1 + t 0 5 3 0, 0 1 2 2 1 2 1, 0 3 0 0 1 x 1 x 2 x 3 x 4 x 5 of the form is ker(a). It is also easy to check that these two vectors are linearly independent, so that they are indeed a basis for ker(a). In general, the basis for the kernel of a matrix will always have the same number of elements in it as the number of free variables in its RREF, i.e., the vectors that this method yields will always form a linearly independent set. The second main example of a vector space we would like to find bases for is a subspace spanned by a set of vectors in R n. We will give two methods for this. We can solve this problem by computing the row space or the column space of a matrix A; of course, we can switch between the two perspectives using transposes. We begin with the case of row space, the span of the rows a a matrix. Theorem. The row space of A is the same as the row space of the RREF of A. Proof. It suffices to show that elementary row operations don t change row space. Indeed, suppose that the rows of a matrix A are v 1,..., v n. If we multiply row j by a non-zero constant c, this then the row space is of the form span(v 1,..., cv j,..., v n ) = {c 1 v 1 +... + (c j c)v j +... + c n v n : c 1,..., c n F }, which is the same as the original span since as c j ranges over all elements of F so does c j c, as c 0 (think of the case of real numbers; this is true since in a field we can always divide by non-zero elements). Clearly, if we swap two rows, this doesn t change the span, as addition in vector spaces is commutative. Finally, if we add a multiple of one row to another it doesn t change the span. For example, by swapping rows suppose

LECTURES 14/15: LINEAR INDEPENDENCE AND BASES 7 that we add c times the second row to the first, with c 0. Then the row space of the resulting matrix is span(v 1, v 2 + cv 1,..., v n ) = {c 1 v 1 + c 2 (v 2 + cv 1 ) +... + c n v n : c 1,..., c n F }. Now c 1 v 1 + c 2 (v 2 + cv 1 ) +... + c n v n = (c 1 + c 2 c)v 1 + c 2 v 2 +... + c n v n, and clearly as c 1, c 2 range over all choices of n elements of F, so does c 1 + c 2 c. In general, the column space of a matrix isn t the same as the column space of its RREF. But, you can always find the column space by using elementary column operations or by taking the transpose of the matrix and then using row reduction to find the row space. Moreover, it isn t hard to show the following. Theorem. The non-zero rows in a reduced row echelon form matrix are linearly independent. Thus, the non-zero rows of the RREF of a matrix A form a basis of the row space of A. Another method is to use pivots in row recution to find which column vectors in matrix are linearly independent. To see this, suppose that a number of columns v i1,..., v in of a matrix have a non-trivial linear dependency c 1 v i1 +...+c n v in = 0. Note that v ij = Ae ij, where e 1,... are the usual standard basis vectors. Then if E 1,..., E k are elementary matrices with R = E 1 E k A where R is the RREF of A we clearly have 0 = (E 1,, E k )(c 1 v i1 +... + c n v in ) = (E 1,, E k )(c 1 Ae i1 +... + c n Ae in ) = c 1 (E 1,, E k )Ae i1 +... + c n (E 1,, E k )Ae in = c 1 Re i1 +... + c n Re in, so that the same linear relation holds between the corresponding columns of R. The converse is also clearly true, as elementary matrices are always invertible. Thus, since the column space of a matrix in RREF is just the span of its pivotal columns, we have shown the following result. Theorem. The column space of a matrix A has as a basis the set of column vectors corresponding to columns with a pivot in the RREF of A. Example. Let s compute bases for the kernel, column space, and row space of a big matrix. Take 3 4 0 7 A = 1 5 2 2 1 4 0 3. 1 1 2 2

8 MA1111: LINEAR ALGEBRA I, MICHAELMAS 2016 The first step is to compute the RREF of A, which is 1 0 0 1 R = 0 1 0 1 0 0 1 1. 0 0 0 0 There are 3 pivotal variables and one free variable, so the kernel will have a basis with 1 element and a column space with a basis of 3 elements. The kernel is found by solving Rx = 0, which has solution set x 1 = t, x 2 = t, x 3 = t, x 4 = t, which is clearly ker(a) = span( 1, 1, 1, 1), with basis {( 1, 1, 1, 1)}. We have already basically found the column space as well, as the RREF above has pivots in the first three columns and so col(a) has as a basis {(3, 1, 1, 1), (4, 5, 4, 1), (0, 2, 0, 2)}. We can find another basis for the column space by noting that it is the same as the row space of A T, and we compute that the RREF of A T is 1 0 0 1 4 0 1 0 1 0 0 1 3. 4 0 0 0 0 Thus, the row space of this matrix, and hence the column space of our original matrix A also has basis {(1, 0, 0, 1/4), (0, 1, 0, 1), (0, 0, 1, 3/4)}. Finally, we can find the row space of our original matrix A by noting that it has as a basis the non-zero rows of R above, namely {(1, 0, 0, 1), (0, 1, 0, 1), (0, 0, 1, 1)}. Summarizing our results on kernels, column spaces, and row spaces, we have the following. Theorem. For an m n matrix A, the number of free variables in the RREF of A is the number of elements in a basis for the kernel ker(a). Moreover, the columns of A are linearly independent if and only if the kernel is the trivial subspace {0}, which is true if and only if the RREF of A has a pivot in each column. Furthermore, the set of non-zero rows in the RREF of A forms a basis of the row space, and the set of columns corresponding to pivotal variables of the RREF of A forms a basis of the column space of A. In particular, the row space and column space both have a basis which has as its number of elements the number of pivots in the RREF of A. Furthermore, the columns of A span all of R m if and only if there is a pivot in each row. Finally, for a square n n matrix A, the following are equivalent. (1) The columns of A span R n (2) The columns of A are linearly independent. (3) The columns of A form a basis of R n.

LECTURES 14/15: LINEAR INDEPENDENCE AND BASES 9 (4) The rows of A form a basis of R n. (5) The matrix A is invertible. (6) det(a) 0.