Math 123, Week 5: Linear Independence, Basis, and Matrix Spaces. Section 1: Linear Independence

Similar documents
Math 2174: Practice Midterm 1

Algorithms to Compute Bases and the Rank of a Matrix

Math 369 Exam #2 Practice Problem Solutions

Lecture 13: Row and column spaces

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017

Math 102, Winter 2009, Homework 7

Math 313 Chapter 5 Review

Chapter 2 Subspaces of R n and Their Dimensions

7. Dimension and Structure.

SUMMARY OF MATH 1600

Rank and Nullity. MATH 322, Linear Algebra I. J. Robert Buchanan. Spring Department of Mathematics

Review Notes for Linear Algebra True or False Last Updated: February 22, 2010

Lecture 3q Bases for Row(A), Col(A), and Null(A) (pages )

MATH 2331 Linear Algebra. Section 2.1 Matrix Operations. Definition: A : m n, B : n p. Example: Compute AB, if possible.

Dimension and Structure

2. Every linear system with the same number of equations as unknowns has a unique solution.

Midterm #2 Solutions

Math 308 Practice Test for Final Exam Winter 2015

MTH 362: Advanced Engineering Mathematics

Row Space and Column Space of a Matrix

LINEAR ALGEBRA REVIEW

Glossary of Linear Algebra Terms. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB

Solutions to Final Practice Problems Written by Victoria Kala Last updated 12/5/2015

We see that this is a linear system with 3 equations in 3 unknowns. equation is A x = b, where

MATH2210 Notebook 3 Spring 2018

2018 Fall 2210Q Section 013 Midterm Exam II Solution

MAT 242 CHAPTER 4: SUBSPACES OF R n

Math Final December 2006 C. Robinson

Practice Final Exam. Solutions.

Chapter 3. Directions: For questions 1-11 mark each statement True or False. Justify each answer.

MATH 2360 REVIEW PROBLEMS

1. General Vector Spaces

Chapter 1. Vectors, Matrices, and Linear Spaces

Lecture 14: Orthogonality and general vector spaces. 2 Orthogonal vectors, spaces and matrices

The definition of a vector space (V, +, )

Math 323 Exam 2 Sample Problems Solution Guide October 31, 2013

Lecture 21: 5.6 Rank and Nullity

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

Answers in blue. If you have questions or spot an error, let me know. 1. Find all matrices that commute with A =. 4 3

x y + z = 3 2y z = 1 4x + y = 0

MATH 323 Linear Algebra Lecture 12: Basis of a vector space (continued). Rank and nullity of a matrix.

YORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH M Test #1. July 11, 2013 Solutions

1 Systems of equations

3.3 Linear Independence

LECTURES 14/15: LINEAR INDEPENDENCE AND BASES

MATH 300, Second Exam REVIEW SOLUTIONS. NOTE: You may use a calculator for this exam- You only need something that will perform basic arithmetic.

ELE/MCE 503 Linear Algebra Facts Fall 2018

Elementary Linear Algebra Review for Exam 2 Exam is Monday, November 16th.

2. (10 pts) How many vectors are in the null space of the matrix A = 0 1 1? (i). Zero. (iv). Three. (ii). One. (v).

Math 242 fall 2008 notes on problem session for week of This is a short overview of problems that we covered.

Math 314/814 Topics for first exam

EXERCISE SET 5.1. = (kx + kx + k, ky + ky + k ) = (kx + kx + 1, ky + ky + 1) = ((k + )x + 1, (k + )y + 1)

Practice Problems for the Final Exam

MATH 2030: ASSIGNMENT 4 SOLUTIONS

The Fundamental Theorem of Linear Algebra

5.4 Basis And Dimension

ft-uiowa-math2550 Assignment OptionalFinalExamReviewMultChoiceMEDIUMlengthForm due 12/31/2014 at 10:36pm CST

Math 415 Exam I. Name: Student ID: Calculators, books and notes are not allowed!

Math 544, Exam 2 Information.

Chapters 5 & 6: Theory Review: Solutions Math 308 F Spring 2015

LS.1 Review of Linear Algebra

Lecture 22: Section 4.7

Section 2.2: The Inverse of a Matrix

Chapter 2: Matrix Algebra

PRACTICE PROBLEMS FOR THE FINAL

Vector space and subspace

MATH 320, WEEK 11: Eigenvalues and Eigenvectors

Definitions for Quizzes

Math 3108: Linear Algebra

Linear Algebra Highlights

NAME MATH 304 Examination 2 Page 1

MAT Linear Algebra Collection of sample exams

Row Space, Column Space, and Nullspace

Final Review Sheet. B = (1, 1 + 3x, 1 + x 2 ) then 2 + 3x + 6x 2

DEPARTMENT OF MATHEMATICS

Math 314H EXAM I. 1. (28 points) The row reduced echelon form of the augmented matrix for the system. is the matrix

4.9 The Rank-Nullity Theorem

ICS 6N Computational Linear Algebra Vector Space

x 1 + 2x 2 + 3x 3 = 0 x 1 + 2x 2 + 3x 3 = 0, x 2 + x 3 = 0 x 3 3 x 3 1

Solutions: We leave the conversione between relation form and span form for the reader to verify. x 1 + 2x 2 + 3x 3 = 0

Matrices and Matrix Algebra.

Chapter 2. General Vector Spaces. 2.1 Real Vector Spaces

1. Let m 1 and n 1 be two natural numbers such that m > n. Which of the following is/are true?

Solutions to Math 51 First Exam April 21, 2011

Kevin James. MTHSC 3110 Section 4.3 Linear Independence in Vector Sp

Midterm 1 Review. Written by Victoria Kala SH 6432u Office Hours: R 12:30 1:30 pm Last updated 10/10/2015

Dimension. Eigenvalue and eigenvector

Lecture Summaries for Linear Algebra M51A

Third Midterm Exam Name: Practice Problems November 11, Find a basis for the subspace spanned by the following vectors.

Matrix invertibility. Rank-Nullity Theorem: For any n-column matrix A, nullity A +ranka = n

Chapter SSM: Linear Algebra. 5. Find all x such that A x = , so that x 1 = x 2 = 0.

Chapter 3. Vector spaces

Math 54 First Midterm Exam, Prof. Srivastava September 23, 2016, 4:10pm 5:00pm, 155 Dwinelle Hall.

SECTION 3.3. PROBLEM 22. The null space of a matrix A is: N(A) = {X : AX = 0}. Here are the calculations of AX for X = a,b,c,d, and e. =

Elementary maths for GMT

(c)

In Class Peer Review Assignment 2

Math 308 Spring Midterm Answers May 6, 2013

Math 54. Selected Solutions for Week 5

Semicontinuity of rank and nullity and some consequences

Transcription:

Math 123, Week 5: Linear Independence, Basis, and Matrix Spaces Section 1: Linear Independence Recall that every row on the left-hand side of the coefficient matrix of a linear system A x = b which could be obtained by elementary row operations could be written as a = c 1 a 1 + + c m a m where a 1,..., a m are the original row vectors of the matrix A. When performing Gaussian elimination, we often encountered rows of all zeroes, e.g. (0, 0,..., 0). This is a very important row since it tells us that, depending on vector b, the corresponding equation contains no information ( 0 = 0 ) or is inconsistent ( 0 = something ). Notice that, in order to have a row like this arise, we need to have c 1 a 1 + + c m a m = 0 for some nontrivial (i.e. not all zero) choices of c 1,..., c m. This condition turns out to be very important in the study of vector spaces, and has the following succinct definition. Definition 1 A set of vectors {v 1, v 2,..., v m } where v i R n is said to be linearly independent if c 1 v 1 + c 2 v 2 + + c m v m = 0 implies c 1 = c 2 = = c m = 0. Otherwise, the vectors are said to be linearly dependent. 1

Note: There are some very important geometric consequences of a set of vectors {v 1,..., v m } being linearly dependent. 1. If the set if linearly dependent, then at least one vector in the set can be written as a linear combination of the remaining vectors. Without loss of generality, suppose that c m 0 so that v m is such a vector. Then we have c 1 v 1 + + c m 1 v m 1 + c m v m = 0 = v m = c 1 c m v 1 c m 1 c m v m 1. (1) Note that we have at least one c m 0 by linear dependence. 2. If the set is linearly dependent then there is a vector (which we will be denote v m ) such that span{v 1,..., v m 1, v m } = span{v 1,..., v m 1 }. This follows immediately because (1) implies v m span{v 1,..., v m 1 }. 3. Geometrically, linearly dependent vectors must lie on a common hyperplane which is lower dimension than the number of vectors. This is easiest to visualize in R 2 and R 3, where three linearly dependent vectors R 3 can only span a plane (or line), but not the whole space, e.g.: 2

Example 1 Determine whether the vectors v 1 = (3, 1, 4), v 2 = (1, 1, 0), and v 3 = (1, 2, 2) are linearly independent or dependent. If they are linearly dependent, find non-trivial coefficients c 1, c 2, and c 3 which demonstrate this relationship. Solution: We need to see if there is a non-trivial solution to the following equation: c 1 v 1 + c 2 v 2 + c 3 v 3 = 0. With a little effort, it can be seen that what we need to do is solve the following the linear system: 3 1 1 c 1 0 1 1 2 c 2 = 0. 4 0 2 c 3 0 where the vector v 1, v 2, and v 3 are the columns. We can perform Gaussian elimination to get: 3 1 1 0 1 0 1 2 0 1 1 2 0 5 0 1 2 0. 4 0 2 0 0 0 0 0 We can easily see that this has a non-trivial solution, so that the set of vectors is linearly dependent. We should have expected this, since we know that a row of zeroes was obtained through the process of row reduction for the transpose matrix. In fact, the solution set can be parametrized by c 3 = t so that c 1 = 1 2 t and c 2 = 5 2t. For a specific non-trivial solution, we can select t = 2 to get {c 1 = 1, c 2 = 5, c 3 = 2}. In other words, we have that (1)v 1 + ( 5)v 2 + (2)v 3 = v 1 5v 2 + 2v 3 = 0. We now have a prescription for determine whether a set of n vectors with n coordinates are linearly independent or dependent. Specifically, we 3

have the following result. Theorem 1 A set of vectors {v 1, v 2,..., v n } where v i R n is linearly independent if and only if det(a) 0 where the matrix A is defined as A = [ v 1 v 2 v n ]. (2) Proof The linear independence condition can be stated as A c = 0 where A is defined as in (2) and c = (c 1, c 2,..., c n ). Notice that c = 0 is always a solution of this system, so that, if A is invertible (i.e. det(a) 0), then c = 0 is the only solution. This, however, is the trivial solution. It follows that, if det(a) 0, then the vectors are linearly independent. On the other hand, if det(a) = 0, we have that there is a row of zeroes in the Gaussian row-reduction on the left-hand side. Since the system is homogeneous, however, there is also a zero on the row-reduced righthand side. It follows that A c = 0 has at least one non-trivial solution so that the set of vectors are linearly dependent, and we are done. Example 2 Show that the set of vectors {v 1, v 2 } = {(1, 2), ( 1, 2)} is linearly dependent. Solution: The set of vectors is linearly dependent if c 1 v 1 + c 2 v 2 = 0 for some c 1, c 2 R not all zero. This is equivalent to the linear system [ ] [ ] [ ] 1 1 c1 0 =. 2 2 0 c 2 4

Our determinant test for linear independence gives 1 1 2 2 = (1)(2) ( 1)( 2) = 0. Since det(a) = 0, we have that the vectors are linearly dependent. This is clear geometrically, since the two vectors are scalar multiples of one another. Explicitly, we have v 1 + v 2 = 0. Example 3 Show that the set of vectors {v 1, v 2, v 3 } = {(1, 0, 1), (0, 1, 0), (0, 1, 1)} are linearly independent. Solution: The linear independence condition corresponds to: 1 0 0 c 1 0 0 1 1 c 2 = 0. 1 0 1 c 3 0 We have det 1 0 0 0 1 1 1 0 1 = (1) 1 1 0 1 = (1)[(1)( 1) (0)(1)] = 1. Since det(a) 0, we have that the vectors are linearly independent. This result works well for sets of vectors where the number of vectors is the same as the dimension of the vectors themselves. The linear independence condition, however, can be defined for sets of vectors not satisfying this condition. For instance, consider being asked to determine the whether the vectors v 1 = (3, 1, 2) and v 2 = (1, 1, 2) are linearly independent or dependent. We need to check whether c 1 v 1 + c 2 v 2 = 0 5

has a nontrivial solution. This corresponds to the linear system 3 1 [ ] [ ] 1 1 c1 0 =. c 1 2 2 0 We cannot apply the previous result because the coefficient matrix on the left is not a square matrix. We can either solve the system directly, or apply the following result. (We will omit the proof for time.) Theorem 2 The set of vectors {v 1,..., v m } where v i R n and m < n is linearly independent if and only if the matrix A = [ v 1 v 2 v m ] has some m m submatrix for which the determinant is nonzero. Conversely, the set {v 1,..., v m } is linearly dependent if every m m submatrix of A has a zero determinant. Example 4 Show that the set of vectors {(3, 1, 2), (1, 1, 2)} is linearly independent. Solution: We have the matrix A = 3 1 1 1 1 2 Rather than attempt to determine the determinant of this matrix (which does not exist), we consider the determinants of square submatrices of maximal size. In this case, that is 2 2. We notice that we have [ ] A 3 1 = = det(a ) = (3)(1) ( 1)(1) = 4. 1 1 Since det(a ) 0, we have that the set of vectors is linearly independent.. 6

Notice that this is true even though [ ] A 1 2 = = det(a ) = ( 1)( 2) (1)(2) = 0. 1 2 That is, it is sufficient to find a single maximal submatrix with a nonzero determinant to conclude the set is linearly independent. They do not need to all be nonzero. Example 5 Show that the set of vectors {(1, 2, 1), ( 2, 4, 2)} is linearly dependent. Solution: We need to consider submatrices of the following: 1 2 A = 2 4. 1 2 We have and A = A = A = [ 1 2 2 4 [ 1 2 1 2 [ 2 4 1 2 ] ] ] = det(a ) = (1)(4) ( 2)( 2) = 0, = det(a ) = (1)(2) ( 2)( 1) = 0, = det(a ) = ( 2)(2) (4)( 1) = 0. Since all maximum submatrices has a zero determinant, we may conclude the vectors are linearly dependent. Notice that, for this example, it is probably easier to show linearly dependence directly. We can probably recognize this directly by noting that the second vector is negative two times the first vector. It follows that we have (2) v 1 + (1) v 2 = 0. This result could also be obtained by row-reduction. 7

Section 2: Basis and Dimension We have seen that every subspace W R n may be represented as the span of a finite set of vectors {v 1,..., v m } where m n. Furthermore, we have seen that if the set is linearly dependent then we can reduce the size of the set required to span W. We might naturally ask the following questions: (Q1) Is there a minimal set of vectors {v 1,..., v m } required to span a subspace W R n? (Q2) Is the set of spanning vectors {v 1,..., v m } unique? It turns out that we have all of the tools we need to answer these questions. We have the following definition which is foundational to the study of vector spaces. Definition 2 A set of vector S = {v 1,..., v m } is said to be a basis of W R n if span(s) = W and the set of vectors in S is linearly independent. A basis allows a vector space to be represented by a finite set of generators. We will see that this is an incredibly powerful computational simplification. Reconsider the following example. Example 6 Determine a basis for the subspace W = span(s) R 3 where S = {(3, 1, 4), (1, 1, 0), (1, 2, 2)}. Solution: The general trick will be to check for linear independence, and if this fails, to remove one of the vectors (carefully!) from the set. We will continue this until the remaining set is linearly independent. We have already found that (3, 1, 4), (1, 1, 0), and (1, 2, 2) are linearly dependent, and in particular, that (3, 1, 4) 5 (1, 1, 0) + 2 (1, 2, 2) = (0, 0, 0) 8

This can be rewritten as (3, 1, 4) = 5 (1, 1, 0) 2 (1, 2, 2). (3) so that (3, 1, 4) span{(1, 1, 0), (1, 2, 2)}. It follows that span{(3, 1, 4), (1, 1, 0), (1, 2, 2)} = span{(1, 1, 0), (1, 2, 2)}. It only remains to check whether this reduced set is linearly dependent since, if it were, we would be able to remove on the vectors from the spanning set. By our earlier result, we check 1 1 1 2 = 1 0 so that the set is linearly independent. It follows that B = {(1, 1, 0), (1, 2, 2)} is a basis of the subspace W = span(s). Note: Before moving on, we should give at least passing consideration to question number two above. That is, we should give consideration to whether this is the only possible choice of basis. In fact, we should be able to easily convince ourselves that this is not that case. Notice that (3, 1, 4) 5 (1, 1, 0) + 2 (1, 2, 2) = (0, 0, 0) = (1, 2, 2) = 1 2 (3, 1, 4) 5 (1, 1, 0). 2 It follows that span{(3, 1, 4), (1, 1, 0)} = span{(3, 1, 4), (1, 1, 0), (1, 2, 2)}. Since the set B = {(3, 1, 4), (1, 1, 0)} is linearly independent, we have that B is also a basis of the subspace W = span(s). That is, we have found two bases of W : B and B. We might be discouraged that a subspace W R n can be represented by multiple bases; however, this turns out to be quite useful in practice. It is often beneficial to represent vector spaces by different bases in one application than another. First, however, we should consider how the bases of a space are related to one another. Theorem 3 Suppose a subspace W R n has bases B 1 and B 2. Then the num- 9

ber of elements in B 1 and B 2 is the same. This number is called the dimension of W and is denoted dim(w ). This result allows us to relax a little bit. We may not be able to find a unique basis, but at the very least we know that the number of elements needs to minimally generate a space is always the same. Note (standard basis): For R n we denote the standard basis B = {e 1, e 2,..., e n } where e i R n is the vector with a one in the ith component and zeroes elsewhere. For example, in R 3 we have the standard basis B = {e 1, e 2, e 3 } where e 1 = (1, 0, 0), e 2 = (0, 1, 0), and e 3 = (0, 0, 1). Note (coordinates): A key feature of having a basis B = {v 1,..., v m } for a subspace W R n is that each element v W can be written as a linear combination of elements in the basis, i.e. v = c 1 v 1 + + c m v m in exactly one way. That is to say, once the basis has been chosen, the representation of each vector v W in terms of that basis is fixed. The values c i, i = 1,..., m, are said to be the coordinates of the vector v relative to the basis B. For example, in the standard basis of R 3, we have that the vector (2, 3, 1) is given uniquely by coordinates c 1 = 2, c 2 = 3, c 3 = 1. This uniqueness of coordinates, however, is true in any basis. Example 7 Show that the set B = {(2, 0, 1), (1, 1, 0), (0, 1, 1)} is a basis of R 3 and then determine the coordinates of (2, 3, 1) with respect to this basis. Solution: For three vectors in R 3, the linear independent condition 10

coincides with our ability to span R 3. We have 2 1 0 0 1 1 1 0 1 = (2) 1 1 0 1 (1) 0 1 1 1 = (2)(1) (1) = 1 0. It follows that B is linearly independent and therefore is a basis of R 3. To determine the coordinates of (2, 3, 1) with respect to the basis B, we solve 2 1 0 2 c 1 0 + c 2 1 + c 3 1 = 3. 1 0 1 1 This gives 2 1 0 2 0 1 1 3 1 0 1 1 1 0 0 2 0 1 0 6 0 0 1 3. It follows that the coordinates of (2, 3, 1) with respect to the basis B are c 1 = 2, c 2 = 6, and c 3 = 3. Section 3: Row, Column, and Nullspace We now return to the consideration of matrices. We introduce the following matrix-related vector spaces. Definition 3 Consider an m n matrix A. We define: 1. the row space of A (denoted row(a)) to be the span of the row vectors of A; 2. the column space of A (denoted col(a)) to be the span of the 11

column vectors of A; and 3. the null space of A (denoted null(a)) to be the set of vectors v such that A v = 0. Example 8 Describe the row space, column space, and null space of the following matrix in terms of the span of vectors: [ ] 2 3 1 A =. 1 1 0 Solution: We can easily see that row(a) = span{(2, 3, 1), ( 1, 1, 0)} and that col(a) = span{(2, 1), ( 3, 1), (1, 0)}. It is worth noting here that, although row(a) and col(a) are derived from the same matrix, they do not need to be subspaces of the same vector space. In this case, row(a) R 3 and col(a) R 2. In general, for an m n matrix A we have row(a) R n and col(a) R m We now consider null(a). As at the beginning of class, we solve A v = 0 by Gaussian elimination. This gives [ ] [ ] 2 3 1 0 1 0 1 0. 1 1 0 0 0 1 1 0 Set v 3 = t, we have v 1 = t, v 2 = t. It follows that v 1 1 v 2 v 3 = t 1. 1 It follows that null(a) = span{(1, 1, 1)}. Note: The null space of matrix also goes by the name of the kernel (denoted ker(a)) while the column space is often called the range of image (denoted range(a) and im(a), respective). In this class, we will always refer to the null space and column space. 12

Defining a vector space by the span of an arbitrary set of vectors is not, in general, the best way to describe the space. More commonly, we are interested a basis for the space. We have the following result. Theorem 4 Consider the row-reduced echelon form of an m n matrix A. Then: 1. A basis for row(a) is given by the rows of the row-reduced echelon matrix which have leading ones; 2. A basis for col(a) is given by the columns of the original matrix which have leadings ones in the row-reduced echelon form; and 3. A basis for null(a) is given by the the vectors found by placing the solution A v = 0 in vector form. Example 9 Determine a basis of the row space, column space, and null space of the following matrix: [ ] 2 3 1 A = 1 1 0 Solution: We had the row-reduced echelon matrix [ ] 1 0 1. 0 1 1 It follows that the set {(1, 0, 1), (0, 1, 1)} is a basis of the row(a), the set {(2, 1), ( 3, 1)} is a basis of col(a) (since the third column is not assigned a leading one in the row reduction), and {(1, 1, 1)} is a basis of null(a) (as before). We now want to say something about how these three spaces relate to one another. We introduce the following. Definition 4 Consider an m n matrix A with row space row(a) and null space 13

null(a). Then the rank of A is given by rank(a) = dim(row(a)) and the nullity of A is given by nullity(a) = dim(null(a)). The following result, which is known as the Rank-Nullity Theorem, demonstrates how the row, column, and null spaces relate. Theorem 5 Consider an m n matrix A with row space row(a), column space col(a), and null space null(a). Then: (1) rank(a) = dim(row(a)) = dim(col(a)); (2) rank(a) + nullity(a) = n; and (3) If B row is a basis of row(a) and B null is a basis of null(a) then B = {B row, B null } is a basis of R n. Example 10 Verify the Rank-Nullity Theorem for the matrix [ ] 2 3 1 A =. 1 1 0 Solution: We can easily see that rank(a) = dim(row(a)) = dim(col(a))= 2 and nullity(a) = 1. It also follows that rank(a)+nullity(a) = 2 + 1 = 3 = n so that the second point is satisfied, and it can be easily checked that the set B = {(1, 0, 1), (0, 1, 1), (1, 1, 1)} is a basis of R 3 since det 1 0 1 0 1 1 1 1 1 = (1) 1 1 1 1 = (2) + (1) = 3 0. + (1) 0 1 1 1 14

Example 11 Determine a basis for the row space, column space, and null space of 1 1 0 1 0 A = 1 1 1 1 2 1 1 3 4 3. 1 1 2 2 1 Determine the rank and nullity of A and verify that the Rank-Nullity Theorem is satisfied. Solution: The matrix A can be row reduced to give 1 1 0 1 0 1 1 1 1 2 1 1 3 4 3 1 1 2 2 1 1 1 0 0 1 0 0 1 0 2 0 0 0 1 1 0 0 0 0 0 A basis for the row space is given by the rows of the row reduced matrix with leading ones, so that we have. row(a) = span {(1, 1, 0, 0, 1), (0, 0, 1, 0, 2), (0, 0, 0, 1, 1)}. A basis for the column space is given by the columns of the original matrix corresponding to leading ones in the row reduced echelon form, so that we have col(a) = span {(1, 1, 1, 1), (0, 1, 3, 2), (1, 1, 4, 2)}. A basis for the nullspace can be given by the vector form of the solution of the homogeneous system so that we have A v = 0 null(a) = span {(1, 1, 0, 0, 0), (1, 0, 2, 1, 1)}. 15

We have that rank(a) = dim(row(a))= 3 and nullity(a) = dim(null(a))= 2. It follows that rank(a) + nullity(a) = (3) + (2) = 5 which corresponds to the number of columns of A. This is exactly what we expect from the rank-nullity theorem. To verify the remaining condition, it can be checked that 1 0 0 1 1 1 0 0 1 0 det(a) = 0 1 0 0 2 = 5 0 0 0 1 0 1 1 0 1 0 1 so that the union of the basis of the row space and the null space is a basis of R 5. Suggested Problems: 1. Determine whether the following set of vectors is linearly independent of linearly dependent. If they are linearly dependent, determine a nontrivial set of constants c 1,..., c m such that c 1 v 1 + + c m v m = 0. (a) v 1 = ( 1, 2, 2), v 2 = ( 1, 3, 2) (b) v 1 = (1, 0, 3, 1), v 2 = ( 3, 0, 9, 3) (c) v 1 = ( 2, 1, 1), v 2 = ( 1, 2, 1), v 3 = (0, 1, 1) (d) v 1 = ( 1, 1, 1, 1), v 2 = (1, 1, 0, 2), v 3 = ( 2, 0, 1, 1) 2. Determine a basis for the following spaces: (a) W = span{v 1, v 2, v 3 } where v 1 = (1, 2), v 2 = ( 2, 4), and v 3 = (3, 6). (b) W = span{v 1, v 2, v 3 } where v 1 = (1, 1, 2), v 2 = ( 2, 1, 1), and v 3 = ( 3, 0, 1). (c) W = span{v 1, v 2, v 3, v 4 } where v 1 = (1, 1, 1, 3), v 2 = (1, 1, 3, 5), v 3 = (1, 1, 1, 1), and v 4 = (1, 1, 0, 2). 16

3. Determine a basis for the row space, column space, and null space of the following matrices. Also determine the rank and nullity. [ ] 2 5 (a) A = 2 5 (b) A = 2 2 1 1 0 2 2 3 0 (c) A = (d) A = 2 1 1 1 1 1 1 1 1 2 2 2 2 1 4 3 1 1 2 3 1 1 2 1 1 0 2 2 4. Prove that, if v m+1 span{v 1,..., v m }, then span{v 1,..., v m } = span{v 1,..., v m, v m+1 }. 5. Show that the linear system A x = b has a solution if and only if b col(a). 6. Show that, for an n n matrix A, the following are equivalent: (a) det(a) = 0 (b) A is noninvertible (singular) (c) null(a) {0} (note that 0 null(a) for any matrix A.) (d) A x = b has a unique solution (e) rank(a) = n. 7. Prove that, for any m n matrix A, row(a) and null(a) are subspaces of R n, and col(a) is a subspace of R m. 17