Dimension We showed that adding a vector to a basis produces a linearly dependent set of vectors; more is true. Lemma If a vector space V has a basis B containing n vectors, then any set containing more than n vectors from V cannot be linearly independent. Proof Suppose {v 1,v 2,,v p } is a set of p > n vectors in V. Then the coordinate vectors [v 1 ] B,[v 2 ] B,,[v p ] B lie in R n. But as there are more than n such coordinate vectors, there must be a linear dependence amongst them; that is, there are scalars c 1,c 2,,c p, not all zero, so that c 1 [v 1 ] B + c 2 [v 2 ] B + + c p [v p ] B = 0. Since the transformation ϕ(v) = [v] B is linear, it follows that [c 1 v 1 +c 2 v 2 + +c p v p ] B = 0, which means that if B = {b 1, b 2,, b n }, c 1 v 1 + c 2 v 2 + + c p v p = 0 b 1 +0 b 2 + +0 b n = 0. which illustrates that {v 1,v 2,,v p } is a linearly dependent set. //
Theorem If a vector space V has a basis containing exactly n vectors, then every basis for V contains n exactly vectors. Proof Suppose B is a basis for V containing n vectors. If C is a basis for V containing more than n vectors, then by the lemma, C is linearly dependent, so cannot be a basis; if C is a basis for V containing fewer than n vectors, then by the lemma, B is linearly dependent, so cannot be a basis. Neither of these cases is possible, so any other basis for V must contain exactly n vectors. // This theorem makes it possible to define the notion of the dimension of a vector space: the vector space V is said to have dimension n if it has a basis consisting of exactly n vectors (hence all bases for V have the same size); in this case, we write dimv = n. Example: Since R n has a basis containing n vectors, namely the standard basis E = {e 1,e 2,,e n }, then R n has dimension n. (See also Examples 2, 3, 4, p. 258.) How does one find a basis for an arbitrary vector space? The primary tools for doing this are provided by the following theorems.
Theorem Let S = {v 1,v 2,,v p } be a set of vectors in the vector space V that spans V. Then some subset of S forms a basis for V. Proof If S is linearly independent, then it is a basis for V; otherwise, some one of the vectors in S, say v k, is a linear combination of the vectors preceeding it in the list of v s. Its removal produces a subset of S which still spans V. If this subset is not linearly independent, we can continue to remove vectors as necessary without changing the fact that the set spans V. Eventually, we must obtain a linearly independent set (in the worst case, a set consisting of one nonzero vector) that spans V. This subset of S is a basis for V. // Theorem Let S = {v 1,v 2,,v p } be a set of linearly independent vectors in the vector space V. Then S is the subset of some set of vectors in V that forms a basis for V. Proof If S already spans V, then it is a basis for V; otherwise, some vector v p+1 in V is not a linear combination of the vectors in S. Adding it to S does not change the linear independence of the set. If this larger set does not span V, we can continue to add vectors as necessary without changing the fact
that the set is linearly independent. Eventually, we must obtain a spanning set for V (since V is contained in some R n, the set must span V once it contains at most n vectors). This set of vectors expanded from S is therefore a basis for V. // Corollary If V is an n-dimensional vector space, then any linearly independent set of n vectors is a basis for V, and any set of n vectors that spans V is a basis for V. // We can now answer the question we posed earlier: is every subspace of R n spanned by a finite set of vectors from the space? Corollary Every subspace H of R n has dimension no greater than n and thus has a basis containing no more than n vectors. //
We saw already that a basis for the column space Col A of a matrix A can be formed from the pivot columns of A. So dim Col A = # pivot columns in A = # basic variables in the solution of Ax = b. What about the null space of A? Theorem dim Nul A = # non-pivot columns in A = # free variables in the solution of Ax = 0. Proof Immediate from a consideration of how one solves the matrix equation Ax = 0 through bringing A to row echelon form. //