Footnotes to Linear Algebra (MA 540 fall 2013), T. Goodwillie, Bases

Similar documents
Chapter 1 Vector Spaces

Math 550 Notes. Chapter 2. Jesse Crawford. Department of Mathematics Tarleton State University. Fall 2010

In N we can do addition, but in order to do subtraction we need to extend N to the integers

Stat 451: Solutions to Assignment #1

A linear algebra proof of the fundamental theorem of algebra

A linear algebra proof of the fundamental theorem of algebra

Chapter 2: Linear Independence and Bases

Spanning, linear dependence, dimension

NOTES (1) FOR MATH 375, FALL 2012

SUPPLEMENT TO CHAPTER 3

Principles of Real Analysis I Fall I. The Real Number System

BASES. Throughout this note V is a vector space over a scalar field F. N denotes the set of positive integers and i,j,k,l,m,n,p N.

Notes on the Point-Set Topology of R Northwestern University, Fall 2014

Part III. 10 Topological Space Basics. Topological Spaces

5 Set Operations, Functions, and Counting

a (b + c) = a b + a c

V (v i + W i ) (v i + W i ) is path-connected and hence is connected.

Homework #2 Solutions Due: September 5, for all n N n 3 = n2 (n + 1) 2 4

Linear Algebra Lecture Notes-I

Unit 2, Section 3: Linear Combinations, Spanning, and Linear Independence Linear Combinations, Spanning, and Linear Independence

August 23, 2017 Let us measure everything that is measurable, and make measurable everything that is not yet so. Galileo Galilei. 1.

We are going to discuss what it means for a sequence to converge in three stages: First, we define what it means for a sequence to converge to zero

LINEAR ALGEBRA BOOT CAMP WEEK 1: THE BASICS

Linear Algebra Lecture Notes

Section 2: Classes of Sets

A Do It Yourself Guide to Linear Algebra

Apprentice Linear Algebra, 1st day, 6/27/05

Abstract Vector Spaces

Thus, X is connected by Problem 4. Case 3: X = (a, b]. This case is analogous to Case 2. Case 4: X = (a, b). Choose ε < b a

In N we can do addition, but in order to do subtraction we need to extend N to the integers

Math 4153 Exam 1 Review

S15 MA 274: Exam 3 Study Questions

Chapter 4. Measure Theory. 1. Measure Spaces

Linear Algebra (part 1) : Vector Spaces (by Evan Dummit, 2017, v. 1.07) 1.1 The Formal Denition of a Vector Space

MA651 Topology. Lecture 9. Compactness 2.

Let V be a vector space, and let X be a subset. We say X is a Basis if it is both linearly independent and a generating set.

Definition 1. A set V is a vector space over the scalar field F {R, C} iff. there are two operations defined on V, called vector addition

Abstract Measure Theory

2 so Q[ 2] is closed under both additive and multiplicative inverses. a 2 2b 2 + b

Cardinality and ordinal numbers

SOLUTIONS TO EXERCISES FOR. MATHEMATICS 205A Part 1. I. Foundational material

Sequences of height 1 primes in Z[X]

Linear Algebra. Preliminary Lecture Notes

Chapter One. The Real Number System

Vector Spaces 4.5 Basis and Dimension

Part V. 17 Introduction: What are measures and why measurable sets. Lebesgue Integration Theory

We showed that adding a vector to a basis produces a linearly dependent set of vectors; more is true.

MAT2342 : Introduction to Applied Linear Algebra Mike Newman, fall Projections. introduction

Math 541 Fall 2008 Connectivity Transition from Math 453/503 to Math 541 Ross E. Staffeldt-August 2008

Linear Algebra II. 2 Matrices. Notes 2 21st October Matrix algebra

(II.B) Basis and dimension

Math 110, Spring 2015: Midterm Solutions

1 Topology Definition of a topology Basis (Base) of a topology The subspace topology & the product topology on X Y 3

Sequences. Chapter 3. n + 1 3n + 2 sin n n. 3. lim (ln(n + 1) ln n) 1. lim. 2. lim. 4. lim (1 + n)1/n. Answers: 1. 1/3; 2. 0; 3. 0; 4. 1.

Orthonormal Systems. Fourier Series

NOTES ON VECTOR-VALUED INTEGRATION MATH 581, SPRING 2017

AFFINE AND PROJECTIVE GEOMETRY, E. Rosado & S.L. Rueda 4. BASES AND DIMENSION

The Cyclic Decomposition of a Nilpotent Operator

0.2 Vector spaces. J.A.Beachy 1

Math 421, Homework #7 Solutions. We can then us the triangle inequality to find for k N that (x k + y k ) (L + M) = (x k L) + (y k M) x k L + y k M

Hilbert spaces. 1. Cauchy-Schwarz-Bunyakowsky inequality

18.175: Lecture 2 Extension theorems, random variables, distributions

3 COUNTABILITY AND CONNECTEDNESS AXIOMS

The Banach-Tarski paradox

Spring 2014 Advanced Probability Overview. Lecture Notes Set 1: Course Overview, σ-fields, and Measures

Exercises for Unit VI (Infinite constructions in set theory)

Clearly C B, for every C Q. Suppose that we may find v 1, v 2,..., v n

A n = A N = [ N, N] A n = A 1 = [ 1, 1]. n=1

1 The Local-to-Global Lemma

Review of Linear Algebra

CS 173: Induction. Madhusudan Parthasarathy University of Illinois at Urbana-Champaign. February 7, 2016

MTH5102 Spring 2017 HW Assignment 3: Sec. 1.5, #2(e), 9, 15, 20; Sec. 1.6, #7, 13, 29 The due date for this assignment is 2/01/17.

Dr. Abdulla Eid. Section 4.2 Subspaces. Dr. Abdulla Eid. MATHS 211: Linear Algebra. College of Science

INVERSE LIMITS AND PROFINITE GROUPS

Theorem 5.3. Let E/F, E = F (u), be a simple field extension. Then u is algebraic if and only if E/F is finite. In this case, [E : F ] = deg f u.

A NICE PROOF OF FARKAS LEMMA

Boolean Algebras. Chapter 2

Linear algebra and differential equations (Math 54): Lecture 10

SYMBOL EXPLANATION EXAMPLE

Elementary linear algebra

Lecture 14: The Rank-Nullity Theorem

Domesticity in projective spaces

Lecture Summaries for Linear Algebra M51A

4.4. Orthogonality. Note. This section is awesome! It is very geometric and shows that much of the geometry of R n holds in Hilbert spaces.

MATH 220 (all sections) Homework #12 not to be turned in posted Friday, November 24, 2017

Definition Suppose S R n, V R m are subspaces. A map U : S V is linear if

Axioms for Set Theory

FINITE CONNECTED H-SPACES ARE CONTRACTIBLE

Math 350 Fall 2011 Notes about inner product spaces. In this notes we state and prove some important properties of inner product spaces.

Math 117: Topology of the Real Numbers

ADVANCED CALCULUS - MTH433 LECTURE 4 - FINITE AND INFINITE SETS

Generalized eigenspaces

Linear Algebra. Preliminary Lecture Notes

Lecture notes - Math 110 Lec 002, Summer The reference [LADR] stands for Axler s Linear Algebra Done Right, 3rd edition.

DO FIVE OUT OF SIX ON EACH SET PROBLEM SET

THE REAL NUMBERS Chapter #4

ALGEBRAIC GEOMETRY COURSE NOTES, LECTURE 2: HILBERT S NULLSTELLENSATZ.

Lebesgue Measure. Dung Le 1

Finite-Dimensional Cones 1

MATH41011/MATH61011: FOURIER SERIES AND LEBESGUE INTEGRATION. Extra Reading Material for Level 4 and Level 6

Transcription:

Footnotes to Linear Algebra (MA 540 fall 2013), T. Goodwillie, Bases November 18, 2013 1 Spanning and linear independence I will outline a slightly different approach to the material in Chapter 2 of Axler s text. 1.1 Spanning If V is a vector space and S is a finite subset of V then the span of S may be defined as the set of all vectors v V such that v can be written as a linear combination of elements of S: v = Σ s S c s s The span of S is always a subspace of V. The span of S contains S. Every subspace of V that contains S must contain the span of S, and of course every subspace of V that contains the span of S must contain S; the span of S is the smallest subspace that contains S. We say that S is a spanning set for V if the span of S is V. If S is a spanning set for V then for every v V there is at least one way to give scalars c s such that v = Σ s S c s s. 1.2 Independence We say that S is a linearly independent set if for every v V there is at most one way to give scalars c s such that v = Σ s S c s s. An equivalent condition on the set S is that whenever 0 = Σ s S c s s then the scalars c s must all be zero. 1

Let us show why this condition is equivalent to S being linearly independent. First we assume that S is linearly independent and we show that S satisfies the second condition. If scalars c s are such that 0 = Σ s S c s s, then since also 0 = Σ s S 0s we can say that for every s S the scalar c s (the coefficient of s in the first expression) is equal to 0 (the coefficient of s in the second expression). On the other hand, what if we assume that S is not linearly independent? Then for some vector v V there are two different collections of scalars both giving v: v = Σ s S c s s = Σ s S d s s. But then we have 0 = v v = Σ s S c s s Σ s S d s s = Σ s S (c s d s )s, so that 0 has been written as a linear combination of the elements of S using coefficients c s d s that are not all zero. So in this case S does not satisfy the second condition. 1.3 Bases A subset of V is called a basis for V if it is both a spanning set and linearly independent. Thus if B is a basis for V then for every v V there is exactly one way to give scalars c b such that v = Σ b B c b b. 1.4 Bases and subspaces It is clear that any subset of a linearly independent set must be linearly independent (just as it is clear that any subset of V that contains a spanning set must be a spanning set). And every linearly independent set in V is a basis for a subspace (its span). In particular, whenever we take a basis B for V and write it as the union of two sets u 1..., u m and w 1..., w n with empty intersection, then these sets are bases for subspaces U and W. In fact, V is the direct sum of U and W. Why? Clearly U + W is V, because it is a subspace of V that contains the basis. To see that U W is {0}, suppose that v U W. Then there are scalars c i and d j such that c 1 u 1 +... + c m u m = v = d 1 w 1 +... + d n w n The equation c 1 u 1 +... + c m u m d 1 w 1... d n w n = 0 implies that all c i and d j are zero, since B is a basis. Thus v = 0, as we wished to show. Conversely, if V = U W (i.e. if V = U + W and U W = {0}), then the union of a basis for U and a basis for W is always a basis for V. To check this carefully, suppose that u 1..., u m and 2

w 1..., w n are bases for U and W. The span of u 1,..., w n is a subspace of V that contains both U and W, so it must be V. For linear independence, suppose that The vector (c 1 u 1 +... + c m u m ) + (d 1 w 1 +... + d n w n ) = 0. c 1 u 1 +... + c m u m = (d 1 w 1 +... + d n w n ) is in both U and W, so it is zero. But this implies that all c i are zero because the u 1..., u m are linearly independent, and it implies that all d j are zero because the w 1..., w n are linearly independent. 1.5 A remark We have not yet shown that V has a basis. That s because we have not used the fact that our scalars form a field. It is time to start dividing by non-zero scalars. 1.6 Two key observations (1) If Σ s S c s s = 0 and the coefficient c s0 of some particular s 0 S is not zero, then s 0 can be expressed as a linear combination of the vectors s S different from s 0. In fact, s 0 is equal to Σ s s0 ( cs c s0 )s. (2) Assume that I is a linearly independent set and that s is not in the span of I. Then I {s} is again linearly independent. In fact, if there were a nontrivial linear relation between the elements of I {s} then the coefficient of s could not be zero since that would give a nontrivial linear relation between just the elements of I, contradicting the first assumption. And if the coefficient of s is not zero then (1) applies and shows that s in the span of I, contradicting the second assumption. 1.7 Existence of a basis, one approach To show that V has a basis, we can proceed as follows: This works if V has a finite spanning set. Start with a finite spanning set S. If it is not a basis for V (not linearly independent), then there is a nontrivial linear relation between the elements of S, so that (1) above says that there is some element of S such that when we remove it we still have a spanning set. Remove such an element of S. Repeat until the spanning set becomes a basis. 3

1.8 Existence of a basis, an opposite approach This produces a basis for V as long as V does not have arbitrarily large linearly independent sets. Maybe the empty set is a basis (i.e. maybe 0 is the only vector in V ). If not, then choose a vector v 1 0. Maybe the linearly independent set consisting of v 1 alone is a basis. If not then that set does not span V, so we can choose a vector v 2 not in the span of v 1 By (2) above, the set consisting of the two vectors v 1 and v 2 is linearly independent. Maybe this set is a basis. If not then that set does not span V, so we can choose a vector v 3 not in the span of v 1 and v 2. By (2) above, the set consisting of these three vectors is linearly independent. Maybe it is a basis. And so on. This must yield a basis eventually (unless it goes on forever, producing arbitrarily large linearly independent sets). 1.9 An aside: Finite-dimensional and infinite-dimensional It turns out that there are only two possibilities for a vector space V : 1.9.1 The finite-dimensional case By finite-dimensional we mean that there exists a finite set spanning V. In this case, as proved below: V has a finite basis. (We showed this in 1.7.) Every basis has the same number of elements as every other basis. Call this number n. Every spanning set has at least n elements. Every linearly independent set has at most n elements. We call n the dimension of V and denote it by dim(v ). 1.9.2 The infinite-dimensional case By infinite-dimensional we mean that there exists no finite set spanning V. In this case for every n there is a linearly independent set having n elements. To see this, follow the procedure described in 1.8. The process never ends. In fact, although we will not do this in the course, the concepts of spanning set, linearly independent set, and basis can be extended beyond the finite-dimensional case. It can even be shown that V 4

still has a basis if V is infinite-dimensional; the basis will now be an infinite set. 1.10 Back to business All of the assertions above about the finite-dimensional case follow immediately from the next Lemma: 1.11 The fundamental inequality Lemma: If S spans V and I is a linearly independent set in V then S has at least as many elements as I. Proof: We first show that if S spans V and I is a linearly independent set in V such that S does not contain I, then there exists another spanning set S that has the same number of elements as S but has more elements of I in it than S does. To do this, we choose an element i I such that i / S. There are scalars c s such that i = Σ s S c s s, because S spans V. The numbers c s for s / I cannot all be zero, because an element of I cannot be a linear combination of other elements of I, since I is linearly independent. Choose some s S such that c s 0 and s / I. If S is defined to be S with i adjoined and s removed, then S has the same number of elements as S and has more elements of I in it than S does. Also, it still spans V because the deleted element s is in its span, by (1) above. To complete the proof of the Lemma, suppose that some spanning set for V has n elements and that the set I is linearly independent and has m elements. Choose a spanning set having n elements and having as many elements of I as possible. It must have all of I in it, because otherwise the previous paragraph shows how to make a new S with more elements of I in it. Therefore n m. 1.12 Dimension and sum and intersection At this point we know (see Bases and subspaces above) that dim(u W ) = dim(u) + dim(w ). We want to show that more generally dim(u + W ) = dim(u) + dim(w ) dim(w U). If V is a vector space and W is a subspace of V, let us say that a subspace C of V is complementary to W in V if V = W C, i.e. if W + C = V and W C = {0}. For every subspace of V there is at least one complementary subspace (if V is finite-dimensional). We can obtain this by starting with a basis of W, adjoining new vectors to it to get a basis for V, and referring again to Bases and subspaces above: the new vectors must necessarily form a basis for some subspace complementary to W. 5

Note that for any given W V there are generally many complementary subspaces. But we know they all have the same dimension, namely dim(v ) dim(w ) Now, suppose that some finite-dimensional vector space V is the sum of two subspaces, V = W +U, but do not assume the sum is a direct sum. That is, do not assume that W U = {0}. Choose a subspace of W that is complementary to W U (in W ). Call it C. I claim that, in addition to being complementary to W U in W, C is also complementary to U in V. To prove this we have to show two things: 1. C + U = V. To see this, observe that V = W + U = (C + (W U)) + U = C + ((W U) + U) = C + U, where the last step uses that W U U. 2. C U = {0}. To see this, observe that C U = (C W ) U = C (W U) = {0}, where the first step uses that C = C W. It follows that dim(c) is equal to both dim(v ) dim(u) and dim(w ) dim(w U). 2 Infinite bases This section is an entirely optional supplement to the course. The course is mainly about finite-dimensional vector spaces, and much of what we will learn only applies in the finite-dimensional case. But the notions of spanning and linear independence can be extended to the general case in a good way. Instead of finite lists or finite sets of vectors, we use sets of vectors. If S is a subset of V, not necessarily finite, then call S a spanning set for V if every element of V may be expressed as a (finite) linear combination v = Σ s c s s, where the sum is over all s in some finite subset of S and for each such s c s is a scalar. We can also write this as an infinite sum, with the understanding that c s = 0 for all but finitely many s S. More generally, we can speak of S spanning a subspace of V. Every subset of V spans some subspace. Call S a linearly independent set if whenever Σ s c s s = 0 then c s = 0 for every s S. Call S a basis if it spans V and is also linearly independent. This means that every vector can be uniquely expressed as a linear combination of the vectors in S. The following statements can be proved without assuming that V has a finite spanning set, but the proof depends on the Axiom of Choice: 6

Every vector space has a basis. More generally, every spanning set in V has a subset that is a basis, and for every linearly independent set in V there is some basis that contains it. Even more generally, we can prove that, given any spanning set S and any linearly independent set I such that I S, there is a basis B such that I B S. The key point is observation (2) above, which is still valid if the set S is infinite. Let us use this observation to prove the desired statement. In the case of a countable spanning set we do not need Choice. Take the elements of B I and list them in some order v 1, v 2,.... Let W n be the subspace of V spanned by I {v 1,..., v n }. Note that W n+1 is spanned by W n and v n+1. The set I is a basis for W 0. Inductively make a basis B n for W n. Let B 0 be I. Having made B n, make B n+1 by either adjoining the vector v n+1 to B n or not. If v n+1 W n then let B n+1 = B n : in this case W n+1 = W n so that B n is a basis for W n+1. If v n+1 / W n, then let B n+1 = B n {v n+1 }. This is linearly independent by the observation above, and it spans W n+1. At the end of this infinite process, we have what we want: the union of the sets is linearly independent and it spans V. B 0 B 1... (To extend this argument to the uncountable case we need the Axiom of Choice in some form. I won t go into that here.) One consequence of this is that for every subspace of V there is a complementary subspace. This way of making a basis generalizes 1.8 above. There does not appear to be a version of 1.7 in the infinite-dimensional case. 7