Spanning, linear dependence, dimension

Size: px
Start display at page:

Download "Spanning, linear dependence, dimension"

Transcription

1 Spanning, linear dependence, dimension In the crudest possible measure of these things, the real line R and the plane R have the same size (and so does 3-space, R 3 ) That is, there is a function between R and R which is one-to-one and onto But everybody knows that there is a definite sense in which the plane is bigger than the line, and Euclidean space is bigger still The line is -dimensional, the plane -dimensional, and 3-space 3-d One of the purposes of these notes is to make this idea precise In what follows, we assume we have fixed a field F and a vector space V over F We start, of course, with a DEFINITION (LINEAR COMBINATION) Suppose that S V and v V (So S is a set of vectors from V ) We say that v is a linear combination of the vectors in S if there are v,, v k in S and scalars α,, α k such that v = α v + + α k v k 3 For a simple example, suppose that F = R, V = R 3 and S =, 7 Then 6 9 is a linear combination of the vectors in S because 3 + ( ) 7 But 6 9 vectors in S because we cannot solve the system α 6 9 = is not a linear combination of the 3 + α 7 = 6 9 (Try it) (This example brings up a small matter of language In case S = { v,, v k } is a fairly small finite set and it usually will be, for us and v is a linear combo of the vectors in S, we will often just say that v isa linearcombination of v,, v k So it is correct (and standard) to say that 6 9 is a linear combination of 3 and 7 However, we do want to leave open the possibility that S is infinite But even if S is infinite, a linear combination involves only finitely many of the vectors from S) For a slightly more sophisticated example, again suppose that F = R, but V = C(R), the space of continuous functions on the real numbers Then cos x is a linear combination of and sin x This follows immediately from the well-

2 known fact that cos x = sin x (for all x) [Please note that as usual in this context, does not denote the number but the constant function] Here s a closely related notion: DEFINITION (DEPENDENCE RELATION) Suppose that v,, v k are distinct vectors in V and there are scalars α,, α k which are not all zero such that α v + +α k v k = In such a case we call the expression α v + +α k v k = a (nontrivial) dependence relation We won t count the trivial case where all the α s are zero So 3 + ( ) ( ) 9 = is a dependence rela- tion On the other hand, there is no dependence relation involving 7 and α 7 To see this, try to solve the system + α α 3 = (Do it; don t just take my word for it) As I hope you noticed, there s a very direct connection between linear combos and dependence relations Let s spell it out Another DEFINITION (DEPENDENT SETS) Suppose that S is a subset of the vector space V ; S is called (linearly) dependent if there is a nontrivial dependence relation among the elements of S that is, there are v,, v k in S and scalars α,, α k such that α v + + α k v k =, and at least one α j In case this doesn t happen S is called aw, you guessed independent So the set 3, 7, 6 9 is dependent The set 3, 7, 6 9 is independent There are two points to be made here first, why linearly? Well, there are other notions of dependence/independence in math, but we won t be deal with them in this course We will just say a set is either dependent or independent from now on The other point concerns order order; that is, S = 3, 7, Aset doesn t come in any particular, 6 9 is the same thing as

3 3, 6 9, 7 In fact, it s also the same thing as 3, 3, 7, 6 9 A set is determined by what its elements are; how they are presented doesn t matter But later in the course, we will be considering sets of vectors that come in a particular order, and that will be of some importance The ordered sets 3, 7, 6 9, 3, 6 9, 7, and 3, 3, 7, 6 9 are all different things [Which one of them is the same as the unordered set S? None of them] Note that any set S which has the zero vector in it is automatically dependent [Why?] Further, the only one-element set which is dependent is { } [Again, why?] Here s the relationship between these two ideas PROPOSITION Suppose that S V The following are equivalent: S is dependent There are v S and v, v k S (different from v) such that v is a linear combination of v,, v k PROOF: (a) (b) Suppose that S is dependent; then there are distinct vectors v,, v m S and scalars α,, α m such that α v + + α m v m = Also, not all of the α s are We may assume that α m ; then α m v m = α v α m v m and hence v m = α α m v α m α m v m This is exactly what (b) says [Get used to this kind of arguement using a dependence relation to solve for one vector in terms of the others It comes up a lot] (b) (a) Suppose that v is a linear combination of v,, v k (all distinct vectors from S) Then v = β v + + β k v k, where β,,β k are scalars So ( ) v + β v + + β k v k = and this is a (nontrivial) dependence relation This completes the proof 3

4 So for instance the dependence relation 3 +( ) 7 +( ) 6 9 yields 3 = , expressing the first vector as a linear combo of the others I should mention that the notion of linear combination/dependence, um, depends on the field For instance, C 3 can be regarded as a vector space over C, but also over R The vectors, and 3 + i + i are dependent i + 3i over C since 3 + i + i = (+i) +, but it is not hard to check + 3i i that they are independent over R; there is no nontrivial dependence relation among these three vectors involving only real scalars DEFINITION (LINEAR SPAN) If S V, where as usual V is a vector space over the field F, then the (linear) span of S is the set W of all linear combinations of the vectors in S We write W = Span(S) in this case, and also call S a spanning set for W [Some people write span(s) or S for the span of S; the notation in the definition will be used in this class We will usually omit the word linear here, too We will also make the following convention, which not everybody uses] CONVENTION Span( ) = { } ( is the empty set; it has no elements) PROPOSITION For any subset S of the space V, W = Span(S) is a subspace of V In fact, it is the smallest subspace of V having such that S W (That is, S W and for any subspace U of V with S U, W U) PROOF: The convention takes care of the trivial case where S =, so we may assume that S is not empty For any v S, v is a linear combination of vectors from S; thus W If w W, then there are v,, v k in S and scalars α,,α k such that w = α v + + α k v k Now for any scalar β, β w = (βα ) v + (βα k ) v k is a linear combination of vectors from S, so β w W and hence W is closed under scalar multiplication Now suppose that w and w are in W Then there are vectors v,, v m in S and scalars β,,β m and γ,,γ m such that w = β v m + + β m v m and w = γ v + + γ m v m [We may assume that the vectors involved to get w as a linear combo of vectors in S are the same ones involved in getting w, as we can always add v to a linear combo without changing it] So w + w = (β + γ ) v + + (β m + γ m ) v m, = 4

5 which witnesses that w + w W and thus that W is closed under addition So it is a subspace of V S W since for any v S, v is a linear combo of vectors from S But any subspace of V which contains S must also contain all linear combos of vectors from S because it will be obtained by taking scalar multiples and sums of vectors, and any subspace is closed under these operations That s it Note that this result tells us that S V is a subspace if and only if Span(S) = S If S = 3, 7, and V = R3 (as a vector space over R), then it is not hard to see that W = Span(S) is a plane through the origin There are many other spanning sets besides S for this plane For instance 3, 7, 6 9 spans the same subspace W, and so does 3, 6 9 And so on It is clear for any S S V, then Span(S ) Span(S ); when are they equal? Our convention says that Span( ) = Span{ } Apart from that, to say that Span(S ) = Span(S ) when S S is just the same as saying that every vector in S is a linear combination of the vectors in S I hope this is clear As noted above, Span 3, 7 = Span 3, 7, 6 9 But Span 3, 7, 6 9 is all of R3 This can be seen di- rectly, by showing that for any a b R 3, there a (unique) solution to the c system x x = a b (Try it) x 3 c Another way to say what we just observed is that a dependent subset of a vector space is redundant in a certain way you can remove at least one vector from it without changing the subspace it spans But this is not true for independent sets With this in mind, we make the following DEFINITION (BASIS) If W = Span(S) and S is independent, then S is called a basis for W An ordered basis for W is simply an ordered set of vectors with no repetition where the underlying set is a basis (So an arbitrary independent set is a basis for something its span Here 5

6 of course S is a subset and W a subspace of some given vector space V over some given field F ) In our example example above with the plane W = Span 3, 7 = Span 3, 6 9 = Span 3, 7, 6 9, the 3 two-element sets are bases for W, but the 3-element set is not, 7 is an ordered basis for W The only basis for the trivial subspace { } (of any V ) is the empty set by our convention This corresponds to the usual idea that a single point is -dimensional But most subspaces have many bases However, we will see shortly that the number of vectors in one basis for W is the same number as in any other basis (in case that number is finite) If the basis has a single element, necessarily nonzero, then the space it spans is -dimensional; in R or R 3 this just means it is a line through the origin (Note that any set consisting of a single nonzero vector by itself is independent) If a basis has elements, the space it spans is -dimensional, which makes it a plane in R or R 3 (Note that two vectors are dependent just if one is a scalar multiple of the other, and in R n this is usually easily determined by sight; for 3 or more vectors, it usually requires some work) In case the field is F and the vector space F n, we have the standard basis { e,, e n } where each e j is has all zero coordinates, except that the jth coordi- nate is For instance, the standard basis for F 3 is { e, e, e 3 } =,, It is readily seen that this set is independent and spans all of F n In case of R 3, the standard ordered basis is often denoted ( i, j, k) The independent set 3, 7, 6 9 is a nonstandard basis for R3 The vector space of polynomials R[x] also has a standard basis, which is {, x, x, x 3, } Note that this makes the space infinite-dimensional (The here is the constant polynomial, not the number) To show that the dimension of a space (possibly a subspace of a bigger space) is well-defined, we first prove a lemma, which contains most of the work LEMMA Suppose that V is a vector space over the field F and S, S are subsets of V such that S is finite, S is independent and Span(S ) = V Then S S ( S is the number of elements of the set S) Before starting the proof, we make a couple of comments First, the assump- 6

7 tion that S is finite can be removed, but first we would have to define the size of a possibly infinite set; also the proof in that case requires some version of the axiom of choice We will stick to the case when S is finite we don t need to assume that S is finite, but that will follow Also, of course we may assume that S ; otherwise there is nothing to do Here s the basic strategy of the proof We start with a vector v S and show that there is a vector (which we will call w ) in S such that if we throw w out of S and replace it by v, we still have a spanning set for (all of) V If S =, we would then be done If not, we pick a second vector v S and we show that there is a vector w S different from w such that if we throw w out of S and replace it by v we still have a spanning set for V This implies in particular that there is a second vector in S Again, if S has just two elements, we would at that stage be done If not, we do it again, finding a third vector v 3 in S and a third vector w 3 in S such that (S \ { w, w, w 3 }) { v, v, v 3 } still spans V We continue like this as long as there are elements left in S, but we must run out before we run out of elements of S Of course, we have to justify that all this can be done PROOF: As noted, we may assume that S Let S = { w,, w k } (with no repetitions) Let v be any vector in S it cannot be Since it in Span(S ), we must have k and we can write v = a w + + a k w k for some scalars a,,a k These scalars cannot all be zero, and by relabelling the vectors in S, we may assume a So a w = v a w a k w k and w = a v a a w a k a w k Thus w is in the span of { v, w,, w k } and so this span must be all of V This completes the first step If v is the only vector in S we are finished, so assume there is another vector v S It is b v + c w + + c k w k for some scalars b, c,,c k It cannot be the case that c = = c k =, because if that happened, v would be equal to a scalar multiple of v, which is not possible by independence of S Without loss of generality, c and thus c w = v b v c 3 w 3 c k w k (Possibly k =, but it can t be ) Again solving for w by dividing by c shows that w Span{ v, v, w 3,, w k } and therefore this span must be all of V Once again, if S has only two elements, we are finished Otherwise, choose any v 3 S different from v and v and express it as v 3 = d v + d v + e 3 w e k w k where the d s and e s are scalars It cannot be the case that either k = or that all of the e j s are zero, because either of these would contradict the independence of S We may assume that e 3 and so we can solve for w 3 in terms of v, v, v 3 and the vectors w 4,, w k if there are any of those So w 3 is in the span of v, v, v 3 and the w j s for j 4 (if any) If S has just 3 elements, we are now finished If not, we keep going in the same manner By using independence of S and the fact that our new S spans all of V, we can keep doing this as long as 7

8 there are elements left in S ; we can t run out of elements of S first because that would contradict the independence of S This is a little loose (the proper proof is by induction), but if you ve made it this far, you should get the picture I will call the proof completed COROLLARY If S and S are bases for the same vector space V, and either of them is finite, then S = S PROOF: Suppose S is finite Then it is a spanning set for V and S is independent and so S S But now S is a finite spanning set for V and S is independent, so S S That s it DEFINITION (DIMENSION) Suppose that V is a vector space with a finite spanning set The dimension dim(v ) is the number of vectors in some (any) basis for V We collect some basic facts in a PROPOSITION Suppose that V is a vector space with finite dimension n If S is an independent subset of V, and v V \ S, then S { v} is independent if and only if v / Span(S) If S is an independent subset of V, then there is a set S S which is a basis for V Consequently, if S has exactly n elements, then S itself is a basis for V 3 If S is a finite spanning set for V, then there is S S such that S is a basis for V Consequently, if S has exactly n elements, then S itself is a basis for V PROOF: This part doesn t need finite-dimensionality We already know that if v Span(S), then S { v} is dependent On the other hand, if S { v} is dependent, then there are v, v,, v k in S and scalars a, a,, a k and b such that a v + a v + + a k v k + b v = is a dependence relation b, since otherwise we d have a dependence relation just involving vectors from S, so b v = a v a k v k, v = a b v a k b v k and v Span(S) If S itself is a basis for V, just take S = S If not, there must be some v / Span(S) S { v} is still independent and must span a larger subspace than S does If S { v} is a basis for all of V, we stop; if not, find w / Span(S { v}) Then S { v, w} is still independent; if it is a basis for for V, we stop Otherwise we continue; but we must eventually stop in at most n steps Then we have S If S is independent and has n elements and we actually increase it to get a basis S, that basis would have more than n elements, which is impossible So S itself is a basis for V 8

9 3 If S is a spanning set for V but not a basis, it must be dependent So there v S which is a linear combination of other elements of S So S \ { v} still spans V ; maybe it s a basis for V if not, repeat the process throwing out another vector w without changing the fact that we still have a spanning set for V We repeat this process as long as necessary, and must eventually stop When we do, we have a basis S S If S itself has exactly n elements, we can t throw out any vectors to make our basis S, because that basis wouldn t have enough elements So S itself is a basis A basis for a space provides a useful and fairly economical way of presenting the space Often, we simply use the standard basis if there is one But for some purposes, it makes more sense to use a nonstandard basis; and some spaces don t have anything that could reasonably be called the standard basis We describe some spaces associated with a matrix, and we will see how to find a basis for each of them DEFINITION (ROW, COLUMN AND NULL SPACES) Suppose that A is an m n matrix over the field F The row space of A, row(a), is the space spanned by the rows of A (hence it is a subspace of M,n (F )) The column space of A, col(a), is the space spanned by the columns of A; it is a subspace of F m 3 The null space of A, null(a), is the set of all solutions to the system A x = ; it is a subspace of F n There is also something called the row null space, defined like the null space, except with rows and multiplication on the other side; we won t deal with it The null space is of course nothing new, except for its name (It s the set of solutions to the homogeneous system A x = ) We don t at this point need to justify that any of these are subspaces of the various larger spaces mentioned We know that row-reducing A does not change the null space It is easily seen that it doesn t change the row space, either Obviously switching two rows will not affect the span of the set of rows Also, replacing a row R j by R j = αr j where α is a nonzero scalar doesn t either, since R j is clearly a linear combo of the rows of A, and also vice versa, since R j = α R j (If we allowed α =, this would probably not be the case) Finally, replacing R j by R j = R j +αr k where k j doesn t affect the row space either, since of course R j is a linear combo of the rows of A, but also R j = R j αr k (Each elementary row operation is reversible) Row-reduction does usually change the column space, but what it doesn t change is any dependence relations among the columns That is, suppose B is obtained from A by row-reduction, the columns of A are C,,C n and the 9

10 columns of B are C,, C n Then for any scalars α,,α n we have α C + + α n C n = if and only if α C + + α n C n = The reason for this is simple α C + + α n C n is simply A α where α = α α n So A α = if and only if α null(a) This will be true if and only if α null(b) if and only if α C + + α n C n = It is easy to read off bases for these spaces in case the matrix is in RREF If B is the RREF of A, then we can take the same basis for row(a) as for row(b), and the same basis for null(a) as for null(b) For col(a), we consider a basis for col(b) among the columns of B (which is easy) and take the corresponding columns of A Let s see an example (over the reals) Say A = 3 6 We produce the RREF B by performing the row-operations R R 3R, R 3 R 3 8R, R R 3R and R 3 R 3 R to obtain B = 4 4, in RREF I trust it is clear that a basis for row(b) is just { ( ), ( 4 4 ) } This is also a basis forrow(a) The solutions of the homogeneous system A x = have the form r +s 4 +t 4 A basis for null(a) is then, 46 4, 47 4 as the three vectors in this set are easily seen to be independent It is obvious that the first and third columns of B are independent, but the second is - times the first, the fourth is -46 times the first plus 4 times the third, and the fifth is 47 times the first plus -4 times the third A basis for col(b) consists of its first and third columns The corresponding thing will be true for col(a); that is, a basis for col(a) is 3, instructive to note (and check!) that the second column of A is just It is 3 8,

11 the fourth column of A is and the fifth column of A is ( 4) 3 This kind of thing always works; it follows from 8 6 what we said earlier about linear combos From the example, and the preceding comments, I hope the following further comments are clear The dimension of the row space of a matrix A is simply the number of nonzero rows in the RREF version of A This is also the dimension of the column space, as the columns with the leading s in the RREF matrix B are distinct elements of the standard basis for F m (if A has m rows), and this dimension doesn t change as we row-reduce Finally, the dimension of the null space of an m n matrix is n r where r is the dimension of the row and column space DEFINITION (RANK/NULLITY) If A is an m n matrix, the rank of A, r(a) (or sometimes rk(a)), is the dimension of the row space (and column space) The nullity of A, n(a), is the dimension of the null space of A To summarize what we just said, here are the slogans: row rank equals column rank and rank plus nullity equals the number of columns Given an ordered basis B = ( v,, v n ) for a space V over F and a vector w V, it is possible to assign (uniquely) a vector in F n to w [It also, and crucially, depends on B] Specifically, here s the result and the relevant definition PROPOSITION Suppose that V is a vector space of finite dimension n (over the field F ) Suppose that B = ( v,, v n ) is an ordered basis for V Then for any w V there is a unique tuple (a,, a n ) of scalars such that w = a v + + a n v n PROOF: The fact that such a,, a n exist follows immediately from the fact that B spans V The (minor) issue is their uniqueness So suppose that w = a v + + a n v n = b v + + b n v n Then = (a b ) v + + (a n b n ) v n Since v,, v n are independent, this implies that a b = = a n b n = ; that is, a = b, a = b,,a n = b n, which is just what we had to show DEFINITION (COORDINATES OF A VECTOR WITH RESPECT TO A BASIS) Suppose that V is a finite-dimensional vector space over the field F, and an ordered basis for V is B = ( v,, v n ) is an ordered basis for V If w V, the coordinates of w with respect to B are the scalars a,,a n such a that w = a v + + a n v n We write [ w] B = (This is a case where the notation is more significant than the definition itself) For a trivial example, if w is the zero vector of V and B is any basis, [ w] B a n

12 will be Only slightly less trivial is this kind of thing; if V = F n and B is the standard ordered basis and w = α α n, then [ w] B = α α n, too If it were always this simple, we wouldn t bother with the definition/notation Getting just slightly less trivial, consider V = P n (x), the vector space of polynomials over the reals of degree n and B = (, x,, x n ), the standard ordered basis If p(x) = a + a x + + a n x n is a polynomial, then [p(x)] B = a a In this fashion, we can treat the space P n(x) just like the space R n+, a n which you probably already knew Here s a nontrivial example If V = Span 3, 7 and B is the basis in those brackets in that order, and w = 6 9, then since w = 3 + ( ) ( ) 7, we have [ w] B = Notice two things about this Even though w R 3, its representation in terms of B is a -vector Why does this happen? Another is, if we change B to B = 7, 3 ( ), then [ w] B =, which is different The order matters This brings up the sticky but very important (if you care about linear algebra) question of what happens when you change the basis Please pay close attention to this, because the traditional notation/terminology here is very screwed up (ie, it s backwards) and many mathematicians are traditionalists The notation here befuddles many students, even though the idea is quite simple; it caught me several times, and I m a pro So here s the situation; say V is a finite-dimensional vector space over the field F, and each of B = ( v,, v n ) and B = ( w,, w n ) is an ordered basis for V DEFINITION (CHANGE-OF-BASIS MATRIX) The change-of-basis matrix B P B is the n n matrix which has jth column [ v j ] B for j n (Several texts use different notation, and often call this the change-of-basis

13 matrix from B to B, where it really should be the other way around We will use the notation just as above for this entire course and will not say which basis we are changing from or to) For a(( first example, ) ( consider )) V = R with the standard ordered basis B and 3 5 let B =, be a nonstandard ordered basis for V To find B P B, we must express the standard basis vectors ( e ) and e in( terms ) 3 5 of the new basis It is not hard to see that e = + ( ), ( ) so the first column of this change-of-basis matrix is (Generally, to find these coordinates, ( ) ( you) would have to solve a system( of equations) ) Also, 3 5 e = () + 3 So the second column is and 3 B P B = ( ) 3 In general, there will of course also be the other change-of-basis matrix BP B, going the other way In a case( like) the above example, this one requires 3 no calculation, for it is trivial that = 3 e + e, so the first column is ( ) ( ) ( ) just and the second, similarly, is B P B = It is easy to check that B P B and B P B are inverses of each other As we shall soon see, this is no accident First we explain what the change-of-basis matrix does PROPOSITION Let V be a finite-dimensional vector space over a given field F, each of B and B be an ordered basis for V, and v be any vector in V Then [ v] B = B P B [ v] B Thus, the change-of-basis matrix provides a means (by matrix multiplication) of translating the presentation of any given vector from the input basis B into the output basis B Note that the vector itself doesn t change; it s just how we present it that changes PROOF: Suppose B = ( v,, v n ), B = ( w,, w n ) and P = B P B = (p j,k ) j,k n ; finally suppose that [ v] B = a a n Then we have the following: v = a v + + a n v n and for each k n, v k = p,k w + + p n,k w n Then v = a (p, w + + p n, w n ) + + a n (p,n w + + p n,n w n ) = (p, a + + p n, a n ) w + + (p n, a + + p n,n a n ) w n 3

14 Thus the coordinates of [ v] B are (in order) p, a + + p,n a n,, a p n, a + + p n,n a n These are just the coordinates of P That does a n it An example is in order, and let s keep the first one ( simple Let ) V = R with the two bases B and B mentioned above So P = Suppose, say, ( ) 3 7 that v is represented by with respect to the standard basis B Now ( ) ( ) P = So [ v] 5 B should be this; of course it is easy to check ( ) ( ) ( ) directly that = Now suppose V is any finite-dimensional vector space over F and B, B are ordered bases for V Then the two change-of-basis matrices P = B P B and P = B P B are inverses of each other; let s see why Suppose a is any vector in F n ; there is a unique vector v V such that a = [ v] B Specifically, if a = a a n and B = ( v,, v n ), then v is a v + + a n v n Now (P P ) a = P (P a) = P [ v] B = [ v] B = a That is, if we multiply P P by any a F n, we get a back So P P = I; similarly P P = I We will see more examples and applications of these ideas later Now onto a different matter There are several ways to construct new vector spaces from given ones We mention two of them now Suppose that V is a vector space over F and W and W are subspaces of V Consider the intersection W W (the set of all vectors in both W and in W ); we will make it an exercise to show that this is itself a subspace of V By contrast, the union W W is almost never a subspace of V (The union consists of those vectors in either W or W, or both Obviously it is a subspace if W W, or W W, but these are the only cases Again, this will be left as an exercise) But there is a subspace corresponding to the union in fact, it is the span of the union, but it usually called the sum DEFINITION (THE SUM OF TWO SUBSPACES) Suppose that V is a vector space, and W and W are subspaces of V The sum of W and W, denoted W + W, is the set { w + w : w W, w W } (It will be an exercise to show that the sum is again a subspace) In case W W = { }, we call the sum the direct sum of W and W, and will often write W W in this case [For instance, if W and W are distinct lines in the Euclidean 3-space, both 4

15 going through the origin, W + W is the plane containing both of them; it is in fact a direct sum] (There are infinite versions of intersection and sum, but we will not deal with them) It is well-known for finite sets A and B, A B + A B = A + B There is a corresponding rule for the dimensions of finite-dimensional spaces It is often known as the modular law, but I prefer to call it Lunch In Chinatown Here it is PROPOSITION Suppose that V is a vector space, and W and W are finitedimensional subspaces of V Then dim(w + W ) + dim(w W ) = dim(w ) + dim(w ) PROOF: Suppose that dim(w W ) = k, dim(w ) = k + l and dim(w ) = k +m clearly each of these has at least as big a dimension as the intersection We must show that dim(w + W ) = k + l + m Let { u,, u k } be a basis for W W We can extend it to a basis { u,, u k, v,, v l } for W, and we can also extend it to a basis { u,, u k, x,, x m } for W Once we show that { u,, u k, v,, v l, x,, x m } is basis for W + W, then we will be finished Note that each of these vectors is in W + W since they are all in W or W or both (the u s) and we can take either of w or w to be in the definition of W + W Now we show that this set of all these vectors spans W + W ; let w = w + w be any vector in W + W, where w W and w W There must be scalars a,, a k, and b,, b l such that w = a u + + a k u k + b v + +b l v l, since the u s and v together span W Similarly there are scalars c,, c k and d,, d m such that w = c u + + c k v k + d x + + d m x m Then w = (a + c ) u + + (a k + c k ) u k + b v + + b k v k + d x + + d m x m So w Span{ u,, u k, v,, v l, x,, x m }, and so this set spans all of W + W Now we show the set is independent To this end, suppose there are scalars e,, e k, f,, f l, g,, g m so that e u + + e k e k + f v + + f l v l + g x + + g m x m = We must show that this can only happen if all the scalars are equation we just wrote can be rewritten as The long e u + + e k u k + f v + + f l v l = g x g m x m The left-hand side of this equation is a vector in W as it is a linear combination of the basis vectors for W ; for the same reason, the right-hand side is in W Both sides being equal, this vector is in W W Hence it is h u + +h m u k for some scalars h,,h k Using the right-hand side, we see that h u + +h k u k + g x + + g m x m = By the independence of our basis for W, we must have 5

16 that all the h s and g s are Now this forces e u + +e k v k +f v + +f l v l = That makes all the e s and f s, and shows that the given set is indeed independent Hence it s basis for W + W, and that finishes the proof A simple instance of this is the well-known fact that if two distinct planes in R 3 meet at all, they meet in a line Let s see an example of this result in action Let V = R 4, W = Span, 5 6 W = Span, 6, 8 We will find a basis for each of W, 3 W, W + W and W W We start by putting all these vectors in as columns of one big matrix (I trust you understand why there s a line down the middle) We call the columns of this matrix C,,C 6 Note that these columns span W + W We row- reduce this as a single matrix to get the RREF and call its columns C,,C 6 Note that the left half is also in RREF, and from that we can just read off the fact that {C, C } forms a basis for W The left half is not quite in RREF, but I trust it is close enough to see that the columns C 4 and C 5 are independent, but C 6 is a linear combination of them (in fact, their sum) If this were less clear, we could just row-reduce this half a bit more Anyway, a basis for W is just {C 4, C 5 } Also, in the whole thing, we see that C, C, C 4 are independent, but the rest are linear combos of them A basis for W + W is then just {C, C, C 4 } What about the intersection? First, the Lunch tells us its dimension is (as 3+=+) So we need a single nonzero vector in the intersection Looking at the RREF, it is clear that C 5 = C +C +C 4; this dependence relation will also hold for the original matrix Thus, C 5 = C +C +C 4, ie, C 5 C 4 = C +C By the left-hand side, this vector is in W ; by the right-hand side, it s in W So it (which is 4 4 ) is in W W ; the set with just this vector in it is a basis for W W A couple of comments Notice the hardest basis to find was for the inter-, , 6

17 section this is typical of these kinds of problems Also, we could have used C 6 C 4 = C + C, but that gives us the same vector Another example, with the same question Here V = R 5, W = Span, 6 and W = Span 4, 3 4, We start by forming the big matrix , again calling its columns C,,C 6 Once more we skip the easy row-reduction details, 3 4 just moving straight to the result It is From this it is apparent that {C, C, C 3 } is a basis for W, nearly as apparent that a basis for W is {C 4, C 5, C 6 } and a basis for W +W is {C, C, C 3, C 4 } Here (since = 4 + ), W W is -dimensional We need two independent vectors, and again we use the dependence relations apparent from the RREF C 5 = 3C C +C 4, so C 5 = 3C C +C 4, and C 5 C 4 = 3C C = is good for one of them (as it s in both W and W ) Also, C 6 = 4C +C 3 +C 4, 4 9 so C 6 = 4C + C 3 + C 4 and C 6 C 4 = 4C + C 3 = is also in the 5 4 intersection A basis for W W is then, ,, 7

[Disclaimer: This is not a complete list of everything you need to know, just some of the topics that gave people difficulty.]

[Disclaimer: This is not a complete list of everything you need to know, just some of the topics that gave people difficulty.] Math 43 Review Notes [Disclaimer: This is not a complete list of everything you need to know, just some of the topics that gave people difficulty Dot Product If v (v, v, v 3 and w (w, w, w 3, then the

More information

Vector Spaces. 9.1 Opening Remarks. Week Solvable or not solvable, that s the question. View at edx. Consider the picture

Vector Spaces. 9.1 Opening Remarks. Week Solvable or not solvable, that s the question. View at edx. Consider the picture Week9 Vector Spaces 9. Opening Remarks 9.. Solvable or not solvable, that s the question Consider the picture (,) (,) p(χ) = γ + γ χ + γ χ (, ) depicting three points in R and a quadratic polynomial (polynomial

More information

Answers in blue. If you have questions or spot an error, let me know. 1. Find all matrices that commute with A =. 4 3

Answers in blue. If you have questions or spot an error, let me know. 1. Find all matrices that commute with A =. 4 3 Answers in blue. If you have questions or spot an error, let me know. 3 4. Find all matrices that commute with A =. 4 3 a b If we set B = and set AB = BA, we see that 3a + 4b = 3a 4c, 4a + 3b = 3b 4d,

More information

ARE211, Fall2012. Contents. 2. Linear Algebra (cont) Vector Spaces Spanning, Dimension, Basis Matrices and Rank 8

ARE211, Fall2012. Contents. 2. Linear Algebra (cont) Vector Spaces Spanning, Dimension, Basis Matrices and Rank 8 ARE211, Fall2012 LINALGEBRA2: TUE, SEP 18, 2012 PRINTED: SEPTEMBER 27, 2012 (LEC# 8) Contents 2. Linear Algebra (cont) 1 2.6. Vector Spaces 1 2.7. Spanning, Dimension, Basis 3 2.8. Matrices and Rank 8

More information

LINEAR ALGEBRA: THEORY. Version: August 12,

LINEAR ALGEBRA: THEORY. Version: August 12, LINEAR ALGEBRA: THEORY. Version: August 12, 2000 13 2 Basic concepts We will assume that the following concepts are known: Vector, column vector, row vector, transpose. Recall that x is a column vector,

More information

0.2 Vector spaces. J.A.Beachy 1

0.2 Vector spaces. J.A.Beachy 1 J.A.Beachy 1 0.2 Vector spaces I m going to begin this section at a rather basic level, giving the definitions of a field and of a vector space in much that same detail as you would have met them in a

More information

Linear transformations: the basics

Linear transformations: the basics Linear transformations: the basics The notion of a linear transformation is much older than matrix notation. Indeed, matrix notation was developed (essentially) for the needs of calculation with linear

More information

MTH Linear Algebra. Study Guide. Dr. Tony Yee Department of Mathematics and Information Technology The Hong Kong Institute of Education

MTH Linear Algebra. Study Guide. Dr. Tony Yee Department of Mathematics and Information Technology The Hong Kong Institute of Education MTH 3 Linear Algebra Study Guide Dr. Tony Yee Department of Mathematics and Information Technology The Hong Kong Institute of Education June 3, ii Contents Table of Contents iii Matrix Algebra. Real Life

More information

1 Last time: inverses

1 Last time: inverses MATH Linear algebra (Fall 8) Lecture 8 Last time: inverses The following all mean the same thing for a function f : X Y : f is invertible f is one-to-one and onto 3 For each b Y there is exactly one a

More information

Final Review Sheet. B = (1, 1 + 3x, 1 + x 2 ) then 2 + 3x + 6x 2

Final Review Sheet. B = (1, 1 + 3x, 1 + x 2 ) then 2 + 3x + 6x 2 Final Review Sheet The final will cover Sections Chapters 1,2,3 and 4, as well as sections 5.1-5.4, 6.1-6.2 and 7.1-7.3 from chapters 5,6 and 7. This is essentially all material covered this term. Watch

More information

Math 110, Spring 2015: Midterm Solutions

Math 110, Spring 2015: Midterm Solutions Math 11, Spring 215: Midterm Solutions These are not intended as model answers ; in many cases far more explanation is provided than would be necessary to receive full credit. The goal here is to make

More information

Vector space and subspace

Vector space and subspace Vector space and subspace Math 112, week 8 Goals: Vector space, subspace, span. Null space, column space. Linearly independent, bases. Suggested Textbook Readings: Sections 4.1, 4.2, 4.3 Week 8: Vector

More information

Linear Algebra (Math-324) Lecture Notes

Linear Algebra (Math-324) Lecture Notes Linear Algebra (Math-324) Lecture Notes Dr. Ali Koam and Dr. Azeem Haider September 24, 2017 c 2017,, Jazan All Rights Reserved 1 Contents 1 Real Vector Spaces 6 2 Subspaces 11 3 Linear Combination and

More information

LECTURES 14/15: LINEAR INDEPENDENCE AND BASES

LECTURES 14/15: LINEAR INDEPENDENCE AND BASES LECTURES 14/15: LINEAR INDEPENDENCE AND BASES MA1111: LINEAR ALGEBRA I, MICHAELMAS 2016 1. Linear Independence We have seen in examples of span sets of vectors that sometimes adding additional vectors

More information

BASIC NOTIONS. x + y = 1 3, 3x 5y + z = A + 3B,C + 2D, DC are not defined. A + C =

BASIC NOTIONS. x + y = 1 3, 3x 5y + z = A + 3B,C + 2D, DC are not defined. A + C = CHAPTER I BASIC NOTIONS (a) 8666 and 8833 (b) a =6,a =4 will work in the first case, but there are no possible such weightings to produce the second case, since Student and Student 3 have to end up with

More information

Chapter 2: Linear Independence and Bases

Chapter 2: Linear Independence and Bases MATH20300: Linear Algebra 2 (2016 Chapter 2: Linear Independence and Bases 1 Linear Combinations and Spans Example 11 Consider the vector v (1, 1 R 2 What is the smallest subspace of (the real vector space

More information

Math 54 HW 4 solutions

Math 54 HW 4 solutions Math 54 HW 4 solutions 2.2. Section 2.2 (a) False: Recall that performing a series of elementary row operations A is equivalent to multiplying A by a series of elementary matrices. Suppose that E,...,

More information

Footnotes to Linear Algebra (MA 540 fall 2013), T. Goodwillie, Bases

Footnotes to Linear Algebra (MA 540 fall 2013), T. Goodwillie, Bases Footnotes to Linear Algebra (MA 540 fall 2013), T. Goodwillie, Bases November 18, 2013 1 Spanning and linear independence I will outline a slightly different approach to the material in Chapter 2 of Axler

More information

Lecture 3q Bases for Row(A), Col(A), and Null(A) (pages )

Lecture 3q Bases for Row(A), Col(A), and Null(A) (pages ) Lecture 3q Bases for Row(A), Col(A), and Null(A) (pages 57-6) Recall that the basis for a subspace S is a set of vectors that both spans S and is linearly independent. Moreover, we saw in section 2.3 that

More information

NOTES (1) FOR MATH 375, FALL 2012

NOTES (1) FOR MATH 375, FALL 2012 NOTES 1) FOR MATH 375, FALL 2012 1 Vector Spaces 11 Axioms Linear algebra grows out of the problem of solving simultaneous systems of linear equations such as 3x + 2y = 5, 111) x 3y = 9, or 2x + 3y z =

More information

MAT2342 : Introduction to Applied Linear Algebra Mike Newman, fall Projections. introduction

MAT2342 : Introduction to Applied Linear Algebra Mike Newman, fall Projections. introduction MAT4 : Introduction to Applied Linear Algebra Mike Newman fall 7 9. Projections introduction One reason to consider projections is to understand approximate solutions to linear systems. A common example

More information

Gaussian elimination

Gaussian elimination Gaussian elimination October 14, 2013 Contents 1 Introduction 1 2 Some definitions and examples 2 3 Elementary row operations 7 4 Gaussian elimination 11 5 Rank and row reduction 16 6 Some computational

More information

Math 205, Summer I, Week 3a (continued): Chapter 4, Sections 5 and 6. Week 3b. Chapter 4, [Sections 7], 8 and 9

Math 205, Summer I, Week 3a (continued): Chapter 4, Sections 5 and 6. Week 3b. Chapter 4, [Sections 7], 8 and 9 Math 205, Summer I, 2016 Week 3a (continued): Chapter 4, Sections 5 and 6. Week 3b Chapter 4, [Sections 7], 8 and 9 4.5 Linear Dependence, Linear Independence 4.6 Bases and Dimension 4.7 Change of Basis,

More information

The definition of a vector space (V, +, )

The definition of a vector space (V, +, ) The definition of a vector space (V, +, ) 1. For any u and v in V, u + v is also in V. 2. For any u and v in V, u + v = v + u. 3. For any u, v, w in V, u + ( v + w) = ( u + v) + w. 4. There is an element

More information

First we introduce the sets that are going to serve as the generalizations of the scalars.

First we introduce the sets that are going to serve as the generalizations of the scalars. Contents 1 Fields...................................... 2 2 Vector spaces.................................. 4 3 Matrices..................................... 7 4 Linear systems and matrices..........................

More information

Let V be a vector space, and let X be a subset. We say X is a Basis if it is both linearly independent and a generating set.

Let V be a vector space, and let X be a subset. We say X is a Basis if it is both linearly independent and a generating set. Basis Let V be a vector space, and let X be a subset. We say X is a Basis if it is both linearly independent and a generating set. The first example of a basis is the standard basis for R n e 1 = (1, 0,...,

More information

Inverses and Elementary Matrices

Inverses and Elementary Matrices Inverses and Elementary Matrices 1-12-2013 Matrix inversion gives a method for solving some systems of equations Suppose a 11 x 1 +a 12 x 2 + +a 1n x n = b 1 a 21 x 1 +a 22 x 2 + +a 2n x n = b 2 a n1 x

More information

Math 308 Discussion Problems #4 Chapter 4 (after 4.3)

Math 308 Discussion Problems #4 Chapter 4 (after 4.3) Math 38 Discussion Problems #4 Chapter 4 (after 4.3) () (after 4.) Let S be a plane in R 3 passing through the origin, so that S is a two-dimensional subspace of R 3. Say that a linear transformation T

More information

MA106 Linear Algebra lecture notes

MA106 Linear Algebra lecture notes MA106 Linear Algebra lecture notes Lecturers: Diane Maclagan and Damiano Testa 2017-18 Term 2 Contents 1 Introduction 3 2 Matrix review 3 3 Gaussian Elimination 5 3.1 Linear equations and matrices.......................

More information

Topic 14 Notes Jeremy Orloff

Topic 14 Notes Jeremy Orloff Topic 4 Notes Jeremy Orloff 4 Row reduction and subspaces 4. Goals. Be able to put a matrix into row reduced echelon form (RREF) using elementary row operations.. Know the definitions of null and column

More information

MATH 323 Linear Algebra Lecture 12: Basis of a vector space (continued). Rank and nullity of a matrix.

MATH 323 Linear Algebra Lecture 12: Basis of a vector space (continued). Rank and nullity of a matrix. MATH 323 Linear Algebra Lecture 12: Basis of a vector space (continued). Rank and nullity of a matrix. Basis Definition. Let V be a vector space. A linearly independent spanning set for V is called a basis.

More information

Linear Algebra Notes. Lecture Notes, University of Toronto, Fall 2016

Linear Algebra Notes. Lecture Notes, University of Toronto, Fall 2016 Linear Algebra Notes Lecture Notes, University of Toronto, Fall 2016 (Ctd ) 11 Isomorphisms 1 Linear maps Definition 11 An invertible linear map T : V W is called a linear isomorphism from V to W Etymology:

More information

a (b + c) = a b + a c

a (b + c) = a b + a c Chapter 1 Vector spaces In the Linear Algebra I module, we encountered two kinds of vector space, namely real and complex. The real numbers and the complex numbers are both examples of an algebraic structure

More information

Dot Products, Transposes, and Orthogonal Projections

Dot Products, Transposes, and Orthogonal Projections Dot Products, Transposes, and Orthogonal Projections David Jekel November 13, 2015 Properties of Dot Products Recall that the dot product or standard inner product on R n is given by x y = x 1 y 1 + +

More information

Lecture 14: The Rank-Nullity Theorem

Lecture 14: The Rank-Nullity Theorem Math 108a Professor: Padraic Bartlett Lecture 14: The Rank-Nullity Theorem Week 6 UCSB 2013 In today s talk, the last before we introduce the concept of matrices, we prove what is arguably the strongest

More information

Linear Algebra Handout

Linear Algebra Handout Linear Algebra Handout References Some material and suggested problems are taken from Fundamentals of Matrix Algebra by Gregory Hartman, which can be found here: http://www.vmi.edu/content.aspx?id=779979.

More information

Chapter 1 Vector Spaces

Chapter 1 Vector Spaces Chapter 1 Vector Spaces Per-Olof Persson persson@berkeley.edu Department of Mathematics University of California, Berkeley Math 110 Linear Algebra Vector Spaces Definition A vector space V over a field

More information

Linear Algebra Highlights

Linear Algebra Highlights Linear Algebra Highlights Chapter 1 A linear equation in n variables is of the form a 1 x 1 + a 2 x 2 + + a n x n. We can have m equations in n variables, a system of linear equations, which we want to

More information

Solution to Homework 1

Solution to Homework 1 Solution to Homework Sec 2 (a) Yes It is condition (VS 3) (b) No If x, y are both zero vectors Then by condition (VS 3) x = x + y = y (c) No Let e be the zero vector We have e = 2e (d) No It will be false

More information

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det What is the determinant of the following matrix? 3 4 3 4 3 4 4 3 A 0 B 8 C 55 D 0 E 60 If det a a a 3 b b b 3 c c c 3 = 4, then det a a 4a 3 a b b 4b 3 b c c c 3 c = A 8 B 6 C 4 D E 3 Let A be an n n matrix

More information

Linear Algebra Lecture Notes

Linear Algebra Lecture Notes Linear Algebra Lecture Notes Lecturers: Inna Capdeboscq and Damiano Testa Warwick, January 2017 Contents 1 Number Systems and Fields 3 1.1 Axioms for number systems............................ 3 2 Vector

More information

Linear Algebra. Preliminary Lecture Notes

Linear Algebra. Preliminary Lecture Notes Linear Algebra Preliminary Lecture Notes Adolfo J. Rumbos c Draft date April 29, 23 2 Contents Motivation for the course 5 2 Euclidean n dimensional Space 7 2. Definition of n Dimensional Euclidean Space...........

More information

Linear Algebra, Summer 2011, pt. 2

Linear Algebra, Summer 2011, pt. 2 Linear Algebra, Summer 2, pt. 2 June 8, 2 Contents Inverses. 2 Vector Spaces. 3 2. Examples of vector spaces..................... 3 2.2 The column space......................... 6 2.3 The null space...........................

More information

FINITE ABELIAN GROUPS Amin Witno

FINITE ABELIAN GROUPS Amin Witno WON Series in Discrete Mathematics and Modern Algebra Volume 7 FINITE ABELIAN GROUPS Amin Witno Abstract We detail the proof of the fundamental theorem of finite abelian groups, which states that every

More information

Example: 2x y + 3z = 1 5y 6z = 0 x + 4z = 7. Definition: Elementary Row Operations. Example: Type I swap rows 1 and 3

Example: 2x y + 3z = 1 5y 6z = 0 x + 4z = 7. Definition: Elementary Row Operations. Example: Type I swap rows 1 and 3 Math 0 Row Reduced Echelon Form Techniques for solving systems of linear equations lie at the heart of linear algebra. In high school we learn to solve systems with or variables using elimination and substitution

More information

Getting Started with Communications Engineering

Getting Started with Communications Engineering 1 Linear algebra is the algebra of linear equations: the term linear being used in the same sense as in linear functions, such as: which is the equation of a straight line. y ax c (0.1) Of course, if we

More information

Math 308 Midterm November 6, 2009

Math 308 Midterm November 6, 2009 Math 308 Midterm November 6, 2009 We will write A 1,..., A n for the columns of an m n matrix A. If x R n, we will write x = (x 1,..., x n ). he null space and range of a matrix A are denoted by N (A)

More information

LINEAR ALGEBRA REVIEW

LINEAR ALGEBRA REVIEW LINEAR ALGEBRA REVIEW SPENCER BECKER-KAHN Basic Definitions Domain and Codomain. Let f : X Y be any function. This notation means that X is the domain of f and Y is the codomain of f. This means that for

More information

Abstract Vector Spaces and Concrete Examples

Abstract Vector Spaces and Concrete Examples LECTURE 18 Abstract Vector Spaces and Concrete Examples Our discussion of linear algebra so far has been devoted to discussing the relations between systems of linear equations, matrices, and vectors.

More information

4.3 - Linear Combinations and Independence of Vectors

4.3 - Linear Combinations and Independence of Vectors - Linear Combinations and Independence of Vectors De nitions, Theorems, and Examples De nition 1 A vector v in a vector space V is called a linear combination of the vectors u 1, u,,u k in V if v can be

More information

Review Notes for Midterm #2

Review Notes for Midterm #2 Review Notes for Midterm #2 Joris Vankerschaver This version: Nov. 2, 200 Abstract This is a summary of the basic definitions and results that we discussed during class. Whenever a proof is provided, I

More information

MTH 2032 Semester II

MTH 2032 Semester II MTH 232 Semester II 2-2 Linear Algebra Reference Notes Dr. Tony Yee Department of Mathematics and Information Technology The Hong Kong Institute of Education December 28, 2 ii Contents Table of Contents

More information

Chapter 2 Subspaces of R n and Their Dimensions

Chapter 2 Subspaces of R n and Their Dimensions Chapter 2 Subspaces of R n and Their Dimensions Vector Space R n. R n Definition.. The vector space R n is a set of all n-tuples (called vectors) x x 2 x =., where x, x 2,, x n are real numbers, together

More information

Chapter SSM: Linear Algebra. 5. Find all x such that A x = , so that x 1 = x 2 = 0.

Chapter SSM: Linear Algebra. 5. Find all x such that A x = , so that x 1 = x 2 = 0. Chapter Find all x such that A x : Chapter, so that x x ker(a) { } Find all x such that A x ; note that all x in R satisfy the equation, so that ker(a) R span( e, e ) 5 Find all x such that A x 5 ; x x

More information

2: LINEAR TRANSFORMATIONS AND MATRICES

2: LINEAR TRANSFORMATIONS AND MATRICES 2: LINEAR TRANSFORMATIONS AND MATRICES STEVEN HEILMAN Contents 1. Review 1 2. Linear Transformations 1 3. Null spaces, range, coordinate bases 2 4. Linear Transformations and Bases 4 5. Matrix Representation,

More information

EXERCISE SET 5.1. = (kx + kx + k, ky + ky + k ) = (kx + kx + 1, ky + ky + 1) = ((k + )x + 1, (k + )y + 1)

EXERCISE SET 5.1. = (kx + kx + k, ky + ky + k ) = (kx + kx + 1, ky + ky + 1) = ((k + )x + 1, (k + )y + 1) EXERCISE SET 5. 6. The pair (, 2) is in the set but the pair ( )(, 2) = (, 2) is not because the first component is negative; hence Axiom 6 fails. Axiom 5 also fails. 8. Axioms, 2, 3, 6, 9, and are easily

More information

Sequence convergence, the weak T-axioms, and first countability

Sequence convergence, the weak T-axioms, and first countability Sequence convergence, the weak T-axioms, and first countability 1 Motivation Up to now we have been mentioning the notion of sequence convergence without actually defining it. So in this section we will

More information

Linear Algebra March 16, 2019

Linear Algebra March 16, 2019 Linear Algebra March 16, 2019 2 Contents 0.1 Notation................................ 4 1 Systems of linear equations, and matrices 5 1.1 Systems of linear equations..................... 5 1.2 Augmented

More information

2. Prime and Maximal Ideals

2. Prime and Maximal Ideals 18 Andreas Gathmann 2. Prime and Maximal Ideals There are two special kinds of ideals that are of particular importance, both algebraically and geometrically: the so-called prime and maximal ideals. Let

More information

Linear Algebra, Summer 2011, pt. 3

Linear Algebra, Summer 2011, pt. 3 Linear Algebra, Summer 011, pt. 3 September 0, 011 Contents 1 Orthogonality. 1 1.1 The length of a vector....................... 1. Orthogonal vectors......................... 3 1.3 Orthogonal Subspaces.......................

More information

Math 240, 4.3 Linear Independence; Bases A. DeCelles. 1. definitions of linear independence, linear dependence, dependence relation, basis

Math 240, 4.3 Linear Independence; Bases A. DeCelles. 1. definitions of linear independence, linear dependence, dependence relation, basis Math 24 4.3 Linear Independence; Bases A. DeCelles Overview Main ideas:. definitions of linear independence linear dependence dependence relation basis 2. characterization of linearly dependent set using

More information

7. Dimension and Structure.

7. Dimension and Structure. 7. Dimension and Structure 7.1. Basis and Dimension Bases for Subspaces Example 2 The standard unit vectors e 1, e 2,, e n are linearly independent, for if we write (2) in component form, then we obtain

More information

One-to-one functions and onto functions

One-to-one functions and onto functions MA 3362 Lecture 7 - One-to-one and Onto Wednesday, October 22, 2008. Objectives: Formalize definitions of one-to-one and onto One-to-one functions and onto functions At the level of set theory, there are

More information

Math 314 Lecture Notes Section 006 Fall 2006

Math 314 Lecture Notes Section 006 Fall 2006 Math 314 Lecture Notes Section 006 Fall 2006 CHAPTER 1 Linear Systems of Equations First Day: (1) Welcome (2) Pass out information sheets (3) Take roll (4) Open up home page and have students do same

More information

Math 24 Spring 2012 Questions (mostly) from the Textbook

Math 24 Spring 2012 Questions (mostly) from the Textbook Math 24 Spring 2012 Questions (mostly) from the Textbook 1. TRUE OR FALSE? (a) The zero vector space has no basis. (F) (b) Every vector space that is generated by a finite set has a basis. (c) Every vector

More information

2. Every linear system with the same number of equations as unknowns has a unique solution.

2. Every linear system with the same number of equations as unknowns has a unique solution. 1. For matrices A, B, C, A + B = A + C if and only if A = B. 2. Every linear system with the same number of equations as unknowns has a unique solution. 3. Every linear system with the same number of equations

More information

MATH 167: APPLIED LINEAR ALGEBRA Chapter 2

MATH 167: APPLIED LINEAR ALGEBRA Chapter 2 MATH 167: APPLIED LINEAR ALGEBRA Chapter 2 Jesús De Loera, UC Davis February 1, 2012 General Linear Systems of Equations (2.2). Given a system of m equations and n unknowns. Now m n is OK! Apply elementary

More information

Roberto s Notes on Linear Algebra Chapter 4: Matrix Algebra Section 7. Inverse matrices

Roberto s Notes on Linear Algebra Chapter 4: Matrix Algebra Section 7. Inverse matrices Roberto s Notes on Linear Algebra Chapter 4: Matrix Algebra Section 7 Inverse matrices What you need to know already: How to add and multiply matrices. What elementary matrices are. What you can learn

More information

Linear Algebra. Preliminary Lecture Notes

Linear Algebra. Preliminary Lecture Notes Linear Algebra Preliminary Lecture Notes Adolfo J. Rumbos c Draft date May 9, 29 2 Contents 1 Motivation for the course 5 2 Euclidean n dimensional Space 7 2.1 Definition of n Dimensional Euclidean Space...........

More information

A Do It Yourself Guide to Linear Algebra

A Do It Yourself Guide to Linear Algebra A Do It Yourself Guide to Linear Algebra Lecture Notes based on REUs, 2001-2010 Instructor: László Babai Notes compiled by Howard Liu 6-30-2010 1 Vector Spaces 1.1 Basics Definition 1.1.1. A vector space

More information

SUMMARY OF MATH 1600

SUMMARY OF MATH 1600 SUMMARY OF MATH 1600 Note: The following list is intended as a study guide for the final exam. It is a continuation of the study guide for the midterm. It does not claim to be a comprehensive list. You

More information

Linear Independence Reading: Lay 1.7

Linear Independence Reading: Lay 1.7 Linear Independence Reading: Lay 17 September 11, 213 In this section, we discuss the concept of linear dependence and independence I am going to introduce the definitions and then work some examples and

More information

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017 Math 4A Notes Written by Victoria Kala vtkala@math.ucsb.edu Last updated June 11, 2017 Systems of Linear Equations A linear equation is an equation that can be written in the form a 1 x 1 + a 2 x 2 +...

More information

chapter 12 MORE MATRIX ALGEBRA 12.1 Systems of Linear Equations GOALS

chapter 12 MORE MATRIX ALGEBRA 12.1 Systems of Linear Equations GOALS chapter MORE MATRIX ALGEBRA GOALS In Chapter we studied matrix operations and the algebra of sets and logic. We also made note of the strong resemblance of matrix algebra to elementary algebra. The reader

More information

The value of a problem is not so much coming up with the answer as in the ideas and attempted ideas it forces on the would be solver I.N.

The value of a problem is not so much coming up with the answer as in the ideas and attempted ideas it forces on the would be solver I.N. Math 410 Homework Problems In the following pages you will find all of the homework problems for the semester. Homework should be written out neatly and stapled and turned in at the beginning of class

More information

Math 346 Notes on Linear Algebra

Math 346 Notes on Linear Algebra Math 346 Notes on Linear Algebra Ethan Akin Mathematics Department Fall, 2014 1 Vector Spaces Anton Chapter 4, Section 4.1 You should recall the definition of a vector as an object with magnitude and direction

More information

Vector Space Basics. 1 Abstract Vector Spaces. 1. (commutativity of vector addition) u + v = v + u. 2. (associativity of vector addition)

Vector Space Basics. 1 Abstract Vector Spaces. 1. (commutativity of vector addition) u + v = v + u. 2. (associativity of vector addition) Vector Space Basics (Remark: these notes are highly formal and may be a useful reference to some students however I am also posting Ray Heitmann's notes to Canvas for students interested in a direct computational

More information

Math Linear algebra, Spring Semester Dan Abramovich

Math Linear algebra, Spring Semester Dan Abramovich Math 52 0 - Linear algebra, Spring Semester 2012-2013 Dan Abramovich Fields. We learned to work with fields of numbers in school: Q = fractions of integers R = all real numbers, represented by infinite

More information

Row Reduced Echelon Form

Row Reduced Echelon Form Math 40 Row Reduced Echelon Form Solving systems of linear equations lies at the heart of linear algebra. In high school we learn to solve systems in or variables using elimination and substitution of

More information

Math 54 First Midterm Exam, Prof. Srivastava September 23, 2016, 4:10pm 5:00pm, 155 Dwinelle Hall.

Math 54 First Midterm Exam, Prof. Srivastava September 23, 2016, 4:10pm 5:00pm, 155 Dwinelle Hall. Math 54 First Midterm Exam, Prof Srivastava September 23, 26, 4:pm 5:pm, 55 Dwinelle Hall Name: SID: Instructions: Write all answers in the provided space This exam includes two pages of scratch paper,

More information

. = V c = V [x]v (5.1) c 1. c k

. = V c = V [x]v (5.1) c 1. c k Chapter 5 Linear Algebra It can be argued that all of linear algebra can be understood using the four fundamental subspaces associated with a matrix Because they form the foundation on which we later work,

More information

MATH 2331 Linear Algebra. Section 2.1 Matrix Operations. Definition: A : m n, B : n p. Example: Compute AB, if possible.

MATH 2331 Linear Algebra. Section 2.1 Matrix Operations. Definition: A : m n, B : n p. Example: Compute AB, if possible. MATH 2331 Linear Algebra Section 2.1 Matrix Operations Definition: A : m n, B : n p ( 1 2 p ) ( 1 2 p ) AB = A b b b = Ab Ab Ab Example: Compute AB, if possible. 1 Row-column rule: i-j-th entry of AB:

More information

ALGEBRA. 1. Some elementary number theory 1.1. Primes and divisibility. We denote the collection of integers

ALGEBRA. 1. Some elementary number theory 1.1. Primes and divisibility. We denote the collection of integers ALGEBRA CHRISTIAN REMLING 1. Some elementary number theory 1.1. Primes and divisibility. We denote the collection of integers by Z = {..., 2, 1, 0, 1,...}. Given a, b Z, we write a b if b = ac for some

More information

Linear Algebra. Chapter Linear Equations

Linear Algebra. Chapter Linear Equations Chapter 3 Linear Algebra Dixit algorizmi. Or, So said al-khwarizmi, being the opening words of a 12 th century Latin translation of a work on arithmetic by al-khwarizmi (ca. 78 84). 3.1 Linear Equations

More information

LECTURE 6: VECTOR SPACES II (CHAPTER 3 IN THE BOOK)

LECTURE 6: VECTOR SPACES II (CHAPTER 3 IN THE BOOK) LECTURE 6: VECTOR SPACES II (CHAPTER 3 IN THE BOOK) In this lecture, F is a fixed field. One can assume F = R or C. 1. More about the spanning set 1.1. Let S = { v 1, v n } be n vectors in V, we have defined

More information

Advanced Engineering Mathematics Prof. Pratima Panigrahi Department of Mathematics Indian Institute of Technology, Kharagpur

Advanced Engineering Mathematics Prof. Pratima Panigrahi Department of Mathematics Indian Institute of Technology, Kharagpur Advanced Engineering Mathematics Prof. Pratima Panigrahi Department of Mathematics Indian Institute of Technology, Kharagpur Lecture No. # 02 Vector Spaces, Subspaces, linearly Dependent/Independent of

More information

Linear Algebra for Beginners Open Doors to Great Careers. Richard Han

Linear Algebra for Beginners Open Doors to Great Careers. Richard Han Linear Algebra for Beginners Open Doors to Great Careers Richard Han Copyright 2018 Richard Han All rights reserved. CONTENTS PREFACE... 7 1 - INTRODUCTION... 8 2 SOLVING SYSTEMS OF LINEAR EQUATIONS...

More information

Chapter 4 & 5: Vector Spaces & Linear Transformations

Chapter 4 & 5: Vector Spaces & Linear Transformations Chapter 4 & 5: Vector Spaces & Linear Transformations Philip Gressman University of Pennsylvania Philip Gressman Math 240 002 2014C: Chapters 4 & 5 1 / 40 Objective The purpose of Chapter 4 is to think

More information

Math 123, Week 5: Linear Independence, Basis, and Matrix Spaces. Section 1: Linear Independence

Math 123, Week 5: Linear Independence, Basis, and Matrix Spaces. Section 1: Linear Independence Math 123, Week 5: Linear Independence, Basis, and Matrix Spaces Section 1: Linear Independence Recall that every row on the left-hand side of the coefficient matrix of a linear system A x = b which could

More information

Math 3108: Linear Algebra

Math 3108: Linear Algebra Math 3108: Linear Algebra Instructor: Jason Murphy Department of Mathematics and Statistics Missouri University of Science and Technology 1 / 323 Contents. Chapter 1. Slides 3 70 Chapter 2. Slides 71 118

More information

T ((x 1, x 2,..., x n )) = + x x 3. , x 1. x 3. Each of the four coordinates in the range is a linear combination of the three variables x 1

T ((x 1, x 2,..., x n )) = + x x 3. , x 1. x 3. Each of the four coordinates in the range is a linear combination of the three variables x 1 MATH 37 Linear Transformations from Rn to Rm Dr. Neal, WKU Let T : R n R m be a function which maps vectors from R n to R m. Then T is called a linear transformation if the following two properties are

More information

Vector Spaces. Chapter Two

Vector Spaces. Chapter Two Chapter Two Vector Spaces The first chapter began by introducing Gauss method and finished with a fair understanding, keyed on the Linear Combination Lemma, of how it finds the solution set of a linear

More information

Modern Algebra Prof. Manindra Agrawal Department of Computer Science and Engineering Indian Institute of Technology, Kanpur

Modern Algebra Prof. Manindra Agrawal Department of Computer Science and Engineering Indian Institute of Technology, Kanpur Modern Algebra Prof. Manindra Agrawal Department of Computer Science and Engineering Indian Institute of Technology, Kanpur Lecture 02 Groups: Subgroups and homomorphism (Refer Slide Time: 00:13) We looked

More information

AN ALGEBRA PRIMER WITH A VIEW TOWARD CURVES OVER FINITE FIELDS

AN ALGEBRA PRIMER WITH A VIEW TOWARD CURVES OVER FINITE FIELDS AN ALGEBRA PRIMER WITH A VIEW TOWARD CURVES OVER FINITE FIELDS The integers are the set 1. Groups, Rings, and Fields: Basic Examples Z := {..., 3, 2, 1, 0, 1, 2, 3,...}, and we can add, subtract, and multiply

More information

EXAM 2 REVIEW DAVID SEAL

EXAM 2 REVIEW DAVID SEAL EXAM 2 REVIEW DAVID SEAL 3. Linear Systems and Matrices 3.2. Matrices and Gaussian Elimination. At this point in the course, you all have had plenty of practice with Gaussian Elimination. Be able to row

More information

Homogeneous Linear Systems and Their General Solutions

Homogeneous Linear Systems and Their General Solutions 37 Homogeneous Linear Systems and Their General Solutions We are now going to restrict our attention further to the standard first-order systems of differential equations that are linear, with particular

More information

Linear Algebra, 4th day, Thursday 7/1/04 REU Info:

Linear Algebra, 4th day, Thursday 7/1/04 REU Info: Linear Algebra, 4th day, Thursday 7/1/04 REU 004. Info http//people.cs.uchicago.edu/laci/reu04. Instructor Laszlo Babai Scribe Nick Gurski 1 Linear maps We shall study the notion of maps between vector

More information

Math 344 Lecture # Linear Systems

Math 344 Lecture # Linear Systems Math 344 Lecture #12 2.7 Linear Systems Through a choice of bases S and T for finite dimensional vector spaces V (with dimension n) and W (with dimension m), a linear equation L(v) = w becomes the linear

More information

4. Linear Subspaces Addition and scaling

4. Linear Subspaces Addition and scaling 71 4 Linear Subspaces There are many subsets of R n which mimic R n For example, a plane L passing through the origin in R 3 actually mimics R 2 in many ways First, L contains zero vector O as R 2 does

More information

2 Systems of Linear Equations

2 Systems of Linear Equations 2 Systems of Linear Equations A system of equations of the form or is called a system of linear equations. x + 2y = 7 2x y = 4 5p 6q + r = 4 2p + 3q 5r = 7 6p q + 4r = 2 Definition. An equation involving

More information