Sept 11,1 Rank, Homogenous Systems, Linear Independence If (A B) is a linear system represented as an augmented matrix then A is called the COEFFICIENT MATRIX and B is (usually) called the RIGHT HAND SIDE (rhs) Definition: A linear system (A B), represented as an augmented matrix, is HOMOGENOUS if B = O, the zero vector. Definition: If is (A O) (A B) is a linear system then the ASSOCIATED HOMOGENOUS SYSTEM VECTOR REPRESENTATION OF A LINEAR SYSTEM If S is a linear system and (A B) an augmented matrix representation which C[1], C[],... C[n] the columns of A, then the associated VECTOR REPRESENTATION of S is the vector equation x 1 C 1 +... + x n = B That is S is represented by B written as a linear combination of the columns of A with variable coefficients. A solution to S is then a choice of values for the variables for which the expression is an equalty of vectors. EXAMPLE; (A B) = B := -1-1 x y RHS 1-1 4 -. Here the coefficient matrix A = A := 1 4 and the RHS is The associated homogenous system is x y RHS 1 0 4 0 When we determine the parametric solution to (A B) by calculating the REF x y RHS 1-1 x = 1 y, x = -1 y + y - 0 1 0 0 0
We have written the general solution as a linear combination of the vectors 1 (*) With X = x y We write this as X = + X p X h, X p = 1 0 and X h = y 1 The "p" is for "particular" and the "h" for "homogenous". The interpretation is that any solution X and be written as this one specific solution, Xp, plus some solution to the 1 0 and associated homogenous system X h Indeed is we solve the associated homogenous system x y RHS 1 0 0 0 0 x y RHS 1 0 4 0 we get the REF for whiich we get x = y x, = y y y the X h = y 1 in line (*), x y = y 1 which is exactly Thus solving a linear system can be interpreted as a -part process (a) Find a particular solution X p (We will see that any particular solution will work) (b) Solve the homogenous system Then the solutions all have the form X = X p + X h where X h is some solution to the homogenous system. These can be done in either order. Our parametric process does them simultaneously. NOTE: HOMOGENOUS SYSTEMS ARE ALWAYS CONSISTENT because the verctor O 0 = 0 is always a solution. THIS IS CALLED THE TRIVIAL SOLUTION. So even if a... 0 system is inconsistent, its associated homogenous system will always be consistent. The system (A B) has infinitely many solutions if and only if (a) it is consistent (b) the associated homogenous system (A O) has infinitely many solutions.
The homogenous system (A O) has infinitely many solutions if and only if there is a free variable. Recall that if A is a matrix then the RANK of A is the number of pivots in any REF of A Equivalently, the rank of A is the number of non-zero rows in any REF of A. This is is not by any means "obvious". We will take it as a fact for now and prove it when we have more tools. If A is an m by n matrix ( m rows and n-columns) then of course the rank of A is the number of pivot variables. Since there is a variable for each column of A the rank is at most the number of columns. Since there is a pivot in each non-zero row of the REF and the REF an A have the same number of rows we see that the rank of A is at most the number of rows of A. So of A is an m by n matrix then 0 RANK( A ) min ( m, n ) Since there is a free variable exactly when there is a variable that is not a pivot variable we see THE HOMOGENOUS SYSTEM (A O) HAS ONLY THE TRIVIAL SOUTION IF AND ONLY IF THE RANK OF A IS EQUAL TO THE NUMBER OF COLUMNS OF A SAID ANOTHER WAY THE HOMOGENOUS SYSTEM (A O) HAS ONLY THE TRIVIAL SOLUTION MEANS THAT IF C 1, C are the columns of A then the only way to write x 1 C 1 + x C +... + x n = O (the zero vector) is with x 1 = x =... = x n = 0 DEFINITION: The set of vectors { C 1, C } is LINEARLY INDEPENDENT if the only way to write O as a linear combination of C 1, C is with all coefficients equal to 0. Thus: THE COLUMNS OF THE MATRIX A ARE LNIEARLY INDEPENDENT IF AND ONLY IF THE RANK OF A IS EQUAL TO THE NUMBER OF COLUMNS OF A. Given any set of n vectors { C 1, C } in R m, we can always make them the columns of a matrix and calculate its rank. Thus we have a simple way to determine if a set of vectors is linearly independent: (a) assemble the vectors into a matrix A (b) calculate a REF of A Then: The vectors are linearly indpendent if and only if the rank A is n.
If the rank is less than n then there are non-trivial ways to write O as a linear combination of the {,,, C 1 C... }. These are just the non-trivial solutions to the homogenous system (A O) Example: The two vectors C 1 = 1 and C = 4 are linearly dependent since (**) C 1 + ( 1 ) C = O What are all of the was to write O as a linear combination of C 1 and C? Solution: The pairs (x,y) such that x C 1 + y C = O are the solutions to the homogenous system (A O) where A := 1 4. We found above that the parametric solution to this system is 1 So every way to write x C 1 + y C = O has the form y C 1 + y C where y is a real number. Note that y=0 provides the trivial solution and y= -1 gives the solution (**) x y = y Example: Consider the matrix 1-0 4 1 A := - 0 1 0 1-4 0 4 4-1 - 0 7 1 with REF M = 1-0 4 1 0 0-8 -1 0 0 0 0 1 0 0 0 0 0 0 QUESTIONS: 1. What is the rank of A? Are the columns of A an independent set? Why or why not. Answer:, No, The condition for independence is that the rank(a) = number of columns of A.. What is the largest number of columns of A that could be independent? Why?, Given ay subset of the columns of A, if they form the columns of a matrix then the row operations that produced M would reduce that matrix to the one consisting of the corresponding columns of M. That matrix would have at most three non-zero rows so it would hav eat most rank.
. If C i is the i { th } column of A, which of the following are linearly independent? (a) {{ C 1, C }}, (b) { C, C 4, C 6 }, (c) { C, C 4 }, (d) { C, C }, (e) { C 1 }, (f) { C } ANS: (a), (b), (c),(e) In each case the rank of the matrix whose columns are the given vectors has rank equal to the number of columns. 4. The matrix M was produced from A by the sequence R->R -*R1, R->R-*R1, R4->R4+R1,R->R-R, R4->R4+*R, R4<->R The sequence converting M back into A is the sequence of inverses of these operations in the reverse order. That is R4<->R, R4->R4-*R, R-.R+R, R4->R4 -R1, R->R+R1, R->R+*R1 Use these to modify B is the linear system (A B) = x y z w t rhs 1-0 4 1-0 1 0 1-4 0 4 4-1 - 0 7 1 to a vector C such that (A C) is inconsistent. Solution: This is just the previous matrix which has REF = 1-0 4 1 0 0-8 -1 0 0 0 0 1 0 0 0 0 0 0 1-0 4 If the REF had become, say, 0 0-8 0 0 0 0 1 0 0 0 0 0-1 inconsistent. Thus we want replace the vector B = 1 then the system would have been
with a vector C which the given sequence of elementary steps would take to D = - To do this we apply the reserse sequence to D - R4<->R - R4->R4-*R - 5 R->R+R -7 5 R4->R4 -R1-7 R->R+R1 R->R+*R1 1 >