Sections 1.5, 1.7 Ma 322 Fall 213 Ma 322 Sept. 9-13 Summary ˆ Solutions of homogeneous equations AX =. ˆ Using the rank. ˆ Parametric solution of AX = B. ˆ Linear dependence and independence of vectors in R n. ˆ Using REF and RREF as convenient.
The Homogeneous Equation AX =. ˆ A linear system (A ) is said t be homogeneous when the RHS entries are all. In this case, the REF of M = (A ) can be seen to be M = (A ), i.e. the column can be omitted through the reduction process, if convenient. ˆ Clearly, rank(m) = rank(a), so a homogeneous system is always consistent. Indeed, it is also clear that X = is a solution to AX = and hence consistency is directly obvious! ˆ Let the common rank be r. Then there are exactly r pivot variables and n r free variables. ˆ The final solution will consist of solving the pivot variables in terms of the free variables and reporting the conclusion. ˆ Here is a worked example. An example. ˆ Consider a system in REF: x y z w t RHS 2 6 4 6 1 3 2 7 ˆ Identify pc list as (1, 3, 5, ), pivot variables as x, z, t and the free variables y, w. ˆ The fourth equation is ignored. The third gives t =, the second gives z =3w 2t = 3w and the first gives x = 3y + 2w 3t = 3y + 2w.
Reporting the Solution. ˆ The above solution is best reported as a vector: x y z w t = 3y + 2w y 3w w = yv 1 + wv 2 where v 1 = 3 1, v 2 = 2 3 1. Conclusion. ˆ Thus the solution is seen to be a member of Span{v 1, v 2 }. ˆ Denoting a general member of the span as v h, we write X = v h. It is important to remember that v h stands for any one of an infinite collection of vectors and should not be confused with a specific single vector or with the whole span! ˆ In general, the solution of AX = is always a member of the span of n r vectors, where n is the number of variables and r = rank(a).
The Non homogeneous Case. ˆ More generally, if we put the augmented matrix M = (A B) of a non homogeneous system into REF, say M = (A B ). then we need to verify the consistency condition first. ˆ If the condition fails, then there is no solution. ˆ If the condition holds, then the solution process and reporting is just as above, except the final answer isa fixed vector plus a span of (n r) vectors as before. Here is an example. A Non Homogeneous Example. ˆ Consider a system in REF: x y z w t RHS 2 6 4 6 18 1 3 2 5 7 35 ˆ Identify pc list as (1, 3, 5, ), pivot variables as x, z, t and the free variables y, w. ˆ The fourth equation is ignored. The third gives t = 5, the second gives z =3w 2t + 5 = 3w 1 + 5 = 3w 5 and the first gives x = 3y + 2w 3t + 9 = 3y + 2w 15 + 9 = 3y + 2w 6.
Reporting the Full Solution. ˆ The above solution is best reported as a vector: x y z w t = 3y + 2w 6 y 3w 5 w 5 = v p + yv 1 + wv 2 where v p = 6 5 5, v 1 = 3 1, v 2 = 2 3 1. Conclusion. ˆ Thus the solution is seen to be v p plus a member of Span{v 1, v 2 }. ˆ Denoting a general member of the span as v h, we write X = v p + v h. It is important to remember that v h stands for any one of an infinite collection of vectors and should not be confused with a specific single vector or with the whole span! ˆ In general, the solution of AX = B is always X = v p + v h, where v h is a general member of a span of n r vectors, with n is the number of variables and r = rank(a).
Trivial Solutions. ˆ If we consider a homogeneous equation AX = and C 1, C 2,, C n are n columns of A, then we are solving the equation: x 1 C 1 + x 2 C 2 + + x n C n =. ˆ As already observed, this system is consistent and always has a solution x 1 = x 2 = = x n = or briefly X =. Def.1: Trivial Solution. The solution X = is said to be a trivial solution of the system AX =. ˆ As already observed, the solution of AX = is X = v h, where v h is a general member of a span of n r vectors where r = rank(a). Linear Dependence. ˆ Def.2: Linearly Dependent vectors The columns C 1, C 2,, C n of A are said to be linearly dependent if the system AX = has a non trivial solution, or equivalently n r >, i.e. rank of A is less than its number of columns. ˆ For future use, we restate this definition more generally thus: Def.2(general): Linearly Dependent vectors. Any set S of vectors is said to be linearly dependent if there is a positive integer n such that n distinct vectors v 1, v 2,, v n of S satisfy c 1 v 1 +c 2 v 2 + +c n v n = where at least one of c i is non zero. ˆ Note that this makes sense even for an infinite set S.
Linear Independence. ˆ Def.3: Linearly Independent Vectors. A set S of vectors is said to be linearly independent if it is not linearly dependent! ˆ Def.4: Notations rownum, colnum. For any matrix A, we shall define colnum(a) to be the number of columns in A and rownum(a) to be the number of rows in A. ˆ Though logically clear, linear independence can be difficult to verify without better tools. We describe such a tool next. ˆ Convention: We shall often drop the word linearly from the terms linearly dependent and linearly independent. Test of Linear dependence/independence. ˆ For a finite set of vectors in R m, there is a simple criterion for dependence/independence. ˆ Given vectors v 1, v 2,, v n in R m, make a matrix A by taking these as columns and find its rank (by using REF, for example.). ˆ Suppose rank(a) = r. It is obvious that r n = colnum(a). Then we have: v 1, v 2,, v n are linearly dependent iff r < n and thus, they are linearly independent iff r = n.
A Proof of a Better Criterion. ˆ Given vectors v 1, v 2,, v r in R m, here is a more general criterion for their independence. ˆ Make a matrix A using v 1, v 2,, v r as columns and assume that A has r different rows with different finite pivot columns. Then v 1, v 2,, v r are linearly independent. ˆ Proof: Observe that each of the r columns must have pivots in them and so each of 1, 2,, r appear in the pc list. ˆ Consider the equation x 1 v 1 + x 2 v 2 + + x r v r =. If we consider the row which has its pivot column r, then clearly the equation gives x r =. ˆ Now if we consider a row which has pivot column r 1, then its equation involves x r 1 and may also involve x r. Since x r is already known to be zero, we have x r 1 =. ˆ Proceeding thus, we get x 1 = x 2 = x r =, i.e. the columns are linearly independent. Using REF, RREF. ˆ We have illustrated how to make and use REF - the row echelon form. ˆ We defined RREF, but did not work much with it. Roughly, it needs twice as much work as REF and hence we avoided using it, if it was not really needed. ˆ It is, however, needed for some of the later work and we include an illustration of the method to get that form. ˆ Unlike, REF, the form RREF is well defined and thus has theoretical merit. ˆ Typically, the act of solving equations, becomes, writing down the answers if we have achieved RREF.
An example of RREF ˆ Problem: Consider the following augmented matrix (already in REF) and convert to RREF. Use it to write out the solution of the associated linear system. ˆ x y z RHS 1 1 3 12 1 6 2 9 27 ˆ Notice that the pc list is (1, 2, 3, ) and the respective pivot entries are 1, 1, 9. ˆ The process is to start with the last pivot, make all entries above it equal to zero and make it 1. Then repeat with earlier pivots. RREF Example concluded. ˆ Thus, the operations R 1 3 9 R 3, R 2 6 9 R 3, 1 9 R 3 give: ˆ x y z RHS 1 1 3 12 1 6 2 9 27 x y z RHS 1 1 3 1 2 1 3 ˆ Then the operation R 1 R 2 finishes off the RREF. x y z RHS 1 1 1 2 1 3 ˆ The answer to the system can now be simply read off!