Solution to Homework 8, Math 568 S 5.4: No. 0. Use property of heorem 5 to test for linear independence in P 3 for the following set of cubic polynomials S = { x 3 x, x x, x, x 3 }. Solution: If we use the standard basis in P 3, which is {, x, x, x 3}, we have set of coordinate vectors in R 4 = { [x 3 x ] B, [x x] B, [x ] B, [x 3 ] B } = { [0, 0,, ], [0,,, 0], [,, 0, 0], [, 0, 0, ] } So by heorem 5, linear independence is checked if we check the linear independence of the set R 4. So, we set c [0, 0,, ] + c [0,,, 0] + c 3 [,, 0, 0] + c 4 [, 0, 0, ] = [0, 0, 0, 0] and find if [c, c, c 3, c 4 ] is necessarily [0, 0, 0, 0] or not. But this linear system for the unknowns [c, c, c 3, c 4 ] has the augmented matrix row reduce as shown 0 0 0 0 0 0 0 0 0 0 0 0 R4 R R R 4 0 0 0 R3+R 0 0 0 R 0 0 0 0 0 0 0 0 0 0 0 0 R3 R 0 0 0 R3 R4 R 4 R 3 0 0 0 R3 0 0 0 0 0 0 0 0 0 R4+R3 0 0 0 3 R4 0 0 0 0 0 0 0 0 0 0 0 0 3 0 0 0 0 0 Even without any further row reduction, it is clear that every column of the coefficient part of the reduced matrix has a leading one, c, c, c 3, c 4 = 0. hus, the set S is linearly independent in P 3. S 5.4: No. 4. Let V be the vector space of all matrices and S = {A, A, A 3, A 4 }as given below. Find a basis for Sp S. A = ( ) 3, A = ( ), A 3 = ( ) 3, A 4 = ( ) 0 Solution: If we choose standard basis in M the set of all matrix, call them B = {E, E, E, E } where E ij is the matrix with all zeros except for in the (i, j)-th element which is. hen it is clear that the coordinate vectors in this basis are = {[A ] B, [A ] B, [A 3 ] B, [A 4 ] B } =,,, 3 3 0
Writing the matrix C with column vectors given in, we have C as given below row-reducing in the way shown: 3 3 3 C = 0 5 0 5 0 0 0 3 R+R,R3+R R 4+R 0 0 0 R R3 R 3 R 0 5 0 5 0 0 6 0 0 0 6 0 0 3 0 3 0 3 R3 5R 0 0 0 0 0 0 R 4 6R 0 0 0 5 R R 0 0 0 5 5 R3 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 R 3R3 0 0 0 0 0 0 0 0 0 0 It follows that {[, 0,, 0], [0,, 0, 0], [0, 0, 0, ]} is the basis for the row space of C and therefore { [, 0,, 0], [0,, 0, 0], [0, 0, 0, ] } is a basis for the column space of C and thus a basis for Sp. herefore, basis for Sp S is on computation {( ) ( ) ( ) } B 0 = {E E, E, E } ==,,, 0 0 0 0 S 5.4: No. 6. In P, let Q = {p (x), p (x), p 3 (x)}, where p (x) = +x+x, p (x) = x + 3x, p 3 (x) = + x + 8x. Use the basis B = [, x, x ] to show that Q is a basis for P, Solution: If we use standard basis B in P, we note the coordinate vectors of the set Q is the set 0 = {[p ] B, [p ] B, [p 3 ] B } =,, 3 8 herefore, to determine whether B is a basis it is enough to show that is a basis for R 3. But then, a set of three vectors in R 3 will be a basis if we can show that is linearly independent. Setting c [,, ] +c [0,, 3] +c 3 [,, 8] = [0, 0, 0] we get a linear system of equation for c, c, c 3, whose augmented matrix row reduces as shown below: 0 0 0 0 0 0 0 R 0 R R R 3 R 0 3 0 3 8 0 3 8 0 0 3 0 0 0 0 0 0 R3 3R 0 3 0 R 3R3 R +R 3 0 0 0 0 0 0 0 0 0 implying c, c, c 3 all zero, and therefore the set Q is a basis for P.
S 5.4: No. 34. Show that B = { e x, e x, e 3x, e 4x} is a basis for V, where V is the set of functions f in the form f(x) = ae x + be x + ce 3x + de 4x for real numbers a, b, c, d. Solution: We note that from definition of V any f V is a linear combination of elements of B and so Sp B = V. So, we only need to determine that B is linearly independent. We set c e x + c e x + c 3 e 3x + c 4 e 4x = 0 for all x, which on differentiation c e x + c e x + 3c 3 e 3x + 4c 4 e 4x = 0 for all x, which on differentiation c e x + 4c e x + 9c 3 e 3x + 6c 4 e 4x = 0 for all x, which on differentiation c e x + 8c e x + 7c 3 e 3x + 64c 4 e 4x = 0 for all x By setting x = 0 in the above we obtain a set of linear equations for (c, c, c 3, c 4 ) for which the augmented matrix system row reduces as follows: 0 0 0 0 3 4 0 0 3 0 0 3 0 4 9 6 0 R R,R3 R R 4 R 0 3 8 5 0 R3 3R,R4 7R R R 0 0 6 0 8 7 64 0 0 7 6 63 0 0 0 4 0 0 0 R3 0 3 0 0 0 3 0 0 0 3 0 R4 R3,R R3 R +R 0 0 3 0 0 0 4 0 0 0 0 6 0 0 0 0 0 6 R4 0 0 3 0 0 0 0 0 0 0 3 0 R3 3R4,R+3R4 R R 4 0 0 0 0 0 0 0 0 0 0 0 0 which implies (c, c, c 3, c 4 ) = (0, 0, 0, 0), and so the set of functions in B are linearly independent. So B is a basis for V. S 5.4: No. 36. Prove that if Q = {v, v, v 3,, v m } is a linearly independent set of vectors in a vector space V, and if w is a vector in V such that w / Sp Q, then the set = {v, v, v 3,, v m, w} is a linearly independent set in V. Solution: Assume on the contrary that is dependent, which would imply that the equation c v + c v + + c m v m + c m+ w = 0 () has nonzero solution (c, c, c 3,, c m, c m+ ). Now claim that the last component c m+ of this nonzero solution is nonzero as otherwise if this were zero, the statement () for nonzero (c, c, ) would imply linear dependence of {v, v,, v m } which is not the case. herefore c m+ 0, implying from () that w = c m+ (c v + c v + + c m v m ) 3
of w Sp Q, which is not the case. herefore, we have a contradiction in the assumption that is dependent. So, must be independent. S 5.4: No. 38. he result from previous exercise (no. 37, S 5.4) is that if S = {v, v, v, v n } is linearly dependent if and only if at least one of the vectors v j can be expressed as a linear combination of remaining vectors. Use this result to obtain necessary and sufficient conditions for a set {u, v} to be linearly dependent. Determine by inspection if the following sets are linearly dependent or independent: Solution: When S contains only two vectors, when one vector is a linear combination of the other, then this statement implies one vector is simply a multiple of others. herefore linear dependence is equivalent to one vector being a multiple of another. a. { + x, x }. Clearly in this case x is not a scalar multiple of + x since the scalar cannot depend on x. hus the set is independent. b. {x, e x }. Once again e x is not a constant multiple of x for all x, and so the set is independent. c. {x, 3x}. Clearly the second function here is only a multiple of three of the first function and so this set of two functions is linearly dependent. d. {( ) ( )} 4, 3 6 We notice above that the second matrix is (-) times first matrix and so the set above is linearly dependent. e. {( ) ( )} 0 0 0, 0 0 0 Once again the set above is linearly dependent since the first matrix is a multiple (which is zero in this case) of the second. Note also, that any set containing the zero element of the vector space is always linearly dependent. S 3.6: No. 4. Verify that S = {u, u, u 3 } is an orthogonal set, where u =, u =, u 3 = Solution: We check u u = [,, ] = ()() + ()() + ()( ) = 0, u 3 u = [,, ] = ( )() + ()() + ()( ) = 0, u u 3 = [,, ] = ()( ) + ()() + ()() = 0. 4
It follows that the set S above is an orthogonal set in R 3. S 3.6: No. 6. Find values of a, b, c such that {u, u, u 3 } defined as below is an orthogonal set u = 0, u = a, u 3 = b c Solution: We first check orthogonality of u, u by checking u u = ()() + (0)()+()( ) = 0. Now, we require 0 = u u 3 = ()(a)+(0)(b)+()c = a+c, implying c = a. Now, require 0 = u u 3 = ()(a) + ()(b) + ( )c, implying that b = c a = 5a. herefore, u 3 = [a, b, c] = a[, 5, ] for arbitrary scalar a. S 3.6: No. 0. Express v = [0,, ] in terms of the orthogonal basis B = {u, u, u 3 }, where u =, u = Solution: We know the representation v = v u u u 0, u 3 = u + v u u u u + v u 3 u 3 u u 3 () 3 So, we calculate the scalars v u = (0)() + ()() + ()() = 3, v u = (0)( )+()(0)+()() =, v u 3 = (0)( )+()()+()( ) = 0 and length square u = u u = + + = 3, u = ( ) + (0) + () =. u 3 = ( ) + () + ( ) = 6. herefore, equation () above becomes v = 3 3 u + u + 0u 3 = u + u as may be double checked from expression of u, u and v. S 3.6: No.. Same question as No. 0, except that v = [,, ]. Solution: Once again () is valid, except we now have to calculate the coefficients anew. So, we calculate the scalars v u = ()() + ()() + ()() = 4, v u = ()( )+()(0)+()() = 0, v u 3 = ()( )+()()+()( ) = and length square is the same as in last exercise: u = u u = + + = 3, u = ( ) + (0) + () =. u 3 = ( ) + () + ( ) = 6. herefore, equation () above becomes v = 4 3 u + 0 u + 6 u 3 = 4 3 u + 3 u 3, as may be double checked from expression of u, u 3 and v as [,, ] = 4 3 [,, ] + 3 [,, ].. S 3.6: No.. Let S = {u, u, u 3 } be an orthogonal set of nonzero vectors in R 3. Define the 3 3 matrix A by A = [u, u, u 3 ],i.e. column vectors of A are u, u, u 3. Show that A A is a diagonal. 5
Solution: We have using properties of matrix multiplication and realizing that u i is row vector that is the i-th row of A and u j is the j-th row of A and so the (i, j) the element of A A must be the scalar product u i u j which can be nonzero only for i = j and results in a diagonal matrix. hus, for the 3 3 matrices involved we have u A A = u ( ) u u u u u u 3 u u u 3 = u u u u u u u 3 3 u 3 u u 3 u u 3 u 3 u u 0 0 = 0 u u 0 0 0 u 3 u 3 which is a diagonal matrix. S 3.6: No. 4. Use Gram-Schmidth process to generate orthogonal set from the given linearly independent vectors {w, w, w 3 } below w = 0, w = 0, w 3 = 0, Solution: We take u = w = [, 0,, ]. We note that w u = [,, 0, ][, 0,, ] = ()() + ()(0) + (0)() + ()() = 6 and u = u u = + 0 + + = 6. u = w u u u u = w 6 6 u = [,, 0, ] [, 0,, ] = [,,, 0] We can double check against algebraic mistake by noting that u u = ()() + ()(0) + ( )() + (0)() = 0. We also have length square of u = u u = () + () + ( ) + 0 = 3 and w 3 u = [,, 0, ][, 0,, ] = ()() + ( )(0) + (0)() + ()() = 3 and w 3 u = [,, 0, ][,,, 0] = ()() + ( )() + (0)( ) + ()(0) = 0 and so u 3 = w 3 3 u u u u ( w 3 u u u ) u = [,, 0, ] 3 [ ] 6 [, 0,, ] =,,, 0 We can double check against algebraic mistake by noting that u 3 u = 0 and u 3 u = 0 as readily checked. S 3.6: No. 6. Same question as No. 4 above with w = 0, w = 3 6, w 3 = 0 5, 5 Solution: Once again we take u = w = [0,, ]. We note that w u = [3, 6, ][0,, ] = (3)(0)+(6)()+()() = 0 and u = u u = 0 + + = 6
5. hus, u = w u u u u = w 0 5 u = [3, 6, ] [0,, 4] = [3, 4, ] We can double check against algebraic mistake by noting that u u = 0, as readily checked. We also have length square of u = u u = (3) + (4) + ( ) = 9 and w 3 u = [0, 5, 5][0,, ] = 5 + 0 = 5 and w 3 u = [0, 5, 5][3, 4, ] = (0)(3) + ( 5)(4) + (5)( ) = 0 and so u 3 = w 3 3 u u u u ( w 3 u u u ) u = [0, 5, 5] 5 5 [0,, ] = [0, 6, 3] We can double check against algebraic mistake by noting that u 3 u = 0 and u 3 u = 0 as readily checked. S 3.6: No. 0. Find a basis for the null space and range of the given matrix A below. hen use Gram-Schmidt to obtain orthogonal bases. 5 A = 7 5 Solution: o find basis for null space we just solve Ax = 0 and describe the solution as a linear combination of independent vectors. For that purpose we need to row reduce the augmented system as below 5 0 5 0 5 0 7 5 0 R R R 3 R 0 4 5 5 0 R R3 R 3 R 0 4 0 0 0 4 0 0 4 5 5 0 R3 4R R +R 0 3 3 0 0 4 0 R R3 0 0 0 R 3R 3 0 0 6 0 0 0 5 0 0 0 0 his row reduction implies that the fourth component x 4 = t is arbitrary, where as x = 6t, x = 5t, x 3 = t, x 4 = t, or x = t[ 6, 5,, ], meaning that Sp { [ 6, 5,, ] } = Null space of A. Since a set containing one nonzero vector has to be independent (since c v = 0 implies c = 0 for nonzero v) it follows that a basis of null space of A is also B N := { [ 6, 5,, ] }. he above row reduction also shows that if we were to check linear independence of columns of A, they would be linearly dependent since x 4 is arbitrary. However, if we got rid of the fourth column of A, then the 3 3 matrix would have row reduced to identity (look at the first three columns in the RREF form) above, implying at once that the first three columns of A would form a basis for column space of A, i.e. B = { [,, ], [,, ], [, 7, ] } is a basis for column space of A. Now, we call this set of vectors {w, w, w 3 } and carry out Gram-Schmidt process. 7
We have u = w = [,, ]. We note that w u = [,, ][,, ] = and u = u u = + + = 6. hus, u = w u u u u = w 6 u = [,, ] + [ 6 [,, ] = 6, 4 ] 3, 5 6 We can double check against algebraic mistake by noting ( that u u = 6 (( )() + (8)() + ( 5)()) = 0. We also have length square of u = u u = 36 ( ) + (8) + ( 5) ) = 35 6 and w 3 u = [, 7, ][,, ] = 7 and w3 u = 6 [, 7, ][, 8, 5] = 35 6 ( + 56 0) = 6. Note w 3 u =. hus using calculations above, u u u 3 = w 3 3 u u u u ( w 3 u u u ) [ u = [, 7, ] 7[,, ] 6, 4 ] 3, 5 6 = 6 [6 0 +, 4 04 8, 0 + 5] = [5, 70, 85] 6 here is likely to be an algebraic error in the calculation of u 3, which I haven t located yet. he set {u, u, u 3 } as calculated above will be an orthogonal basis for column space of A. Note that for the null space since B N only contains one vector, no Gram-Shmidt is involved. S 3.6: No. 3. Let W be a p dimensional subspace of R n. If v is a vector in W such that v w = 0 for every w W, show that v = 0. Solution Since the statement v w = 0 for every w W it is also true for w = v and we then have v = v v = 0, implying that v + v + v3 + + vn = 0 or every component of v is zero implying v = 0. 8