Solution to Homework 8, Math 2568

Similar documents
MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

Eigenvalue and Eigenvector Homework

SECTION 3.3. PROBLEM 22. The null space of a matrix A is: N(A) = {X : AX = 0}. Here are the calculations of AX for X = a,b,c,d, and e. =

MATH 260 LINEAR ALGEBRA EXAM III Fall 2014

Math 265 Linear Algebra Sample Spring 2002., rref (A) =

Row Space, Column Space, and Nullspace

MATH 304 Linear Algebra Lecture 10: Linear independence. Wronskian.

Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition

CSL361 Problem set 4: Basic linear algebra

22m:033 Notes: 7.1 Diagonalization of Symmetric Matrices

Total 100

Solution to Set 6, Math W = {x : x 1 + x 2 = 0} Solution: This is not a subspace since it does not contain 0 = (0, 0) since

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL

Homework 5. (due Wednesday 8 th Nov midnight)

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

Linear algebra I Homework #1 due Thursday, Oct Show that the diagonals of a square are orthogonal to one another.

MATH 304 Linear Algebra Lecture 20: Review for Test 1.

Problem 1: Solving a linear equation

1 Last time: least-squares problems

Chapter 1 Vector Spaces

Chapter 3. Vector spaces

YORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH M Test #1. July 11, 2013 Solutions

Spring 2014 Math 272 Final Exam Review Sheet

Solution to Set 7, Math 2568

3.2 Subspace. Definition: If S is a non-empty subset of a vector space V, and S satisfies the following conditions: (i).

MAT Linear Algebra Collection of sample exams

Worksheet for Lecture 15 (due October 23) Section 4.3 Linearly Independent Sets; Bases

Linear algebra II Homework #1 solutions A = This means that every eigenvector with eigenvalue λ = 1 must have the form

Finite Math - J-term Section Systems of Linear Equations in Two Variables Example 1. Solve the system

1. Select the unique answer (choice) for each problem. Write only the answer.

b for the linear system x 1 + x 2 + a 2 x 3 = a x 1 + x 3 = 3 x 1 + x 2 + 9x 3 = 3 ] 1 1 a 2 a

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

MODULE 8 Topics: Null space, range, column space, row space and rank of a matrix

There are two things that are particularly nice about the first basis

MATH10212 Linear Algebra B Homework Week 4

EXERCISE SET 5.1. = (kx + kx + k, ky + ky + k ) = (kx + kx + 1, ky + ky + 1) = ((k + )x + 1, (k + )y + 1)

5.) For each of the given sets of vectors, determine whether or not the set spans R 3. Give reasons for your answers.

Math 224, Fall 2007 Exam 3 Thursday, December 6, 2007

MATH 425-Spring 2010 HOMEWORK ASSIGNMENTS

MATH 304 Linear Algebra Lecture 34: Review for Test 2.

Math Computation Test 1 September 26 th, 2016 Debate: Computation vs. Theory Whatever wins, it ll be Huuuge!

Reduction to the associated homogeneous system via a particular solution

MTH Linear Algebra. Study Guide. Dr. Tony Yee Department of Mathematics and Information Technology The Hong Kong Institute of Education

Homework Set #8 Solutions

Check that your exam contains 30 multiple-choice questions, numbered sequentially.

Linear Algebra Massoud Malek

INNER PRODUCT SPACE. Definition 1

Lecture 22: Section 4.7

MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors.

Numerical Linear Algebra Homework Assignment - Week 2

Chap 3. Linear Algebra

Elementary Matrices. MATH 322, Linear Algebra I. J. Robert Buchanan. Spring Department of Mathematics

Math Linear Algebra Final Exam Review Sheet

Inverses. Stephen Boyd. EE103 Stanford University. October 28, 2017

Linear Algebra problems

System of Linear Equations

MATH 2331 Linear Algebra. Section 2.1 Matrix Operations. Definition: A : m n, B : n p. Example: Compute AB, if possible.

Solution: By inspection, the standard matrix of T is: A = Where, Ae 1 = 3. , and Ae 3 = 4. , Ae 2 =

MA 265 FINAL EXAM Fall 2012

Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008

Math 314/ Exam 2 Blue Exam Solutions December 4, 2008 Instructor: Dr. S. Cooper. Name:

This MUST hold matrix multiplication satisfies the distributive property.

can only hit 3 points in the codomain. Hence, f is not surjective. For another example, if n = 4

MATH 220 FINAL EXAMINATION December 13, Name ID # Section #

Linear independence, span, basis, dimension - and their connection with linear systems

Worksheet for Lecture 15 (due October 23) Section 4.3 Linearly Independent Sets; Bases

Linear Models Review

Elementary maths for GMT

Lecture 3: Linear Algebra Review, Part II

Math 360 Linear Algebra Fall Class Notes. a a a a a a. a a a

Math 3191 Applied Linear Algebra

MATH SOLUTIONS TO PRACTICE PROBLEMS - MIDTERM I. 1. We carry out row reduction. We begin with the row operations

Math 314H EXAM I. 1. (28 points) The row reduced echelon form of the augmented matrix for the system. is the matrix

Math 54. Selected Solutions for Week 5

Review of Linear Algebra

Math 4377/6308 Advanced Linear Algebra I Dr. Vaughn Climenhaga, PGH 651A HOMEWORK 3

Elementary Linear Algebra Review for Exam 2 Exam is Monday, November 16th.

Dimension. Eigenvalue and eigenvector

4.6 Bases and Dimension

MATH 223 FINAL EXAM APRIL, 2005

6. Orthogonality and Least-Squares

Homework For each of the following matrices, find the minimal polynomial and determine whether the matrix is diagonalizable.

Components and change of basis

MATH 1553 PRACTICE FINAL EXAMINATION

Linear Algebra. and

[Disclaimer: This is not a complete list of everything you need to know, just some of the topics that gave people difficulty.]

Solutions to practice questions for the final

ELE/MCE 503 Linear Algebra Facts Fall 2018

Chapter 3 Transformations

MATH 221: SOLUTIONS TO SELECTED HOMEWORK PROBLEMS

Problem Set (T) If A is an m n matrix, B is an n p matrix and D is a p s matrix, then show

Math Linear Algebra

Practice Final Exam. Solutions.

Linear algebra II Homework #1 due Thursday, Feb A =

Lecture 03. Math 22 Summer 2017 Section 2 June 26, 2017

Vector Spaces and Dimension. Subspaces of. R n. addition and scalar mutiplication. That is, if u, v in V and alpha in R then ( u + v) Exercise: x

Assignment 1 Math 5341 Linear Algebra Review. Give complete answers to each of the following questions. Show all of your work.

SUMMARY OF MATH 1600

Math 4377/6308 Advanced Linear Algebra

Math 4310 Solutions to homework 1 Due 9/1/16

Transcription:

Solution to Homework 8, Math 568 S 5.4: No. 0. Use property of heorem 5 to test for linear independence in P 3 for the following set of cubic polynomials S = { x 3 x, x x, x, x 3 }. Solution: If we use the standard basis in P 3, which is {, x, x, x 3}, we have set of coordinate vectors in R 4 = { [x 3 x ] B, [x x] B, [x ] B, [x 3 ] B } = { [0, 0,, ], [0,,, 0], [,, 0, 0], [, 0, 0, ] } So by heorem 5, linear independence is checked if we check the linear independence of the set R 4. So, we set c [0, 0,, ] + c [0,,, 0] + c 3 [,, 0, 0] + c 4 [, 0, 0, ] = [0, 0, 0, 0] and find if [c, c, c 3, c 4 ] is necessarily [0, 0, 0, 0] or not. But this linear system for the unknowns [c, c, c 3, c 4 ] has the augmented matrix row reduce as shown 0 0 0 0 0 0 0 0 0 0 0 0 R4 R R R 4 0 0 0 R3+R 0 0 0 R 0 0 0 0 0 0 0 0 0 0 0 0 R3 R 0 0 0 R3 R4 R 4 R 3 0 0 0 R3 0 0 0 0 0 0 0 0 0 R4+R3 0 0 0 3 R4 0 0 0 0 0 0 0 0 0 0 0 0 3 0 0 0 0 0 Even without any further row reduction, it is clear that every column of the coefficient part of the reduced matrix has a leading one, c, c, c 3, c 4 = 0. hus, the set S is linearly independent in P 3. S 5.4: No. 4. Let V be the vector space of all matrices and S = {A, A, A 3, A 4 }as given below. Find a basis for Sp S. A = ( ) 3, A = ( ), A 3 = ( ) 3, A 4 = ( ) 0 Solution: If we choose standard basis in M the set of all matrix, call them B = {E, E, E, E } where E ij is the matrix with all zeros except for in the (i, j)-th element which is. hen it is clear that the coordinate vectors in this basis are = {[A ] B, [A ] B, [A 3 ] B, [A 4 ] B } =,,, 3 3 0

Writing the matrix C with column vectors given in, we have C as given below row-reducing in the way shown: 3 3 3 C = 0 5 0 5 0 0 0 3 R+R,R3+R R 4+R 0 0 0 R R3 R 3 R 0 5 0 5 0 0 6 0 0 0 6 0 0 3 0 3 0 3 R3 5R 0 0 0 0 0 0 R 4 6R 0 0 0 5 R R 0 0 0 5 5 R3 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 R 3R3 0 0 0 0 0 0 0 0 0 0 It follows that {[, 0,, 0], [0,, 0, 0], [0, 0, 0, ]} is the basis for the row space of C and therefore { [, 0,, 0], [0,, 0, 0], [0, 0, 0, ] } is a basis for the column space of C and thus a basis for Sp. herefore, basis for Sp S is on computation {( ) ( ) ( ) } B 0 = {E E, E, E } ==,,, 0 0 0 0 S 5.4: No. 6. In P, let Q = {p (x), p (x), p 3 (x)}, where p (x) = +x+x, p (x) = x + 3x, p 3 (x) = + x + 8x. Use the basis B = [, x, x ] to show that Q is a basis for P, Solution: If we use standard basis B in P, we note the coordinate vectors of the set Q is the set 0 = {[p ] B, [p ] B, [p 3 ] B } =,, 3 8 herefore, to determine whether B is a basis it is enough to show that is a basis for R 3. But then, a set of three vectors in R 3 will be a basis if we can show that is linearly independent. Setting c [,, ] +c [0,, 3] +c 3 [,, 8] = [0, 0, 0] we get a linear system of equation for c, c, c 3, whose augmented matrix row reduces as shown below: 0 0 0 0 0 0 0 R 0 R R R 3 R 0 3 0 3 8 0 3 8 0 0 3 0 0 0 0 0 0 R3 3R 0 3 0 R 3R3 R +R 3 0 0 0 0 0 0 0 0 0 implying c, c, c 3 all zero, and therefore the set Q is a basis for P.

S 5.4: No. 34. Show that B = { e x, e x, e 3x, e 4x} is a basis for V, where V is the set of functions f in the form f(x) = ae x + be x + ce 3x + de 4x for real numbers a, b, c, d. Solution: We note that from definition of V any f V is a linear combination of elements of B and so Sp B = V. So, we only need to determine that B is linearly independent. We set c e x + c e x + c 3 e 3x + c 4 e 4x = 0 for all x, which on differentiation c e x + c e x + 3c 3 e 3x + 4c 4 e 4x = 0 for all x, which on differentiation c e x + 4c e x + 9c 3 e 3x + 6c 4 e 4x = 0 for all x, which on differentiation c e x + 8c e x + 7c 3 e 3x + 64c 4 e 4x = 0 for all x By setting x = 0 in the above we obtain a set of linear equations for (c, c, c 3, c 4 ) for which the augmented matrix system row reduces as follows: 0 0 0 0 3 4 0 0 3 0 0 3 0 4 9 6 0 R R,R3 R R 4 R 0 3 8 5 0 R3 3R,R4 7R R R 0 0 6 0 8 7 64 0 0 7 6 63 0 0 0 4 0 0 0 R3 0 3 0 0 0 3 0 0 0 3 0 R4 R3,R R3 R +R 0 0 3 0 0 0 4 0 0 0 0 6 0 0 0 0 0 6 R4 0 0 3 0 0 0 0 0 0 0 3 0 R3 3R4,R+3R4 R R 4 0 0 0 0 0 0 0 0 0 0 0 0 which implies (c, c, c 3, c 4 ) = (0, 0, 0, 0), and so the set of functions in B are linearly independent. So B is a basis for V. S 5.4: No. 36. Prove that if Q = {v, v, v 3,, v m } is a linearly independent set of vectors in a vector space V, and if w is a vector in V such that w / Sp Q, then the set = {v, v, v 3,, v m, w} is a linearly independent set in V. Solution: Assume on the contrary that is dependent, which would imply that the equation c v + c v + + c m v m + c m+ w = 0 () has nonzero solution (c, c, c 3,, c m, c m+ ). Now claim that the last component c m+ of this nonzero solution is nonzero as otherwise if this were zero, the statement () for nonzero (c, c, ) would imply linear dependence of {v, v,, v m } which is not the case. herefore c m+ 0, implying from () that w = c m+ (c v + c v + + c m v m ) 3

of w Sp Q, which is not the case. herefore, we have a contradiction in the assumption that is dependent. So, must be independent. S 5.4: No. 38. he result from previous exercise (no. 37, S 5.4) is that if S = {v, v, v, v n } is linearly dependent if and only if at least one of the vectors v j can be expressed as a linear combination of remaining vectors. Use this result to obtain necessary and sufficient conditions for a set {u, v} to be linearly dependent. Determine by inspection if the following sets are linearly dependent or independent: Solution: When S contains only two vectors, when one vector is a linear combination of the other, then this statement implies one vector is simply a multiple of others. herefore linear dependence is equivalent to one vector being a multiple of another. a. { + x, x }. Clearly in this case x is not a scalar multiple of + x since the scalar cannot depend on x. hus the set is independent. b. {x, e x }. Once again e x is not a constant multiple of x for all x, and so the set is independent. c. {x, 3x}. Clearly the second function here is only a multiple of three of the first function and so this set of two functions is linearly dependent. d. {( ) ( )} 4, 3 6 We notice above that the second matrix is (-) times first matrix and so the set above is linearly dependent. e. {( ) ( )} 0 0 0, 0 0 0 Once again the set above is linearly dependent since the first matrix is a multiple (which is zero in this case) of the second. Note also, that any set containing the zero element of the vector space is always linearly dependent. S 3.6: No. 4. Verify that S = {u, u, u 3 } is an orthogonal set, where u =, u =, u 3 = Solution: We check u u = [,, ] = ()() + ()() + ()( ) = 0, u 3 u = [,, ] = ( )() + ()() + ()( ) = 0, u u 3 = [,, ] = ()( ) + ()() + ()() = 0. 4

It follows that the set S above is an orthogonal set in R 3. S 3.6: No. 6. Find values of a, b, c such that {u, u, u 3 } defined as below is an orthogonal set u = 0, u = a, u 3 = b c Solution: We first check orthogonality of u, u by checking u u = ()() + (0)()+()( ) = 0. Now, we require 0 = u u 3 = ()(a)+(0)(b)+()c = a+c, implying c = a. Now, require 0 = u u 3 = ()(a) + ()(b) + ( )c, implying that b = c a = 5a. herefore, u 3 = [a, b, c] = a[, 5, ] for arbitrary scalar a. S 3.6: No. 0. Express v = [0,, ] in terms of the orthogonal basis B = {u, u, u 3 }, where u =, u = Solution: We know the representation v = v u u u 0, u 3 = u + v u u u u + v u 3 u 3 u u 3 () 3 So, we calculate the scalars v u = (0)() + ()() + ()() = 3, v u = (0)( )+()(0)+()() =, v u 3 = (0)( )+()()+()( ) = 0 and length square u = u u = + + = 3, u = ( ) + (0) + () =. u 3 = ( ) + () + ( ) = 6. herefore, equation () above becomes v = 3 3 u + u + 0u 3 = u + u as may be double checked from expression of u, u and v. S 3.6: No.. Same question as No. 0, except that v = [,, ]. Solution: Once again () is valid, except we now have to calculate the coefficients anew. So, we calculate the scalars v u = ()() + ()() + ()() = 4, v u = ()( )+()(0)+()() = 0, v u 3 = ()( )+()()+()( ) = and length square is the same as in last exercise: u = u u = + + = 3, u = ( ) + (0) + () =. u 3 = ( ) + () + ( ) = 6. herefore, equation () above becomes v = 4 3 u + 0 u + 6 u 3 = 4 3 u + 3 u 3, as may be double checked from expression of u, u 3 and v as [,, ] = 4 3 [,, ] + 3 [,, ].. S 3.6: No.. Let S = {u, u, u 3 } be an orthogonal set of nonzero vectors in R 3. Define the 3 3 matrix A by A = [u, u, u 3 ],i.e. column vectors of A are u, u, u 3. Show that A A is a diagonal. 5

Solution: We have using properties of matrix multiplication and realizing that u i is row vector that is the i-th row of A and u j is the j-th row of A and so the (i, j) the element of A A must be the scalar product u i u j which can be nonzero only for i = j and results in a diagonal matrix. hus, for the 3 3 matrices involved we have u A A = u ( ) u u u u u u 3 u u u 3 = u u u u u u u 3 3 u 3 u u 3 u u 3 u 3 u u 0 0 = 0 u u 0 0 0 u 3 u 3 which is a diagonal matrix. S 3.6: No. 4. Use Gram-Schmidth process to generate orthogonal set from the given linearly independent vectors {w, w, w 3 } below w = 0, w = 0, w 3 = 0, Solution: We take u = w = [, 0,, ]. We note that w u = [,, 0, ][, 0,, ] = ()() + ()(0) + (0)() + ()() = 6 and u = u u = + 0 + + = 6. u = w u u u u = w 6 6 u = [,, 0, ] [, 0,, ] = [,,, 0] We can double check against algebraic mistake by noting that u u = ()() + ()(0) + ( )() + (0)() = 0. We also have length square of u = u u = () + () + ( ) + 0 = 3 and w 3 u = [,, 0, ][, 0,, ] = ()() + ( )(0) + (0)() + ()() = 3 and w 3 u = [,, 0, ][,,, 0] = ()() + ( )() + (0)( ) + ()(0) = 0 and so u 3 = w 3 3 u u u u ( w 3 u u u ) u = [,, 0, ] 3 [ ] 6 [, 0,, ] =,,, 0 We can double check against algebraic mistake by noting that u 3 u = 0 and u 3 u = 0 as readily checked. S 3.6: No. 6. Same question as No. 4 above with w = 0, w = 3 6, w 3 = 0 5, 5 Solution: Once again we take u = w = [0,, ]. We note that w u = [3, 6, ][0,, ] = (3)(0)+(6)()+()() = 0 and u = u u = 0 + + = 6

5. hus, u = w u u u u = w 0 5 u = [3, 6, ] [0,, 4] = [3, 4, ] We can double check against algebraic mistake by noting that u u = 0, as readily checked. We also have length square of u = u u = (3) + (4) + ( ) = 9 and w 3 u = [0, 5, 5][0,, ] = 5 + 0 = 5 and w 3 u = [0, 5, 5][3, 4, ] = (0)(3) + ( 5)(4) + (5)( ) = 0 and so u 3 = w 3 3 u u u u ( w 3 u u u ) u = [0, 5, 5] 5 5 [0,, ] = [0, 6, 3] We can double check against algebraic mistake by noting that u 3 u = 0 and u 3 u = 0 as readily checked. S 3.6: No. 0. Find a basis for the null space and range of the given matrix A below. hen use Gram-Schmidt to obtain orthogonal bases. 5 A = 7 5 Solution: o find basis for null space we just solve Ax = 0 and describe the solution as a linear combination of independent vectors. For that purpose we need to row reduce the augmented system as below 5 0 5 0 5 0 7 5 0 R R R 3 R 0 4 5 5 0 R R3 R 3 R 0 4 0 0 0 4 0 0 4 5 5 0 R3 4R R +R 0 3 3 0 0 4 0 R R3 0 0 0 R 3R 3 0 0 6 0 0 0 5 0 0 0 0 his row reduction implies that the fourth component x 4 = t is arbitrary, where as x = 6t, x = 5t, x 3 = t, x 4 = t, or x = t[ 6, 5,, ], meaning that Sp { [ 6, 5,, ] } = Null space of A. Since a set containing one nonzero vector has to be independent (since c v = 0 implies c = 0 for nonzero v) it follows that a basis of null space of A is also B N := { [ 6, 5,, ] }. he above row reduction also shows that if we were to check linear independence of columns of A, they would be linearly dependent since x 4 is arbitrary. However, if we got rid of the fourth column of A, then the 3 3 matrix would have row reduced to identity (look at the first three columns in the RREF form) above, implying at once that the first three columns of A would form a basis for column space of A, i.e. B = { [,, ], [,, ], [, 7, ] } is a basis for column space of A. Now, we call this set of vectors {w, w, w 3 } and carry out Gram-Schmidt process. 7

We have u = w = [,, ]. We note that w u = [,, ][,, ] = and u = u u = + + = 6. hus, u = w u u u u = w 6 u = [,, ] + [ 6 [,, ] = 6, 4 ] 3, 5 6 We can double check against algebraic mistake by noting ( that u u = 6 (( )() + (8)() + ( 5)()) = 0. We also have length square of u = u u = 36 ( ) + (8) + ( 5) ) = 35 6 and w 3 u = [, 7, ][,, ] = 7 and w3 u = 6 [, 7, ][, 8, 5] = 35 6 ( + 56 0) = 6. Note w 3 u =. hus using calculations above, u u u 3 = w 3 3 u u u u ( w 3 u u u ) [ u = [, 7, ] 7[,, ] 6, 4 ] 3, 5 6 = 6 [6 0 +, 4 04 8, 0 + 5] = [5, 70, 85] 6 here is likely to be an algebraic error in the calculation of u 3, which I haven t located yet. he set {u, u, u 3 } as calculated above will be an orthogonal basis for column space of A. Note that for the null space since B N only contains one vector, no Gram-Shmidt is involved. S 3.6: No. 3. Let W be a p dimensional subspace of R n. If v is a vector in W such that v w = 0 for every w W, show that v = 0. Solution Since the statement v w = 0 for every w W it is also true for w = v and we then have v = v v = 0, implying that v + v + v3 + + vn = 0 or every component of v is zero implying v = 0. 8