Lecture 18: Section 4.3 Shuanglin Shao November 6, 2013
Linear Independence and Linear Dependence. We will discuss linear independence of vectors in a vector space.
Definition. If S = {v 1, v 2,, v r } is a nonempty set of vectors in a vector space V, then the vector equation has at least one solution, namely, k 1 v 1 + k 2 v 2 + + k r v r = 0 k 1 = 0, k 2 = 0,, k r = 0. We call it the trivial solution. If this is the only solution, then S is said to be a linearly independent set. If there are solutions in addition to the trivial solution, then S is said to be a linearly dependently set.
Example 1. Example 1. Linear independence of the standard unit vectors in R n. Let Let e 1 = (1, 0, 0, 0), e 2 = (0, 1, 0,, 0),, e n = (0, 0,, 1). Thus c 1 e 1 + c 2 e 2 + + c n e n = 0. (c 1, c 2,, c n ) = (0, 0,, 0). This implies that c 1 = 0, c 2 = 0,, c n = 0. Therefore {e 1, e 2,, e n } is linearly independently.
Example 2. Example 2. Linear independence in R 3. Determine whether the vectors v 1 = (1, 2, 3), v 2 = (5, 6, 1) and v 3 = (3, 2, 1) are linearly independent or linearly dependent in R 3. Solution. Suppose that there exist k 1, k 2 and k 3 such that k 1 v 1 + k 2 v 2 + k 3 v 3 = (0, 0, 0). Thus (k 1 + 5k 2 + 3k 3, 2k 1 + 6k 2 + 2k 3, 3k 1 k 2 + k 3 ) = (0, 0, 0). (1)
Writing it in the linear system, k 1 + 5k 2 + 3k 3 = 0, 2k 1 + 6k 2 + 2k 3 = 0, 3k 1 k 2 + k 3 = 0. The augmented matrix is 1 5 3 0 2 6 2 0. 3 1 1 0
By elementary row reductions, 1 5 3 0 0 2 1 0 0 0 0 0 Since the last row only consists of zero, there exists nontrivial solutions (k 1, k 2, k 3 ) satisfying the equation (1).
Linear independence in R 4. Determine whether the vectors v 1 = (1, 2, 2, 1), v 2 = (4, 9, 9, 4), v 3 = (5, 8, 9, 5) in R 4 are linearly dependent or linearly independent. Proof. Suppose that there exists k 1, k 2, k 3 such that k 1 v 1 + k 2 v 2 + k 3 v 3 = (0, 0, 0).
Suppose that there exist k 1, k 2, k 3 such that k 1 + 4k 2 + 5k 3 = 0, 2k 1 + 9k 2 + 8k 3 = 0, 2k 1 + 9k 2 + 9k 3 = 0, k 1 4k 2 5k 3 = 0. The 4th equation is the same as the first equation. So the augmented matrix is 1 4 5 0 2 9 8 0 2 9 9 0
By the elementary row reductions, we have 1 4 5 0 0 1 2 0. 0 0 1 0 This implies that k 1 = 0, k 2 = 0, k 3 = 0. Therefore v 1, v 2, v 3 is linearly independent.
An important linearly independently set in P n. Show that the polynomials 1, x,, x n form a linearly independent set in P n. Solution. Suppose that there exists a 0, a 1,, a n such that Thus a 0 + a 1 x + a 2 x 2 + + a n x n = 0. a 0 = a 1 = a 2 = = a n = 0. Hence the set of polynomials of {1, x,, x n } is linearly independent.
Theorem. Theorem. (a). A set S with two or more vectors is Linearly dependent if and only if at least one of the vectors in S is expressible as a linear combination of the other vectors in S. (b). Linearly independent if and only if no vector in S is expressible as a linear combination of the other vectors in S.
Proof. We observe that the second statement follows from the first statement. So we only need to prove part (a). Let S = {v 1, v 2,, v k }.. If v 1 = c 2 v 2 + + c k v k, then 0 = v 1 c 2 v 2 + c k v k. Thus there exists nonzero solution (1, c 2,, c k ) satisfying the equation.
. Suppose that there exists S is linearly dependent, i.e., there exists (c 1, c 2,, c k ), not all zero, such that Suppose that c 1 0, This proves the statement. c 1 v 1 + c 2 v 2 + + c k v k = 0. v 1 = c 2 c 1 v 2 c k c 2 v k.
Theorem. (a). (b). (c). A finite set that contains bf 0 is linearly dependent. A set with exactly one vector is linearly dependent if and only if that vector is nonzero. A set with exactly two vectors is linearly independent if and only if neither vector is a scalar multiple of the other. The proof is omitted.
Theorem. Let S = {v 1, v 2,, v r } be a set of vectors in R n. If r > n, then S is linearly dependent. Solution. Suppose that there exists x 1, x 2,, x r such that x 1 v 1 + x 2 v 2 + + x r v r = 0. (2) We need to show that there exists a nonzero solution (x 1, x 2,, x r ) to this equation. Suppose that v i = (a 1i, a 2i,, a ni ). Then we have the system of equations a 11 x 1 + a 12 x 2 + + a 1r x r = 0, a 21 x 1 + a 22 x 2 + + a 2r x r = 0,.. a n1 x 1 + a n2 x 2 + + a nr x r = 0.
Then the augmented matrix is a 11 a 12 a 1r 0 a 21 a 22 a 2r 0..... a n1 a n2 a nr 0. By Theorem 1.2.2, since this system has more unknowns than equations, it has infinitely many solutions. So there exists a nonzero solution to the equation (2).
By using the standard unit basis in R n, each v i can be expressed as a linear combination of e 1, e 2,, e n : v 1 = a 11 e 1 + a 12 e 2 + + a 1n e n v 2. = a 21 e 1 + a 22 e 2 + + a 2n e n. v n = a n1 e 1 + a n2 e 2 + + a nn e n.. v r = a r1 e 1 + a r2 e 2 + + a rn e n.
We write the first n equations in the following form: v 1 a 11 a 12 a 1n e 1 v 2 a 21 a 22 a 2n e 2. v n = Let A be the coefficient matrix.... a n1 a n2 a nn. e n
We divide it into two cases. Case 1. If A is invertible, then the column vector of e 1, e 2,, e n can be written as the product of the inverse matrix A and the column vector v 1, v 2,, v n. Since v r is a linear combination of e 1, e 2,, e n. Thus v r is a linear combination of v 1, v 2,, v n. Hence the set of vectors {v 1, v 2,, v r } is linearly dependent. Case 2. If A is not invertible, then by elementary row operations, we can reduce the last row of the matrix A to the zero row. Applying the same elementary row operations to the left hand side, we see that a linear combination of {v 1, v 2,, v n } with nonzero coefficients is equal to zero. Thus {v 1,, v n } is linearly dependent. Hence {v 1,, v n,, v r } is linearly dependent.
Homework and Reading. Homework. Ex. # 2, # 3, # 4, # 6, # 10, # 12, # 13, # 16. True or false questions on page 200. Reading. Section 4.4.