MTH 362: Advanced Engineering Mathematics Lecture 5 Jonathan A. Chávez Casillas 1 1 University of Rhode Island Department of Mathematics September 26, 2017
1 Linear Independence and Dependence of Vectors 2
Definition Given any set of m vectors x 1, x 2,..., x n (with the same number of components), a linear combination of these vectors is an expression of the form where c 1, c 2,..., c n are scalars. Definition c 1x 1 + c 2x 2 +... + c nx n, A set of m vectors x 1, x 2,..., x n is said to be linear independent if the only scalars that nullify a linear combination are all 0. That is, if c 1x 1 + c 2x 2 +... + c nx n = 0, Then c 1 = c 2 =... = c m = 0 If a set of vectors is not linear independent, then it is said they are linear dependent.
Example If x 1 = [ 1 1 1 Are x 1, x 2 and x 3 linear independent? Solution, x 2 = [ 1 1 2, x 3 = [ 3 1 4 Since 2x 1 + x 2 x 3 = 0 (Please check!) then, {x 1, x 2, x 3} are linear dependent. (why?)
Example If x 1 = [ 1 2 1 Are x 1, x 2 and x 3 linear independent? Solution, x 2 = [ 1 0 1 The vectors are linear dependent if the system of equations [ 1 1 1 2 0 1 1 1 1 [ x1 x 2 x 3, x 3 = [ 1 1 1 = [ 0 0 0 Solving by Gaussian elimination, we get that the only solution (Check again!) is x 1 = x 2 = x 3 = 0. Thus, the vectors are linear independent.
Definition The rank of a matrix A is the maximum number of linearly independent row vectors of A. It is denoted by rank A. Example The matrix [ 1 2 1 1 0 1 1 1 1 has rank 3, while the matrix [ 1 1 1 1 1 2 3 1 4 has rank 2. Why?
Definition We call a matrix A 1 row-equivalent to a matrix A 2 if A 2 can be obtained from A 1 by (finitely many!) elementary row operations. Since the maximum number of linear independent rows does not change by exchanging rows, multiplying a row by a scalar or add multiples of one row to another (WHY?), we have the following results Theorem Row-equivalent matrices have the same rank. Theorem Consider p vectors that each have n components. Then, these vectors are linearly independent if the matrix formed, with these vectors as row vectors, has rank p. However, these vectors are linearly dependent if that matrix has rank less than p.
A very important result is the following: Rank of A and A T The rank r of a matrix A equals the maximum number of linearly independent column vectors of A. Hence A and its transpose, A T, have the same rank. Example (Homework) Check the proof in the textbook. Theorem Consider p vectors each having n components. If n < p, then these vectors are linearly dependent. Why is the previous result true? Hint: Array the vectors as a matrix.
Table of Contents 1 Linear Independence and Dependence of Vectors 2
What is a vector space? Definition Let V be a nonempty set of objects (elements) with two operations. Vector Addition: for any v, w V, the sum u + v V. Scalar Multiplication: for any v V and k R, the product kv V. Then V is a vector space if it satisfies the Axioms of Addition and the Axioms of Scalar Multiplication that follow. In this case, the elements of V are called vectors.
Axioms of Addition Axioms of Addition A1. Addition is commutative. u + v = v + u for all u, v V. A2. Addition is associative. (u + v) + w = u + (v + w) for all u, v, w V. A3. Existence of an additive identity. There exists an element 0 in V so that u + 0 = u for all u V. A4. Existence of an additive inverse. For each u V there exists an element u V so that u + ( u) = 0.
Axioms of Scalar Multiplication Axioms of Scalar Multiplication S1. Scalar multiplication distributes over vector addition. a(u + v) = au + av for all a R and u, v V. S2. Scalar multiplication distributes over scalar addition. (a + b)u = au + bu for all a, b R and u V. S3. Scalar multiplication is associative. a(bu) = (ab)u for all a, b R and u V. S4. Existence of a multiplicative identity for scalar multiplication. 1u = u for all u V.
A linearly independent set in V consisting of a maximum possible number of vectors in V is called a basis for V. In other words, any largest possible set of independent vectors in V forms basis for V. That means, if we add one or more vectors to that set, the set will be linearly dependent. The number of vectors in a base of V is called the dimension of V, denoted by dim V.
Span of a Set of Vectors Definition Let {u 1, u 2,..., u k } be a set of vectors in R n. Then the collection of all linear combinations of these vectors is called the span of these vectors, written span{u 1, u 2,..., u k }. Problem Describe the span of the vectors u = [ 0 1 2 and v = [ 0 2 3.
Solution Notice that any linear combination of the vectors u and v yields a vector Suppose we take an arbitrary vector a linear combination of u and v. [ 0 y z [ 0 y z [ 0 y z in the YZ-plane. in the YZ-plane. It turns out we can write any such vector as = ( 3y + 2z) [ 0 1 2 + (2y z) [ 0 2 3 Hence, span{u, v} is the YZ-plane.
Subspaces Definition Let V be a nonempty collection of vectors in R n. Then V is a subspace if whenever a and b are scalars and u and v are vectors in V, au + bv is also in V. Subspaces are closely related to the span of a set of vectors which we discussed earlier. Theorem Let V be a nonempty collection of vectors in R n. Then V is a subspace of R n if and only if there exist vectors {u 1,..., u k } in V such that V = span {u 1,..., u k }
Subspaces Subspaces are also related to the property of linear independence. Theorem If V is a subspace of R n, then there exist linearly independent vectors {u 1,..., u k } of V such that V = span {u 1,..., u k } In other words, subspaces of R n consist of spans of finite, linearly independent collections of vectors in R n.
Basis of a Subspace Definition Let V be a subspace of R n. Then {u 1,..., u k } is called a basis for V if the following conditions hold: span{u 1,..., u k } = V {u 1,..., u k } is linearly independent. The following theorem claims that any two bases of a subspace must be of the same size. Theorem Let V be a subspace of R n and suppose {u 1,..., u k } and {v 1,..., v m} are two bases for V. Then k = m.
Dimension The previous theorem shows than all bases of a subspace will have the same size. This size is called the dimension of the subspace. Definition Let V be a subspace of R n. Then the dimension of V is the number of a vectors in a basis of V.
Properties of R n Note that the dimension of R n is n. There are some other important properties of vectors in R n. Theorem If {u 1,..., u n} is a linearly independent set of a vectors in R n, then {u 1,..., u n} is a basis for R n. Suppose {u 1,..., u m} spans R n. Then m n. If {u 1,..., u n} spans R n, then {u 1,..., u n} is linearly independent.
Row and Column Space Definition Let A be an m n matrix. The column space of A is the span of the columns of A. The row space of A is the span of the rows of A. Theorem The row space and the column space of a matrix A have the same dimension, equal to rank A.
Null Space Definition The null space of A, or kernel of A is defined as: ker(a) = {X : AX = 0} We also speak of the image of A, Im(A), which is all vectors of the form AX where X is in R n. To find ker(a), we solve the system of equations AX = 0. Problem Find ker(a) for the matrix A: A = [ 1 2 1 0 1 1 2 3 3
Null Space Solution The first step is to set up the augmented matrix: [ 1 2 1 0 0 1 1 0 2 3 3 0 Place this matrix in reduced row-echelon form: [ 1 0 3 0 0 1 1 0 0 0 0 0
Null Space Solution (continued) The solution to this system of equations is [ 3t t t : t R Therefore the null space of A is the span of this vector: ker(a) = span {[ 3 1 1 }
Nullity Definition The dimension of the null space of a matrix is called the nullity, denoted null(a). Theorem Let A be an m n matrix. Then, rank(a) + null(a) = n For instance, in the last example, A was a 3 3 matrix. The rank was 2 and the nullity was 1 (since the null space had dimension 1). rank(a) + null(a) = 2 + 1 = 3 = n