Linear Algebra Chih-Wei Yi Dept. of Computer Science National Chiao Tung University November, 008
Section De nition and Examples Section De nition and Examples
Section De nition and Examples De nition (Vector Spaces) Let V be a set on which two operations (vector addition and scalar multiplication) are de ned. If the following axioms are satis ed for every u, v, w V and every scalar (real number) a, b, then V is called a vector space. u + v V. (closure under addition) u + v = v + u. (commutative property of addition) (u + v) + w = v + (u + w). (associative property of addition) 4 V has a zero vector 0 such that for every vector u V, u + 0 = u. (zero vector) 5 For each u V, there exists u V such that u + ( u) = 0. (inverse vector)
Section De nition and Examples De nition (Vector Spaces (Cont.)) 6 au V. (closure under scalar multiplication) 7 a(u + v) = au + av. (distributive property) 8 (a + b)u = au + bu. (distributive property) 9 (ab)u = a(bu). 0 u = u.
Section De nition and Examples Examples For any integer n, R n is a vector space. For any integer n, m, let M mn denote the set of all m n matrices. Then, M mn is a vector space. Let C[a, b] denote the set of all continuous function on [a, b]. For any f, g C[a, b] and c R, we de ne (f + g)(x) = f (x) + g(x) and (cf )(x) = cf (x). It can be veri ed that C[a, b] is a vector space.
Section De nition and Examples Problem For any integer n 0, let P n denote the set of all polynomials with degree at most n. Prove that P n is a vector space.
Section De nition and Examples Theorem (The Zero Vector and Inverse Vector) If V is a vector space and u V, then 0u = 0. If u + v = 0, then v = u. (uniqueness of the inverse vector) ( )u = u. Proof. Since 0u+0u = (0 + 0) u = 0u, after adding (0u) to both sides, we have 0u = 0. Adding u to both sides, we have ( u + u) + v = u + 0. So, v = u. Since u + ( )u = ( + ( )) u = 0, we have ( )u = u.
Section Subspaces Section Subspaces
Section Subspaces De nition (Subspaces) If S V is nonempty and satis es following conditions, then S is called a subspace of V. For any vector u, v S, u + v S. For any scalar a and vector u S, au S. According to the de nition, to verify whether a subset of vector space is a subspace or not, we should verify the closure of addition and scalar multiplication. For a vector space V, V and f0g are its subspaces. f0g is called the zero space. Any subspace of V excepting V itself is called a proper subspace. Note that 0 belongs to any vector space and subspace.
Section Subspaces Examples o Let S = n(x, x, x ) T j x = x. S is a subspace of R. S is nonempty since (0, 0, 0) T S. If x = (x, x, x ) T S, we have x = x. So, for any scalar c, since cx = cx, cx = (cx, cx, cx ) T S. If x = (x, x, x ) T, y = (y, y, y ) T S, we have x = x and y = y. Then, since x + y = x + y, x + y = (x + y, x + y, x + y )T S. Examples Let S = A R j a = a. S forms a subspace of R.
Section Subspaces Examples Let S = f(x, ) j x is a real numberg. S is not a subspace of R. We have a lot of reasons. If x = (x, x ) T S, we have x =. For any scalar c 6=, since cx = c 6=, cx = (cx, cx ) T / S. 0 = (0, 0) T / S. etc. Examples Let S = fp(x) j p(0) = 0 and deg(p(x)) ng. S is a subspace of P n. Examples Let C n [a, b] denote the set of all function f that have a continuous nth derivation on [a, b]. Then, C n [a, b] is a subspace of C[a, b].
Section Subspaces The Nullspace of a Matrix Let A be a m n matrix. Let N (A) denote the set of all solutions to the homogeneous system Ax = 0, i.e. N (A) = fx j Ax = 0g. N (A) is a subspace of R n. N (A) is nonempty. 0 N (A) since A0 = 0. N (A) is an additive closure. If Ax = 0 and Ay = 0, A(x + y) = 0. N (A) is an scalar multiplication closure. If Ax = 0, A (cx) = c (Ax) = 0. N (A) is called the nullspace of A.
Section Subspaces Example Determine N (A) if A = Solution 0 0 Solve Ax = 0 by Gauss-Jordan reduction 0 0 0 0! 0 0 0 0 0! 0 0 0 0! 0 0. 0 Here x, x 4 are free variables. Let x = s, x 4 = t. Then, x = s t, x = s + t..
Section Subspaces Solution (Cont.) So, we have 6 6 4 s t s + t s t 7 7 5 = 6 6 4 s s s 0 7 7 5 + 6 6 4 t t 0 t 7 7 5 = s 6 6 4 0 7 7 5 + t 6 6 4 0 7 7 5, and N (A) = 8 >< >: s 6 6 4 0 7 7 5 + t 6 6 4 0 7 7 5 j s, t R 9 >= >;.
Section Subspaces The Span of a Set of Vectors De nitions (Linear Combination) Let v, v,, v n be vectors in a vector space V. A sum of the form α v + α v + + α n v n, where α, α,, α n are scalars, is called a linear combination of v, v,, v n. De nitions (Span) The set of all linear combination of v, v,, v n, denoted as Span (v, v,, v n ), is called the span of v, v,, v n.
Section Subspaces Spans Are Subspaces Theorem If v, v,, v n are vectors of a vector space V, Span (v, v,, v n ) is a subspace of V. Proof. For simplicity, let S denote Span (v, v,, v n ). Obviously, S is not empty. If v S and v = α v + α v +... + α n v n, then cv = cα v + cα v + + cα n v n S. If v, w S and assume v = α v + α v + + α n v n and w = β v + β v + + β n v n, then v + w = (α + β ) v + (α + β ) v + + (α n + β n ) v n S.
Section Subspaces Examples 0 The nullspace of A = 0 6 7 4 5 and 6 7 4 0 5. 0 is the span of vectors Examples In R, Span (e ) is f(α, 0, 0) j α Rg. Span (e, e ) is f(α, β, 0) j α, β Rg.
Section Subspaces Spanning Set De nition (Spanning Set) The set fv, v,, v n g is a spanning set for V if and only if every vector in V can be written as a linear combination of v, v,, v n, i.e. V Span (v, v,, v n ). Example n e, e, e, (,, ) T o is a spanning sets for R. (α, β, γ) = αe + βe + γe.
Section Subspaces How to Verify Spanning Set Example n o Is (,, ) T, (,, 0) T, (, 0, 0) T a spanning set for R? Solution Verify that (a, b, c) T = α (,, ) T + β (,, 0) T + γ (, 0, 0) T is solvable for every a, b, c R. In other words, the following system is consistent. 4 0 0 0 5 4 α β γ 5 = 4 a b c 5.
Section Subspaces Example Is (, 0, ) T, (0,, 0) T a spanning sets for R? Solution Verify that whether (a, b, c) T = α(, 0, ) T + β(0,, 0) T is solvable for every a, b, c R. Counter example, let (a, b, c) T =(, 0, ) T. The system is inconsistent. 4 0 0 0 5 α β = 4 0 5
Section Subspaces Example Is (,, 4) T, (,, ) T, (4,, ) T a spanning sets for R? Solution Verify that whether (a, b, c) T = α(,, 4) T + β(,, ) T + γ(4,, ) T is solvable for every a, b, c R. 4 α a 4 5 4 β 5 = 4 b 5 4 γ c
Section Subspaces Example Prove that x, x +, and x span P. Solution For any f (x) P, let f (x) = ax + bx + c. The problem is asking whether ax + bx + c = α x + β (x + ) + γ x is solvable for every a, b, c R. In other words, we solve 0 α a 4 0 0 5 4 β 5 = 4 b 5. 0 γ c
Section Linear Independence Section Linear Independence
Section Linear Independence Linearly Independent De nition (Linearly Independent) The vectors v, v,, v n in a vector space V are said to be linearly independent if α v + α v + + α n v n = 0 has exactly one solution α = α = = α n = 0. Example The vectors (, ) T and (, ) T are linearly independent, since the equation α(, ) T + β(, ) T = (0, 0) T has exactly one solution α = β = 0.
Section Linear Independence Linearly Dependent De nition (Linearly Dependent) The vectors v, v,, v n in a vector space V are said to be linearly dependent if α v + α v + + α n v n = 0 has not all zero solutions of α = α = = α n. Example e, e, e, (,, ) T are linearly dependent, since ( ) e + e + ( ) e + (,, ) T = 0.
Section Linear Independence Linearly Independent or Linearly Dependent Given vectors v, v,, v n, how can we verify they are linearly independent or dependent? Solve the homogeneous linear system α v + α v + + α n v n = 0. If there are nontrivial solutions, i.e. α, α,, α n are not all zeros, then v, v,, v n are linearly dependent; Otherwise, the only solution is α = α = = α n = 0, then v, v,, v n are linearly independent.
Section Linear Independence Exercise Prove that if one of v, v,, v n is 0, the zero vector, then v, v,, v n are linearly dependent. Prove that if vectors v, v,, v n are linearly dependent, there exists a vector v i, i n, such that v i = α v +... + α i v i + α i+ v i+ + + α n v n.
Section Linear Independence Example Which of the following collections of vectors are linearly independent in R? (a) 8 < : 4 5, 4 0 5, 4 0 0 5 9 = ; ; (b) 8 < : 4 0 5, 4 0 0 5 9 = ; ; (c) 8 < : 4 4 5, 4 5, 4 4 5 9 = ;.
Section Linear Independence Solution ((c)) The equation c (,, 4) T + c (,, ) T + c (4,, ) T = (0, 0, 0) T only has a trivial solution. In other words, the homogeneous linear system 4 c 0 4 5 4 c 5 = 4 0 5 4 c 0 has only the trival solution c = c = c = 0.
Section Linear Independence Verify by Determinants Theorem Let x, x,, x n be n vectors in R n and let X = (x, x,, x n ). The vectors x, x,, x n will be linearly dependent if and only if X is singular. The vectors x, x,, x n will be linearly dependent if and only if det(x) = 0. The vectors x, x,, x n will be linearly independent if and only if det(x) 6= 0. Example Determine whether the vectors (4,, ) T, (,, ) T, (, are linearly dependent? 5, ) T
Section Linear Independence Span v.s. Linearly Independent Theorem Let v, v,, v n be vectors in a vector space V. A vector v in Span (v, v,, v n ) can be written uniquely as a linear combination of v, v,, v n if and only if v, v,, v n are linearly independent.
Section Linear Independence Proof. First, we prove the if part, v in Span (v, v,, v n ) can be written uniquely as a linear combination of v, v,, v n if v, v,, v n are linearly independent, by contradiction. Assume v has at least two linear combination representations. v = α v + α v + + α n v n v = β v + β v + + β n v n Then substract the nd equation from the st equation, we have (α β ) v + (α β ) v + + (α n β n ) v n = 0.
Section Linear Independence Cont. Next, we prove the only if part, v in Span (v, v,, v n ) can be written uniquely as a linear combination of v, v,, v n only if v, v,, v n are linearly independent, by contradiction. Assume v, v,, v n are not linearly independent. There exist α,α,, α n that are not all zeros such that α v +... + α i v i + α i+ v i+ + + α n v n = 0. If v =β v + β v + + β n v n, then we have another nontrivial solution (α + β ) v + (α + β ) v + + (α n + β n ) v n = v.
Section Linear Independence Linear Independency of Functions Let f (x), f (x),, f n (x) are functions in C (n ) [a, b]. If f (x), f (x),, f n (x) are linearly dependent, there exist c, c,, c n not all zero such that for any x [a, b], c f (x) + c f (x) + + c n f n (x) = 0 c f 0(x) + c f 0(x) + + c nfn 0 (x) = 0. (n ) (n ) (n ) c f (x) + c f (x) + + c n f n (x) = 0
Section Linear Independence Wronskian Let f (x), f (x),, f n (x) are functions in C (n the function W [f, f,, f n ] (x) on [a, b] by W [f, f,, f n ] (x) = ) [a, b]. De ne f (x) f (x) f n (x) f 0(x) f 0(x) f n(x) 0...... (n ) (n ) (n ) f (x) f (x) f n (x). Theorem If there exists a point x 0 in [a, b] such that W [f, f,, f n ] (x 0 ) 6= 0, then f, f,, f n are linearly independent.
Section Linear Independence Examples Show that e x and e x are linearly independent in C (, ). Show that, x, x, x are linearly independent in P.
Section 4 Basis and Dimension Section 4 Basis and Dimension
Section 4 Basis and Dimension De nition (Basis) The vectors v, v,, v n are a basis for a vector space V if and only if v, v,, v n are linearly independent. v, v,, v n span V. Examples fe, e, e g is a standard basis of R. In addition, 4 0 0 5 and 4 0 both are bases of R. Note that any basis of R must have exactly three elements. 5
Section 4 Basis and Dimension Example In R, the set fe, E, E, E g is a basis, where E = 0 0 0 0, E = 0 0 0 0, E = 0 0 0, E = 0. Example 0 The nullspace of the matrix A = is 0 8 >< N (A) = s >: 6 4 0 7 5 + t 6 4 0 7 5 j s, t R 9 >=. Obviously, >; n (,,, 0) T, (,, 0, ) T o is a basis of N(A).
Section 4 Basis and Dimension The Number of Vectors in Basis Theorem If fv, v,, v n g is a spanning set for a vector space V, then any collection of m vectors in V, where m > n, is linearly dependent. Corollary If fv, v,, v n g and fu, u,, u m g both are bases for a vector space V, then n = m.
Section 4 Basis and Dimension Proof Let u, u,, u n be m vectors in V where m > n. Since fv, v,, v n g span the vector space V, we may have u i = a i v + a i v +... + a in v n for any i m. Consider the equation c u + c u + + c m u m = 0. We have! m a i c i v + i=! m a i c i v + + i=! m a im c i v m = 0. i=
Section 4 Basis and Dimension Proof (Cont.) Since m > n, the homogeneous system a a a m c a a a m c 6 4..... 7 6 7. 5 4. 5 a n a n a nm c m has nontrivial solution. nm m = 6 4 0 0. 0 7 5 n So, c u + c u + + c m u m = 0 has nontrivial solutions.
Section 4 Basis and Dimension Dimensions of Bases De nition Let V be a vector space. If V has a basis consisting of n vectors, we say that V has dimension n. The subspace f0g of V is said to have dimension 0. V is said to be nite-dimensional if there is a nite set of vectors that spans V. Otherwise, we say that V is in nite-dimensional. Examples R n, R mn, P are nite-dimensional vector spaces. C (n ) [a, b] is in nite-dimensional vector spaces.
Section 4 Basis and Dimension Properties of the Dimension of Vector Spaces Theorem If V is a vector space of dimension n > 0, then Any set of n linearly independent vectors spans V. Any n vectors that span V are linearly independent. Theorem If V is a vector space of dimension n > 0, then No set of less than n vectors can span V. Any subset of less than n linearly independent vectors can be extended to form a basis for V. Any spanning set containing more than n vectors can be pared down to form a basis for V.
Section 5 Change of Basis Section 5 Change of Basis
Section 5 Change of Basis Coordinates Assume B = fv, v,, v n g is a (ordered) basis of a vector space V. Any vector u in V can be expressed as a unique linear combination of the basis. Assume u = c v + c v + + c n v n and the order of c v, v,, v n is xed. Then, (c, c,, c n ) T B or c 6 7 4. 5 c n B are shorthand for the linear combination c v + c v + + c n v n, and called as the coordinate vector of u with respected to the ordered basis B.
Section 5 Change of Basis Example n o Let B = (, 0, ) T, (,, 0) T, (0, 0, ) T and n o B = (,, ) T, (0,, 0) T, (0, 0, ) T and consider the vector (,, ) T (w.r.t. the standard basis). Since (,, ) T = (, 0, ) T + ( ) (,, 0) T + ( ) (0, 0, ) T, it has coordinate (,, ) T B. Since (,, ) T = (,, ) T + (0, has coordinate (,, 0) T B., 0) T + 0 (0, 0, ) T, it We can see that non-zero vectors may have di erent coordinate vectors w.r.t. di erent basis.
Section 5 Change of Basis Problem Now, we know that if B = n(, o 0, ) T, (,, 0) T, (0, 0, ) T n o and B = (,, ) T, (0,, 0) T, (0, 0, ) T, then (,, ) T = (,, ) T B = (,, 0) T B. How can we convert coordinates w.r.t. two bases?
Section 5 Change of Basis An Example of Changing Coordinates Step, (, 0, ) = (,, ) + (0,, 0) + 0(0, 0, ) (,, 0) = (,, ) + (0,, 0) + ( )(0, 0, ) (0, 0, ) = 0(,, ) + 0(0,, 0) + (0, 0, ) Step, (,, ) = (, 0, ) + ( ) (,, 0) + ( ) (0, 0, ) = ((,, ) + (0,, 0) + 0(0, 0, )) +( ) ((,, ) + (0,, 0) + ( )(0, 0, )) + ( ) (0(,, ) + 0(0,, 0) + (0, 0, )) = (,, ) + (0,, 0) + 0 (0, 0, )
Section 5 Change of Basis A Simple Observation 4 0 5B = 4 0 0 0 5 B!B 4 5 B
Section 5 Change of Basis From the Other Side Step, (,, ) = (, 0, ) + ( )(,, 0) + ( )(0, 0, ) (0,, 0) = ( )(, 0, ) + (,, 0) + (0, 0, ) (0, 0, ) = 0(, 0, ) + 0(,, 0) + (0, 0, ) Step, (,, ) = (,, ) + (0,, 0) + 0 (0, 0, ) = ((, 0, ) + ( )(,, 0) + ( )(0, 0, )) + (( )(, 0, ) + (,, 0) + (0, 0, )) +0 (0(, 0, ) + 0(,, 0) + (0, 0, )) = (, 0, ) + ( ) (,, 0) + ( ) (0, 0, )
Section 5 Change of Basis Verify the Observation 4 5B = 4 0 0 5 B!B 4 0 5 B
Section 5 Change of Basis Changing Coordinates Problem B = fv, v,, v n g and B = fu, u,, u n g are two bases of a vector space V. If the coordinate vectors of w w.r.t. B is c = (c, c,, c n ) T B, and the coordinate vectors of w w.r.t. B is d = (d, d,, d n ) T B, what is the relation between c and d? Assume v i = a i u + a i u +... + a ni u n for any i n. Then, w = c v + c v + + c n v n!!! n n n = c a i u i + c a i u i + + c n a in u i i= i= i= =! n a j c j u + j=! n a j c j u + + j=! n a nj c j u n. j=
Section 5 Change of Basis Changing Coordinates (Cont.) So, we have d i = n j= a ij c j = a i c + a i c +... + a in c n. In other words, d a a a n c d a a a n c 6 7 = 6 4. 5 4..... 7 6 7.. 5 4. 5 d n a n a n a nn c n B B!B Similarly, if u i = b i v + b i v +... + b ni v n for any i n. Then, c b b b n c b b b n 6 7 = 6 4. 5 4..... 7 6. 5 4 c n b n b n b nn B B!B d d. d n 7 5 B B.
Section 5 Change of Basis Changing Coordinates (Cont.) Furthermore, d = Ac = A (Bd) = (AB) d. We have AB = I. So, A = B and B = A.
Section 6 Row Space and Column Space Section 6 Row Space and Column Space
Section 6 Row Space and Column Space De nition (Row Spaces and Column Spaces) If A is an m n matrix, then The subspace of R n spanned by row vectors a, a,, a m is called the row space of A. The subspace of R m spanned by column vectors a, a,, a n is called the column space of A.
Section 6 Row Space and Column Space Theorem (Invariance of Row Spaces under Elementary Row Operations) If A and B are row equivalent matrices, then the row space of A is the same as the row space of B. In other words, elementary row operations do not change the row space of a matrix. Proof. This can be easily veri ed by the reversibility of ERO s.
Section 6 Row Space and Column Space Dimension of Column Spaces under ERO s Theorem If A and B are row equivalent matrices, then Proof. A given set of column vectors of A is linearly independent if and only if the corresponding columns of B are linearly independent. A given set of column vectors of A is a basis of the column space of A if and only if the corresponding columns of B are a basis of the column space of B. Since Ax = 0 and Bx = 0 are equivalent, x a + x a +... + x n a n = 0 and x b + x b +... + x n b n = 0 have exactly the same solutions.
Section 6 Row Space and Column Space Rank Theorem For any matrix, the dimension of its row space is the same as the dimension of its column space. Proof. Reduce the matrix to row echelon form. De nition (Rank) If A is a matrix, the dimension of the row or column space of A is called the rank of A and denoted by rank (A).
Section 6 Row Space and Column Space Null Spaces and Nullity De nition (Nullity) A is an m n matrix All solutions of Ax = 0 form a subspace of R n and is called the null space of A and denoted by null(a). The dimension of null(a) is called the nullity of A and denoted by nullity(a). Theorem If A is an m n matrix, we have rank(a) + nullity(a) = n. Proof. Consider A is in the reduced row form.
Section 6 Row Space and Column Space The Solutions of Ax=b Theorem If x 0 is a solution of Ax = b, then the solution set of Ax = b is x0 + x 0 : Ax 0 = 0. Proof. For any x 0 such that Ax 0 = 0, x 0 + x 0 is a solution of Ax = b. On the other hand, for any x" such that Ax" = b, let x 0 = x" x 0. We have Ax 0 = A(x" x 0 ) = Ax" Ax 0 = b b = 0 and x" = x 0 + x 0.
Section 6 Row Space and Column Space The Consistency Theorem Theorem Ax = b is consistency if and only if rank(a) = rank([a jb]). Proof. If Ax = b is consistent, b is in the column space. So the dimension of the column space of [A jb] are the same as that of A. This imply the dimension of the row space of A and [A jb] are the same. In other words, rank(a) = rank([a jb]).