Math 544, Exam 2 Information. 10/12/10, LC 115, 2:00-3:15. Exam 2 will be based on: Sections 1.7, 1.9, 3.2, 3.3, 3.4; The corresponding assigned homework problems (see http://www.math.sc.edu/ boylan/sccourses/544fa10/544.html) At minimum, you need to understand how to do the homework problems. Lecture notes: 9/16-10/7. Topic List (not necessarily comprehensive): You will need to know: theorems, results, and definitions from class. 1.7: Linear independence and non-singular matrices. You will need to know/identify the following terms: Linear combination of vectors. Linear dependence of a set of vectors; linear independence of a set of vectors. Let S = { v 1,..., v n } R m. The set S is linearly independent the vector equation x 1 v 1 + + x n v n = 0 x 1 has only the trivial solution x =. x n = 0. Conversely, S is linearly dependent the vector equation has a non-trivial solution x 0. The vector equation can be written in matrix form as A x = ( v 1 v n ) x = 0, where the vectors in S form the columns of A Mat n n (R). Only sets of vectors can be linearly dependent/linearly independent. It does not make sense to speak of matrices or systems of equations being linearly dependent/independent. Notes: 1. Any set of vectors S with 0 S is linearly dependent. 2. Suppose that v 0. Then { v} is linearly independent. 3. Two vectors are linearly dependent if and only if they are scalar multiples of each other.
Singular matrices; non-singular matrices. Let A Mat n n (R). The A is non-singular the equation A x = 0 has only the trivial solution, x = 0. Conversely, A is singular this equation has a non-trivial solution x 0. Only square matrices can be singular/non-singular. It does not make sense to speak of a system of equations or a set of vectors as being singular/non-singular. Note: Similarly, only systems of linear equations can consistent/inconsistent. It does not make sense to speak of matrices or sets of vectors as being consistent/inconsistent. Theorem. Suppose that S = { v 1,..., v n } R m with m < n. dependent set. Then S is a linearly Facts: Let A and B Mat n n (R). 1. AB is non-singular if and only if A and B are non-singular. 2. AB is singular if and only if one (or both) of A and B is singular. 1.9: Matrix inverses and their properties. Matrix inverses. A matrix A Mat n n (R) is invertible if and only if there exists B Mat n n (R) with BA = I n = AB. We say that B is the inverse of A, and we write B = A 1. 1. Only square matrices are allowed to be invertible. 2. Suppose that A Mat n n (R) is invertible. Then the inverse, A 1, is unique. Theorem. Let A Mat n n. Then the following conditions on A are equivalent. A is non-singular. A x = 0 has only the trivial solution, x = 0. The columns of A are a linearly independent set of vectors. For all b R n, A x = b has a unique solution. A is invertible. A is row-equivalent to I n. Facts: Suppose that A Mat n n (R) is invertible. To compute A 1, form the augmented matrix (A I n ) Mat n 2n (R). Apply the Gauss- Jordan algorithm to convert it to its reduced echelon form, (I n A 1 ). For all b R n, the system A x = b is consistent (i.e., it has a solution); the unique solution is x = A 1 b. 2
Theorem. Suppose that A, B Mat n n (R) are invertible. Then we have 1. (A 1 ) 1 = A. 2. (AB) 1 = B 1 A 1. (Note: (AB) 1 = A 1 B 1 AB = BA.) 3. Let k 0 in R. Then we have (ka) 1 = 1 k A 1. 4. (A T ) 1 = (A 1 ) T. ( ) a b Proposition. Let Mat c d 2 2 (R). Then we have ad bc = 0 = A is not invertible. ad bc 0 = A is invertible. 3.2: Vector space properties of R n. Vector space. A vector space is a collection of objects called vectors and a collection of constants called scalars together with the operations of vector addition (+) and scalar multiplication ( ) which satisfy the following axioms: Two closure axioms: closure under + and. Four addition axioms: associativity, identity, inverses, commutativity. Four scalar multiplication axioms: associativity, distributivity, identity. Subspace. Suppose that V is a vector space. A subset W V is a subspace of V if and only if W is itself a vector space (with the same scalars, addition, and multiplication as V ). 1. Let V be a vector space. Then { 0} and V are subspaces of V. These are the trivial subspaces of V. 2. The non-trivial subspaces of R 2 are lines through the origin; the non-trivial subspaces of R 3 are lines through the origin and planes through the origin. Theorem. Let V be a vector space, and let W V. Then W is a subspace of V if and only if W satisfies the following three axioms: 1. { 0} W. 2. For all x, y W, we have x + y W. 3. For all a R and for all x W, we have a x W. Note: One can combine axioms 2 and 3: for all a R and for all x, y W, we have a x + y W. Question: How does one determine whether a subset W of a vector space V is a subspace? To show that W is a subspace, you need to verify the three subspace axioms. To show that W is not a subspace, it suffices to provide a simple numerical example in which one of the axioms is violated. 3
Note: If a subset W of a vector space V is defined by a system of linear homogeneous equations satisfied by the coordinates of vectors in W, then it is a subspace. Facts: Suppose that U and V are subspaces of R n. Then we have: 1. U + V = { u + v : u U, v V } is always a subspace of R n. 2. U V is always a subspace of R n. 3. U V is not generally a subspace of R n. 3.3: Subspaces. Span. The span of a set S = { v 1,..., v r } R n is the set Span(S) = {all linear combinations of vectors in S} = { y = a 1 v 1 + + a r v r : a 1,..., a r R} R n. Examples. Let v 0. Then Span{ v} is a line through the origin in the direction of v. Let u, v be non-zero vectors in R n. Then u Span{ v} if and only if u and v determine the same line. Theorem. Suppose that S = { v 1,..., v r } R n. Then Span(S) is a subspace of R n. Let A Mat m n (R). Important subspaces associated to A are: The null space of A is Null(A) = { x R n : A x = 0}. It is a subspace of R n. The range of A is Range(A) = { y R m : there exists x R n such that A x = y}. It is a subspace of R m. The column space of A is Col(A) = Span{columns of A}. It is a subspace of R m. The row space of A is Row(A) = Span{rows of A}. It is a subspace of R n. Theorem. Suppose that A Mat m n (R). Then we have Range(A) = Col(A) = Row(A T ) R m. Theorem. Suppose that A, B Mat m n (R) are row-equivalent. Then we have Row(A) = Row(B). Problem: Given a matrix A Mat m n (R), give a basis (algebraic description) of the important subspaces associated to A. 3.4: Bases. A set of vectors S spans a subspace W of a vector space V if and only if Span(S) = W. Theorem. Suppose that Span(S) = W. 1. Suppose that S is linearly dependent. Then there exists T S with T < S for which Span(T ) = W. 4
2. Suppose that S is linearly independent. Then no set T with T < S has Span(T ) = W. Basis. Let W { 0} be a subspace of R n. Then a subset S W is a basis if and only if 1. S is linearly independent. 2. Span(S) = W. Note: Bases are not unique. Given a matrix A Mat m n (R), compute bases for Null(A), Range(A), Col(A), and Row(A). To do this, you first compute the reduced echelon form of A. Call it B. Row(A): The nonzero rows of B form a basis for Row(A). Col(A): The columns of B with the leading 1 s correspond to the columns of A which form a basis. Range(A): Since the range and column space of A agree, you compute Range(A) just as you would Col(A). Null(A): Null(A) is the set of solutions to the homogenous system A x = 0. Therefore, begin by solving A x = 0. Convert your solution to vector form. i.e., write your solution as Null(A) = Span(S). Verify that the vectors in S are linearly independent (which is usually easy to do and requires little or no justification). Theorem. Suppose that B is a basis for a subspace W or R n. Then every vector x W is expressible as linear combination of vectors from B in a unique way. 5