Chapter : Theory Review: Solutions Math 08 F Spring 05. What two properties must a function T : R m R n satisfy to be a linear transformation? (a) For all vectors u and v in R m, T (u + v) T (u) + T (v) (b) For all vectors u in R m and all scalars r, T (ru) rt (u).. If T : R m R n is a linear transformation, what is the definition of onto? of oneto-one? Explain how these definitions relate to the matrix A such that T (x) Ax. T is onto if, for all vectors w in the codomain R n, we can find a vector v in R m such that T (v) w. This relates to matrices because, if T (x) Ax, this is saying the equation Ax w has a solution for all vectors w, which is equivalent to saying the columns of A span the codomain R n. T is one-to-one if, for all vectors w in the codomain R n, there is at most one vector v in R m such that T (v) w. Equivalently, T is one-to-one if T (u) T (v) implies u v. Note: these are the DEFINITIONS of one-to-one. We can prove these definitions are equivalent to the following: T is one-to-one if and only if the only vector x such that T (x) 0 is x 0. From this definition, we can relate this to matrices because this is saying Ax 0 has only the trivial solution, which is equivalent to saying the columns of A are linearly independent.. Three functions are given below. Determine if they are linear transformations (by checking that they satisfy the two properties above) and, if a function is a linear transformation, determine if it is onto or one-to-one (or both). (a) T : R R given by x T x x x + x x + This is not a linear transformation. For example, let 0 u 0, v. Then, T (u + v) + T (u) + T (v).
(b) T : R R given by x T x x x x 4x This is a linear transformation. If then u v u u, v v u v [ ] [ ] (u T (u + v) + v ) u + v (u + v ) 4(u + v ) u 4u + v 4v u v + T (u) + T (v). u 4u v 4v Also, for any scalar r we have ru T (ru) ru 4ru [ r u u 4u ] rt (u). Also, this linear transformation corresponds to the matrix 0 4 (meaning T (x) Ax). Because A reduces to, 0 /4 the columns of A span R but are not linearly independent, so T is onto but not one-to-one. (Note: we could also see that T is not one-to-one by exhibiting two different vectors u and v such that u v but T (u) T (v); for example u 0, v.) 0 4 (c) Let y be a fixed vector in R, then let T : R R be given by T (x) (y x)y
This is a linear transformation. For any vectors u, v, because y (u + v) y u + y v and y (ru) ry u (which we remember from learning about the dot product in calculus!), we know T (u + v) (y (u + v))y (y u + y v)y (y u)y + (y v)y T (u) + T (v). Also T (ru) (y (ru))y r(y u)y rt (u). However, this linear transformation sends any vector u to a multiple of y, so it is not onto (vectors that are not multiples of y are not in the range). It is also not one-to-one; choosing any nonzero vector x orthogonal to y will satisfy T (x) 0. 4. Give an example of the following, or explain why such an example does not exist. (a) A linear transformation T : R R such that T 0 x T x x x x x + x (b) A linear transformation T : R R such that T x T x x x x x + x (c) A linear transformation T : R R such that T 0 and T 0 and T 4 This does NOT exist because, if T is a linear transformation, we must have T T which is not true in this case.
(d) Non-square matrices A and B such that AB I 0, B 0 0 0 (e) Non-square matrices A and B such that AB I and B I This does NOT exist. To prove it does not exist, we could use question 5(h): if A and B are not square matrices, but their products AB and BA are square, we must have A an n m matrix and B and m n matrix for some m and n. Because they are not square, either m < n or m > n. If m < n, B is a matrix with more columns than rows, so the columns of B must be linearly dependent. But, using 5(h), this implies that the columns of AB are linearly dependent, so it s impossible for AB I (because the columns of I are linearly independent). Similarly, if m > n, then A is a matrix with more columns than rows, so the columns of A are linearly dependent, so again by 5(h), the columns of BA must be linearly dependent, so it s impossible for B I. Therefore, these matrices do not exist. (f) An n n matrix A with no zero entries such that A is nonsingular 4 (g) An n n matrix A with no zero entries such that A is singular (h) A singular matrix A whose columns are linearly independent This does NOT exist, because if A is singular, then A is an n n matrix with no inverse. By the Big Theorem, this implies the columns of A are linearly dependent. (i) A square matrix A such that A A T 0 0
5. For each statement below, determine if it is True or False. Justify your answer. The justification is the most important part! (a) If T : R m R n is a linear transformation, then T (0) 0. TRUE. If we plug in r 0 to the second condition needed to be a linear transformation, we find that T (0) 0. (b) If T : R m R n is a linear transformation, then the only vector x such that T (x) 0 is x 0. FALSE. The only time this is true is if T is one-to-one. If T is not one-to-one, it is false. For example, we could take the linear transformation T (x) ; then every vector x satisfies T (x) 0. (c) If {v, v, v } is a linearly dependent set of vectors in R m and T : R m R n is a linear transformation, then {T (v ), T (v ), T (v )} is a linearly dependent set. TRUE. If these vectors are linearly dependent, we can write one as a linear combination of the others, say But, this implies v c v + c v. T (v ) T (c v + c v ). But, using the properties of linear transformations, this implies T (v ) c T (v ) + c T (v ), or that T (v ) is a linear combination of T (v ) and T (v ). But, this implies that {T (v ), T (v ), T (v )} is linearly dependent. (d) If {v, v, v } is a linearly independent set of vectors in R m and T : R m R n is a linear transformation, then {T (v ), T (v ), T (v )} is a linearly independent set. FALSE. If T (x) 0, then no matter what {v, v, v } are, {T (v ), T (v ), T (v )} {0, 0, 0}, which is a linearly dependent set. (e) For any matrix A, there is a matrix B such that AB A. TRUE. Let B I. (f) For any matrix A, there is a matrix B such that AB I. FALSE. B only exists if A is invertible. For example, if A is the zero matrix, no such B exists.
(g) If AB AC and A is not the zero matrix, then B C. FALSE. For example, let 0, B 0 0, C 0 0. Multiplying these out, we can see that AB AC and A is not the zero matrix, but B C. Note, however, if we add the additional hypothesis that A is nonsingular, this is a true statement (we can multiply both sides by A to get B C). (h) If B is a matrix whose columns are linearly dependent, then the columns of AB are linearly dependent (where A is any matrix such that the product AB is defined). TRUE. Write B [b... b n ] where b,..., b n are the columns of B. Then, by definition, AB [Ab... Ab n ]. If the columns of B are linearly dependent, then we can write one of them as a linear combination of the others, say b n c b + + c n b n. But, multiplying this equation by A, we see that Ab n Ac b + + Ac n b n c Ab + + c n Ab n which shows that the vector Ab n is a linear combination of {Ab,..., Ab n }, i.e. the set {Ab,..., Ab n } of columns of AB are linearly dependent. (i) If B is a matrix whose columns are linearly independent, then the columns of AB are linearly independent (where A is any matrix such that the product AB is defined). FALSE. If B is any matrix, and A is the zero matrix, then AB is the zero matrix, whose columns are linearly dependent. (However, that if we require A to be a nonsingular matrix, then this statement is true! Can you prove this?) (j) If A is not the zero matrix, then A has an inverse. FALSE. Let. A is not the zero matrix but has no inverse. (k) There is a nonsingular matrix A such that A O (the zero matrix). FALSE. If A is nonsingular, then we could multiply both sides by A to get O. But, A cannot be the zero matrix because the zero matrix is singular! Therefore, such a matrix does not exist.
(l) There is a nonsingular matrix A such that A A. TRUE. Multiplying both sides by A we find that I, and the identity matrix satisfies I I. (m) If A and B are nonsingular (n n) matrices, then A + B is also nonsingular. FALSE. If I and B I, then A and B are both nonsingular, but A+B is the zero matrix, which is singular. (n) If A and B are nonsingular (n n) matrices, then AB is also nonsingular. TRUE. Being nonsingular just means the matrix has an inverse. If A and B are nonsingular, then A and B exist, so AB has an inverse: (AB) B A, so AB is nonsingular.