Chapter 3: Theory Review: Solutions Math 308 F Spring 2015

Similar documents
Family Feud Review. Linear Algebra. October 22, 2013

Chapters 5 & 6: Theory Review: Solutions Math 308 F Spring 2015

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

Linear Algebra Exam 1 Spring 2007

February 20 Math 3260 sec. 56 Spring 2018

MATH 1553, SPRING 2018 SAMPLE MIDTERM 2 (VERSION B), 1.7 THROUGH 2.9

7.6 The Inverse of a Square Matrix

Math 60. Rumbos Spring Solutions to Assignment #17

Math 314H EXAM I. 1. (28 points) The row reduced echelon form of the augmented matrix for the system. is the matrix

The matrix will only be consistent if the last entry of row three is 0, meaning 2b 3 + b 2 b 1 = 0.

Math 18, Linear Algebra, Lecture C00, Spring 2017 Review and Practice Problems for Final Exam

Section 4.5. Matrix Inverses

Chapter 2 Notes, Linear Algebra 5e Lay

Midterm 1 Review. Written by Victoria Kala SH 6432u Office Hours: R 12:30 1:30 pm Last updated 10/10/2015

Math 2174: Practice Midterm 1

MATH 2331 Linear Algebra. Section 2.1 Matrix Operations. Definition: A : m n, B : n p. Example: Compute AB, if possible.

5.) For each of the given sets of vectors, determine whether or not the set spans R 3. Give reasons for your answers.

1. Let m 1 and n 1 be two natural numbers such that m > n. Which of the following is/are true?

Problem # Max points possible Actual score Total 120

Review 1 Math 321: Linear Algebra Spring 2010

YORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH M Test #1. July 11, 2013 Solutions

Review Let A, B, and C be matrices of the same size, and let r and s be scalars. Then

Linear Equations in Linear Algebra

Math 360 Linear Algebra Fall Class Notes. a a a a a a. a a a

which are not all zero. The proof in the case where some vector other than combination of the other vectors in S is similar.

Span & Linear Independence (Pop Quiz)

MATH 2331 Linear Algebra. Section 1.1 Systems of Linear Equations. Finding the solution to a set of two equations in two variables: Example 1: Solve:

is Use at most six elementary row operations. (Partial

1 Last time: multiplying vectors matrices

Announcements Wednesday, October 04

Math 54 First Midterm Exam, Prof. Srivastava September 23, 2016, 4:10pm 5:00pm, 155 Dwinelle Hall.

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017

Math 54 HW 4 solutions

Lecture 22: Section 4.7

MATH 152 Exam 1-Solutions 135 pts. Write your answers on separate paper. You do not need to copy the questions. Show your work!!!

MATH 221, Spring Homework 10 Solutions

Math 24 Winter 2010 Sample Solutions to the Midterm

Math 3C Lecture 20. John Douglas Moore

MAT 1332: CALCULUS FOR LIFE SCIENCES. Contents. 1. Review: Linear Algebra II Vectors and matrices Definition. 1.2.

Chapter 3. Directions: For questions 1-11 mark each statement True or False. Justify each answer.

MATH 300, Second Exam REVIEW SOLUTIONS. NOTE: You may use a calculator for this exam- You only need something that will perform basic arithmetic.

MATH 221: SOLUTIONS TO SELECTED HOMEWORK PROBLEMS

2. Every linear system with the same number of equations as unknowns has a unique solution.

MATH 1553, JANKOWSKI MIDTERM 2, SPRING 2018, LECTURE A

Math 54 Homework 3 Solutions 9/

What is A + B? What is A B? What is AB? What is BA? What is A 2? and B = QUESTION 2. What is the reduced row echelon matrix of A =

3.4 Elementary Matrices and Matrix Inverse

Assignment 1 Math 5341 Linear Algebra Review. Give complete answers to each of the following questions. Show all of your work.

NAME MATH 304 Examination 2 Page 1

Math 240 Calculus III

Properties of Linear Transformations from R n to R m

Multiplying matrices by diagonal matrices is faster than usual matrix multiplication.

Solutions to Math 51 First Exam April 21, 2011

Linear Algebra V = T = ( 4 3 ).

Span and Linear Independence

Solutions to Exam I MATH 304, section 6

a11 a A = : a 21 a 22

Math 24 Spring 2012 Questions (mostly) from the Textbook

Section 2.2: The Inverse of a Matrix

Exam 2 MAS 3105 Applied Linear Algebra, Spring 2018

Spring 2014 Math 272 Final Exam Review Sheet

CS 246 Review of Linear Algebra 01/17/19

Review problems for MA 54, Fall 2004.

1 Last time: inverses

MAT Linear Algebra Collection of sample exams

Linear Algebra MATH20F Midterm 1

Sept. 3, 2013 Math 3312 sec 003 Fall 2013

Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008

1. TRUE or FALSE. 2. Find the complete solution set to the system:

MATH10212 Linear Algebra B Homework Week 4

Elementary maths for GMT

Review for Chapter 1. Selected Topics

Linear Algebra Quiz 4. Problem 1 (Linear Transformations): 4 POINTS Show all Work! Consider the tranformation T : R 3 R 3 given by:

Lecture 6 & 7. Shuanglin Shao. September 16th and 18th, 2013

Conceptual Questions for Review

Math 20F Final Exam(ver. c)

Find the solution set of 2x 3y = 5. Answer: We solve for x = (5 + 3y)/2. Hence the solution space consists of all vectors of the form

Math 21b: Linear Algebra Spring 2018

MATH 2360 REVIEW PROBLEMS

EXAM. Exam #2. Math 2360 Summer II, 2000 Morning Class. Nov. 15, 2000 ANSWERS

Glossary of Linear Algebra Terms. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB

Math Camp II. Basic Linear Algebra. Yiqing Xu. Aug 26, 2014 MIT

Elementary Linear Algebra Review for Exam 2 Exam is Monday, November 16th.

Final Exam Practice Problems Answers Math 24 Winter 2012

MTH 464: Computational Linear Algebra

Matrix equation Ax = b

Math 313 Chapter 1 Review

Linear Algebra Practice Problems

Math 3A Winter 2016 Midterm

Problem Point Value Points

[ Here 21 is the dot product of (3, 1, 2, 5) with (2, 3, 1, 2), and 31 is the dot product of

Introduction to Determinants

LS.1 Review of Linear Algebra

Math Linear Algebra

MATH PRACTICE EXAM 1 SOLUTIONS

Linear Algebra Massoud Malek

Numerical Linear Algebra Homework Assignment - Week 2

Name: MATH 3195 :: Fall 2011 :: Exam 2. No document, no calculator, 1h00. Explanations and justifications are expected for full credit.

Math 369 Exam #2 Practice Problem Solutions

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

Transcription:

Chapter : Theory Review: Solutions Math 08 F Spring 05. What two properties must a function T : R m R n satisfy to be a linear transformation? (a) For all vectors u and v in R m, T (u + v) T (u) + T (v) (b) For all vectors u in R m and all scalars r, T (ru) rt (u).. If T : R m R n is a linear transformation, what is the definition of onto? of oneto-one? Explain how these definitions relate to the matrix A such that T (x) Ax. T is onto if, for all vectors w in the codomain R n, we can find a vector v in R m such that T (v) w. This relates to matrices because, if T (x) Ax, this is saying the equation Ax w has a solution for all vectors w, which is equivalent to saying the columns of A span the codomain R n. T is one-to-one if, for all vectors w in the codomain R n, there is at most one vector v in R m such that T (v) w. Equivalently, T is one-to-one if T (u) T (v) implies u v. Note: these are the DEFINITIONS of one-to-one. We can prove these definitions are equivalent to the following: T is one-to-one if and only if the only vector x such that T (x) 0 is x 0. From this definition, we can relate this to matrices because this is saying Ax 0 has only the trivial solution, which is equivalent to saying the columns of A are linearly independent.. Three functions are given below. Determine if they are linear transformations (by checking that they satisfy the two properties above) and, if a function is a linear transformation, determine if it is onto or one-to-one (or both). (a) T : R R given by x T x x x + x x + This is not a linear transformation. For example, let 0 u 0, v. Then, T (u + v) + T (u) + T (v).

(b) T : R R given by x T x x x x 4x This is a linear transformation. If then u v u u, v v u v [ ] [ ] (u T (u + v) + v ) u + v (u + v ) 4(u + v ) u 4u + v 4v u v + T (u) + T (v). u 4u v 4v Also, for any scalar r we have ru T (ru) ru 4ru [ r u u 4u ] rt (u). Also, this linear transformation corresponds to the matrix 0 4 (meaning T (x) Ax). Because A reduces to, 0 /4 the columns of A span R but are not linearly independent, so T is onto but not one-to-one. (Note: we could also see that T is not one-to-one by exhibiting two different vectors u and v such that u v but T (u) T (v); for example u 0, v.) 0 4 (c) Let y be a fixed vector in R, then let T : R R be given by T (x) (y x)y

This is a linear transformation. For any vectors u, v, because y (u + v) y u + y v and y (ru) ry u (which we remember from learning about the dot product in calculus!), we know T (u + v) (y (u + v))y (y u + y v)y (y u)y + (y v)y T (u) + T (v). Also T (ru) (y (ru))y r(y u)y rt (u). However, this linear transformation sends any vector u to a multiple of y, so it is not onto (vectors that are not multiples of y are not in the range). It is also not one-to-one; choosing any nonzero vector x orthogonal to y will satisfy T (x) 0. 4. Give an example of the following, or explain why such an example does not exist. (a) A linear transformation T : R R such that T 0 x T x x x x x + x (b) A linear transformation T : R R such that T x T x x x x x + x (c) A linear transformation T : R R such that T 0 and T 0 and T 4 This does NOT exist because, if T is a linear transformation, we must have T T which is not true in this case.

(d) Non-square matrices A and B such that AB I 0, B 0 0 0 (e) Non-square matrices A and B such that AB I and B I This does NOT exist. To prove it does not exist, we could use question 5(h): if A and B are not square matrices, but their products AB and BA are square, we must have A an n m matrix and B and m n matrix for some m and n. Because they are not square, either m < n or m > n. If m < n, B is a matrix with more columns than rows, so the columns of B must be linearly dependent. But, using 5(h), this implies that the columns of AB are linearly dependent, so it s impossible for AB I (because the columns of I are linearly independent). Similarly, if m > n, then A is a matrix with more columns than rows, so the columns of A are linearly dependent, so again by 5(h), the columns of BA must be linearly dependent, so it s impossible for B I. Therefore, these matrices do not exist. (f) An n n matrix A with no zero entries such that A is nonsingular 4 (g) An n n matrix A with no zero entries such that A is singular (h) A singular matrix A whose columns are linearly independent This does NOT exist, because if A is singular, then A is an n n matrix with no inverse. By the Big Theorem, this implies the columns of A are linearly dependent. (i) A square matrix A such that A A T 0 0

5. For each statement below, determine if it is True or False. Justify your answer. The justification is the most important part! (a) If T : R m R n is a linear transformation, then T (0) 0. TRUE. If we plug in r 0 to the second condition needed to be a linear transformation, we find that T (0) 0. (b) If T : R m R n is a linear transformation, then the only vector x such that T (x) 0 is x 0. FALSE. The only time this is true is if T is one-to-one. If T is not one-to-one, it is false. For example, we could take the linear transformation T (x) ; then every vector x satisfies T (x) 0. (c) If {v, v, v } is a linearly dependent set of vectors in R m and T : R m R n is a linear transformation, then {T (v ), T (v ), T (v )} is a linearly dependent set. TRUE. If these vectors are linearly dependent, we can write one as a linear combination of the others, say But, this implies v c v + c v. T (v ) T (c v + c v ). But, using the properties of linear transformations, this implies T (v ) c T (v ) + c T (v ), or that T (v ) is a linear combination of T (v ) and T (v ). But, this implies that {T (v ), T (v ), T (v )} is linearly dependent. (d) If {v, v, v } is a linearly independent set of vectors in R m and T : R m R n is a linear transformation, then {T (v ), T (v ), T (v )} is a linearly independent set. FALSE. If T (x) 0, then no matter what {v, v, v } are, {T (v ), T (v ), T (v )} {0, 0, 0}, which is a linearly dependent set. (e) For any matrix A, there is a matrix B such that AB A. TRUE. Let B I. (f) For any matrix A, there is a matrix B such that AB I. FALSE. B only exists if A is invertible. For example, if A is the zero matrix, no such B exists.

(g) If AB AC and A is not the zero matrix, then B C. FALSE. For example, let 0, B 0 0, C 0 0. Multiplying these out, we can see that AB AC and A is not the zero matrix, but B C. Note, however, if we add the additional hypothesis that A is nonsingular, this is a true statement (we can multiply both sides by A to get B C). (h) If B is a matrix whose columns are linearly dependent, then the columns of AB are linearly dependent (where A is any matrix such that the product AB is defined). TRUE. Write B [b... b n ] where b,..., b n are the columns of B. Then, by definition, AB [Ab... Ab n ]. If the columns of B are linearly dependent, then we can write one of them as a linear combination of the others, say b n c b + + c n b n. But, multiplying this equation by A, we see that Ab n Ac b + + Ac n b n c Ab + + c n Ab n which shows that the vector Ab n is a linear combination of {Ab,..., Ab n }, i.e. the set {Ab,..., Ab n } of columns of AB are linearly dependent. (i) If B is a matrix whose columns are linearly independent, then the columns of AB are linearly independent (where A is any matrix such that the product AB is defined). FALSE. If B is any matrix, and A is the zero matrix, then AB is the zero matrix, whose columns are linearly dependent. (However, that if we require A to be a nonsingular matrix, then this statement is true! Can you prove this?) (j) If A is not the zero matrix, then A has an inverse. FALSE. Let. A is not the zero matrix but has no inverse. (k) There is a nonsingular matrix A such that A O (the zero matrix). FALSE. If A is nonsingular, then we could multiply both sides by A to get O. But, A cannot be the zero matrix because the zero matrix is singular! Therefore, such a matrix does not exist.

(l) There is a nonsingular matrix A such that A A. TRUE. Multiplying both sides by A we find that I, and the identity matrix satisfies I I. (m) If A and B are nonsingular (n n) matrices, then A + B is also nonsingular. FALSE. If I and B I, then A and B are both nonsingular, but A+B is the zero matrix, which is singular. (n) If A and B are nonsingular (n n) matrices, then AB is also nonsingular. TRUE. Being nonsingular just means the matrix has an inverse. If A and B are nonsingular, then A and B exist, so AB has an inverse: (AB) B A, so AB is nonsingular.