Answers in blue. If you have questions or spot an error, let me know. 1. Find all matrices that commute with A =. 4 3

Similar documents
Chapter SSM: Linear Algebra. 5. Find all x such that A x = , so that x 1 = x 2 = 0.

Final Review Sheet. B = (1, 1 + 3x, 1 + x 2 ) then 2 + 3x + 6x 2

Math 217 Midterm 1. Winter Solutions. Question Points Score Total: 100

Math 416, Spring 2010 Matrix multiplication; subspaces February 2, 2010 MATRIX MULTIPLICATION; SUBSPACES. 1. Announcements

Chapter 2 Subspaces of R n and Their Dimensions

Definitions for Quizzes

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

MATH 221: SOLUTIONS TO SELECTED HOMEWORK PROBLEMS

S09 MTH 371 Linear Algebra NEW PRACTICE QUIZ 4, SOLUTIONS Prof. G.Todorov February 15, 2009 Please, justify your answers.

We see that this is a linear system with 3 equations in 3 unknowns. equation is A x = b, where

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017

[Disclaimer: This is not a complete list of everything you need to know, just some of the topics that gave people difficulty.]

Elementary Linear Algebra Review for Exam 2 Exam is Monday, November 16th.

MAT2342 : Introduction to Applied Linear Algebra Mike Newman, fall Projections. introduction

Topic 2 Quiz 2. choice C implies B and B implies C. correct-choice C implies B, but B does not imply C

Math 110 Answers for Homework 6

Dot Products, Transposes, and Orthogonal Projections

EXAM 2 REVIEW DAVID SEAL

Lecture 1 Systems of Linear Equations and Matrices

Solutions to Math 51 First Exam April 21, 2011

Eigenspaces. (c) Find the algebraic multiplicity and the geometric multiplicity for the eigenvaules of A.

Math 123, Week 5: Linear Independence, Basis, and Matrix Spaces. Section 1: Linear Independence

Linear Algebra (MATH ) Spring 2011 Final Exam Practice Problem Solutions

2. (10 pts) How many vectors are in the null space of the matrix A = 0 1 1? (i). Zero. (iv). Three. (ii). One. (v).

Linear Algebra. Grinshpan

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

LINEAR ALGEBRA BOOT CAMP WEEK 1: THE BASICS

Math 416, Spring 2010 Coordinate systems and Change of Basis February 16, 2010 COORDINATE SYSTEMS AND CHANGE OF BASIS. 1.

Spring 2014 Midterm 1 02/26/14 Lecturer: Jesus Martinez Garcia

Final Exam Practice Problems Answers Math 24 Winter 2012

Math 416, Spring 2010 Gram-Schmidt, the QR-factorization, Orthogonal Matrices March 4, 2010 GRAM-SCHMIDT, THE QR-FACTORIZATION, ORTHOGONAL MATRICES

1 Last time: inverses

Examples True or false: 3. Let A be a 3 3 matrix. Then there is a pattern in A with precisely 4 inversions.

Section 4.5. Matrix Inverses

Linear Algebra Highlights

2. Every linear system with the same number of equations as unknowns has a unique solution.

1 Last time: determinants

LECTURES 14/15: LINEAR INDEPENDENCE AND BASES

BASIC NOTIONS. x + y = 1 3, 3x 5y + z = A + 3B,C + 2D, DC are not defined. A + C =

Math 54 HW 4 solutions

Solution Set 7, Fall '12

First we introduce the sets that are going to serve as the generalizations of the scalars.

Linear Algebra FALL 2013 MIDTERM EXAMINATION Solutions October 11, 2013

Math 21b: Linear Algebra Spring 2018

235 Final exam review questions

Chapter 6. Orthogonality

CS 246 Review of Linear Algebra 01/17/19

3/3/2015 FIRST HOURLY PRACTICE I Math 21b, Spring Name: MWF9 George Boxer

EXERCISE SET 5.1. = (kx + kx + k, ky + ky + k ) = (kx + kx + 1, ky + ky + 1) = ((k + )x + 1, (k + )y + 1)

0 2 0, it is diagonal, hence diagonalizable)

x y + z = 3 2y z = 1 4x + y = 0

Chapter 4 & 5: Vector Spaces & Linear Transformations

Math Linear Algebra Final Exam Review Sheet

Math 291-2: Lecture Notes Northwestern University, Winter 2016

MIT Final Exam Solutions, Spring 2017

Linear Equations in Linear Algebra

Solutions: We leave the conversione between relation form and span form for the reader to verify. x 1 + 2x 2 + 3x 3 = 0

Math 224, Fall 2007 Exam 3 Thursday, December 6, 2007

Math 110, Spring 2015: Midterm Solutions

MATH 15a: Linear Algebra Practice Exam 2

(v, w) = arccos( < v, w >

Solutions to Final Exam

1 Review of the dot product

MATH 15a: Applied Linear Algebra Practice Exam 1

22A-2 SUMMER 2014 LECTURE 5

Spanning, linear dependence, dimension

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

18.06 Quiz 2 April 7, 2010 Professor Strang

(v, w) = arccos( < v, w >

is Use at most six elementary row operations. (Partial

Chapters 5 & 6: Theory Review: Solutions Math 308 F Spring 2015

M340L Final Exam Solutions May 13, 1995

Review for Exam Find all a for which the following linear system has no solutions, one solution, and infinitely many solutions.

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL

a s 1.3 Matrix Multiplication. Know how to multiply two matrices and be able to write down the formula

Solutions to Math 51 First Exam October 13, 2015

Math 320, spring 2011 before the first midterm

Elementary maths for GMT

Final Review Written by Victoria Kala SH 6432u Office Hours R 12:30 1:30pm Last Updated 11/30/2015

Chapter 2 Notes, Linear Algebra 5e Lay

LINEAR ALGEBRA REVIEW

Solution to Homework 1

MATH 310, REVIEW SHEET 2

Lecture Summaries for Linear Algebra M51A

Math 416, Spring 2010 The algebra of determinants March 16, 2010 THE ALGEBRA OF DETERMINANTS. 1. Determinants

(v, w) = arccos( < v, w >

Assignment 1 Math 5341 Linear Algebra Review. Give complete answers to each of the following questions. Show all of your work.

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

Linear Algebra II. 2 Matrices. Notes 2 21st October Matrix algebra

This MUST hold matrix multiplication satisfies the distributive property.

x 1 + 2x 2 + 3x 3 = 0 x 1 + 2x 2 + 3x 3 = 0, x 2 + x 3 = 0 x 3 3 x 3 1

Matrices. 1 a a2 1 b b 2 1 c c π

1 9/5 Matrices, vectors, and their applications

Chapter SSM: Linear Algebra Section Fails to be invertible; since det = 6 6 = Invertible; since det = = 2.

February 20 Math 3260 sec. 56 Spring 2018

SUMMARY OF MATH 1600

Linear Algebra Notes. Lecture Notes, University of Toronto, Fall 2016

Family Feud Review. Linear Algebra. October 22, 2013

MODULE 8 Topics: Null space, range, column space, row space and rank of a matrix

Transcription:

Answers in blue. If you have questions or spot an error, let me know. 3 4. Find all matrices that commute with A =. 4 3 a b If we set B = and set AB = BA, we see that 3a + 4b = 3a 4c, 4a + 3b = 3b 4d, c d 3c + 4d = 4a + 3c, and 4c [ + 3d = ] 4b + 3d. Solving these equations, a = d and b = c, so all a c matrices of the form B = will commute with A that is, all rotation-scaling matrices. c a (If you prefer, you could solve for c instead of for b.) [. Let A = 3 ]. Compute A by finding a pattern or by using geometry. 3 Geometrically, this is a rotation by π 6, so that A6 = I. Since is a multiple of 6, this [ ] 3 tells us that A = I, or AA = I, so that A = A =. 3 5 3. Let A = 3. Compute A 5 by finding a pattern or by using geometry. Geometrically, this is a reflection [ across ] a line, so A = I (or, just compute it). Thus, 5 A = (A ) 5 A = A = 3. 5 4. Let A be an n n diagonal matrix with diagonal entries a,..., a nn. Find a formula for A t in terms of a,..., a nn, where t is any positive integer. Do this a couple of times to establish a pattern (or, if you have the formal background for it, prove it by induction by looking at A t A). You ll find that A t is the n n diagonal matrix with diagonal entries (a ) t,..., (a nn ) t. 5. Are the following matrices invertible? If so, find their inverses. If not, explain how you know that they aren t. 3 3 4 A = 4 5 6 B = 4 C = 7 8 9 3 9 A is not invertible (as it has rank ), while B = 3 4 and C = 3. 6. For which values of k are the vectors,, and 3 linearly dependent? k k Consider a relation a + b + c 3 =. From looking at the first two components, k k it s clear that we have a = λ, b = 3λ, c = λ (for some scalar λ) and that this will be a

nontrivial relation for λ. Looking at the last component, we see that such a relation works for λ if and only if k = or k = 4. Alternatively, put the three vectors in the columns of a matrix and see whether it row-reduces to the identity. 7. Consider the matrix A = 3 6 8 4 5 with rref(a) = 3. 7 4 3 8 (a) Find a basis for im(a). We know that im(a) is spanned by the column vectors of A and that all column vectors corresponding to a column of rref(a) without a leading are redundant (since the same relations among the columns of A also hold among the columns of rref(a) and vice-versa), 3 so im(a) = span(, ). 7 3 (b) Find a basis for ker(a). Using rref(a) x =, we see that the solutions to A x = are r s 3t x = s t = r + s + t 3, t so that ker(a) = span(,, 3 ). It s a good idea to use Rank-Nullity to check your work: dim(im(a)) + dim(ker(a)) = + 3 = 5 as required. 8. Suppose that the vectors v,..., v n form a basis of R n. Let A be the matrix with column vectors v,..., v n. Find im(a) and ker(a). By Summary 3.3., im(a) = R n and ker(a) = { }. 9. Can you find a matrix A such that im(a) = ker(a). What about a 3 3 matrix? 4 4? n n? An example of a matrix with this property is A =. It is impossible to find a 3 3 matrix with this property since then dim(im(a)) = dim(ker(a)) (i.e., rank(a) = nullity(a)), so that the Rank-Nullity Theorem tells us that rank(a) + rank(a) = 3, or rank(a) =.5. Since the rank of a matrix is always an integer, no matrix with this property exists. For a

4 4 matrix, one matrix with this property is A =. Generalizing this pattern, one can check that no such n n matrix A exists if n is odd and that one possibility when n is even is the matrix A whose first n columns are and whose second n columns are e,..., e n. Let a, b, c R where a. Find a basis for the subspace V of R 3 defined by V = { x y R 3 ax + by + cz = }. z Note that V = ker(a) where A = [ a b c ]. Since a, we compute rref(a) = [ b a c a so that one possible basis is B = { b a, }. Doing this by inspection, you can also note that V is a plane (and so of dimension ), so that any two linearly independent solutions to ax + by + cz = will form a basis.. Let A be an n p matrix such that ker(a) = { } and B be a p m matrix. How does ker(ab) relate to ker(b)? We saw in Section 3.4, #39 that ker(b) ker(ab) (see the HW solutions for details). Now, suppose that x ker(ab). Then AB x =. Since ker(a) = { }, this can happen only if B x = (since A(B x) = ), or, in other words, if x ker(b) so that ker(ab) ker(b) also. So, in this case, we have ker(ab) = ker(b).. Let V and W be subspaces of R n and define V W = { v w R n v V, w W }. Show that V W is a subspace of R n. Since V and W are subspaces of R n, we know V and W. Thus, = ( ) V W. Suppose that v w V W and v w V W for some vectors v, v V and w, w W. Then ( v w ) + ( v w ) = ( v + v ) ( w + w ) V W since v + v V and w + w W since V and W are subspaces of R n. Finally, suppose that v w V W for some v V and w W, and k R. Then k( v w) = k v k w V W since k v V and k w W since V and W are subspaces of R n. Thus, V W is a subspace of R n. 3. Let V be a subspace of R n and define V = { x R n x v = for all v V }. Show that V is a subspace of R n. Note that v = for all vectors v R n, so V. Suppose that w, w V. That is, w v = and w v = for all vectors v V. Then ( w + w ) v = w v + w v = + = for any vector v V (note that we proved the distributivity of the dot product on the review for the first exam).. c a], 3

Finally, suppose that w V. That is w v = for all v V. Then (k w) v = k( w v) = k = for all v V (note that we proved the associativity of scalars for the dot product on the review for the first exam). Thus, V is a subspace of R n. 4. Let v,..., v m R n. Let T ( x) be a linear transformation from R n to R p. If v,..., v m are linearly dependent, are T ( v ),..., T ( v m ) necessarily linearly dependent? If v,..., v m are linearly independent, are T ( v ),..., T ( v m ) necessarily linearly independent? Most of the time, the best way (in my opinion) to answer questions about linear (in)dependence is to look at linear relations (although you may use any part of Summary 3..9 if you prefer, or Summary 3.3. in some cases, and you should at least know the other ways as they re occasionally useful). So, if v,..., v m are linearly dependent, then we may suppose that c v + + c m v m = where not all of the c i =. Taking T of both sides and using the properties of linear transformations, we see that c T ( v ) + + c m T ( v m ) = (note that T ( ) = A = for some matrix A), so that T ( v ),..., T ( v m ) are linearly dependent (since not all of c i = ). For linear independence, the same trick won t work: it ll tell us if we start with the trivial relation among v,..., v m, then we ll get the trivial relation among T ( v ),..., T ( v m ) also, but this doesn t prove that other relations don t exist. If we knew that T were invertible, we could try the same thing in reverse to transform a relation among T ( v ),..., T ( v m ) into a relation among v,..., v m and use the linear independence of v,..., v m to show the linear independence of T ( v ),..., T ( v m ). However, what if T isn t invertible? Let s try an example where T is not invertible (note that while examples are generally useless for proving that something is true, one example is all you need to prove something is false): the simplest such example is T ( x) = for all x (from R to R, say). Then e, e are linearly independent, but T ( e ), T ( e ) are both the zero vector and so not linearly independent. Thus, the answer to the second part is: no, they are not necessarily linearly independent (although we saw that they will be if T is invertible). An informal way to think about this is imagining that the coordinates (that is to say, coefficients) of a set of linearly independent vectors is information (while the coordinates of linearly dependent vectors don t carry the same amount of information, since there isn t a unique way to extract the coordinates ). Then, these results say that applying a linear transformation to a set of vectors won t gain any new information, but may lose some of the existing information. 5. Consider two 3 3 matrices A and B and a vector v R 3 such that AB v =, BA v =, and A v =, but A v and B v. (a) Show that the vectors B = {A v, B v, v} form a basis of R 3. (b) Find the B-matrix of the transformation T ( x) = A x. (a) By Summary 3.3., it s enough to show that these vectors are linearly independent. (Showing that they span R 3 would also be sufficient, but showing linear independence is usually easier.) As above, we ll start by looking at a relation of the form c A v + c B v + c 3 v = and calculate that c = c = c 3 = to show linear independence. First, apply matrix A to both sides on the left: A(c A v+c B v+c 3 v) = A, or c A v+c AB v+c 3 A v = 4

, which implies c 3 A v = as the other vectors =. Since A v, this means that c 3 =. Now, apply B to both sides on the left: B(c A v + c B v) = B, so that c B v = since BA v =. As B v, this shows that c =. Thus, we have c A v = and since A v we have c =. Thus, the only relation among the vectors in B is the trivial relation, so that they are linearly independent. (b) The easiest way to go about this is to use the (unnumbered) theorem following Definition 3.4.3 in your book to calculate the B-matrix column-by-column (using the other methods may be possible, but probably gets more than a bit messy). We compute [T (A v)] B = [A v] B = [ ] B =, [T (B v)] B = [AB v] B = [ ] B =, and [T ( v)] B = [A v] B =, so that the B-matrix of T is B =. 6. We say that a set of vectors v,..., v m R n are orthonormal if they are perpendicular unit vectors. That is, if { if i = j v i v j = if i j. Show that if u,..., u n R n are orthonormal, then B = { u,..., u n } form a basis of R n. (Hint: Let c u + + c n u n = and take the dot product of both sides with u j for j =,..., n.) Note that u j (c u + + c n u n ) = u j simplifies to c ( u j u ) + + c n ( u j u n ) = and that { if i = j u j u i = if i j. So, this tells us that c () + + c j () + + c n () =, or that c j =. Doing this for j =,..., n we see that c = = c n =, so that the only relation among the vectors of B is the trivial relation. As above, this is sufficient to show that B is a basis by Summary 3.3.. 7. Let A = 3 3 and let B = {,, } be a basis of R 3. Find the B-matrix of 7 the transformation T ( x) = A x. 3 The B-matrix of T is B = 4 5 6. 7 8 9 5

a b 8. Let A be a matrix of the form A = where a b a + b = and a, and let B = b a {, } be a basis of R a b. Find the B-matrix of the transformation T ( x) = A x. Interpret your answer geometrically. The B-matrix is B =. Geometrically, A is the matrix of a reflection across the line b a spanned by and the vector is perpendicular to this line, so this result shows a b that if we write a vector as a linear combination of a vector on the line (i.e., x ) and a vector perpendicular to the line (i.e., x ), then the vector on the line is unaltered by T, while the coefficient of the vector perpendicular to the line is multiplied by (that is, its direction is replaced by its opposite). In other words, T ( x + x ) = x x. Aside: It s easy to verify that B really is a basis of R, for example by #6. u 9. Let A be a matrix of the form A = u u u u u where u +u u u =, and let B = {, } u u be a basis of R. Find the B-matrix of the transformation T ( x) = A x. Interpret your answer geometrically. The B-matrix is B =. Geometrically, A is the matrix of an orthogonal projection onto u u the line spanned by and is a vector perpendicular to this line. So, this result u u shows that if we write a vector x as a linear combination of a vector on the line (i.e., x ) and a vector perpendicular to the line (i.e., x ), then the vector on the line is unaltered by T, while the vector perpendicular to the line is mapped to the zero vector. In other words, T ( x + x ) = x. Aside: It s easy to verify that B really is a basis of R, for example by #6.. True or False: (a) Let B = { e n,..., e } (the standard basis vectors in the opposite order) and let A = [ v... v n ]. Then the B-matrix of T ( x) = A x is the matrix [ vn... v ] (the column vectors in the opposite order). False. Both rows and columns are swapped, so the B- matrix B is defined entrywise by b i,j = a n+ i,n+ j. (b) Let A be an n n matrix. If A = A, then A = I n. False. For example, in the case, let A be the matrix of an orthogonal projection. (c) Let A be an n m matrix. Then dim(im(a))+dim(ker(a)) = n. False. By Rank-Nullity, this is equal to m. (d) If A is a matrix representing an orthogonal projection, then A is similar to. True. See #9. (e) Let A and B be n n matrices. If A is similar to B, then A t is similar to B t for every positive integer t. True. See Example 7 on p. 46. 6

(f) Let A and B be n n matrices such that B is similar to A. If A is invertible, then B is invertible. True. Write B = S AS and use Theorem.4.7 to show that B = S A S. (Note: This also shows that B is similar to A.) (g) If A and B are invertible n n matrices, then AB is invertible and (AB) = A B. False. See Theorem.4.7. (h) Let V be the set of all unit vectors in R n (that is, the set of all vectors x such that x x = ). Then V is a subspace of R n. False. This actually fails all three properties for being a subspace. Most easily, =, so V. (i) If V is a subspace of R n, then dim(v ) n. True. See for example Theorem 3..8. (Note: In the context of Chapter 4, this is a special case of the result that if V is a subspace of W, then dim(v ) dim(w ), but don t worry about that now since we haven t even defined what this means in an abstract vector space yet.) (j) If the vectors v,..., v m R n are linearly independent and the vector v R n is not in span( v,..., v m ), then the vectors v,..., v m, v are linearly independent. True. We know that none of v,..., v m are redundant since they are linearly independent, and v can t be redundant since if it were it would be in span( v,..., v m ). Aside: This is one trick for finding a basis of a subspace of R n : pick any nonzero vector in the space. If it doesn t span the space, pick a vector not in the span and add it to the set of vectors you ve picked. Then repeat until you end up with a set of vectors that do span the subspace. Since you can find at most n linearly independent vectors in R n, this process is guaranteed to terminate after finitely many steps and is constructed in such a way that you re guaranteed that the set of vectors you end up with will both span the space and be linearly independent. (k) If the ith column of an n m matrix A is equal to the jth column of A for some i j, then the vector e i e j is in the kernel of A. True. If the columns of A are v k for k =,..., m (so that v i = v j ), then A( e i e j ) = A e i A e j = v i v j =. (l) Suppose the vectors v,..., v m R n are linearly independent. Define the vectors w j = j v i for j =,..., m to be the sum of the first j vectors from v,..., v m (so, for example, w = v, w = v + v,..., w m = v + + v m ). Then the vectors w,..., w m are linearly independent. True. Use the fact that v,..., v j are linearly independent to show that the vector w j is not a linear combination of w,..., w j for j =,..., m. (m) Let v,..., v m, v, w R n. If v is not a linear combination of v,..., v m and w is not a linear combination of v,..., v m, then ( v + w) is not a linear combination of v,..., v m. False. For example, let w be any vector that isn t a linear combination of v,..., v m and then let v = v w. i= 7