Answers in blue. If you have questions or spot an error, let me know. 1. Find all matrices that commute with A =. 4 3
|
|
- Russell Hardy
- 6 years ago
- Views:
Transcription
1 Answers in blue. If you have questions or spot an error, let me know Find all matrices that commute with A =. 4 3 a b If we set B = and set AB = BA, we see that 3a + 4b = 3a 4c, 4a + 3b = 3b 4d, c d 3c + 4d = 4a + 3c, and 4c [ + 3d = ] 4b + 3d. Solving these equations, a = d and b = c, so all a c matrices of the form B = will commute with A that is, all rotation-scaling matrices. c a (If you prefer, you could solve for c instead of for b.) [. Let A = 3 ]. Compute A by finding a pattern or by using geometry. 3 Geometrically, this is a rotation by π 6, so that A6 = I. Since is a multiple of 6, this [ ] 3 tells us that A = I, or AA = I, so that A = A = Let A = 3. Compute A 5 by finding a pattern or by using geometry. Geometrically, this is a reflection [ across ] a line, so A = I (or, just compute it). Thus, 5 A = (A ) 5 A = A = Let A be an n n diagonal matrix with diagonal entries a,..., a nn. Find a formula for A t in terms of a,..., a nn, where t is any positive integer. Do this a couple of times to establish a pattern (or, if you have the formal background for it, prove it by induction by looking at A t A). You ll find that A t is the n n diagonal matrix with diagonal entries (a ) t,..., (a nn ) t. 5. Are the following matrices invertible? If so, find their inverses. If not, explain how you know that they aren t A = B = 4 C = A is not invertible (as it has rank ), while B = 3 4 and C = For which values of k are the vectors,, and 3 linearly dependent? k k Consider a relation a + b + c 3 =. From looking at the first two components, k k it s clear that we have a = λ, b = 3λ, c = λ (for some scalar λ) and that this will be a
2 nontrivial relation for λ. Looking at the last component, we see that such a relation works for λ if and only if k = or k = 4. Alternatively, put the three vectors in the columns of a matrix and see whether it row-reduces to the identity. 7. Consider the matrix A = with rref(a) = (a) Find a basis for im(a). We know that im(a) is spanned by the column vectors of A and that all column vectors corresponding to a column of rref(a) without a leading are redundant (since the same relations among the columns of A also hold among the columns of rref(a) and vice-versa), 3 so im(a) = span(, ). 7 3 (b) Find a basis for ker(a). Using rref(a) x =, we see that the solutions to A x = are r s 3t x = s t = r + s + t 3, t so that ker(a) = span(,, 3 ). It s a good idea to use Rank-Nullity to check your work: dim(im(a)) + dim(ker(a)) = + 3 = 5 as required. 8. Suppose that the vectors v,..., v n form a basis of R n. Let A be the matrix with column vectors v,..., v n. Find im(a) and ker(a). By Summary 3.3., im(a) = R n and ker(a) = { }. 9. Can you find a matrix A such that im(a) = ker(a). What about a 3 3 matrix? 4 4? n n? An example of a matrix with this property is A =. It is impossible to find a 3 3 matrix with this property since then dim(im(a)) = dim(ker(a)) (i.e., rank(a) = nullity(a)), so that the Rank-Nullity Theorem tells us that rank(a) + rank(a) = 3, or rank(a) =.5. Since the rank of a matrix is always an integer, no matrix with this property exists. For a
3 4 4 matrix, one matrix with this property is A =. Generalizing this pattern, one can check that no such n n matrix A exists if n is odd and that one possibility when n is even is the matrix A whose first n columns are and whose second n columns are e,..., e n. Let a, b, c R where a. Find a basis for the subspace V of R 3 defined by V = { x y R 3 ax + by + cz = }. z Note that V = ker(a) where A = [ a b c ]. Since a, we compute rref(a) = [ b a c a so that one possible basis is B = { b a, }. Doing this by inspection, you can also note that V is a plane (and so of dimension ), so that any two linearly independent solutions to ax + by + cz = will form a basis.. Let A be an n p matrix such that ker(a) = { } and B be a p m matrix. How does ker(ab) relate to ker(b)? We saw in Section 3.4, #39 that ker(b) ker(ab) (see the HW solutions for details). Now, suppose that x ker(ab). Then AB x =. Since ker(a) = { }, this can happen only if B x = (since A(B x) = ), or, in other words, if x ker(b) so that ker(ab) ker(b) also. So, in this case, we have ker(ab) = ker(b).. Let V and W be subspaces of R n and define V W = { v w R n v V, w W }. Show that V W is a subspace of R n. Since V and W are subspaces of R n, we know V and W. Thus, = ( ) V W. Suppose that v w V W and v w V W for some vectors v, v V and w, w W. Then ( v w ) + ( v w ) = ( v + v ) ( w + w ) V W since v + v V and w + w W since V and W are subspaces of R n. Finally, suppose that v w V W for some v V and w W, and k R. Then k( v w) = k v k w V W since k v V and k w W since V and W are subspaces of R n. Thus, V W is a subspace of R n. 3. Let V be a subspace of R n and define V = { x R n x v = for all v V }. Show that V is a subspace of R n. Note that v = for all vectors v R n, so V. Suppose that w, w V. That is, w v = and w v = for all vectors v V. Then ( w + w ) v = w v + w v = + = for any vector v V (note that we proved the distributivity of the dot product on the review for the first exam).. c a], 3
4 Finally, suppose that w V. That is w v = for all v V. Then (k w) v = k( w v) = k = for all v V (note that we proved the associativity of scalars for the dot product on the review for the first exam). Thus, V is a subspace of R n. 4. Let v,..., v m R n. Let T ( x) be a linear transformation from R n to R p. If v,..., v m are linearly dependent, are T ( v ),..., T ( v m ) necessarily linearly dependent? If v,..., v m are linearly independent, are T ( v ),..., T ( v m ) necessarily linearly independent? Most of the time, the best way (in my opinion) to answer questions about linear (in)dependence is to look at linear relations (although you may use any part of Summary 3..9 if you prefer, or Summary 3.3. in some cases, and you should at least know the other ways as they re occasionally useful). So, if v,..., v m are linearly dependent, then we may suppose that c v + + c m v m = where not all of the c i =. Taking T of both sides and using the properties of linear transformations, we see that c T ( v ) + + c m T ( v m ) = (note that T ( ) = A = for some matrix A), so that T ( v ),..., T ( v m ) are linearly dependent (since not all of c i = ). For linear independence, the same trick won t work: it ll tell us if we start with the trivial relation among v,..., v m, then we ll get the trivial relation among T ( v ),..., T ( v m ) also, but this doesn t prove that other relations don t exist. If we knew that T were invertible, we could try the same thing in reverse to transform a relation among T ( v ),..., T ( v m ) into a relation among v,..., v m and use the linear independence of v,..., v m to show the linear independence of T ( v ),..., T ( v m ). However, what if T isn t invertible? Let s try an example where T is not invertible (note that while examples are generally useless for proving that something is true, one example is all you need to prove something is false): the simplest such example is T ( x) = for all x (from R to R, say). Then e, e are linearly independent, but T ( e ), T ( e ) are both the zero vector and so not linearly independent. Thus, the answer to the second part is: no, they are not necessarily linearly independent (although we saw that they will be if T is invertible). An informal way to think about this is imagining that the coordinates (that is to say, coefficients) of a set of linearly independent vectors is information (while the coordinates of linearly dependent vectors don t carry the same amount of information, since there isn t a unique way to extract the coordinates ). Then, these results say that applying a linear transformation to a set of vectors won t gain any new information, but may lose some of the existing information. 5. Consider two 3 3 matrices A and B and a vector v R 3 such that AB v =, BA v =, and A v =, but A v and B v. (a) Show that the vectors B = {A v, B v, v} form a basis of R 3. (b) Find the B-matrix of the transformation T ( x) = A x. (a) By Summary 3.3., it s enough to show that these vectors are linearly independent. (Showing that they span R 3 would also be sufficient, but showing linear independence is usually easier.) As above, we ll start by looking at a relation of the form c A v + c B v + c 3 v = and calculate that c = c = c 3 = to show linear independence. First, apply matrix A to both sides on the left: A(c A v+c B v+c 3 v) = A, or c A v+c AB v+c 3 A v = 4
5 , which implies c 3 A v = as the other vectors =. Since A v, this means that c 3 =. Now, apply B to both sides on the left: B(c A v + c B v) = B, so that c B v = since BA v =. As B v, this shows that c =. Thus, we have c A v = and since A v we have c =. Thus, the only relation among the vectors in B is the trivial relation, so that they are linearly independent. (b) The easiest way to go about this is to use the (unnumbered) theorem following Definition in your book to calculate the B-matrix column-by-column (using the other methods may be possible, but probably gets more than a bit messy). We compute [T (A v)] B = [A v] B = [ ] B =, [T (B v)] B = [AB v] B = [ ] B =, and [T ( v)] B = [A v] B =, so that the B-matrix of T is B =. 6. We say that a set of vectors v,..., v m R n are orthonormal if they are perpendicular unit vectors. That is, if { if i = j v i v j = if i j. Show that if u,..., u n R n are orthonormal, then B = { u,..., u n } form a basis of R n. (Hint: Let c u + + c n u n = and take the dot product of both sides with u j for j =,..., n.) Note that u j (c u + + c n u n ) = u j simplifies to c ( u j u ) + + c n ( u j u n ) = and that { if i = j u j u i = if i j. So, this tells us that c () + + c j () + + c n () =, or that c j =. Doing this for j =,..., n we see that c = = c n =, so that the only relation among the vectors of B is the trivial relation. As above, this is sufficient to show that B is a basis by Summary Let A = 3 3 and let B = {,, } be a basis of R 3. Find the B-matrix of 7 the transformation T ( x) = A x. 3 The B-matrix of T is B =
6 a b 8. Let A be a matrix of the form A = where a b a + b = and a, and let B = b a {, } be a basis of R a b. Find the B-matrix of the transformation T ( x) = A x. Interpret your answer geometrically. The B-matrix is B =. Geometrically, A is the matrix of a reflection across the line b a spanned by and the vector is perpendicular to this line, so this result shows a b that if we write a vector as a linear combination of a vector on the line (i.e., x ) and a vector perpendicular to the line (i.e., x ), then the vector on the line is unaltered by T, while the coefficient of the vector perpendicular to the line is multiplied by (that is, its direction is replaced by its opposite). In other words, T ( x + x ) = x x. Aside: It s easy to verify that B really is a basis of R, for example by #6. u 9. Let A be a matrix of the form A = u u u u u where u +u u u =, and let B = {, } u u be a basis of R. Find the B-matrix of the transformation T ( x) = A x. Interpret your answer geometrically. The B-matrix is B =. Geometrically, A is the matrix of an orthogonal projection onto u u the line spanned by and is a vector perpendicular to this line. So, this result u u shows that if we write a vector x as a linear combination of a vector on the line (i.e., x ) and a vector perpendicular to the line (i.e., x ), then the vector on the line is unaltered by T, while the vector perpendicular to the line is mapped to the zero vector. In other words, T ( x + x ) = x. Aside: It s easy to verify that B really is a basis of R, for example by #6.. True or False: (a) Let B = { e n,..., e } (the standard basis vectors in the opposite order) and let A = [ v... v n ]. Then the B-matrix of T ( x) = A x is the matrix [ vn... v ] (the column vectors in the opposite order). False. Both rows and columns are swapped, so the B- matrix B is defined entrywise by b i,j = a n+ i,n+ j. (b) Let A be an n n matrix. If A = A, then A = I n. False. For example, in the case, let A be the matrix of an orthogonal projection. (c) Let A be an n m matrix. Then dim(im(a))+dim(ker(a)) = n. False. By Rank-Nullity, this is equal to m. (d) If A is a matrix representing an orthogonal projection, then A is similar to. True. See #9. (e) Let A and B be n n matrices. If A is similar to B, then A t is similar to B t for every positive integer t. True. See Example 7 on p
7 (f) Let A and B be n n matrices such that B is similar to A. If A is invertible, then B is invertible. True. Write B = S AS and use Theorem.4.7 to show that B = S A S. (Note: This also shows that B is similar to A.) (g) If A and B are invertible n n matrices, then AB is invertible and (AB) = A B. False. See Theorem.4.7. (h) Let V be the set of all unit vectors in R n (that is, the set of all vectors x such that x x = ). Then V is a subspace of R n. False. This actually fails all three properties for being a subspace. Most easily, =, so V. (i) If V is a subspace of R n, then dim(v ) n. True. See for example Theorem (Note: In the context of Chapter 4, this is a special case of the result that if V is a subspace of W, then dim(v ) dim(w ), but don t worry about that now since we haven t even defined what this means in an abstract vector space yet.) (j) If the vectors v,..., v m R n are linearly independent and the vector v R n is not in span( v,..., v m ), then the vectors v,..., v m, v are linearly independent. True. We know that none of v,..., v m are redundant since they are linearly independent, and v can t be redundant since if it were it would be in span( v,..., v m ). Aside: This is one trick for finding a basis of a subspace of R n : pick any nonzero vector in the space. If it doesn t span the space, pick a vector not in the span and add it to the set of vectors you ve picked. Then repeat until you end up with a set of vectors that do span the subspace. Since you can find at most n linearly independent vectors in R n, this process is guaranteed to terminate after finitely many steps and is constructed in such a way that you re guaranteed that the set of vectors you end up with will both span the space and be linearly independent. (k) If the ith column of an n m matrix A is equal to the jth column of A for some i j, then the vector e i e j is in the kernel of A. True. If the columns of A are v k for k =,..., m (so that v i = v j ), then A( e i e j ) = A e i A e j = v i v j =. (l) Suppose the vectors v,..., v m R n are linearly independent. Define the vectors w j = j v i for j =,..., m to be the sum of the first j vectors from v,..., v m (so, for example, w = v, w = v + v,..., w m = v + + v m ). Then the vectors w,..., w m are linearly independent. True. Use the fact that v,..., v j are linearly independent to show that the vector w j is not a linear combination of w,..., w j for j =,..., m. (m) Let v,..., v m, v, w R n. If v is not a linear combination of v,..., v m and w is not a linear combination of v,..., v m, then ( v + w) is not a linear combination of v,..., v m. False. For example, let w be any vector that isn t a linear combination of v,..., v m and then let v = v w. i= 7
Chapter SSM: Linear Algebra. 5. Find all x such that A x = , so that x 1 = x 2 = 0.
Chapter Find all x such that A x : Chapter, so that x x ker(a) { } Find all x such that A x ; note that all x in R satisfy the equation, so that ker(a) R span( e, e ) 5 Find all x such that A x 5 ; x x
More informationFinal Review Sheet. B = (1, 1 + 3x, 1 + x 2 ) then 2 + 3x + 6x 2
Final Review Sheet The final will cover Sections Chapters 1,2,3 and 4, as well as sections 5.1-5.4, 6.1-6.2 and 7.1-7.3 from chapters 5,6 and 7. This is essentially all material covered this term. Watch
More informationMath 217 Midterm 1. Winter Solutions. Question Points Score Total: 100
Math 7 Midterm Winter 4 Solutions Name: Section: Question Points Score 8 5 3 4 5 5 6 8 7 6 8 8 Total: Math 7 Solutions Midterm, Page of 7. Write complete, precise definitions for each of the following
More informationMath 416, Spring 2010 Matrix multiplication; subspaces February 2, 2010 MATRIX MULTIPLICATION; SUBSPACES. 1. Announcements
Math 416, Spring 010 Matrix multiplication; subspaces February, 010 MATRIX MULTIPLICATION; SUBSPACES 1 Announcements Office hours on Wednesday are cancelled because Andy will be out of town If you email
More informationChapter 2 Subspaces of R n and Their Dimensions
Chapter 2 Subspaces of R n and Their Dimensions Vector Space R n. R n Definition.. The vector space R n is a set of all n-tuples (called vectors) x x 2 x =., where x, x 2,, x n are real numbers, together
More informationDefinitions for Quizzes
Definitions for Quizzes Italicized text (or something close to it) will be given to you. Plain text is (an example of) what you should write as a definition. [Bracketed text will not be given, nor does
More information1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det
What is the determinant of the following matrix? 3 4 3 4 3 4 4 3 A 0 B 8 C 55 D 0 E 60 If det a a a 3 b b b 3 c c c 3 = 4, then det a a 4a 3 a b b 4b 3 b c c c 3 c = A 8 B 6 C 4 D E 3 Let A be an n n matrix
More informationMATH 221: SOLUTIONS TO SELECTED HOMEWORK PROBLEMS
MATH 221: SOLUTIONS TO SELECTED HOMEWORK PROBLEMS 1. HW 1: Due September 4 1.1.21. Suppose v, w R n and c is a scalar. Prove that Span(v + cw, w) = Span(v, w). We must prove two things: that every element
More informationS09 MTH 371 Linear Algebra NEW PRACTICE QUIZ 4, SOLUTIONS Prof. G.Todorov February 15, 2009 Please, justify your answers.
S09 MTH 37 Linear Algebra NEW PRACTICE QUIZ 4, SOLUTIONS Prof. G.Todorov February, 009 Please, justify your answers. 3 0. Let A = 0 3. 7 Determine whether the column vectors of A are dependent or independent.
More informationWe see that this is a linear system with 3 equations in 3 unknowns. equation is A x = b, where
Practice Problems Math 35 Spring 7: Solutions. Write the system of equations as a matrix equation and find all solutions using Gauss elimination: x + y + 4z =, x + 3y + z = 5, x + y + 5z = 3. We see that
More informationMath 4A Notes. Written by Victoria Kala Last updated June 11, 2017
Math 4A Notes Written by Victoria Kala vtkala@math.ucsb.edu Last updated June 11, 2017 Systems of Linear Equations A linear equation is an equation that can be written in the form a 1 x 1 + a 2 x 2 +...
More information[Disclaimer: This is not a complete list of everything you need to know, just some of the topics that gave people difficulty.]
Math 43 Review Notes [Disclaimer: This is not a complete list of everything you need to know, just some of the topics that gave people difficulty Dot Product If v (v, v, v 3 and w (w, w, w 3, then the
More informationElementary Linear Algebra Review for Exam 2 Exam is Monday, November 16th.
Elementary Linear Algebra Review for Exam Exam is Monday, November 6th. The exam will cover sections:.4,..4, 5. 5., 7., the class notes on Markov Models. You must be able to do each of the following. Section.4
More informationMAT2342 : Introduction to Applied Linear Algebra Mike Newman, fall Projections. introduction
MAT4 : Introduction to Applied Linear Algebra Mike Newman fall 7 9. Projections introduction One reason to consider projections is to understand approximate solutions to linear systems. A common example
More informationTopic 2 Quiz 2. choice C implies B and B implies C. correct-choice C implies B, but B does not imply C
Topic 1 Quiz 1 text A reduced row-echelon form of a 3 by 4 matrix can have how many leading one s? choice must have 3 choice may have 1, 2, or 3 correct-choice may have 0, 1, 2, or 3 choice may have 0,
More informationMath 110 Answers for Homework 6
Math 0 Answers for Homework 6. We know both the matrix A, and its RREF: 0 6 A = 0 0 9 0 0 0 0 0 0 0 (a) A basis for the image of A is (,, ), (0,, 0), and (, 0, ). The reason we know this is that we know
More informationDot Products, Transposes, and Orthogonal Projections
Dot Products, Transposes, and Orthogonal Projections David Jekel November 13, 2015 Properties of Dot Products Recall that the dot product or standard inner product on R n is given by x y = x 1 y 1 + +
More informationEXAM 2 REVIEW DAVID SEAL
EXAM 2 REVIEW DAVID SEAL 3. Linear Systems and Matrices 3.2. Matrices and Gaussian Elimination. At this point in the course, you all have had plenty of practice with Gaussian Elimination. Be able to row
More informationLecture 1 Systems of Linear Equations and Matrices
Lecture 1 Systems of Linear Equations and Matrices Math 19620 Outline of Course Linear Equations and Matrices Linear Transformations, Inverses Bases, Linear Independence, Subspaces Abstract Vector Spaces
More informationSolutions to Math 51 First Exam April 21, 2011
Solutions to Math 5 First Exam April,. ( points) (a) Give the precise definition of a (linear) subspace V of R n. (4 points) A linear subspace V of R n is a subset V R n which satisfies V. If x, y V then
More informationEigenspaces. (c) Find the algebraic multiplicity and the geometric multiplicity for the eigenvaules of A.
Eigenspaces 1. (a) Find all eigenvalues and eigenvectors of A = (b) Find the corresponding eigenspaces. [ ] 1 1 1 Definition. If A is an n n matrix and λ is a scalar, the λ-eigenspace of A (usually denoted
More informationMath 123, Week 5: Linear Independence, Basis, and Matrix Spaces. Section 1: Linear Independence
Math 123, Week 5: Linear Independence, Basis, and Matrix Spaces Section 1: Linear Independence Recall that every row on the left-hand side of the coefficient matrix of a linear system A x = b which could
More informationLinear Algebra (MATH ) Spring 2011 Final Exam Practice Problem Solutions
Linear Algebra (MATH 4) Spring 2 Final Exam Practice Problem Solutions Instructions: Try the following on your own, then use the book and notes where you need help. Afterwards, check your solutions with
More information2. (10 pts) How many vectors are in the null space of the matrix A = 0 1 1? (i). Zero. (iv). Three. (ii). One. (v).
Exam 3 MAS 3105 Applied Linear Algebra, Spring 2018 (Clearly!) Print Name: Apr 10, 2018 Read all of what follows carefully before starting! 1. This test has 7 problems and is worth 110 points. Please be
More informationLinear Algebra. Grinshpan
Linear Algebra Grinshpan Saturday class, 2/23/9 This lecture involves topics from Sections 3-34 Subspaces associated to a matrix Given an n m matrix A, we have three subspaces associated to it The column
More informationMATH 240 Spring, Chapter 1: Linear Equations and Matrices
MATH 240 Spring, 2006 Chapter Summaries for Kolman / Hill, Elementary Linear Algebra, 8th Ed. Sections 1.1 1.6, 2.1 2.2, 3.2 3.8, 4.3 4.5, 5.1 5.3, 5.5, 6.1 6.5, 7.1 7.2, 7.4 DEFINITIONS Chapter 1: Linear
More informationLINEAR ALGEBRA BOOT CAMP WEEK 1: THE BASICS
LINEAR ALGEBRA BOOT CAMP WEEK 1: THE BASICS Unless otherwise stated, all vector spaces in this worksheet are finite dimensional and the scalar field F has characteristic zero. The following are facts (in
More informationMath 416, Spring 2010 Coordinate systems and Change of Basis February 16, 2010 COORDINATE SYSTEMS AND CHANGE OF BASIS. 1.
Math 46 Spring Coordinate systems and Change of asis February 6 COORDINAE SYSEMS AND CHANGE OF ASIS Announcements Don t forget that we have a quiz on hursday and test coming up the following hursday Finishing
More informationSpring 2014 Midterm 1 02/26/14 Lecturer: Jesus Martinez Garcia
Math 0 Spring 04 Midterm 0/6/4 Lecturer: Jesus Martinez Garcia Time Limit: 50 minutes Name (Print: Teaching Assistant This exam contains 9 pages (including this cover page and 4 problems Check to see if
More informationFinal Exam Practice Problems Answers Math 24 Winter 2012
Final Exam Practice Problems Answers Math 4 Winter 0 () The Jordan product of two n n matrices is defined as A B = (AB + BA), where the products inside the parentheses are standard matrix product. Is the
More informationMath 416, Spring 2010 Gram-Schmidt, the QR-factorization, Orthogonal Matrices March 4, 2010 GRAM-SCHMIDT, THE QR-FACTORIZATION, ORTHOGONAL MATRICES
Math 46, Spring 00 Gram-Schmidt, the QR-factorization, Orthogonal Matrices March 4, 00 GRAM-SCHMIDT, THE QR-FACTORIZATION, ORTHOGONAL MATRICES Recap Yesterday we talked about several new, important concepts
More information1 Last time: inverses
MATH Linear algebra (Fall 8) Lecture 8 Last time: inverses The following all mean the same thing for a function f : X Y : f is invertible f is one-to-one and onto 3 For each b Y there is exactly one a
More informationExamples True or false: 3. Let A be a 3 3 matrix. Then there is a pattern in A with precisely 4 inversions.
The exam will cover Sections 6.-6.2 and 7.-7.4: True/False 30% Definitions 0% Computational 60% Skip Minors and Laplace Expansion in Section 6.2 and p. 304 (trajectories and phase portraits) in Section
More informationSection 4.5. Matrix Inverses
Section 4.5 Matrix Inverses The Definition of Inverse Recall: The multiplicative inverse (or reciprocal) of a nonzero number a is the number b such that ab = 1. We define the inverse of a matrix in almost
More informationLinear Algebra Highlights
Linear Algebra Highlights Chapter 1 A linear equation in n variables is of the form a 1 x 1 + a 2 x 2 + + a n x n. We can have m equations in n variables, a system of linear equations, which we want to
More information2. Every linear system with the same number of equations as unknowns has a unique solution.
1. For matrices A, B, C, A + B = A + C if and only if A = B. 2. Every linear system with the same number of equations as unknowns has a unique solution. 3. Every linear system with the same number of equations
More information1 Last time: determinants
1 Last time: determinants Let n be a positive integer If A is an n n matrix, then its determinant is the number det A = Π(X, A)( 1) inv(x) X S n where S n is the set of n n permutation matrices Π(X, A)
More informationLECTURES 14/15: LINEAR INDEPENDENCE AND BASES
LECTURES 14/15: LINEAR INDEPENDENCE AND BASES MA1111: LINEAR ALGEBRA I, MICHAELMAS 2016 1. Linear Independence We have seen in examples of span sets of vectors that sometimes adding additional vectors
More informationBASIC NOTIONS. x + y = 1 3, 3x 5y + z = A + 3B,C + 2D, DC are not defined. A + C =
CHAPTER I BASIC NOTIONS (a) 8666 and 8833 (b) a =6,a =4 will work in the first case, but there are no possible such weightings to produce the second case, since Student and Student 3 have to end up with
More informationMath 54 HW 4 solutions
Math 54 HW 4 solutions 2.2. Section 2.2 (a) False: Recall that performing a series of elementary row operations A is equivalent to multiplying A by a series of elementary matrices. Suppose that E,...,
More informationSolution Set 7, Fall '12
Solution Set 7, 18.06 Fall '12 1. Do Problem 26 from 5.1. (It might take a while but when you see it, it's easy) Solution. Let n 3, and let A be an n n matrix whose i, j entry is i + j. To show that det
More informationFirst we introduce the sets that are going to serve as the generalizations of the scalars.
Contents 1 Fields...................................... 2 2 Vector spaces.................................. 4 3 Matrices..................................... 7 4 Linear systems and matrices..........................
More informationLinear Algebra FALL 2013 MIDTERM EXAMINATION Solutions October 11, 2013
Name: Section Number: Linear Algebra FALL MIDTERM EXAMINATION Solutions October, Instructions: The exam is 7 pages long, including this title page The number of points each problem is worth is listed after
More informationMath 21b: Linear Algebra Spring 2018
Math b: Linear Algebra Spring 08 Homework 8: Basis This homework is due on Wednesday, February 4, respectively on Thursday, February 5, 08. Which of the following sets are linear spaces? Check in each
More information235 Final exam review questions
5 Final exam review questions Paul Hacking December 4, 0 () Let A be an n n matrix and T : R n R n, T (x) = Ax the linear transformation with matrix A. What does it mean to say that a vector v R n is an
More informationChapter 6. Orthogonality
6.4 The Projection Matrix 1 Chapter 6. Orthogonality 6.4 The Projection Matrix Note. In Section 6.1 (Projections), we projected a vector b R n onto a subspace W of R n. We did so by finding a basis for
More informationCS 246 Review of Linear Algebra 01/17/19
1 Linear algebra In this section we will discuss vectors and matrices. We denote the (i, j)th entry of a matrix A as A ij, and the ith entry of a vector as v i. 1.1 Vectors and vector operations A vector
More information3/3/2015 FIRST HOURLY PRACTICE I Math 21b, Spring Name: MWF9 George Boxer
3/3/25 FIRST HOURLY PRACTICE I Math 2b, Spring 25 Name: MWF9 George Boxer MWF Omar Antolin MWF Hector Pasten MWF Oliver Knill MWF2 Gabriel Bujokas MWF2 Cheng-Chiang Tsai TThu Simon Schieder TThu Arul Shankar
More informationEXERCISE SET 5.1. = (kx + kx + k, ky + ky + k ) = (kx + kx + 1, ky + ky + 1) = ((k + )x + 1, (k + )y + 1)
EXERCISE SET 5. 6. The pair (, 2) is in the set but the pair ( )(, 2) = (, 2) is not because the first component is negative; hence Axiom 6 fails. Axiom 5 also fails. 8. Axioms, 2, 3, 6, 9, and are easily
More information0 2 0, it is diagonal, hence diagonalizable)
MATH 54 TRUE/FALSE QUESTIONS FOR MIDTERM 2 SOLUTIONS PEYAM RYAN TABRIZIAN 1. (a) TRUE If A is diagonalizable, then A 3 is diagonalizable. (A = P DP 1, so A 3 = P D 3 P = P D P 1, where P = P and D = D
More informationx y + z = 3 2y z = 1 4x + y = 0
MA 253: Practice Exam Solutions You may not use a graphing calculator, computer, textbook, notes, or refer to other people (except the instructor). Show all of your work; your work is your answer. Problem
More informationChapter 4 & 5: Vector Spaces & Linear Transformations
Chapter 4 & 5: Vector Spaces & Linear Transformations Philip Gressman University of Pennsylvania Philip Gressman Math 240 002 2014C: Chapters 4 & 5 1 / 40 Objective The purpose of Chapter 4 is to think
More informationMath Linear Algebra Final Exam Review Sheet
Math 15-1 Linear Algebra Final Exam Review Sheet Vector Operations Vector addition is a component-wise operation. Two vectors v and w may be added together as long as they contain the same number n of
More informationMath 291-2: Lecture Notes Northwestern University, Winter 2016
Math 291-2: Lecture Notes Northwestern University, Winter 2016 Written by Santiago Cañez These are lecture notes for Math 291-2, the second quarter of MENU: Intensive Linear Algebra and Multivariable Calculus,
More informationMIT Final Exam Solutions, Spring 2017
MIT 8.6 Final Exam Solutions, Spring 7 Problem : For some real matrix A, the following vectors form a basis for its column space and null space: C(A) = span,, N(A) = span,,. (a) What is the size m n of
More informationLinear Equations in Linear Algebra
1 Linear Equations in Linear Algebra 1.7 LINEAR INDEPENDENCE LINEAR INDEPENDENCE Definition: An indexed set of vectors {v 1,, v p } in n is said to be linearly independent if the vector equation x x x
More informationSolutions: We leave the conversione between relation form and span form for the reader to verify. x 1 + 2x 2 + 3x 3 = 0
6.2. Orthogonal Complements and Projections In this section we discuss orthogonal complements and orthogonal projections. The orthogonal complement of a subspace S is the set of all vectors orthgonal to
More informationMath 224, Fall 2007 Exam 3 Thursday, December 6, 2007
Math 224, Fall 2007 Exam 3 Thursday, December 6, 2007 You have 1 hour and 20 minutes. No notes, books, or other references. You are permitted to use Maple during this exam, but you must start with a blank
More informationMath 110, Spring 2015: Midterm Solutions
Math 11, Spring 215: Midterm Solutions These are not intended as model answers ; in many cases far more explanation is provided than would be necessary to receive full credit. The goal here is to make
More informationMATH 15a: Linear Algebra Practice Exam 2
MATH 5a: Linear Algebra Practice Exam 2 Write all answers in your exam booklet. Remember that you must show all work and justify your answers for credit. No calculators are allowed. Good luck!. Compute
More information(v, w) = arccos( < v, w >
MA322 F all206 Notes on Inner Products Notes on Chapter 6 Inner product. Given a real vector space V, an inner product is defined to be a bilinear map F : V V R such that the following holds: Commutativity:
More informationSolutions to Final Exam
Solutions to Final Exam. Let A be a 3 5 matrix. Let b be a nonzero 5-vector. Assume that the nullity of A is. (a) What is the rank of A? 3 (b) Are the rows of A linearly independent? (c) Are the columns
More information1 Review of the dot product
Any typographical or other corrections about these notes are welcome. Review of the dot product The dot product on R n is an operation that takes two vectors and returns a number. It is defined by n u
More informationMATH 15a: Applied Linear Algebra Practice Exam 1
MATH 5a: Applied Linear Algebra Practice Exam Note: this practice test is NOT a guarantee of what the actual midterm will look like!. Say whether the following functions are linear. If so, write down the
More information22A-2 SUMMER 2014 LECTURE 5
A- SUMMER 0 LECTURE 5 NATHANIEL GALLUP Agenda Elimination to the identity matrix Inverse matrices LU factorization Elimination to the identity matrix Previously, we have used elimination to get a system
More informationSpanning, linear dependence, dimension
Spanning, linear dependence, dimension In the crudest possible measure of these things, the real line R and the plane R have the same size (and so does 3-space, R 3 ) That is, there is a function between
More informationIMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET
IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET This is a (not quite comprehensive) list of definitions and theorems given in Math 1553. Pay particular attention to the ones in red. Study Tip For each
More information18.06 Quiz 2 April 7, 2010 Professor Strang
18.06 Quiz 2 April 7, 2010 Professor Strang Your PRINTED name is: 1. Your recitation number or instructor is 2. 3. 1. (33 points) (a) Find the matrix P that projects every vector b in R 3 onto the line
More information(v, w) = arccos( < v, w >
MA322 F all203 Notes on Inner Products Notes on Chapter 6 Inner product. Given a real vector space V, an inner product is defined to be a bilinear map F : V V R such that the following holds: For all v,
More informationis Use at most six elementary row operations. (Partial
MATH 235 SPRING 2 EXAM SOLUTIONS () (6 points) a) Show that the reduced row echelon form of the augmented matrix of the system x + + 2x 4 + x 5 = 3 x x 3 + x 4 + x 5 = 2 2x + 2x 3 2x 4 x 5 = 3 is. Use
More informationChapters 5 & 6: Theory Review: Solutions Math 308 F Spring 2015
Chapters 5 & 6: Theory Review: Solutions Math 308 F Spring 205. If A is a 3 3 triangular matrix, explain why det(a) is equal to the product of entries on the diagonal. If A is a lower triangular or diagonal
More informationM340L Final Exam Solutions May 13, 1995
M340L Final Exam Solutions May 13, 1995 Name: Problem 1: Find all solutions (if any) to the system of equations. Express your answer in vector parametric form. The matrix in other words, x 1 + 2x 3 + 3x
More informationReview for Exam Find all a for which the following linear system has no solutions, one solution, and infinitely many solutions.
Review for Exam. Find all a for which the following linear system has no solutions, one solution, and infinitely many solutions. x + y z = 2 x + 2y + z = 3 x + y + (a 2 5)z = a 2 The augmented matrix for
More informationMATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL
MATH 3 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL MAIN TOPICS FOR THE FINAL EXAM:. Vectors. Dot product. Cross product. Geometric applications. 2. Row reduction. Null space, column space, row space, left
More informationa s 1.3 Matrix Multiplication. Know how to multiply two matrices and be able to write down the formula
Syllabus for Math 308, Paul Smith Book: Kolman-Hill Chapter 1. Linear Equations and Matrices 1.1 Systems of Linear Equations Definition of a linear equation and a solution to a linear equations. Meaning
More informationSolutions to Math 51 First Exam October 13, 2015
Solutions to Math First Exam October 3, 2. (8 points) (a) Find an equation for the plane in R 3 that contains both the x-axis and the point (,, 2). The equation should be of the form ax + by + cz = d.
More informationMath 320, spring 2011 before the first midterm
Math 320, spring 2011 before the first midterm Typical Exam Problems 1 Consider the linear system of equations 2x 1 + 3x 2 2x 3 + x 4 = y 1 x 1 + 3x 2 2x 3 + 2x 4 = y 2 x 1 + 2x 3 x 4 = y 3 where x 1,,
More informationElementary maths for GMT
Elementary maths for GMT Linear Algebra Part 2: Matrices, Elimination and Determinant m n matrices The system of m linear equations in n variables x 1, x 2,, x n a 11 x 1 + a 12 x 2 + + a 1n x n = b 1
More informationFinal Review Written by Victoria Kala SH 6432u Office Hours R 12:30 1:30pm Last Updated 11/30/2015
Final Review Written by Victoria Kala vtkala@mathucsbedu SH 6432u Office Hours R 12:30 1:30pm Last Updated 11/30/2015 Summary This review contains notes on sections 44 47, 51 53, 61, 62, 65 For your final,
More informationChapter 2 Notes, Linear Algebra 5e Lay
Contents.1 Operations with Matrices..................................1.1 Addition and Subtraction.............................1. Multiplication by a scalar............................ 3.1.3 Multiplication
More informationLINEAR ALGEBRA REVIEW
LINEAR ALGEBRA REVIEW SPENCER BECKER-KAHN Basic Definitions Domain and Codomain. Let f : X Y be any function. This notation means that X is the domain of f and Y is the codomain of f. This means that for
More informationSolution to Homework 1
Solution to Homework Sec 2 (a) Yes It is condition (VS 3) (b) No If x, y are both zero vectors Then by condition (VS 3) x = x + y = y (c) No Let e be the zero vector We have e = 2e (d) No It will be false
More informationMATH 310, REVIEW SHEET 2
MATH 310, REVIEW SHEET 2 These notes are a very short summary of the key topics in the book (and follow the book pretty closely). You should be familiar with everything on here, but it s not comprehensive,
More informationLecture Summaries for Linear Algebra M51A
These lecture summaries may also be viewed online by clicking the L icon at the top right of any lecture screen. Lecture Summaries for Linear Algebra M51A refers to the section in the textbook. Lecture
More informationMath 416, Spring 2010 The algebra of determinants March 16, 2010 THE ALGEBRA OF DETERMINANTS. 1. Determinants
THE ALGEBRA OF DETERMINANTS 1. Determinants We have already defined the determinant of a 2 2 matrix: det = ad bc. We ve also seen that it s handy for determining when a matrix is invertible, and when it
More information(v, w) = arccos( < v, w >
MA322 Sathaye Notes on Inner Products Notes on Chapter 6 Inner product. Given a real vector space V, an inner product is defined to be a bilinear map F : V V R such that the following holds: For all v
More informationAssignment 1 Math 5341 Linear Algebra Review. Give complete answers to each of the following questions. Show all of your work.
Assignment 1 Math 5341 Linear Algebra Review Give complete answers to each of the following questions Show all of your work Note: You might struggle with some of these questions, either because it has
More informationMATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)
MATH 20F: LINEAR ALGEBRA LECTURE B00 (T KEMP) Definition 01 If T (x) = Ax is a linear transformation from R n to R m then Nul (T ) = {x R n : T (x) = 0} = Nul (A) Ran (T ) = {Ax R m : x R n } = {b R m
More informationIMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET
IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET This is a (not quite comprehensive) list of definitions and theorems given in Math 1553. Pay particular attention to the ones in red. Study Tip For each
More informationLinear Algebra II. 2 Matrices. Notes 2 21st October Matrix algebra
MTH6140 Linear Algebra II Notes 2 21st October 2010 2 Matrices You have certainly seen matrices before; indeed, we met some in the first chapter of the notes Here we revise matrix algebra, consider row
More informationThis MUST hold matrix multiplication satisfies the distributive property.
The columns of AB are combinations of the columns of A. The reason is that each column of AB equals A times the corresponding column of B. But that is a linear combination of the columns of A with coefficients
More informationx 1 + 2x 2 + 3x 3 = 0 x 1 + 2x 2 + 3x 3 = 0, x 2 + x 3 = 0 x 3 3 x 3 1
. Orthogonal Complements and Projections In this section we discuss orthogonal complements and orthogonal projections. The orthogonal complement of a subspace S is the complement that is orthogonal to
More informationMatrices. 1 a a2 1 b b 2 1 c c π
Matrices 2-3-207 A matrix is a rectangular array of numbers: 2 π 4 37 42 0 3 a a2 b b 2 c c 2 Actually, the entries can be more general than numbers, but you can think of the entries as numbers to start
More information1 9/5 Matrices, vectors, and their applications
1 9/5 Matrices, vectors, and their applications Algebra: study of objects and operations on them. Linear algebra: object: matrices and vectors. operations: addition, multiplication etc. Algorithms/Geometric
More informationChapter SSM: Linear Algebra Section Fails to be invertible; since det = 6 6 = Invertible; since det = = 2.
SSM: Linear Algebra Section 61 61 Chapter 6 1 2 1 Fails to be invertible; since det = 6 6 = 0 3 6 3 5 3 Invertible; since det = 33 35 = 2 7 11 5 Invertible; since det 2 5 7 0 11 7 = 2 11 5 + 0 + 0 0 0
More informationFebruary 20 Math 3260 sec. 56 Spring 2018
February 20 Math 3260 sec. 56 Spring 2018 Section 2.2: Inverse of a Matrix Consider the scalar equation ax = b. Provided a 0, we can solve this explicity x = a 1 b where a 1 is the unique number such that
More informationSUMMARY OF MATH 1600
SUMMARY OF MATH 1600 Note: The following list is intended as a study guide for the final exam. It is a continuation of the study guide for the midterm. It does not claim to be a comprehensive list. You
More informationLinear Algebra Notes. Lecture Notes, University of Toronto, Fall 2016
Linear Algebra Notes Lecture Notes, University of Toronto, Fall 2016 (Ctd ) 11 Isomorphisms 1 Linear maps Definition 11 An invertible linear map T : V W is called a linear isomorphism from V to W Etymology:
More informationFamily Feud Review. Linear Algebra. October 22, 2013
Review Linear Algebra October 22, 2013 Question 1 Let A and B be matrices. If AB is a 4 7 matrix, then determine the dimensions of A and B if A has 19 columns. Answer 1 Answer A is a 4 19 matrix, while
More informationMODULE 8 Topics: Null space, range, column space, row space and rank of a matrix
MODULE 8 Topics: Null space, range, column space, row space and rank of a matrix Definition: Let L : V 1 V 2 be a linear operator. The null space N (L) of L is the subspace of V 1 defined by N (L) = {x
More information