MAT2342 : Introduction to Applied Linear Algebra Mike Newman, fall Projections. introduction

Size: px
Start display at page:

Download "MAT2342 : Introduction to Applied Linear Algebra Mike Newman, fall Projections. introduction"

Transcription

1 MAT4 : Introduction to Applied Linear Algebra Mike Newman fall 7 9. Projections introduction One reason to consider projections is to understand approximate solutions to linear systems. A common example of this is to find the best-fit line to a given set of points: this is the regression line. Problem 9.. Find the best line that approximates the three points ( ) ( ) ( ). Situations like this arise in any context where it is felt that in some sense the value of some parameter (x) should determine the value of some outcome (y) but where there are other sources of variability in the data. So an experimenter measures outcomes (y) for different values of some parameter (x) with the goal of determining the relationship between them. Some examples: one might wish to determined the gravitational constant g at a particular location and use the time taken for an object to fall m to determine g. The actual times will vary due to air turbulence measurement error etc but on average should correlate with the true values of g. Problem 9. is of course somewhat oversimplified: three data points don t really indicate much. But it is small enough that one could plot the three points and try and fit a line as close as possible. This can be seen as an application of projection in a vector space; in fact a projection in a highdimensional vector space. Projections have a number of other uses so we start by establishing a bit of theory before returning to our application above. scalar product The (standard) scalar product of two vectors in R n is defined as follows. x y = x t y = x y + x y + + x n y n Some useful geometrical facts about the scalar product. Proposition 9.. For x y = x t y we have the following. The Euclidean length of a vector x is x = x x. We sometimes call this the norm of a vector. In two dimensions this is the Pythagoras theorem. Furthermore x = if and only if x x = if and only if z =. Two vectors x and y are orthogonal if the angle between x and y is 9 ; we write x y. We have x y if and only if x y =. More generally if we let θ be the angle between x and y then we have x y = x y cos θ. Problem 9.. If we draw x and y there are two angles that could be considered the angle between them. Furthermore we could measure this angle in two different directions so it is unclear if this angle is positive or negative. Explain how these ambiguities do not affect the expression x y cos θ These notes are intended for students in mike s MAT4. For other uses please say hi to mnewman@uottawa.ca. 86

2 There are also useful algebraic properties of the scalar product. Proposition 9.4. For x y R n and a b R we have the following. x y = y x x (au + bv) = ax u + bx v Proof. We leave this as an exercise using known properties of matrix multiplication. The first property of Proposition 9.4 is called commutativity of the scalar product; the second is called linearity of the scalar product. orthogonal projection The (orthogonal) projection of a vector x on to a vector y is the part of x that is in the direction of y. In order to make this precise we decompose x into two parts: one in the direction of y and the other perpendicular to y. x v y u In this picture the vector u represents the projection of x on to y. We write u = proj y (x). The vector v represents the part of x that is orthogonal to y. We have x = u+v. Knowing the projection u we could find the orthogonal part as v = x u. Problem 9.5. Is proj y (x) = proj x (y)? We can calculate the projection using the following formula. Proposition 9.6. The projection of x on to y is proj y (x) = y x y yy. In particular if we let u = proj y (x) = y x y yy and v = x u then v is parallel to y x is orthogonal to y and x = u + v. Proof. We see that u is parallel to vy because it is a scalar multiple of y. By definition we have x = u + v. So we prove that v is orthogonal to y. ( y v = y t x y x ) y y y = y x y x y y y y = y x y x = Note that we used Proposition 9.4 in simplifying the expression. Example 9.7. Let x = and y = 6. Calculate the projection of x on to y. 87

3 Solution. We calculate the projection as proj y (x) = 6 = 4 8 We can then calculate the rest of x that is to say the part orthogonal to y as follows. x proj y (x) = We obtain a decomposition of x in two parts: the first in the direction of y (this is proj y (x)) and the second is orthogonal to y. 6/ 4/ = / 6/ 8 Problem 9.8. Check that the decomposition in the previous example has the right directions: the first vector should be a multiple of y and the second should be perpendicular to y. Problem 9.9. Let x = and y =. Calculate proj y (x) and give a decomposition of x into two vectors: one parallel to y and the other orthogonal to y. Also calculate proj x (y) and give a decomposition of x into two vectors: one parallel to x and the other orthogonal to x. projection onto subspaces We can think of Proposition 9.6 in a slightly more general way. We are given a vector x and a subspace U and we want to express x = u + vv where u U and v is perpendicular to U. Putting aside the fact that we don t (yet) have an idea of what it might mean for a vector to be perpendicular to a subspace we can think of Proposition 9.6 as corresponding to the case where U is the span of a single vector y. In this case at least it seems reasonable to say that v is perpendicular to U exactly when it is perpendicular to y. This is in fact the real notion of projection that we want; we will see that Proposition 9.6 essentially extends to projection onto subspaces but we need a bit of work first. In the meantime and inspired by the thought that projection onto y might be the same as projection onto the space spanned by y we notice the following. Proposition 9.. Let z = γy where neither y not z are the zero vector. Then proj y (x) = proj z (x). bases (review) Let B = {v v v k } be a set of vectors in a vector space V. We say that B spans V (or B is a spanning set for V ) if every vector in V can be written as a linear combination of the elements of B. We say that B is linearly independent if the only linear combination of the elements of B that give is the trivial one. In other words B is linearly independent if α v + α v + + α k v k = = α = α = = α k = 88

4 A basis of a vector space V is a set of vectors in V that is linearly independent and spanning. We have already seen examples in this course. We found bases for eigenspaces of a matrix. For a diagonalizable matrix we saw that taking the union of the bases for each eigenspace we get a basis for R n. We called this an eigenbasis of R n. This was useful in studying dynamical systems. A basis is useful because it gives a unique expression for every vector. Theorem 9.. Let B = {v v v k } be a set of vectors in some vector space V. Then B is a basis if and only if for every vector x in V there exists unique values α α α k that give x = α v + α v + + α k v k. These values α i in Theorem 9. are the coordinates of x with respect to the basis B. Proof. Assume B is a basis. We must show that any x can be uniquely expressed in terms of B. First of all since B spans V there certainly are some numbers α α k such that x = α v + α v + + α k v k. Now assume this expression is not unique that is Then x = α v + α v + + α k v k = β v + β v + + β k v k = x x = (α v + α v + + α k v k ) (β v + β v + + β k v k ) = (α β ) v + (α β ) v + + (α k But B is linearly independent so we must have α i β i = for each i which means that α i = β i and there is exactly one way to express x as a linear combination of the elements of B. Now assume that for every vector x V there is exactly one way to express x as a linear combination of the elements of B. In particular there is some way to express x so B spans V. So for x = there are constants such that = γ v + γ v + + γ k v k. But we know that = v + v + + v k so we must have γ i = for each i which shows that B is linearly independent. { { Example 9.. Check that B = and B } = are both bases. Then find the } coordinates of with respect to both of these. 4 Solution. In order to check that these are both bases we need to check that they are linearly independent and that they span R. Alternatively we need to check that every vector x can be uniquely expressed in terms of the elements of the alleged basis. For B we need to show that for every vector x R there is a unique solution to x α + α = x But this is true if and only if α = x and α = x. So there is a unique solution for every x. For B we want a unique solution to α + α = x x Right about now we should admit to an abuse of notation. When we are thinking of the coordinates of a vector with respect to a basis we are thinking of the basis as an ordered set (sometimes called a tuple). In other words we need to have a way to tell which vector is first so we know which elements α corresponds to and so on for the second third etc. This doesn t usually cause any problem but note that a when we say that a basis is a set of vectors we really mean a set with an ordering. 89

5 We would consider the following augmented matrix. x x This will have a solution for every x if it has a pivot in every row; it will have a unique solution if it has a pivot in every column (no free variables). So we conclude that a set of vectors is a basis for R n if and only if when we put these vectors as columns in a matrix we have a pivot in every row and column. We leave it as an exercise to apply this to the example above. Now to find coordinates. With respect to B we are looking for numbers α α such that 4 = α + α Here we can see directly that α = and α = 4. With respect to B we are looking for numbers α α such that 4 = α + α As before we might switch to an augmented matrix perspective. 4 4 The coordinates are then α = and α = 4. We notice that a vector can have different coordinates with respect to different bases. In fact we might notice that B and B have exactly one element in common but it is the coordinate of that element that changed. So when we speak of the coordinates of a vector with respect to a basis it is with respect to the whole basis. There are some other results about bases we will find useful. Theorem 9.. Let B and B be bases for a vector space. Then B = B. We say that the dimension of a vector space is the size of a basis. By the previous theorem this is well-defined. In fact we can say a little more than Theorem 9.. Theorem 9.4. Let V be a vector space of dimension n and B a set of vectors in V. If any two of the following is true then so is the third. B is linearly independent B spans V B = n In particular B is a basis of V. orthogonal bases Orthogonality is not just a geometric property: it is an algebraic one also. A set B is an orthogonal set if every pair of vectors in B are orthogonal and it does not contain the zero vector. 9

6 Theorem 9.5. Let {u u u k } be an orthogonal set. Then {u u u k } is linearly independent. The converse is not true: a linearly independent set is not necessarily orthogonal. Proof. Assume {u u u k } is an orthogonal set and that = α u + α u + + α k u k Now we take the scalar product of each side with u i for some i k. u i = u i (α u + α u + + α k u k ) = α u i u + α u i u + + α k u i u k = α i u i u i = α i We used Proposition 9.4 to distribute the product; we used the fact that u i and Proposition 9. to divide by u i u i at the end. An important consequence of this is that if we have n (nonzero) orthogonal vectors in R n then using Theorem 9.5 and Theorem 9.4 we have a basis for R n. We say that a basis whose elements are pairwise orthogonal is an orthogonal basis. Similarly if we have k orthogonal vectors in a subspace U of dimension k we would have an orthogonal basis for U. Problem 9.6. Let B be an orthogonal set pf vectors from R n. Show that it is an orthogonal basis for the subspace spanned by B. { { Problem 9.7. Verify that is an orthogonal basis for R }. Verify that is } not an orthogonal basis for R. An orthonormal basis is an orthogonal basis where each vector is of length (norm) equal to. We can get an orthonormal basis from an orthogonal one by dividing each by its norm. Example 9.8. Verify that {u u } = is an orthogonal basis for the subspace spanned by these two vectors. Then transform it into an orthonormal basis. Solution. We check that u u = ()( ) + ()() + ( )() = which means that they are orthogonal and hence (Theorem 9.5) linearly independent. They certainly span the space that they span so they are a basis for that subspace. We calculate their norms. u = () + () + ( ) = u = ( ) + () + () = So we have an orthonormal basis (for the space spanned by the two vectors). { } u u = / / / / / Problem 9.9. In an orthonormal basis B the scalar product of two different vectors of B is and the scalar product of a vector of B with itself is. Explain why this is true. This can be understood as follows. 9

7 Proposition 9.. Let B = {u u k } be an orthonormal basis for some subspace of R n (which might be R n ). If we define the matrix B by B ij = u i u j then B = I. Alternatively if we let Q be the matrix whose columns are the vectors of B then Q t Q = I. A square matrix Q with the property that Q t Q is said to be an orthogonal matrix. Even if Q is not square it is still a nice matrix: see Exercise 9.8. orthogonal bases and projections We would like to know how to compute projections in an efficient manner. Orthogonal bases accomplish this perfectly. Our immediate goal is to obtain a formula for (orthogonal) projection onto a subspace. We will see that this is essentially the same thing as the projection of a vector onto a vector. Theorem 9.. Let U be a subspace of R n (which might be all of R n ) and let {u u u k } be an orthogonal basis for U. For any vector x U we have the following. x = u x u + u x u + + u k x u k u u u u u k u k In other words x is equal to the sum of the projections of x onto each vector in the orthonormal basis. Proof. We know from Theorem 9. that there are unique numbers α i such that x = α u + α u + + α k u k Now we take the scalar product of each side with u i for some i k. u i x = u i (α u + α u + + α k u k ) u i x = α u i u + α u i u + + α k u i u k u i x = α i u i u i u i x u i u i = α i Which means the α i are exactly what they were claimed to be. It is worth comparing this to the proof of Theorem 9.5. We restate Theorem 9. in terms of the coordinates although this is little more than a repetition. Corollary 9.. Let U be a subspace of R n and let {u u k } be an orthogonal basis for U. Then the coordinates of x with respect to this basis are α i = u i x u i u i. Recall our previous goal of wanting to understand how to project onto a subspace. So give a vector x R n and a subspace U of R n we want to find a vector u in U and a vector v orthogonal to U such that x = u + v. 9

8 For v to be orthogonal to a subspace means that v is orthogonal to each vector in U. We define the orthogonal complement of U written U as the set of all vectors of R n that are orthogonal to every vector of U. Specifically U = {y y x x U} Example 9.. In fact you might have seen an example of this in mat4. Let A be some m n matrix of rank k and denote by U its rowspace. So U is the set of all vectors that are linear combinations of the rows of A. Then U is exactly the kernel (null space) of A. That is U is the set of all vectors x where Ax =. You might recall that the dimension of the rowspace is the rank of A and the dimension of the kernel is the number of free variables of A meaning that dim(u) = k and dim(u ) = n k. In fact this is true for orthogonal complements in general. The theory of orthogonal subspaces gives the following. Theorem 9.4. Let U be a subspace of R n and let U be its orthogonal complement. Then U U = {}. dim(u) + dim(u ) = n. Every vector x can be written uniquely as x = u + v where x U and v U. This is exactly the idea of orthogonal projection on to a subspace. This might make more sense in terms of orthogonal bases. Proposition 9.5. Let U be a subspace of R n and U its orthogonal complement. If B is any basis of U and B is any basis of U then B B is a basis of R n. Furthermore if B and B are both orthogonal bases then so is B B. Proof. Let B = {u u k } and B = {u u n k }. This requires that B + B = n but we know that is true from Theorem 9.4. We show that B B is linearly independent. Assume that Then we have = α u + + α k u k + β u + + β n k u n k α u + + α k u k = (β u + + β n k u n k ) This means that α u + + α k u k is in U and in U and so α u + + α k u k = (see Exercise 9.). But since B is linearly independent the only way this linear combination can be zero is if each of the α i =. Also β u + + β n k u n k is in U and U and by an analogous argument each of the β i must also be zero. If B and B are both orthogonal bases then u i u j whenever i j k or whenever k + i j n. If i k and k + j n then u i U and u j U and so u i u j ; similarly if j k and k + i n. We will see that we can find orthogonal bases for any subspace; and in fact we will see a practical method for extending a basis B of U into a basis B B of R n where B is a basis for U. Putting aside for the moment how exactly we will do this we see what it is good for. 9

9 Theorem 9.6. Let {u u u k } be an orthogonal basis for U and {v v n k } an orthogonal basis for U. Then we have u U and v U and x = u + v where u = x u u u u + x u u u u + + x u k u k u k u k v = x v v v v + x v v v v + + x v n k v n k v n k v n k Furthermore u and v are the only vectors such that u U v U and x = u + v. That is they do not depend on the particular choice of orthogonal bases. Proof. Given the two orthogonal bases the vectors u and v are completely determined and using Theorem 9. applied to R n we see that u + v = x. On the other hand assume that u U and v U with u + v = x. Then u + x = x = u + v. Subtracting we see that u u = v v. But then u u is in both U and U and so by Exercise 9. we have u u = or u = u. Similarly v = v. We define the projection of x onto a subspace U of R n as proj U (x) = x u u u u + x u u u u + + x u k u k u k u k where {u u k } is an orthogonal basis for U. We repeat that even though this definition is given in terms of a particular basis for U the vector proj U (x) does not depend on the choice of basis. Example 9.7. Let B = and B = 4. Show that these are both orthogonal sets and hence orthogonal bases for the subspaces they span. Show that these two spaces are orthogonal. Considering (somewhat arbitrarily) the first as U and the second as U calculate proj U (x) and proj U (x) for x = t. Solution. We first check that B and B are indeed orthogonal (exercise). We also check that each of the vectors of B is orthogonal to each of the vectors of B ; combining this with Exercise 9.7 we see that the two subspaces are indeed orthogonal as claimed. Note that Proposition 9.5 tells us that the union of these two must be an orthogonal basis for R 4. Now for the projections. / proj U (x) = 9 + = / / / proj U (x) = = / 4/ The sum of these two projections is indeed x. So in fact we could have calculated one of them and obtained the other by subtraction. Problem 9.8. For the bases of Example 9.7 find proj U (y) and proj U (y) for y = t. Then calculate y proj U (y) and compare. 94

10 There are two technical questions. Given a subspace how to find an orthogonal basis? Given a subspace how to find a basis for the orthogonal complement? Gram-Schmidt We are inspired by the first picture of the projection of x on to y. The vectors x and y form a basis for R but not an orthogonal one. But if we consider v = x proj y (x) we see that y and v form an orthogonal basis for R. The idea is to construct the orthogonal basis one vector at a time at each step subtracting from the next vector the projections of it onto the vectors we already have. Algorithm 9.9. Let S = {v v v k } be a set of vectors that span a subspace U of R n. We will build orthogonal basis as follows. We start with B = {} and we repeat the following steps until all vectors in S have been considered.. Choose a vector x from S.. Subtract from x its projection onto each element of B.. If the result is nonzero put it into the set B. At the end the set B will be an orthogonal basis for the space spanned by S. Example 9.. The vectors this subspace. 5 span some subspace. Find an orthogonal basis for Solution. We start with an empty orthogonal set B = {}. We choose some vector of S say We subtract its projections onto all the vectors currently in B. There { is nothing in B (yet!) so } nothing to subtract. We put the result into B. Now we have B =. We repeat! Now we choose some other vector from S say currently in B. We put this result into B. Now we have B = {. We subtract its projections onto the vectors = 6 6 = }.. 95

11 Now we choose the only remaining vector in S namely vectors currently in B. 5 5 { Now we have B = 5 final B is an orthogonal basis for the space spanned by S. 5. We subtract its projections onto the 5 = = }. There are no more vectors in S so we are done. The It is important to note that when we are subtracting the projections we are considering projections on to the new vectors up to now not the originals. In other words we project on to the vectors we put in B not the vectors we took out of S. Another observation that is perhaps useful is that the final vectors are orthogonal hence linearly independent. So in Example 9. we found a linearly independent set that spanned the span of S. This means that the span of S is of dimension and so (using Theorem 9.4) we have proved that S is actually a basis. Were it not to have been a basis we would have gotten the vector zero at some step (After subtracting the projections) which we would not have put into B. Problem 9.. In Example 9. we chose the vectors in a particular order. Repeat this example but with a different order. Do we still get an orthogonal basis? Is it the same basis? Problem 9.. In Example 9. we found a basis of three orthogonal vectors. Verify directly that is also an orthogonal set. This little trick is sometimes useful. It says that we can multiply a vector in an orthogonal set by a nonzero constant without destroying the orthogonality. It s not the same basis but it s still an orthogonal basis. This can be especially handy if part way through we start getting ugly fractions: we can rescale the vector before adding it into B. finding the orthogonal complement In some sense this section is optional. Give a subspace U and a vector x we have proj U (x) = x proj U (x). So we don t really need to find a basis for U in order to find proj U (x). But we include the following anyway for completeness and to show that there are other uses for row-reduction. In fact we ll give two methods. Algorithm 9.. Consider a set {v v v k } that spans some subspace U of R n. We put these vectors as columns of a matrix A and we apply Gauss-Jordan elimination to the augmented matrix A I. The result is R B. The columns of R with pivots indicate which columns of the original form a basis for U. The rows of B corresponding to zero rows of R give a basis for U. If we combine the basis of U with the basis of U we obtain a basis of R n (Proposition 9.5). We have completed the basis of U to a basis of R n. 96

12 Proof Sketch. The method of Algorithm 9. is based on elementary matrices. Every row operation on a matrix is equivalent to multiplying that matrix by an invertible matrix on the left. So we can think of the entire row reduction as multiplying on the left by a single invertible matrix (which is a product of many elementary matrices). Call this matrix E. Then the final augmented matrix after the row-reduction is EA I = EA E. A zero row to the left of the vertical bar is a zero row in EA. But a zero row in EA means that the corresponding row of E is orthogonal to every column of A. Those are the rows that span U. Since the matrix E is to be found to the right or the vertical bar then the corresponding rows form the basis for U. The use of elementary matrices in this proof is closely related to what is happening in Exercise 8.. In that question you were being asked to find the first row of the matrix E; where E corresponds to the set of row operations that transformed one tableau into the other. In Algorithm 9. we are considering this as an augmented matrix so we only care about pivots to the left of the vertical bar. In other words we are really row-reducing A but the I is coming along for the ride. Furthermore we only need to know where the pivots to the left of the vertical bar are we do not need to go any further than that. Example 9.4. A subspace U is spanned by the vectors. Find a basis for U. 4 Solution. We row reduce the following. 4 To the left of the vertical bar we see pivots in the first and second columns. So we take the first and second columns of A. To the left of the vertical bar we see that the third and fourth rows are zero. So we take the third and fourth rows of B as a basis for U. Combining the two gives a basis for R 4. basis of U: basis of U : basis of R 4 : Problem 9.5. Check in Example 9.4 that each vector in the basis for U is orthogonal to each vector in the basis for U. Problem 9.6. In Example 9.4 verify that the basis for U is not orthogonal. Transform is into an orthogonal basis using Algorithm 9.9. Do the same for the basis for U. Verify that these two are indeed orthogonal bases for U and U and that their union is an orthogonal basis for R 4. Why was it not necessary to apply Algorithm 9.9 with four vectors in order to get an orthogonal basis for R 4? What happens if you take the basis for R 4 from Example 9.4 and apply Algorithm 9.9 to this set? 97

13 We have another method based on the fact that if U is the row space of a matrix then U is the kernel of that matrix. Algorithm 9.7. Consider a set {v v v k } that spans some subspace U of R n. We put these vectors as rows of a matrix Aand row-reduce it. The rows with pivots indicate the rows of the original matrix A that form a basis of U. Since the orthogonal complement of a row space is a kernel then a basis for ker(a) is a basis for U. In other words a basis for U is exactly a basis for the solutions to Ax =. Proof Sketch. A vector x ker(a) is such that Ax = which means that x is orthogonal to every row of A. So ker(a) U. Now we know that the dimension of ker(a) plus the dimension of the rowspace is n the number of columns. But since U is a subspace of R n then Theorem 9.4 tells us that dim(u ) = dim(ker(a)). If ker(a) U then there is a vector in U that is not in ker(a). But then this vector is not in the span of ker(a) so dim(u ) > dim(ker(a)) which is a contradiction. Example 9.8. A subspace U is spanned by the vectors. Find a basis for U. 4 Solution. We row reduce the following. 4 We have two free variables so we will have two vectors in the basis for the kernel. For each free variable we set it to and all other free variables to. x = x 4 = x = x 4 = basis of U: basis of U : basis of R 4 : exercises. Prove Proposition 9.4 using things you know about matrix multiplication.. Prove Proposition 9. using the formula for projection from Proposition Find two different bases B and B for a vector space V and a vector x V such that the coordinates of x with respect to B and B are the same. 98

14 4. Let B and B be bases for some vector space V. Assume that for every x V the coordinates of x with respect to B and B are the same. Prove that B = B. (hint: what if x B?) { a 5. a) Find all values a and b such that is an orthogonal set. b} b) Find all values a b c d such that a b c d is an orthogonal set. 6. Consider the space U spanned by B = and the vector x = a) Verify that B is an orthogonal set. b) Determine proj u (x) for each vector u B. c) Considering the sum of the three projections is x U? (hint: Theorem 9.) 7. Let B and B be two orthogonal sets. Show that if u is in the span of B and v is in the span of B then u v. 8. Let Q be some matrix (not necessarily square). Show that Q t Q = I if and only if the columns of Q form an orthonormal basis for the subspace that they span. 9. Let U be a subspace of R n. Show that U (U ). Using Theorem 9.4 show that U = (U ). (The former is true in every vector space; the latter can be false in infinitedimensional vector spaces.). For U a subspace of R n show that U U = {}. (hint: let x U U and notice that x x = ). The set is a basis for a subspace U. a) Using Gram-Schmidt (Algorithm 9.9) give an orthogonal basis for U. b) Adjust this to an orthonormal basis.. (not really but we haven t explicitly talked about the subspace test in this course although you would have seen it in mat4) Let U be a subspace of R n. Using the subspace test show that U is a subspace of R n. (This wasn t mentioned explicitly in the text above but is necessary to know to be sure that things like a basis for U make sense.). Let U be the subspace of R spanned by and the vector x = a) Apply the Gram-Schmidt algorithm to find an orthogonal basis for U. b) Redo Gram-Schmidt with the vectors chosen in the other order to find a seceond orthogonal basis for U. c) Find proj U (x). Do this twice using each of your orthogonal bases. Verify that you get the same result. 4. Consider the set S = 4. a) Show that S is an orthogonal set. b) Apply the Gram-Schmidt method (Algorithm 9.9) to S. Explain what happens. Does this seem reasonable? 99

15 5. Let U be the space spanned by. a) Find a (nonzero) vector orthogonal to U (you can use Algorithm 9. or Algorithm 9.7 but you shouldn t really need either). Using Theorem 9.4 explain why your vector is a basis for U. b) Given x = a b c d t find proju (x) and proj U (x). What happens? Does this seem reasonable? 4 6. Let U be the space spanned by 6. a) Find an orthogonal basis for U. b) For each vector in the standard basis for R 4 find its projection on to U. c) For each vector in the standard basis for R 4 find its projection on to U. 7. Let U be the set of points in R that satisfy ax + bx + cx =. For convenience assume that a. a) Since U = ker( a b c ) we see that U is a subspace of R. Show that a basis for U is c b a. a b) Using Algorithm 9. give a basis for U.

Final Review Sheet. B = (1, 1 + 3x, 1 + x 2 ) then 2 + 3x + 6x 2

Final Review Sheet. B = (1, 1 + 3x, 1 + x 2 ) then 2 + 3x + 6x 2 Final Review Sheet The final will cover Sections Chapters 1,2,3 and 4, as well as sections 5.1-5.4, 6.1-6.2 and 7.1-7.3 from chapters 5,6 and 7. This is essentially all material covered this term. Watch

More information

Linear Algebra, Summer 2011, pt. 2

Linear Algebra, Summer 2011, pt. 2 Linear Algebra, Summer 2, pt. 2 June 8, 2 Contents Inverses. 2 Vector Spaces. 3 2. Examples of vector spaces..................... 3 2.2 The column space......................... 6 2.3 The null space...........................

More information

LINEAR ALGEBRA KNOWLEDGE SURVEY

LINEAR ALGEBRA KNOWLEDGE SURVEY LINEAR ALGEBRA KNOWLEDGE SURVEY Instructions: This is a Knowledge Survey. For this assignment, I am only interested in your level of confidence about your ability to do the tasks on the following pages.

More information

12. Perturbed Matrices

12. Perturbed Matrices MAT334 : Applied Linear Algebra Mike Newman, winter 208 2. Perturbed Matrices motivation We want to solve a system Ax = b in a context where A and b are not known exactly. There might be experimental errors,

More information

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION MATH (LINEAR ALGEBRA ) FINAL EXAM FALL SOLUTIONS TO PRACTICE VERSION Problem (a) For each matrix below (i) find a basis for its column space (ii) find a basis for its row space (iii) determine whether

More information

[Disclaimer: This is not a complete list of everything you need to know, just some of the topics that gave people difficulty.]

[Disclaimer: This is not a complete list of everything you need to know, just some of the topics that gave people difficulty.] Math 43 Review Notes [Disclaimer: This is not a complete list of everything you need to know, just some of the topics that gave people difficulty Dot Product If v (v, v, v 3 and w (w, w, w 3, then the

More information

Dot Products. K. Behrend. April 3, Abstract A short review of some basic facts on the dot product. Projections. The spectral theorem.

Dot Products. K. Behrend. April 3, Abstract A short review of some basic facts on the dot product. Projections. The spectral theorem. Dot Products K. Behrend April 3, 008 Abstract A short review of some basic facts on the dot product. Projections. The spectral theorem. Contents The dot product 3. Length of a vector........................

More information

LECTURES 14/15: LINEAR INDEPENDENCE AND BASES

LECTURES 14/15: LINEAR INDEPENDENCE AND BASES LECTURES 14/15: LINEAR INDEPENDENCE AND BASES MA1111: LINEAR ALGEBRA I, MICHAELMAS 2016 1. Linear Independence We have seen in examples of span sets of vectors that sometimes adding additional vectors

More information

Answers in blue. If you have questions or spot an error, let me know. 1. Find all matrices that commute with A =. 4 3

Answers in blue. If you have questions or spot an error, let me know. 1. Find all matrices that commute with A =. 4 3 Answers in blue. If you have questions or spot an error, let me know. 3 4. Find all matrices that commute with A =. 4 3 a b If we set B = and set AB = BA, we see that 3a + 4b = 3a 4c, 4a + 3b = 3b 4d,

More information

Linear Algebra March 16, 2019

Linear Algebra March 16, 2019 Linear Algebra March 16, 2019 2 Contents 0.1 Notation................................ 4 1 Systems of linear equations, and matrices 5 1.1 Systems of linear equations..................... 5 1.2 Augmented

More information

Chapter 6: Orthogonality

Chapter 6: Orthogonality Chapter 6: Orthogonality (Last Updated: November 7, 7) These notes are derived primarily from Linear Algebra and its applications by David Lay (4ed). A few theorems have been moved around.. Inner products

More information

Assignment 1 Math 5341 Linear Algebra Review. Give complete answers to each of the following questions. Show all of your work.

Assignment 1 Math 5341 Linear Algebra Review. Give complete answers to each of the following questions. Show all of your work. Assignment 1 Math 5341 Linear Algebra Review Give complete answers to each of the following questions Show all of your work Note: You might struggle with some of these questions, either because it has

More information

The value of a problem is not so much coming up with the answer as in the ideas and attempted ideas it forces on the would be solver I.N.

The value of a problem is not so much coming up with the answer as in the ideas and attempted ideas it forces on the would be solver I.N. Math 410 Homework Problems In the following pages you will find all of the homework problems for the semester. Homework should be written out neatly and stapled and turned in at the beginning of class

More information

MAT Linear Algebra Collection of sample exams

MAT Linear Algebra Collection of sample exams MAT 342 - Linear Algebra Collection of sample exams A-x. (0 pts Give the precise definition of the row echelon form. 2. ( 0 pts After performing row reductions on the augmented matrix for a certain system

More information

Math 54 HW 4 solutions

Math 54 HW 4 solutions Math 54 HW 4 solutions 2.2. Section 2.2 (a) False: Recall that performing a series of elementary row operations A is equivalent to multiplying A by a series of elementary matrices. Suppose that E,...,

More information

The Fundamental Theorem of Linear Algebra

The Fundamental Theorem of Linear Algebra The Fundamental Theorem of Linear Algebra Nicholas Hoell Contents 1 Prelude: Orthogonal Complements 1 2 The Fundamental Theorem of Linear Algebra 2 2.1 The Diagram........................................

More information

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017 Math 4A Notes Written by Victoria Kala vtkala@math.ucsb.edu Last updated June 11, 2017 Systems of Linear Equations A linear equation is an equation that can be written in the form a 1 x 1 + a 2 x 2 +...

More information

Linear Models Review

Linear Models Review Linear Models Review Vectors in IR n will be written as ordered n-tuples which are understood to be column vectors, or n 1 matrices. A vector variable will be indicted with bold face, and the prime sign

More information

Linear Algebra (part 1) : Vector Spaces (by Evan Dummit, 2017, v. 1.07) 1.1 The Formal Denition of a Vector Space

Linear Algebra (part 1) : Vector Spaces (by Evan Dummit, 2017, v. 1.07) 1.1 The Formal Denition of a Vector Space Linear Algebra (part 1) : Vector Spaces (by Evan Dummit, 2017, v. 1.07) Contents 1 Vector Spaces 1 1.1 The Formal Denition of a Vector Space.................................. 1 1.2 Subspaces...................................................

More information

Math 416, Spring 2010 Gram-Schmidt, the QR-factorization, Orthogonal Matrices March 4, 2010 GRAM-SCHMIDT, THE QR-FACTORIZATION, ORTHOGONAL MATRICES

Math 416, Spring 2010 Gram-Schmidt, the QR-factorization, Orthogonal Matrices March 4, 2010 GRAM-SCHMIDT, THE QR-FACTORIZATION, ORTHOGONAL MATRICES Math 46, Spring 00 Gram-Schmidt, the QR-factorization, Orthogonal Matrices March 4, 00 GRAM-SCHMIDT, THE QR-FACTORIZATION, ORTHOGONAL MATRICES Recap Yesterday we talked about several new, important concepts

More information

Linear Algebra. Min Yan

Linear Algebra. Min Yan Linear Algebra Min Yan January 2, 2018 2 Contents 1 Vector Space 7 1.1 Definition................................. 7 1.1.1 Axioms of Vector Space..................... 7 1.1.2 Consequence of Axiom......................

More information

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS LINEAR ALGEBRA, -I PARTIAL EXAM SOLUTIONS TO PRACTICE PROBLEMS Problem (a) For each of the two matrices below, (i) determine whether it is diagonalizable, (ii) determine whether it is orthogonally diagonalizable,

More information

5.) For each of the given sets of vectors, determine whether or not the set spans R 3. Give reasons for your answers.

5.) For each of the given sets of vectors, determine whether or not the set spans R 3. Give reasons for your answers. Linear Algebra - Test File - Spring Test # For problems - consider the following system of equations. x + y - z = x + y + 4z = x + y + 6z =.) Solve the system without using your calculator..) Find the

More information

2. Every linear system with the same number of equations as unknowns has a unique solution.

2. Every linear system with the same number of equations as unknowns has a unique solution. 1. For matrices A, B, C, A + B = A + C if and only if A = B. 2. Every linear system with the same number of equations as unknowns has a unique solution. 3. Every linear system with the same number of equations

More information

MATH 23a, FALL 2002 THEORETICAL LINEAR ALGEBRA AND MULTIVARIABLE CALCULUS Solutions to Final Exam (in-class portion) January 22, 2003

MATH 23a, FALL 2002 THEORETICAL LINEAR ALGEBRA AND MULTIVARIABLE CALCULUS Solutions to Final Exam (in-class portion) January 22, 2003 MATH 23a, FALL 2002 THEORETICAL LINEAR ALGEBRA AND MULTIVARIABLE CALCULUS Solutions to Final Exam (in-class portion) January 22, 2003 1. True or False (28 points, 2 each) T or F If V is a vector space

More information

BASIC NOTIONS. x + y = 1 3, 3x 5y + z = A + 3B,C + 2D, DC are not defined. A + C =

BASIC NOTIONS. x + y = 1 3, 3x 5y + z = A + 3B,C + 2D, DC are not defined. A + C = CHAPTER I BASIC NOTIONS (a) 8666 and 8833 (b) a =6,a =4 will work in the first case, but there are no possible such weightings to produce the second case, since Student and Student 3 have to end up with

More information

Math Computation Test 1 September 26 th, 2016 Debate: Computation vs. Theory Whatever wins, it ll be Huuuge!

Math Computation Test 1 September 26 th, 2016 Debate: Computation vs. Theory Whatever wins, it ll be Huuuge! Math 5- Computation Test September 6 th, 6 Debate: Computation vs. Theory Whatever wins, it ll be Huuuge! Name: Answer Key: Making Math Great Again Be sure to show your work!. (8 points) Consider the following

More information

Dot Products, Transposes, and Orthogonal Projections

Dot Products, Transposes, and Orthogonal Projections Dot Products, Transposes, and Orthogonal Projections David Jekel November 13, 2015 Properties of Dot Products Recall that the dot product or standard inner product on R n is given by x y = x 1 y 1 + +

More information

Linear Algebra. Chapter Linear Equations

Linear Algebra. Chapter Linear Equations Chapter 3 Linear Algebra Dixit algorizmi. Or, So said al-khwarizmi, being the opening words of a 12 th century Latin translation of a work on arithmetic by al-khwarizmi (ca. 78 84). 3.1 Linear Equations

More information

Math 54 Homework 3 Solutions 9/

Math 54 Homework 3 Solutions 9/ Math 54 Homework 3 Solutions 9/4.8.8.2 0 0 3 3 0 0 3 6 2 9 3 0 0 3 0 0 3 a a/3 0 0 3 b b/3. c c/3 0 0 3.8.8 The number of rows of a matrix is the size (dimension) of the space it maps to; the number of

More information

Math 290-2: Linear Algebra & Multivariable Calculus Northwestern University, Lecture Notes

Math 290-2: Linear Algebra & Multivariable Calculus Northwestern University, Lecture Notes Math 290-2: Linear Algebra & Multivariable Calculus Northwestern University, Lecture Notes Written by Santiago Cañez These are notes which provide a basic summary of each lecture for Math 290-2, the second

More information

Solving Systems of Equations Row Reduction

Solving Systems of Equations Row Reduction Solving Systems of Equations Row Reduction November 19, 2008 Though it has not been a primary topic of interest for us, the task of solving a system of linear equations has come up several times. For example,

More information

Gaussian elimination

Gaussian elimination Gaussian elimination October 14, 2013 Contents 1 Introduction 1 2 Some definitions and examples 2 3 Elementary row operations 7 4 Gaussian elimination 11 5 Rank and row reduction 16 6 Some computational

More information

Math Linear Algebra II. 1. Inner Products and Norms

Math Linear Algebra II. 1. Inner Products and Norms Math 342 - Linear Algebra II Notes 1. Inner Products and Norms One knows from a basic introduction to vectors in R n Math 254 at OSU) that the length of a vector x = x 1 x 2... x n ) T R n, denoted x,

More information

Chapter 2 Notes, Linear Algebra 5e Lay

Chapter 2 Notes, Linear Algebra 5e Lay Contents.1 Operations with Matrices..................................1.1 Addition and Subtraction.............................1. Multiplication by a scalar............................ 3.1.3 Multiplication

More information

MATH 221: SOLUTIONS TO SELECTED HOMEWORK PROBLEMS

MATH 221: SOLUTIONS TO SELECTED HOMEWORK PROBLEMS MATH 221: SOLUTIONS TO SELECTED HOMEWORK PROBLEMS 1. HW 1: Due September 4 1.1.21. Suppose v, w R n and c is a scalar. Prove that Span(v + cw, w) = Span(v, w). We must prove two things: that every element

More information

Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition

Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition MATH 322, Linear Algebra I J. Robert Buchanan Department of Mathematics Spring 205 Motivation When working with an inner product space, the most

More information

MATH 1553 PRACTICE FINAL EXAMINATION

MATH 1553 PRACTICE FINAL EXAMINATION MATH 553 PRACTICE FINAL EXAMINATION Name Section 2 3 4 5 6 7 8 9 0 Total Please read all instructions carefully before beginning. The final exam is cumulative, covering all sections and topics on the master

More information

Linear Algebra- Final Exam Review

Linear Algebra- Final Exam Review Linear Algebra- Final Exam Review. Let A be invertible. Show that, if v, v, v 3 are linearly independent vectors, so are Av, Av, Av 3. NOTE: It should be clear from your answer that you know the definition.

More information

Vector Spaces. 9.1 Opening Remarks. Week Solvable or not solvable, that s the question. View at edx. Consider the picture

Vector Spaces. 9.1 Opening Remarks. Week Solvable or not solvable, that s the question. View at edx. Consider the picture Week9 Vector Spaces 9. Opening Remarks 9.. Solvable or not solvable, that s the question Consider the picture (,) (,) p(χ) = γ + γ χ + γ χ (, ) depicting three points in R and a quadratic polynomial (polynomial

More information

Lecture 3: Linear Algebra Review, Part II

Lecture 3: Linear Algebra Review, Part II Lecture 3: Linear Algebra Review, Part II Brian Borchers January 4, Linear Independence Definition The vectors v, v,..., v n are linearly independent if the system of equations c v + c v +...+ c n v n

More information

LINEAR ALGEBRA: THEORY. Version: August 12,

LINEAR ALGEBRA: THEORY. Version: August 12, LINEAR ALGEBRA: THEORY. Version: August 12, 2000 13 2 Basic concepts We will assume that the following concepts are known: Vector, column vector, row vector, transpose. Recall that x is a column vector,

More information

Orthogonality. 6.1 Orthogonal Vectors and Subspaces. Chapter 6

Orthogonality. 6.1 Orthogonal Vectors and Subspaces. Chapter 6 Chapter 6 Orthogonality 6.1 Orthogonal Vectors and Subspaces Recall that if nonzero vectors x, y R n are linearly independent then the subspace of all vectors αx + βy, α, β R (the space spanned by x and

More information

No books, no notes, no calculators. You must show work, unless the question is a true/false, yes/no, or fill-in-the-blank question.

No books, no notes, no calculators. You must show work, unless the question is a true/false, yes/no, or fill-in-the-blank question. Math 304 Final Exam (May 8) Spring 206 No books, no notes, no calculators. You must show work, unless the question is a true/false, yes/no, or fill-in-the-blank question. Name: Section: Question Points

More information

Chapter 2 Linear Transformations

Chapter 2 Linear Transformations Chapter 2 Linear Transformations Linear Transformations Loosely speaking, a linear transformation is a function from one vector space to another that preserves the vector space operations. Let us be more

More information

Linear Algebra. Preliminary Lecture Notes

Linear Algebra. Preliminary Lecture Notes Linear Algebra Preliminary Lecture Notes Adolfo J. Rumbos c Draft date April 29, 23 2 Contents Motivation for the course 5 2 Euclidean n dimensional Space 7 2. Definition of n Dimensional Euclidean Space...........

More information

EXERCISE SET 5.1. = (kx + kx + k, ky + ky + k ) = (kx + kx + 1, ky + ky + 1) = ((k + )x + 1, (k + )y + 1)

EXERCISE SET 5.1. = (kx + kx + k, ky + ky + k ) = (kx + kx + 1, ky + ky + 1) = ((k + )x + 1, (k + )y + 1) EXERCISE SET 5. 6. The pair (, 2) is in the set but the pair ( )(, 2) = (, 2) is not because the first component is negative; hence Axiom 6 fails. Axiom 5 also fails. 8. Axioms, 2, 3, 6, 9, and are easily

More information

Linear Algebra, Summer 2011, pt. 3

Linear Algebra, Summer 2011, pt. 3 Linear Algebra, Summer 011, pt. 3 September 0, 011 Contents 1 Orthogonality. 1 1.1 The length of a vector....................... 1. Orthogonal vectors......................... 3 1.3 Orthogonal Subspaces.......................

More information

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP) MATH 20F: LINEAR ALGEBRA LECTURE B00 (T KEMP) Definition 01 If T (x) = Ax is a linear transformation from R n to R m then Nul (T ) = {x R n : T (x) = 0} = Nul (A) Ran (T ) = {Ax R m : x R n } = {b R m

More information

Linear Algebra Final Exam Study Guide Solutions Fall 2012

Linear Algebra Final Exam Study Guide Solutions Fall 2012 . Let A = Given that v = 7 7 67 5 75 78 Linear Algebra Final Exam Study Guide Solutions Fall 5 explain why it is not possible to diagonalize A. is an eigenvector for A and λ = is an eigenvalue for A diagonalize

More information

Linear Algebra. Preliminary Lecture Notes

Linear Algebra. Preliminary Lecture Notes Linear Algebra Preliminary Lecture Notes Adolfo J. Rumbos c Draft date May 9, 29 2 Contents 1 Motivation for the course 5 2 Euclidean n dimensional Space 7 2.1 Definition of n Dimensional Euclidean Space...........

More information

Topic 14 Notes Jeremy Orloff

Topic 14 Notes Jeremy Orloff Topic 4 Notes Jeremy Orloff 4 Row reduction and subspaces 4. Goals. Be able to put a matrix into row reduced echelon form (RREF) using elementary row operations.. Know the definitions of null and column

More information

Linear Algebra Highlights

Linear Algebra Highlights Linear Algebra Highlights Chapter 1 A linear equation in n variables is of the form a 1 x 1 + a 2 x 2 + + a n x n. We can have m equations in n variables, a system of linear equations, which we want to

More information

MTH 2032 SemesterII

MTH 2032 SemesterII MTH 202 SemesterII 2010-11 Linear Algebra Worked Examples Dr. Tony Yee Department of Mathematics and Information Technology The Hong Kong Institute of Education December 28, 2011 ii Contents Table of Contents

More information

Math 308 Discussion Problems #4 Chapter 4 (after 4.3)

Math 308 Discussion Problems #4 Chapter 4 (after 4.3) Math 38 Discussion Problems #4 Chapter 4 (after 4.3) () (after 4.) Let S be a plane in R 3 passing through the origin, so that S is a two-dimensional subspace of R 3. Say that a linear transformation T

More information

Working with Block Structured Matrices

Working with Block Structured Matrices Working with Block Structured Matrices Numerical linear algebra lies at the heart of modern scientific computing and computational science. Today it is not uncommon to perform numerical computations with

More information

7. Dimension and Structure.

7. Dimension and Structure. 7. Dimension and Structure 7.1. Basis and Dimension Bases for Subspaces Example 2 The standard unit vectors e 1, e 2,, e n are linearly independent, for if we write (2) in component form, then we obtain

More information

chapter 12 MORE MATRIX ALGEBRA 12.1 Systems of Linear Equations GOALS

chapter 12 MORE MATRIX ALGEBRA 12.1 Systems of Linear Equations GOALS chapter MORE MATRIX ALGEBRA GOALS In Chapter we studied matrix operations and the algebra of sets and logic. We also made note of the strong resemblance of matrix algebra to elementary algebra. The reader

More information

Practice Exam. 2x 1 + 4x 2 + 2x 3 = 4 x 1 + 2x 2 + 3x 3 = 1 2x 1 + 3x 2 + 4x 3 = 5

Practice Exam. 2x 1 + 4x 2 + 2x 3 = 4 x 1 + 2x 2 + 3x 3 = 1 2x 1 + 3x 2 + 4x 3 = 5 Practice Exam. Solve the linear system using an augmented matrix. State whether the solution is unique, there are no solutions or whether there are infinitely many solutions. If the solution is unique,

More information

There are two things that are particularly nice about the first basis

There are two things that are particularly nice about the first basis Orthogonality and the Gram-Schmidt Process In Chapter 4, we spent a great deal of time studying the problem of finding a basis for a vector space We know that a basis for a vector space can potentially

More information

Extra Problems for Math 2050 Linear Algebra I

Extra Problems for Math 2050 Linear Algebra I Extra Problems for Math 5 Linear Algebra I Find the vector AB and illustrate with a picture if A = (,) and B = (,4) Find B, given A = (,4) and [ AB = A = (,4) and [ AB = 8 If possible, express x = 7 as

More information

UNDERSTANDING THE DIAGONALIZATION PROBLEM. Roy Skjelnes. 1.- Linear Maps 1.1. Linear maps. A map T : R n R m is a linear map if

UNDERSTANDING THE DIAGONALIZATION PROBLEM. Roy Skjelnes. 1.- Linear Maps 1.1. Linear maps. A map T : R n R m is a linear map if UNDERSTANDING THE DIAGONALIZATION PROBLEM Roy Skjelnes Abstract These notes are additional material to the course B107, given fall 200 The style may appear a bit coarse and consequently the student is

More information

k is a product of elementary matrices.

k is a product of elementary matrices. Mathematics, Spring Lecture (Wilson) Final Eam May, ANSWERS Problem (5 points) (a) There are three kinds of elementary row operations and associated elementary matrices. Describe what each kind of operation

More information

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET This is a (not quite comprehensive) list of definitions and theorems given in Math 1553. Pay particular attention to the ones in red. Study Tip For each

More information

The Gram Schmidt Process

The Gram Schmidt Process The Gram Schmidt Process Now we will present a procedure, based on orthogonal projection, that converts any linearly independent set of vectors into an orthogonal set. Let us begin with the simple case

More information

The Gram Schmidt Process

The Gram Schmidt Process u 2 u The Gram Schmidt Process Now we will present a procedure, based on orthogonal projection, that converts any linearly independent set of vectors into an orthogonal set. Let us begin with the simple

More information

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET This is a (not quite comprehensive) list of definitions and theorems given in Math 1553. Pay particular attention to the ones in red. Study Tip For each

More information

Vector Spaces, Orthogonality, and Linear Least Squares

Vector Spaces, Orthogonality, and Linear Least Squares Week Vector Spaces, Orthogonality, and Linear Least Squares. Opening Remarks.. Visualizing Planes, Lines, and Solutions Consider the following system of linear equations from the opener for Week 9: χ χ

More information

is Use at most six elementary row operations. (Partial

is Use at most six elementary row operations. (Partial MATH 235 SPRING 2 EXAM SOLUTIONS () (6 points) a) Show that the reduced row echelon form of the augmented matrix of the system x + + 2x 4 + x 5 = 3 x x 3 + x 4 + x 5 = 2 2x + 2x 3 2x 4 x 5 = 3 is. Use

More information

Linear Algebra I. Ronald van Luijk, 2015

Linear Algebra I. Ronald van Luijk, 2015 Linear Algebra I Ronald van Luijk, 2015 With many parts from Linear Algebra I by Michael Stoll, 2007 Contents Dependencies among sections 3 Chapter 1. Euclidean space: lines and hyperplanes 5 1.1. Definition

More information

Math 24 Spring 2012 Questions (mostly) from the Textbook

Math 24 Spring 2012 Questions (mostly) from the Textbook Math 24 Spring 2012 Questions (mostly) from the Textbook 1. TRUE OR FALSE? (a) The zero vector space has no basis. (F) (b) Every vector space that is generated by a finite set has a basis. (c) Every vector

More information

. = V c = V [x]v (5.1) c 1. c k

. = V c = V [x]v (5.1) c 1. c k Chapter 5 Linear Algebra It can be argued that all of linear algebra can be understood using the four fundamental subspaces associated with a matrix Because they form the foundation on which we later work,

More information

MATH Linear Algebra

MATH Linear Algebra MATH 304 - Linear Algebra In the previous note we learned an important algorithm to produce orthogonal sequences of vectors called the Gramm-Schmidt orthogonalization process. Gramm-Schmidt orthogonalization

More information

c 1 v 1 + c 2 v 2 = 0 c 1 λ 1 v 1 + c 2 λ 1 v 2 = 0

c 1 v 1 + c 2 v 2 = 0 c 1 λ 1 v 1 + c 2 λ 1 v 2 = 0 LECTURE LECTURE 2 0. Distinct eigenvalues I haven t gotten around to stating the following important theorem: Theorem: A matrix with n distinct eigenvalues is diagonalizable. Proof (Sketch) Suppose n =

More information

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

MATH 240 Spring, Chapter 1: Linear Equations and Matrices MATH 240 Spring, 2006 Chapter Summaries for Kolman / Hill, Elementary Linear Algebra, 8th Ed. Sections 1.1 1.6, 2.1 2.2, 3.2 3.8, 4.3 4.5, 5.1 5.3, 5.5, 6.1 6.5, 7.1 7.2, 7.4 DEFINITIONS Chapter 1: Linear

More information

Chapter SSM: Linear Algebra. 5. Find all x such that A x = , so that x 1 = x 2 = 0.

Chapter SSM: Linear Algebra. 5. Find all x such that A x = , so that x 1 = x 2 = 0. Chapter Find all x such that A x : Chapter, so that x x ker(a) { } Find all x such that A x ; note that all x in R satisfy the equation, so that ker(a) R span( e, e ) 5 Find all x such that A x 5 ; x x

More information

Eigenvectors and Hermitian Operators

Eigenvectors and Hermitian Operators 7 71 Eigenvalues and Eigenvectors Basic Definitions Let L be a linear operator on some given vector space V A scalar λ and a nonzero vector v are referred to, respectively, as an eigenvalue and corresponding

More information

Math 416, Spring 2010 More on Algebraic and Geometric Properties January 21, 2010 MORE ON ALGEBRAIC AND GEOMETRIC PROPERTIES

Math 416, Spring 2010 More on Algebraic and Geometric Properties January 21, 2010 MORE ON ALGEBRAIC AND GEOMETRIC PROPERTIES Math 46, Spring 2 More on Algebraic and Geometric Properties January 2, 2 MORE ON ALGEBRAIC AND GEOMETRIC PROPERTIES Algebraic properties Algebraic properties of matrix/vector multiplication Last time

More information

Math 113 Final Exam: Solutions

Math 113 Final Exam: Solutions Math 113 Final Exam: Solutions Thursday, June 11, 2013, 3.30-6.30pm. 1. (25 points total) Let P 2 (R) denote the real vector space of polynomials of degree 2. Consider the following inner product on P

More information

Linear algebra and differential equations (Math 54): Lecture 10

Linear algebra and differential equations (Math 54): Lecture 10 Linear algebra and differential equations (Math 54): Lecture 10 Vivek Shende February 24, 2016 Hello and welcome to class! As you may have observed, your usual professor isn t here today. He ll be back

More information

Answer Key for Exam #2

Answer Key for Exam #2 . Use elimination on an augmented matrix: Answer Key for Exam # 4 4 8 4 4 4 The fourth column has no pivot, so x 4 is a free variable. The corresponding system is x + x 4 =, x =, x x 4 = which we solve

More information

First we introduce the sets that are going to serve as the generalizations of the scalars.

First we introduce the sets that are going to serve as the generalizations of the scalars. Contents 1 Fields...................................... 2 2 Vector spaces.................................. 4 3 Matrices..................................... 7 4 Linear systems and matrices..........................

More information

Math 291-2: Lecture Notes Northwestern University, Winter 2016

Math 291-2: Lecture Notes Northwestern University, Winter 2016 Math 291-2: Lecture Notes Northwestern University, Winter 2016 Written by Santiago Cañez These are lecture notes for Math 291-2, the second quarter of MENU: Intensive Linear Algebra and Multivariable Calculus,

More information

REVIEW FOR EXAM III SIMILARITY AND DIAGONALIZATION

REVIEW FOR EXAM III SIMILARITY AND DIAGONALIZATION REVIEW FOR EXAM III The exam covers sections 4.4, the portions of 4. on systems of differential equations and on Markov chains, and..4. SIMILARITY AND DIAGONALIZATION. Two matrices A and B are similar

More information

EXAM 2 REVIEW DAVID SEAL

EXAM 2 REVIEW DAVID SEAL EXAM 2 REVIEW DAVID SEAL 3. Linear Systems and Matrices 3.2. Matrices and Gaussian Elimination. At this point in the course, you all have had plenty of practice with Gaussian Elimination. Be able to row

More information

A SHORT SUMMARY OF VECTOR SPACES AND MATRICES

A SHORT SUMMARY OF VECTOR SPACES AND MATRICES A SHORT SUMMARY OF VECTOR SPACES AND MATRICES This is a little summary of some of the essential points of linear algebra we have covered so far. If you have followed the course so far you should have no

More information

MATH 167: APPLIED LINEAR ALGEBRA Least-Squares

MATH 167: APPLIED LINEAR ALGEBRA Least-Squares MATH 167: APPLIED LINEAR ALGEBRA Least-Squares October 30, 2014 Least Squares We do a series of experiments, collecting data. We wish to see patterns!! We expect the output b to be a linear function of

More information

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra. DS-GA 1002 Lecture notes 0 Fall 2016 Linear Algebra These notes provide a review of basic concepts in linear algebra. 1 Vector spaces You are no doubt familiar with vectors in R 2 or R 3, i.e. [ ] 1.1

More information

1 Review of the dot product

1 Review of the dot product Any typographical or other corrections about these notes are welcome. Review of the dot product The dot product on R n is an operation that takes two vectors and returns a number. It is defined by n u

More information

Span and Linear Independence

Span and Linear Independence Span and Linear Independence It is common to confuse span and linear independence, because although they are different concepts, they are related. To see their relationship, let s revisit the previous

More information

ROOTS COMPLEX NUMBERS

ROOTS COMPLEX NUMBERS MAT1341 Introduction to Linear Algebra Mike Newman 1. Complex Numbers class notes ROOTS Polynomials have roots; you probably know how to find the roots of a quadratic, say x 2 5x + 6, and to factor it

More information

Linear Independence Reading: Lay 1.7

Linear Independence Reading: Lay 1.7 Linear Independence Reading: Lay 17 September 11, 213 In this section, we discuss the concept of linear dependence and independence I am going to introduce the definitions and then work some examples and

More information

Chapter 1: Linear Equations

Chapter 1: Linear Equations Chapter : Linear Equations (Last Updated: September, 6) The material for these notes is derived primarily from Linear Algebra and its applications by David Lay (4ed).. Systems of Linear Equations Before

More information

Review problems for MA 54, Fall 2004.

Review problems for MA 54, Fall 2004. Review problems for MA 54, Fall 2004. Below are the review problems for the final. They are mostly homework problems, or very similar. If you are comfortable doing these problems, you should be fine on

More information

Elementary linear algebra

Elementary linear algebra Chapter 1 Elementary linear algebra 1.1 Vector spaces Vector spaces owe their importance to the fact that so many models arising in the solutions of specific problems turn out to be vector spaces. The

More information

GAUSSIAN ELIMINATION AND LU DECOMPOSITION (SUPPLEMENT FOR MA511)

GAUSSIAN ELIMINATION AND LU DECOMPOSITION (SUPPLEMENT FOR MA511) GAUSSIAN ELIMINATION AND LU DECOMPOSITION (SUPPLEMENT FOR MA511) D. ARAPURA Gaussian elimination is the go to method for all basic linear classes including this one. We go summarize the main ideas. 1.

More information

(v, w) = arccos( < v, w >

(v, w) = arccos( < v, w > MA322 Sathaye Notes on Inner Products Notes on Chapter 6 Inner product. Given a real vector space V, an inner product is defined to be a bilinear map F : V V R such that the following holds: For all v

More information

LINEAR ALGEBRA W W L CHEN

LINEAR ALGEBRA W W L CHEN LINEAR ALGEBRA W W L CHEN c W W L Chen, 1997, 2008. This chapter is available free to all individuals, on the understanding that it is not to be used for financial gain, and may be downloaded and/or photocopied,

More information

Contents. 2.1 Vectors in R n. Linear Algebra (part 2) : Vector Spaces (by Evan Dummit, 2017, v. 2.50) 2 Vector Spaces

Contents. 2.1 Vectors in R n. Linear Algebra (part 2) : Vector Spaces (by Evan Dummit, 2017, v. 2.50) 2 Vector Spaces Linear Algebra (part 2) : Vector Spaces (by Evan Dummit, 2017, v 250) Contents 2 Vector Spaces 1 21 Vectors in R n 1 22 The Formal Denition of a Vector Space 4 23 Subspaces 6 24 Linear Combinations and

More information

SUMMARY OF MATH 1600

SUMMARY OF MATH 1600 SUMMARY OF MATH 1600 Note: The following list is intended as a study guide for the final exam. It is a continuation of the study guide for the midterm. It does not claim to be a comprehensive list. You

More information