The Gram Schmidt Process
|
|
- Kristian Hunt
- 6 years ago
- Views:
Transcription
1 u 2 u The Gram Schmidt Process Now we will present a procedure, based on orthogonal projection, that converts any linearly independent set of vectors into an orthogonal set. Let us begin with the simple case of two vectors: We want to convert this into an orthogonal set by keeping u intact, and modifying u 2. So let v = u.
2 2 Then compute the projection p = proj v (u 2 ) of u 2 onto v. u 2 u = v p p = proj v (u 2) If this projection is zero, that means that u 2 is already orthogonal to v, so v 2 = u 2.
3 Otherwise, we correct u 2 by removing the part of it that is parallel to v, i.e., we set v 2 = u 2 p. u 2 u = v v 2 p = proj v (u 2) v 2 = u 2 p p Note that each of v and v 2 is a linear combination of u and u 2, and vice-versa, so span(v, v 2 ) = span(u, u ).
4 Next, suppose we have a third vector, u 3. If it is already orthogonal to v and v 2, then we can set v 3 = u 3. Otherwise, we need to correct u 3 by removing its projections onto both v and v 2. Let v 3 = u 3 proj v (u 3 ) proj v2 (u 3 ) Then it is clear that v 3 is orthogonal to v and v 2. And again, it is clear that the two sets span the same subspace: span(v, v 2, v 3 ) = span(u, u, u 3 ). We can continue this for more independent vectors, u 4, u 5,..., each time correcting the new vector u j to make it orthogonal to the previously constructed v, v 2,..., v j. Summarizing, we have
5 Theorem (The Gram Schmidt Process) Given a set { u, u 2,..., u k } of linearly independent vectors in an inner product space, perform the following computations: Step. v = u Step 2. v 2 = u 2 proj v (u 2 ) Step 3. v 3 = u 3 proj v (u 3 ) proj v2 (u 3 ) Step 4. v 4 = u 4 proj v (u 4 ) proj v2 (u 4 ) proj v3 (u 4 ). (Continue till Step k.) Then (i) The set { v, v 2,..., v k } is orthogonal, and (ii) for each j =, 2,..., k, span(v, v 2,..., v j ) = span(u, u,..., u j ).
6 If we express the projections in terms of the inner product, we have the following form for the process: Step. v = u Step 2. v 2 = u 2 v, u 2 v, v v Step 3. v 3 = u 3 v, u 3 v, v v v 2, u 3 v 2, v 2 v 2 Step 4. v 4 = u 4 v, u 4 v, v v v 2, u 4 v 2, v 2 v 2 v 3, u 4 v 3, v 3 v 3. (Continue until Step k.)
7 7 The above process applies to any linearly independent set. But if we start with a basis B = { u, u 2,..., u n } of an inner product space V and the result is B = { v, v 2,..., v n }, then the theorem says that span(b ) = span(b) = V. So B is an orthogonal basis of V. The Gram Schmidt process creates an orthogonal set. But as we have seen before, we can easily normalize the resulting vectors to produce an orthonormal set, if desired. So we have the following theorem. Theorem Every finite-dimensional inner product space has an orthonormal basis.
8 Example: Let V = R 4 with the standard Euclidean inner product (dot product). Let us apply the Gram Schmidt process to the vectors u =, u 2 = 0, u 3 = 0 0. Step. v = u = Step 2. v 2 = u 2 proj v (u 2 ) = u 2 v, u 2 v, v v = = / 4 3 / 4 / 4 / 4
9 9 Step 3. v 3 = u 3 proj v (u 3 ) proj v2 (u 3 ) = u 3 v, u 3 v, v v v 2, u 3 v 2, v 2 v 2 0 / 4 0 / 2 / 6 = 4 / 4 3/4 / 4 = / 2 / 2 + / 2 / 6 0 / 4 0 / 2 / 6 / 3 = 0 2/ 3 / 3 Let W be the span of { u, u 2, u 3 }. Then B = { v, v 2, v 3 } = { (,,, ), ( / 4, 3 / 4, / 4, / 4 ), ( / 3, 0, 2 / 3, / 3 ) } is an orthogonal basis of the subspace W.
10 0 Remember that if we rescale the vectors in a basis by nonzero scalar multiples, the result is still a basis. So we could simplify this a bit by eliminating fractions. Let w = v, w 2 = 4v 2 = (, 3,, ), w 3 = 3v 3 = (, 0, 2, ). Then B = { w, w 2, w 3 } is also an orthogonal basis of W. And now we can find an orthonormal basis by normalizing: w = 4 = 2, w 2 = 2 = 2 3, w 3 = 6, so an orthonormal basis of W is B = w w, w 2 w 2, w 3 w 3 { ( = 2, 2, 2, ) 2,, ( 2 3, 3 2 3, 2 3, 2 3 ) (, 6, 0, 2, 6 6 ) }.
11 Rescale-as-you-go Notice that in constructing each v j, it is only the direction that counts, not the norm. So the type of rescaling we did in going from B to B could be done earlier, at each step. Here is the calculation again with that simplification. Step and Step 2 are the same, but before proceeding to Step 3, we can clear fractions. We can say it this way: Step 2. Let v 2 = u 2 proj v (u 2 ) = ( / 4, 3 / 4, / 4, / 4 ). v 2 = 4v 2 = (, 3,, ). Step 3. Let v 3 = u 3 v, u 3 v, v v v 2, u 3 v 2, v 2 v 2 0 / 3 = = 0 2/ 3 0 / 3 So define Etc.
12 2 There is a way to organize the calculations that may save time and avoid errors. Recall again the sequence of steps: Step. v = u Step 2. v 2 = u 2 v, u 2 v, v v Step 3. v 3 = u 3 v, u 3 v, v v v 2, u 3 v 2, v 2 v 2 Step 4. v 4 = u 4 v, u 4 v, v v v 2, u 4 v 2, v 2 v 2 v 3, u 4 v 3, v 3 v 3 Notice that the inner product v, v is needed in every step after the first, the inner product v 2, v 2 is needed in every step after the second, etc. Also, for each v i, the inner products v i, u j will be needed in the j-th step, for each j > i.
13 It is useful to compute these inner products as soon as possible, and record them in a table for later use. v i v i 2 u 2 u 3 u 4 v v, v v, u 2 v, u 3 v, u 4 v 2 v 2, v 2 v 2, u 3 v 2, u 4 v 3 v 3, v 3 v 3, u 4 v 4 v 4, v 4. So as soon as v i is calculated (and after any rescaling, if desired), we can fill out the i-th row of this table.
14 Example: Let us redo the previous example in this format. u = (,,, ), (, 0,, ), (0,,, 0) v i v i 2 (, 0,, ) (0,,, 0) (,,, ) Then v 2 = u 2 v,u2 v v,v = (, 0,, ) 3 4 (0,,, 0) = 4 (, 3,, ), so v 2 = (, 3,, ). Then v i v i 2 (, 0,, ) (0,,, 0) (,,, ) (, 3,, ) 2 2 Finally, v 3 = u v v 2 = = 3 (, 0,, ), etc.
15 5 Example: Let V = R 5 with the standard inner product. We will apply the Gram Schmidt process to u = (,,, 0, 0), u 2 = (0,, 0, 0, ), u 3 = (,, 0,, 0). Set v = u. Now we can fill in the first row of the table: v i v i 2 (0,, 0, 0, ) (,, 0,, 0) (,,, 0, 0) 3 0 Then v 2 = u 2 v,u2 v v,v = (0,, 0, 0, ) 3 (,,, 0, 0) = (/3, 2/3, /3, 0, ). So let v 2 = 3v 2 = (, 2,, 0, 3).
16 6 Now we can fill in the second row: v i v i 2 (0,, 0, 0, ) (,, 0,, 0) (,,, 0, 0) 3 0 (, 2,, 0, 3) 5 3 Then v 3 = u 3 v,u3 v v,v v2,u3 v v 2,v 2 2 = (,, 0,, 0) 0 v 5 (, 2,, 0, 3) = (4/5, 3/5, /5,, 3/5). So let v 3 = (4, 3,, 5, 3). At this point, { v, v 2, v 3 } is an orthogonal basis for the span of { u, u 2, u 3 }.
17 But if we want an orthonormal basis we should complete the third row: v i v i 2 (0,, 0, 0, ) (,, 0,, 0) (,,, 0, 0) 3 0 (, 2,, 0, 3) 5 3 (4, 3,, 5, 3) 60 Now the diagonal entries of the table contain the squared norms of the orthogonal basis, so the resulting orthonormal basis is w = ( v =, ),, 0, w 2 = 5 v = w 3 = 60 v = ( 5, ( 4 60, 2 5, 3 60, 5, 0, 60, ) , ) 3 60
18 Example: Let V = P 2 with the inner product given by p, q = Find an orthonormal basis for V. p(x)q(x) dx. Solution. First, it will be useful to record that x k dx = x { k+ 2/(k + ) if k is even, k + = 0 if k is odd. We start with the standard basis, u =, u 2 = x, u 3 = x 2, and apply the Gram-Schmidt process. First, set v = u =. Then v, v =, = v, u 2 =, x = v, u 3 =, x 2 = x 0 dx = 2, x dx = 0, x 2 dx = 2/3,
19 9 So we have the first row of the table: v i v i 2 x x /3 Then v 2 = u 2 0v = u 2. (Note that u 2 is unchanged, because it was already orthogonal to u.) Then v 2, v 2 = x, x = x 2 dx = 2/3. And v 2, u 3 = x, x 2 = x 3 dx = 0.
20 So we have v i v i 2 x x /3 x 2/3 0 Finally, v 3 = u 3 v, u 3 v, v v v 2, u 3 v 2, v 2 v 2 = x 2 2/3 2 () 0 2/3 (x) = x 2 3 And v 3, v 3 = (x 2 3 )2 dx = x x dx = = 8 45.
21 2 Now we have v i v i 2 x x /3 x 2/3 0 x 2 /3 8/45 So v = 2, v 2 = 2/3, v 3 = 8/45. Then our orthonormal basis is w =, 2 w 2 = 3 x = x, 2/3 2 w 3 = 45 (x 2 /3) = (x 2 /3). 8/45 8
22 Extending to an orthogonal basis Suppose we start with a linearly independent set u,..., u k with the property that an initial segment u, u 2,..., u j is already orthogonal. If we perform the Gram Schmidt process, we find that v = u, v 2 = u 2,... v j = u j. That is, the first j vectors in the list are unaffected. This happens because if u j is orthogonal to each of u,..., u j, then it is orthogonal to v i for i < j. So the projection proj vi u j = v i, u j v i, v i v i is equal to 0. This means that no corrections are made in computing v j.
23 Consequently, we have the following theorem. Theorem Let S = { b, b 2,..., b k } be an orthogonal set of nonzero vectors in a finite-dimensional inner product space V. Then S can be extended to an orthogonal basis of V. Furthermore, if S is orthonormal, it can be extended to an orthonormal basis. Proof. First, extend S to a basis B = { b, b 2,..., b k, b k+,..., b n } of V. Apply the Gram Schmidt process to B to get an orthogonal basis B. According to the remarks above, B will contain S unchanged.
24 Example: Extend S = { (,, 0, ), (, 0, 0, ) } to an orthogonal basis of R 4. Solution: S is clearly orthogonal. First, we need to extend S to a basis of R 4, without worrying about orthogonality. Recall that one way to do this is to add on the standard basis, and then use Gaussian elimination to select a basis from the extended set. So we form the matrix Now we row-reduce this matrix. Since the first two columns are linearly independent, they will contain pivots, and so will be included in the selected basis.
25 We get Since the last matrix is in echelon form and has pivots in columns, 2, 3, and 5, the selected basis is B = { u = (,, 0, ), u 2 = (, 0, 0, ), u 3 = (, 0, 0, 0), u 4 = (0, 0,, 0) }.
26 26 Now we proceed with the Gram-Schmidt process. Since u and u 2 are already orthogonal, we know that v = u and v 2 = u 2, so our table is v i v i 2 (, 0, 0, ) (, 0, 0, 0) (0, 0,, 0) (,, 0, ) (, 0, 0, ) 2 0 So v 3 = u 3 3 v 2 v 2. The common denominator is 6, so we can rescale and set v 3 = 6v 3 = 6u 3 2v 3v 2 = 6(, 0, 0, 0) 2(,, 0, ) 3(, 0, 0, ) = (, 2, 0, ).
27 27 Now, we could continue with u 4, but observe that u 4 is already orthogonal to v, v, and v 3, so v 4 = u 4. So an orthogonal basis containing S is B = { (,, 0, ), (, 0, 0, ), (, 2, 0, ), (0, 0,, 0) }
28 The Gram Schmidt process presented here assumes that we start with a linearly independent set. As a final note, let us consider what would happen if the original set S were linearly dependent, and we proceed with the Gram-Schmidt process anyway. What would go wrong? The following example illustrates the situation. Suppose S = { (, 0,, 0), (,, 2, 0), (2,, 3, 0), (, 0, 0, ) }. Find an orthogonal basis of the subspace W spanned by S.
29 We begin as usual by setting v = (, 0,, 0) and computing the first row of the table: v i v i 2 (,, 2, 0) (2,, 3, 0) (, 0, 0, ) (, 0,, 0) Then v 2 = (,, 2, 0) 3 2 (, 0,, 0), which gives v 2 = (, 2,, 0). So v i v i 2 (,, 2, 0) (2,, 3, 0) (, 0, 0, ) (, 0,, 0) (, 2,, 0) 6 3 But then comes trouble: v 3 = (2,, 3, 0) 5 2 (, 0,, 0) (, 2,, 0) = (0, 0, 0, 0). 2
30 0 Now we have a problem, because v 3 = 0, and this can never be part of a basis. Furthermore, v 3, v 3 = 0, and this number occurs in the denominator of future steps! A closer inspection shows that the reason this occurred is that the original set S was linearly dependent. We should start over by reducing the spanning set S to a basis of W, and then use the Gram Schmidt procedure. But all is not lost! We can actually proceed by ignoring v 3 and continuing.
31 3 So our table becomes v i v i 2 (,, 2, 0) (, 0, 0, ) (, 0,, 0) 2 3 (, 2,, 0) 6 And v 4 = (, 0, 0, ) 2 (, 0,, 0) + 6 (, 2,, 0) = ( 3, 3, 3, ). So set v 4 = (,,, 3). Conclusion: B = { (, 0,, 0), (, 2,, 0), (,,, 3) } is an orthogonal basis of W = span{ (, 0,, 0), (,, 2, 0), (2,, 3, 0), (, 0, 0, ) }. Exercise: Verify this statement directly.
32 We can summarize the idea here as follows Theorem (Gram Schmidt process for linearly dependent sets) If the Gram Schmidt process is applied to a set S = { u,..., u k }, then the set S is linearly independent if and only if each of the vectors v,..., v k produced is nonzero. If, during the process, any v j that becomes zero is discarded before continuing, then the resulting set is still orthogonal, and spans the same subspace as S.
The Gram Schmidt Process
The Gram Schmidt Process Now we will present a procedure, based on orthogonal projection, that converts any linearly independent set of vectors into an orthogonal set. Let us begin with the simple case
More informationOrthonormal Bases; Gram-Schmidt Process; QR-Decomposition
Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition MATH 322, Linear Algebra I J. Robert Buchanan Department of Mathematics Spring 205 Motivation When working with an inner product space, the most
More informationSection 6.4. The Gram Schmidt Process
Section 6.4 The Gram Schmidt Process Motivation The procedures in 6 start with an orthogonal basis {u, u,..., u m}. Find the B-coordinates of a vector x using dot products: x = m i= x u i u i u i u i Find
More informationMATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION
MATH (LINEAR ALGEBRA ) FINAL EXAM FALL SOLUTIONS TO PRACTICE VERSION Problem (a) For each matrix below (i) find a basis for its column space (ii) find a basis for its row space (iii) determine whether
More informationSolutions to Review Problems for Chapter 6 ( ), 7.1
Solutions to Review Problems for Chapter (-, 7 The Final Exam is on Thursday, June,, : AM : AM at NESBITT Final Exam Breakdown Sections % -,7-9,- - % -9,-,7,-,-7 - % -, 7 - % Let u u and v Let x x x x,
More informationMath 3191 Applied Linear Algebra
Math 9 Applied Linear Algebra Lecture : Orthogonal Projections, Gram-Schmidt Stephen Billups University of Colorado at Denver Math 9Applied Linear Algebra p./ Orthonormal Sets A set of vectors {u, u,...,
More informationWorksheet for Lecture 25 Section 6.4 Gram-Schmidt Process
Worksheet for Lecture Name: Section.4 Gram-Schmidt Process Goal For a subspace W = Span{v,..., v n }, we want to find an orthonormal basis of W. Example Let W = Span{x, x } with x = and x =. Give an orthogonal
More informationMATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors.
MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors. Orthogonal sets Let V be a vector space with an inner product. Definition. Nonzero vectors v 1,v
More information2. Every linear system with the same number of equations as unknowns has a unique solution.
1. For matrices A, B, C, A + B = A + C if and only if A = B. 2. Every linear system with the same number of equations as unknowns has a unique solution. 3. Every linear system with the same number of equations
More informationMATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)
MATH 20F: LINEAR ALGEBRA LECTURE B00 (T KEMP) Definition 01 If T (x) = Ax is a linear transformation from R n to R m then Nul (T ) = {x R n : T (x) = 0} = Nul (A) Ran (T ) = {Ax R m : x R n } = {b R m
More informationMTH 2310, FALL Introduction
MTH 2310, FALL 2011 SECTION 6.2: ORTHOGONAL SETS Homework Problems: 1, 5, 9, 13, 17, 21, 23 1, 27, 29, 35 1. Introduction We have discussed previously the benefits of having a set of vectors that is linearly
More informationSection 6.2, 6.3 Orthogonal Sets, Orthogonal Projections
Section 6. 6. Orthogonal Sets Orthogonal Projections Main Ideas in these sections: Orthogonal set = A set of mutually orthogonal vectors. OG LI. Orthogonal Projection of y onto u or onto an OG set {u u
More informationMath 261 Lecture Notes: Sections 6.1, 6.2, 6.3 and 6.4 Orthogonal Sets and Projections
Math 6 Lecture Notes: Sections 6., 6., 6. and 6. Orthogonal Sets and Projections We will not cover general inner product spaces. We will, however, focus on a particular inner product space the inner product
More informationThe Gram-Schmidt Process 1
The Gram-Schmidt Process In this section all vector spaces will be subspaces of some R m. Definition.. Let S = {v...v n } R m. The set S is said to be orthogonal if v v j = whenever i j. If in addition
More informationWorksheet for Lecture 15 (due October 23) Section 4.3 Linearly Independent Sets; Bases
Worksheet for Lecture 5 (due October 23) Name: Section 4.3 Linearly Independent Sets; Bases Definition An indexed set {v,..., v n } in a vector space V is linearly dependent if there is a linear relation
More informationChapter 6: Orthogonality
Chapter 6: Orthogonality (Last Updated: November 7, 7) These notes are derived primarily from Linear Algebra and its applications by David Lay (4ed). A few theorems have been moved around.. Inner products
More informationHomework 5. (due Wednesday 8 th Nov midnight)
Homework (due Wednesday 8 th Nov midnight) Use this definition for Column Space of a Matrix Column Space of a matrix A is the set ColA of all linear combinations of the columns of A. In other words, if
More informationLinear Algebra. and
Instructions Please answer the six problems on your own paper. These are essay questions: you should write in complete sentences. 1. Are the two matrices 1 2 2 1 3 5 2 7 and 1 1 1 4 4 2 5 5 2 row equivalent?
More informationMarch 27 Math 3260 sec. 56 Spring 2018
March 27 Math 3260 sec. 56 Spring 2018 Section 4.6: Rank Definition: The row space, denoted Row A, of an m n matrix A is the subspace of R n spanned by the rows of A. We now have three vector spaces associated
More informationGlossary of Linear Algebra Terms. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB
Glossary of Linear Algebra Terms Basis (for a subspace) A linearly independent set of vectors that spans the space Basic Variable A variable in a linear system that corresponds to a pivot column in the
More informationHomework 11 Solutions. Math 110, Fall 2013.
Homework 11 Solutions Math 110, Fall 2013 1 a) Suppose that T were self-adjoint Then, the Spectral Theorem tells us that there would exist an orthonormal basis of P 2 (R), (p 1, p 2, p 3 ), consisting
More informationThere are two things that are particularly nice about the first basis
Orthogonality and the Gram-Schmidt Process In Chapter 4, we spent a great deal of time studying the problem of finding a basis for a vector space We know that a basis for a vector space can potentially
More informationMTH 2032 SemesterII
MTH 202 SemesterII 2010-11 Linear Algebra Worked Examples Dr. Tony Yee Department of Mathematics and Information Technology The Hong Kong Institute of Education December 28, 2011 ii Contents Table of Contents
More information10 Orthogonality Orthogonal subspaces
10 Orthogonality 10.1 Orthogonal subspaces In the plane R 2 we think of the coordinate axes as being orthogonal (perpendicular) to each other. We can express this in terms of vectors by saying that every
More informationInner products. Theorem (basic properties): Given vectors u, v, w in an inner product space V, and a scalar k, the following properties hold:
Inner products Definition: An inner product on a real vector space V is an operation (function) that assigns to each pair of vectors ( u, v) in V a scalar u, v satisfying the following axioms: 1. u, v
More informationLinear Algebra Massoud Malek
CSUEB Linear Algebra Massoud Malek Inner Product and Normed Space In all that follows, the n n identity matrix is denoted by I n, the n n zero matrix by Z n, and the zero vector by θ n An inner product
More informationMath 1180, Notes, 14 1 C. v 1 v n v 2. C A ; w n. A and w = v i w i : v w = i=1
Math 8, 9 Notes, 4 Orthogonality We now start using the dot product a lot. v v = v v n then by Recall that if w w ; w n and w = v w = nx v i w i : Using this denition, we dene the \norm", or length, of
More informationVectors. Vectors and the scalar multiplication and vector addition operations:
Vectors Vectors and the scalar multiplication and vector addition operations: x 1 x 1 y 1 2x 1 + 3y 1 x x n 1 = 2 x R n, 2 2 y + 3 2 2x = 2 + 3y 2............ x n x n y n 2x n + 3y n I ll use the two terms
More information6. Orthogonality and Least-Squares
Linear Algebra 6. Orthogonality and Least-Squares CSIE NCU 1 6. Orthogonality and Least-Squares 6.1 Inner product, length, and orthogonality. 2 6.2 Orthogonal sets... 8 6.3 Orthogonal projections... 13
More information6.1. Inner Product, Length and Orthogonality
These are brief notes for the lecture on Friday November 13, and Monday November 1, 2009: they are not complete, but they are a guide to what I want to say on those days. They are guaranteed to be incorrect..1.
More informationMATH Linear Algebra
MATH 304 - Linear Algebra In the previous note we learned an important algorithm to produce orthogonal sequences of vectors called the Gramm-Schmidt orthogonalization process. Gramm-Schmidt orthogonalization
More informationReview of Linear Algebra
Review of Linear Algebra Definitions An m n (read "m by n") matrix, is a rectangular array of entries, where m is the number of rows and n the number of columns. 2 Definitions (Con t) A is square if m=
More information[Disclaimer: This is not a complete list of everything you need to know, just some of the topics that gave people difficulty.]
Math 43 Review Notes [Disclaimer: This is not a complete list of everything you need to know, just some of the topics that gave people difficulty Dot Product If v (v, v, v 3 and w (w, w, w 3, then the
More informationWorksheet for Lecture 15 (due October 23) Section 4.3 Linearly Independent Sets; Bases
Worksheet for Lecture 5 (due October 23) Name: Section 4.3 Linearly Independent Sets; Bases Definition An indexed set {v,..., v n } in a vector space V is linearly dependent if there is a linear relation
More informationLecture 4: Applications of Orthogonality: QR Decompositions
Math 08B Professor: Padraic Bartlett Lecture 4: Applications of Orthogonality: QR Decompositions Week 4 UCSB 204 In our last class, we described the following method for creating orthonormal bases, known
More informationMODULE 8 Topics: Null space, range, column space, row space and rank of a matrix
MODULE 8 Topics: Null space, range, column space, row space and rank of a matrix Definition: Let L : V 1 V 2 be a linear operator. The null space N (L) of L is the subspace of V 1 defined by N (L) = {x
More informationMAT2342 : Introduction to Applied Linear Algebra Mike Newman, fall Projections. introduction
MAT4 : Introduction to Applied Linear Algebra Mike Newman fall 7 9. Projections introduction One reason to consider projections is to understand approximate solutions to linear systems. A common example
More informationPractice Exam. 2x 1 + 4x 2 + 2x 3 = 4 x 1 + 2x 2 + 3x 3 = 1 2x 1 + 3x 2 + 4x 3 = 5
Practice Exam. Solve the linear system using an augmented matrix. State whether the solution is unique, there are no solutions or whether there are infinitely many solutions. If the solution is unique,
More informationProjections and Least Square Solutions. Recall that given an inner product space V with subspace W and orthogonal basis for
Math 57 Spring 18 Projections and Least Square Solutions Recall that given an inner product space V with subspace W and orthogonal basis for W, B {v 1, v,..., v k }, the orthogonal projection of V onto
More informationMath 3191 Applied Linear Algebra
Math 191 Applied Linear Algebra Lecture 1: Inner Products, Length, Orthogonality Stephen Billups University of Colorado at Denver Math 191Applied Linear Algebra p.1/ Motivation Not all linear systems have
More informationDefinition 1. A set V is a vector space over the scalar field F {R, C} iff. there are two operations defined on V, called vector addition
6 Vector Spaces with Inned Product Basis and Dimension Section Objective(s): Vector Spaces and Subspaces Linear (In)dependence Basis and Dimension Inner Product 6 Vector Spaces and Subspaces Definition
More informationChapter 3 Transformations
Chapter 3 Transformations An Introduction to Optimization Spring, 2014 Wei-Ta Chu 1 Linear Transformations A function is called a linear transformation if 1. for every and 2. for every If we fix the bases
More informationAlgorithms to Compute Bases and the Rank of a Matrix
Algorithms to Compute Bases and the Rank of a Matrix Subspaces associated to a matrix Suppose that A is an m n matrix The row space of A is the subspace of R n spanned by the rows of A The column space
More informationMidterm 1 Review. Written by Victoria Kala SH 6432u Office Hours: R 12:30 1:30 pm Last updated 10/10/2015
Midterm 1 Review Written by Victoria Kala vtkala@math.ucsb.edu SH 6432u Office Hours: R 12:30 1:30 pm Last updated 10/10/2015 Summary This Midterm Review contains notes on sections 1.1 1.5 and 1.7 in your
More information(v, w) = arccos( < v, w >
MA322 F all206 Notes on Inner Products Notes on Chapter 6 Inner product. Given a real vector space V, an inner product is defined to be a bilinear map F : V V R such that the following holds: Commutativity:
More informationIMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET
IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET This is a (not quite comprehensive) list of definitions and theorems given in Math 1553. Pay particular attention to the ones in red. Study Tip For each
More informationConceptual Questions for Review
Conceptual Questions for Review Chapter 1 1.1 Which vectors are linear combinations of v = (3, 1) and w = (4, 3)? 1.2 Compare the dot product of v = (3, 1) and w = (4, 3) to the product of their lengths.
More informationMATH 240 Spring, Chapter 1: Linear Equations and Matrices
MATH 240 Spring, 2006 Chapter Summaries for Kolman / Hill, Elementary Linear Algebra, 8th Ed. Sections 1.1 1.6, 2.1 2.2, 3.2 3.8, 4.3 4.5, 5.1 5.3, 5.5, 6.1 6.5, 7.1 7.2, 7.4 DEFINITIONS Chapter 1: Linear
More informationMath 290, Midterm II-key
Math 290, Midterm II-key Name (Print): (first) Signature: (last) The following rules apply: There are a total of 20 points on this 50 minutes exam. This contains 7 pages (including this cover page) and
More informationMATH 304 Linear Algebra Lecture 34: Review for Test 2.
MATH 304 Linear Algebra Lecture 34: Review for Test 2. Topics for Test 2 Linear transformations (Leon 4.1 4.3) Matrix transformations Matrix of a linear mapping Similar matrices Orthogonality (Leon 5.1
More informationLINEAR ALGEBRA SUMMARY SHEET.
LINEAR ALGEBRA SUMMARY SHEET RADON ROSBOROUGH https://intuitiveexplanationscom/linear-algebra-summary-sheet/ This document is a concise collection of many of the important theorems of linear algebra, organized
More informationLecture 10: Vector Algebra: Orthogonal Basis
Lecture 0: Vector Algebra: Orthogonal Basis Orthogonal Basis of a subspace Computing an orthogonal basis for a subspace using Gram-Schmidt Orthogonalization Process Orthogonal Set Any set of vectors that
More informationChapter 6. Orthogonality and Least Squares
Chapter 6 Orthogonality and Least Squares Section 6.1 Inner Product, Length, and Orthogonality Orientation Recall: This course is about learning to: Solve the matrix equation Ax = b Solve the matrix equation
More informationMATH 23a, FALL 2002 THEORETICAL LINEAR ALGEBRA AND MULTIVARIABLE CALCULUS Solutions to Final Exam (in-class portion) January 22, 2003
MATH 23a, FALL 2002 THEORETICAL LINEAR ALGEBRA AND MULTIVARIABLE CALCULUS Solutions to Final Exam (in-class portion) January 22, 2003 1. True or False (28 points, 2 each) T or F If V is a vector space
More informationTypical Problem: Compute.
Math 2040 Chapter 6 Orhtogonality and Least Squares 6.1 and some of 6.7: Inner Product, Length and Orthogonality. Definition: If x, y R n, then x y = x 1 y 1 +... + x n y n is the dot product of x and
More informationRecall: Dot product on R 2 : u v = (u 1, u 2 ) (v 1, v 2 ) = u 1 v 1 + u 2 v 2, u u = u u 2 2 = u 2. Geometric Meaning:
Recall: Dot product on R 2 : u v = (u 1, u 2 ) (v 1, v 2 ) = u 1 v 1 + u 2 v 2, u u = u 2 1 + u 2 2 = u 2. Geometric Meaning: u v = u v cos θ. u θ v 1 Reason: The opposite side is given by u v. u v 2 =
More informationMath 18, Linear Algebra, Lecture C00, Spring 2017 Review and Practice Problems for Final Exam
Math 8, Linear Algebra, Lecture C, Spring 7 Review and Practice Problems for Final Exam. The augmentedmatrix of a linear system has been transformed by row operations into 5 4 8. Determine if the system
More informationLINEAR ALGEBRA W W L CHEN
LINEAR ALGEBRA W W L CHEN c W W L Chen, 1997, 2008. This chapter is available free to all individuals, on the understanding that it is not to be used for financial gain, and may be downloaded and/or photocopied,
More informationMath 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination
Math 0, Winter 07 Final Exam Review Chapter. Matrices and Gaussian Elimination { x + x =,. Different forms of a system of linear equations. Example: The x + 4x = 4. [ ] [ ] [ ] vector form (or the column
More informationSection 6.1. Inner Product, Length, and Orthogonality
Section 6. Inner Product, Length, and Orthogonality Orientation Almost solve the equation Ax = b Problem: In the real world, data is imperfect. x v u But due to measurement error, the measured x is not
More informationLINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS
LINEAR ALGEBRA, -I PARTIAL EXAM SOLUTIONS TO PRACTICE PROBLEMS Problem (a) For each of the two matrices below, (i) determine whether it is diagonalizable, (ii) determine whether it is orthogonally diagonalizable,
More informationIMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET
IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET This is a (not quite comprehensive) list of definitions and theorems given in Math 1553. Pay particular attention to the ones in red. Study Tip For each
More informationApplied Mathematics 205. Unit II: Numerical Linear Algebra. Lecturer: Dr. David Knezevic
Applied Mathematics 205 Unit II: Numerical Linear Algebra Lecturer: Dr. David Knezevic Unit II: Numerical Linear Algebra Chapter II.3: QR Factorization, SVD 2 / 66 QR Factorization 3 / 66 QR Factorization
More informationElementary Linear Algebra Review for Exam 2 Exam is Monday, November 16th.
Elementary Linear Algebra Review for Exam Exam is Monday, November 6th. The exam will cover sections:.4,..4, 5. 5., 7., the class notes on Markov Models. You must be able to do each of the following. Section.4
More informationINNER PRODUCT SPACE. Definition 1
INNER PRODUCT SPACE Definition 1 Suppose u, v and w are all vectors in vector space V and c is any scalar. An inner product space on the vectors space V is a function that associates with each pair of
More informationThe value of a problem is not so much coming up with the answer as in the ideas and attempted ideas it forces on the would be solver I.N.
Math 410 Homework Problems In the following pages you will find all of the homework problems for the semester. Homework should be written out neatly and stapled and turned in at the beginning of class
More informationMatrix decompositions
Matrix decompositions Zdeněk Dvořák May 19, 2015 Lemma 1 (Schur decomposition). If A is a symmetric real matrix, then there exists an orthogonal matrix Q and a diagonal matrix D such that A = QDQ T. The
More informationMath 4A Notes. Written by Victoria Kala Last updated June 11, 2017
Math 4A Notes Written by Victoria Kala vtkala@math.ucsb.edu Last updated June 11, 2017 Systems of Linear Equations A linear equation is an equation that can be written in the form a 1 x 1 + a 2 x 2 +...
More informationMTH 309Y 37. Inner product spaces. = a 1 b 1 + a 2 b a n b n
MTH 39Y 37. Inner product spaces Recall: ) The dot product in R n : a. a n b. b n = a b + a 2 b 2 +...a n b n 2) Properties of the dot product: a) u v = v u b) (u + v) w = u w + v w c) (cu) v = c(u v)
More informationLinear Algebra, Summer 2011, pt. 3
Linear Algebra, Summer 011, pt. 3 September 0, 011 Contents 1 Orthogonality. 1 1.1 The length of a vector....................... 1. Orthogonal vectors......................... 3 1.3 Orthogonal Subspaces.......................
More informationLinear Models Review
Linear Models Review Vectors in IR n will be written as ordered n-tuples which are understood to be column vectors, or n 1 matrices. A vector variable will be indicted with bold face, and the prime sign
More information7. Dimension and Structure.
7. Dimension and Structure 7.1. Basis and Dimension Bases for Subspaces Example 2 The standard unit vectors e 1, e 2,, e n are linearly independent, for if we write (2) in component form, then we obtain
More informationReview problems for MA 54, Fall 2004.
Review problems for MA 54, Fall 2004. Below are the review problems for the final. They are mostly homework problems, or very similar. If you are comfortable doing these problems, you should be fine on
More informationDS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.
DS-GA 1002 Lecture notes 0 Fall 2016 Linear Algebra These notes provide a review of basic concepts in linear algebra. 1 Vector spaces You are no doubt familiar with vectors in R 2 or R 3, i.e. [ ] 1.1
More informationSpan & Linear Independence (Pop Quiz)
Span & Linear Independence (Pop Quiz). Consider the following vectors: v = 2, v 2 = 4 5, v 3 = 3 2, v 4 = Is the set of vectors S = {v, v 2, v 3, v 4 } linearly independent? Solution: Notice that the number
More informationwhich arises when we compute the orthogonal projection of a vector y in a subspace with an orthogonal basis. Hence assume that P y = A ij = x j, x i
MODULE 6 Topics: Gram-Schmidt orthogonalization process We begin by observing that if the vectors {x j } N are mutually orthogonal in an inner product space V then they are necessarily linearly independent.
More informationLinear Algebra Primer
Linear Algebra Primer David Doria daviddoria@gmail.com Wednesday 3 rd December, 2008 Contents Why is it called Linear Algebra? 4 2 What is a Matrix? 4 2. Input and Output.....................................
More information80 min. 65 points in total. The raw score will be normalized according to the course policy to count into the final score.
This is a closed book, closed notes exam You need to justify every one of your answers unless you are asked not to do so Completely correct answers given without justification will receive little credit
More informationorthogonal relations between vectors and subspaces Then we study some applications in vector spaces and linear systems, including Orthonormal Basis,
5 Orthogonality Goals: We use scalar products to find the length of a vector, the angle between 2 vectors, projections, orthogonal relations between vectors and subspaces Then we study some applications
More informationMath 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008
Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008 Exam 2 will be held on Tuesday, April 8, 7-8pm in 117 MacMillan What will be covered The exam will cover material from the lectures
More informationMATH 235: Inner Product Spaces, Assignment 7
MATH 235: Inner Product Spaces, Assignment 7 Hand in questions 3,4,5,6,9, by 9:3 am on Wednesday March 26, 28. Contents Orthogonal Basis for Inner Product Space 2 2 Inner-Product Function Space 2 3 Weighted
More informationREVIEW FOR EXAM III SIMILARITY AND DIAGONALIZATION
REVIEW FOR EXAM III The exam covers sections 4.4, the portions of 4. on systems of differential equations and on Markov chains, and..4. SIMILARITY AND DIAGONALIZATION. Two matrices A and B are similar
More informationA Primer in Econometric Theory
A Primer in Econometric Theory Lecture 1: Vector Spaces John Stachurski Lectures by Akshay Shanker May 5, 2017 1/104 Overview Linear algebra is an important foundation for mathematics and, in particular,
More information1 Inner Product and Orthogonality
CSCI 4/Fall 6/Vora/GWU/Orthogonality and Norms Inner Product and Orthogonality Definition : The inner product of two vectors x and y, x x x =.., y =. x n y y... y n is denoted x, y : Note that n x, y =
More informationLinear Algebra Final Exam Study Guide Solutions Fall 2012
. Let A = Given that v = 7 7 67 5 75 78 Linear Algebra Final Exam Study Guide Solutions Fall 5 explain why it is not possible to diagonalize A. is an eigenvector for A and λ = is an eigenvalue for A diagonalize
More information1 0 1, then use that decomposition to solve the least squares problem. 1 Ax = 2. q 1 = a 1 a 1 = 1. to find the intermediate result:
Exercise Find the QR decomposition of A =, then use that decomposition to solve the least squares problem Ax = 2 3 4 Solution Name the columns of A by A = [a a 2 a 3 ] and denote the columns of the results
More information1111: Linear Algebra I
1111: Linear Algebra I Dr. Vladimir Dotsenko (Vlad) Lecture 13 Dr. Vladimir Dotsenko (Vlad) 1111: Linear Algebra I Lecture 13 1 / 8 The coordinate vector space R n We already used vectors in n dimensions
More informationMATH 22A: LINEAR ALGEBRA Chapter 4
MATH 22A: LINEAR ALGEBRA Chapter 4 Jesús De Loera, UC Davis November 30, 2012 Orthogonality and Least Squares Approximation QUESTION: Suppose Ax = b has no solution!! Then what to do? Can we find an Approximate
More informationMath 102, Winter 2009, Homework 7
Math 2, Winter 29, Homework 7 () Find the standard matrix of the linear transformation T : R 3 R 3 obtained by reflection through the plane x + z = followed by a rotation about the positive x-axes by 6
More informationMATH 115A: SAMPLE FINAL SOLUTIONS
MATH A: SAMPLE FINAL SOLUTIONS JOE HUGHES. Let V be the set of all functions f : R R such that f( x) = f(x) for all x R. Show that V is a vector space over R under the usual addition and scalar multiplication
More information1 Dirac Notation for Vector Spaces
Theoretical Physics Notes 2: Dirac Notation This installment of the notes covers Dirac notation, which proves to be very useful in many ways. For example, it gives a convenient way of expressing amplitudes
More informationLinear Algebra. Min Yan
Linear Algebra Min Yan January 2, 2018 2 Contents 1 Vector Space 7 1.1 Definition................................. 7 1.1.1 Axioms of Vector Space..................... 7 1.1.2 Consequence of Axiom......................
More informationMath Linear Algebra
Math 220 - Linear Algebra (Summer 208) Solutions to Homework #7 Exercise 6..20 (a) TRUE. u v v u = 0 is equivalent to u v = v u. The latter identity is true due to the commutative property of the inner
More informationQuizzes for Math 304
Quizzes for Math 304 QUIZ. A system of linear equations has augmented matrix 2 4 4 A = 2 0 2 4 3 5 2 a) Write down this system of equations; b) Find the reduced row-echelon form of A; c) What are the pivot
More informationDiagonalizing Matrices
Diagonalizing Matrices Massoud Malek A A Let A = A k be an n n non-singular matrix and let B = A = [B, B,, B k,, B n ] Then A n A B = A A 0 0 A k [B, B,, B k,, B n ] = 0 0 = I n 0 A n Notice that A i B
More information(v, w) = arccos( < v, w >
MA322 Sathaye Notes on Inner Products Notes on Chapter 6 Inner product. Given a real vector space V, an inner product is defined to be a bilinear map F : V V R such that the following holds: For all v
More informationProblem Set (T) If A is an m n matrix, B is an n p matrix and D is a p s matrix, then show
MTH 0: Linear Algebra Department of Mathematics and Statistics Indian Institute of Technology - Kanpur Problem Set Problems marked (T) are for discussions in Tutorial sessions (T) If A is an m n matrix,
More informationMATH 221: SOLUTIONS TO SELECTED HOMEWORK PROBLEMS
MATH 221: SOLUTIONS TO SELECTED HOMEWORK PROBLEMS 1. HW 1: Due September 4 1.1.21. Suppose v, w R n and c is a scalar. Prove that Span(v + cw, w) = Span(v, w). We must prove two things: that every element
More informationLecture 03. Math 22 Summer 2017 Section 2 June 26, 2017
Lecture 03 Math 22 Summer 2017 Section 2 June 26, 2017 Just for today (10 minutes) Review row reduction algorithm (40 minutes) 1.3 (15 minutes) Classwork Review row reduction algorithm Review row reduction
More informationNo books, no notes, no calculators. You must show work, unless the question is a true/false, yes/no, or fill-in-the-blank question.
Math 304 Final Exam (May 8) Spring 206 No books, no notes, no calculators. You must show work, unless the question is a true/false, yes/no, or fill-in-the-blank question. Name: Section: Question Points
More information