Math 416, Spring 2010 Gram-Schmidt, the QR-factorization, Orthogonal Matrices March 4, 2010 GRAM-SCHMIDT, THE QR-FACTORIZATION, ORTHOGONAL MATRICES
|
|
- Melinda Jackson
- 5 years ago
- Views:
Transcription
1 Math 46, Spring 00 Gram-Schmidt, the QR-factorization, Orthogonal Matrices March 4, 00 GRAM-SCHMIDT, THE QR-FACTORIZATION, ORTHOGONAL MATRICES Recap Yesterday we talked about several new, important concepts The central theme connecting the ideas was orthogonality, but more specifically we covered orthogonal complements; a big theorem that let us write a vector as a sum of two vectors, one from a given vector space and the other from its orthogonal complement; specifically, for V and x there is a unique representation x x + x with x V and x V ; a formula for computing x using dot products (once we have an orthonormal basis); formally defining the map proj V by proj V ( x ) x ; and writing down a matrix which gives the transformation proj V (this means that proj V is a linear transformation); specifically, we said that if u,, u s is a basis for V then u proj V ( x ) u us us x Making orthonormal bases The Gram-Schmidt Algorithm We saw yesterday that orthonormal bases are handy to use; for instance, they make writing the formula for proj V simple Our goal today is to produce orthonormal bases More specifically, if someone gives us a basis v,, v m of a space V, we want to manufacture an orthonormal basis for V from v,, v m from it Example Given the linearly independent collection { v }, construct an orthonormal basis for Span( v ) Solution First notice that Span( v ) is -dimensional since v is a basis Since we are looking for an orthonormal basis of a dimensional space we only have to find one vector, and orthonormality just means it should be unit length So let s just scale the vector we ve been given by its magnitude Our orthonormal basis for Span( v ) is the vector u v v Example Given the linearly independent collection { v, v }, construct an orthonormal basis for Span( v, v ) Solution Since we are working in a -dimensional space we ll have to do a little more work than last time We ll begin in the same way, by defining the first vector in our orthonormal basis by scaling the first vector in the given basis u v v acs@mathuiucedu Page of 5
2 Math 46, Spring 00 Gram-Schmidt, the QR-factorization, Orthogonal Matrices March 4, 00 The second vector in our orthonormal basis must be orthogonal to u and have unit length To find a vector orthogonal to u, we will use the decomposition v v + v, where the decomposition is relative to Span( u ) The vector v is orthogonal to u like we want, but it isn t quite unit length If we scale it appropriately, though, it will be u v v v v v v v v ( u v ) u v ( u v ) u The second equality I wrote follows from how we constructed v as the difference v v The third equality comes from expressing v as a dot product in terms of the orthonormal basis { u } of Span( u ) Span( v ) Example Given the linearly independent collection { v, v, v 3 }, construct an orthonormal basis for Span( v, v, v 3 ) Solution We re now in a 3-dimensional space, so we get to do a little more work We ll begin as before, by defining the first vector in our orthonormal basis by scaling the first vector in the given basis v u v Also as before we ll construct the second vector by writing v v + v (the decomposition relative to Span( u )) Then we define u v v v v v v v v ( u v ) u v ( u v ) u Finally for the third vector we need to find a vector orthogonal to Span( u, u ) Span( v, v ) For this we will write v3 v 3 + v3, the decomposition relative to the space Span( u, u ) Span( v, v ) Then we ll define u 3 v3 v3 v 3 v 3 v 3 v 3 v 3 v3 ( u v 3 ) u ( u v 3 ) u v ( u v 3 ) u ( u v 3 ) u Example Let s compute a specific example We ll try to construct an orthonormal basis for the span of the vectors ( ) v and ( ) 0 v From above we write v u v ( ) acs@mathuiucedu Page of 5
3 Math 46, Spring 00 Gram-Schmidt, the QR-factorization, Orthogonal Matrices March 4, 00 For the second vector, we have ( ) ( ) 0 0 u ( u v ) u u u ( u v ) u the above junk ( 0 ) above junk The process we have outlined above can be followed for any number of initial vectors In fact, by going through the cases we have, we should be able to see how the algorithm works generally The process we have followed is called the Gram-Schmidt algorithm Theorem (Gram-Schmidt Algorithm) Suppose we are given a collection { v,, v s } which is linearly independent Then if we write vi v i + vi (the decomposition relative to the space Span( v,, v i )), the collection of vectors { u,, u s } defined by vi ui v i is an orthonormal basis for Span( v,, v s ) We can calculate u i iteratively as and v u v vi ( u v i ) u ( u i v i ) u i ui v i ( u v i ) u ( u i v i ) u i The algorithm actually provides a factorization of the matrix whose columns are the initial linearly independent vectors This factorization is called the QR factorization of the matrix Theorem (QR factorization) Suppose that A is a matrix whose columns v,, v s are linearly independent Then A QR where Q is the matrix Q u us and R is the matrix v R u v u v 3 v u v 3 u v s u v s v 3 u3 v s u s v s v s u v u v u v 3 u v u v u v 3 us v us v us v 3 u v s u v s u s v s Proof The fact that we can write the matrix R in two ways just comes from the fact that u i ui v j 0 when i > j (since u i is orthogonal to Span( v,, v i ) by construction) v i v i and that acs@mathuiucedu Page 3 of 5
4 Math 46, Spring 00 Gram-Schmidt, the QR-factorization, Orthogonal Matrices March 4, 00 Using the first expression for the matrix R, we ll check the matrices on the left and right hand side are the same column by column For this, note that the ith column of the product (ie, the right hand side) is just u v i u u s v i 0 ( u v i ) u + ( u v i ) u + + ( u i v i ) u i + v i ui ( u v i ) u + ( u v i ) u + + ( u i v i ) u i + v i ui }{{} vi vi v i But this is just the ith column of the left hand side, so our matrices must be equal 3 Orthogonal Transformations and Matrices Definition 3 A linear transformation T : R n R n is called orthogonal if it preserves the length of vectors: T ( x ) x for all x R n A matrix is called orthogonal if it corresponds to an orthogonal linear transformation Notice that we have used the adjective orthogonal before in the context of a pair of vectors Saying that a matrix is orthogonal, though, means something different, so be careful to keep these two concepts separate in your mind! Example Any rotation in R is an orthogonal transformation, since ( ) ( ) cos θ sin θ x (cos θx sin θ cos θ x sin θx ) + (sin θx + cos θx ) cos θx sin θ cos θx x + sin θx + sin θx + sin θ cos θx x + cos θx ( ) x + x x x This isn t surprising given our intuition of what rotations do to vectors in R Example If V is a subspace of R n, then we define This is orthogonal because Ref V ( x ) x x Ref V ( x ) x x x + x x + x x + x x Example If V is a subspace of R n that isn t all of R n, then proj V is not an orthogonal transformation To see why this is true, choose a nonzero vector x V (such a vector exists since V R n ) Then proj V ( x ) 0 x acs@mathuiucedu Page 4 of 5
5 Math 46, Spring 00 Gram-Schmidt, the QR-factorization, Orthogonal Matrices March 4, 00 This example points out an important feature of orthogonal matrices: they must have trivial kernel This gives the following result Theorem 3 An orthogonal matrix is invertible Proof Since an orthogonal matrix is square automatically, we only have to check that ker(a) { 0 } So choose b ker(a), and we want to show b 0 Using the definition of kernel we have A b 0, so that A b 0 0 But A is orthogonal, so that A b b Hence we have b 0, and so b 0 Hence ker(a) { 0 } as desired This proof only tells us that A is invertible, but doesn t tell us what the inverse is We ll compute the inverse of an orthogonal matrix by the end of the class period Theorem 3 (Operations Preserving Orthogonality in Matrices) The product of two orthogonal transformations is orthogonal The inverse of an orthogonal matrix is orthogonal Proof To prove the first statement, suppose that A and B are two orthogonal, n n matrices Then for any x R n we have AB x A(B x ) B x x (in the equalities labeled I have used the orthogonality of A and B) This means that AB is orthogonal For the second statement, let A be an orthogonal n n matrix and choose a vector x R n ; our goal is to show A x x To do so, notice A x AA x x, where the equality labeled uses the orthogonality of A In class there was a very reasonable question asked: why is an orthogonal matrix called orthogonal? One good answer to this question is the following Theorem 33 If v and w are orthogonal elements of R n, then for any orthogonal transformation T : R n R n the vectors T ( v ) and T ( w ) are orthogonal We ll prove this result at the beginning of class next time To do this, we ll need the following result (which you ll prove for homework): Lemma 34 For two vectors x, y R n, we have if and only if x y 0 v + y x + y acs@mathuiucedu Page 5 of 5
Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition
Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition MATH 322, Linear Algebra I J. Robert Buchanan Department of Mathematics Spring 205 Motivation When working with an inner product space, the most
More informationSection 6.4. The Gram Schmidt Process
Section 6.4 The Gram Schmidt Process Motivation The procedures in 6 start with an orthogonal basis {u, u,..., u m}. Find the B-coordinates of a vector x using dot products: x = m i= x u i u i u i u i Find
More informationDot Products, Transposes, and Orthogonal Projections
Dot Products, Transposes, and Orthogonal Projections David Jekel November 13, 2015 Properties of Dot Products Recall that the dot product or standard inner product on R n is given by x y = x 1 y 1 + +
More informationLecture 4: Applications of Orthogonality: QR Decompositions
Math 08B Professor: Padraic Bartlett Lecture 4: Applications of Orthogonality: QR Decompositions Week 4 UCSB 204 In our last class, we described the following method for creating orthonormal bases, known
More informationWorksheet for Lecture 25 Section 6.4 Gram-Schmidt Process
Worksheet for Lecture Name: Section.4 Gram-Schmidt Process Goal For a subspace W = Span{v,..., v n }, we want to find an orthonormal basis of W. Example Let W = Span{x, x } with x = and x =. Give an orthogonal
More informationMath 416, Spring 2010 Matrix multiplication; subspaces February 2, 2010 MATRIX MULTIPLICATION; SUBSPACES. 1. Announcements
Math 416, Spring 010 Matrix multiplication; subspaces February, 010 MATRIX MULTIPLICATION; SUBSPACES 1 Announcements Office hours on Wednesday are cancelled because Andy will be out of town If you email
More informationAnswers in blue. If you have questions or spot an error, let me know. 1. Find all matrices that commute with A =. 4 3
Answers in blue. If you have questions or spot an error, let me know. 3 4. Find all matrices that commute with A =. 4 3 a b If we set B = and set AB = BA, we see that 3a + 4b = 3a 4c, 4a + 3b = 3b 4d,
More informationMAT2342 : Introduction to Applied Linear Algebra Mike Newman, fall Projections. introduction
MAT4 : Introduction to Applied Linear Algebra Mike Newman fall 7 9. Projections introduction One reason to consider projections is to understand approximate solutions to linear systems. A common example
More informationFinal Review Sheet. B = (1, 1 + 3x, 1 + x 2 ) then 2 + 3x + 6x 2
Final Review Sheet The final will cover Sections Chapters 1,2,3 and 4, as well as sections 5.1-5.4, 6.1-6.2 and 7.1-7.3 from chapters 5,6 and 7. This is essentially all material covered this term. Watch
More informationMath 291-2: Lecture Notes Northwestern University, Winter 2016
Math 291-2: Lecture Notes Northwestern University, Winter 2016 Written by Santiago Cañez These are lecture notes for Math 291-2, the second quarter of MENU: Intensive Linear Algebra and Multivariable Calculus,
More informationOrthogonal Complements
Orthogonal Complements Definition Let W be a subspace of R n. If a vector z is orthogonal to every vector in W, then z is said to be orthogonal to W. The set of all such vectors z is called the orthogonal
More informationLinear Algebra, Summer 2011, pt. 3
Linear Algebra, Summer 011, pt. 3 September 0, 011 Contents 1 Orthogonality. 1 1.1 The length of a vector....................... 1. Orthogonal vectors......................... 3 1.3 Orthogonal Subspaces.......................
More informationMath 21b: Linear Algebra Spring 2018
Math b: Linear Algebra Spring 08 Homework 8: Basis This homework is due on Wednesday, February 4, respectively on Thursday, February 5, 08. Which of the following sets are linear spaces? Check in each
More informationMath 261 Lecture Notes: Sections 6.1, 6.2, 6.3 and 6.4 Orthogonal Sets and Projections
Math 6 Lecture Notes: Sections 6., 6., 6. and 6. Orthogonal Sets and Projections We will not cover general inner product spaces. We will, however, focus on a particular inner product space the inner product
More informationMath 290-2: Linear Algebra & Multivariable Calculus Northwestern University, Lecture Notes
Math 290-2: Linear Algebra & Multivariable Calculus Northwestern University, Lecture Notes Written by Santiago Cañez These are notes which provide a basic summary of each lecture for Math 290-2, the second
More informationThere are two things that are particularly nice about the first basis
Orthogonality and the Gram-Schmidt Process In Chapter 4, we spent a great deal of time studying the problem of finding a basis for a vector space We know that a basis for a vector space can potentially
More informationMarch 27 Math 3260 sec. 56 Spring 2018
March 27 Math 3260 sec. 56 Spring 2018 Section 4.6: Rank Definition: The row space, denoted Row A, of an m n matrix A is the subspace of R n spanned by the rows of A. We now have three vector spaces associated
More informationMath 1180, Notes, 14 1 C. v 1 v n v 2. C A ; w n. A and w = v i w i : v w = i=1
Math 8, 9 Notes, 4 Orthogonality We now start using the dot product a lot. v v = v v n then by Recall that if w w ; w n and w = v w = nx v i w i : Using this denition, we dene the \norm", or length, of
More informationAssignment 1 Math 5341 Linear Algebra Review. Give complete answers to each of the following questions. Show all of your work.
Assignment 1 Math 5341 Linear Algebra Review Give complete answers to each of the following questions Show all of your work Note: You might struggle with some of these questions, either because it has
More informationChapter 3 Transformations
Chapter 3 Transformations An Introduction to Optimization Spring, 2014 Wei-Ta Chu 1 Linear Transformations A function is called a linear transformation if 1. for every and 2. for every If we fix the bases
More informationPractice Exam. 2x 1 + 4x 2 + 2x 3 = 4 x 1 + 2x 2 + 3x 3 = 1 2x 1 + 3x 2 + 4x 3 = 5
Practice Exam. Solve the linear system using an augmented matrix. State whether the solution is unique, there are no solutions or whether there are infinitely many solutions. If the solution is unique,
More informationInner products. Theorem (basic properties): Given vectors u, v, w in an inner product space V, and a scalar k, the following properties hold:
Inner products Definition: An inner product on a real vector space V is an operation (function) that assigns to each pair of vectors ( u, v) in V a scalar u, v satisfying the following axioms: 1. u, v
More informationMATH 22A: LINEAR ALGEBRA Chapter 4
MATH 22A: LINEAR ALGEBRA Chapter 4 Jesús De Loera, UC Davis November 30, 2012 Orthogonality and Least Squares Approximation QUESTION: Suppose Ax = b has no solution!! Then what to do? Can we find an Approximate
More informationMath 416, Spring 2010 More on Algebraic and Geometric Properties January 21, 2010 MORE ON ALGEBRAIC AND GEOMETRIC PROPERTIES
Math 46, Spring 2 More on Algebraic and Geometric Properties January 2, 2 MORE ON ALGEBRAIC AND GEOMETRIC PROPERTIES Algebraic properties Algebraic properties of matrix/vector multiplication Last time
More informationSection 6.2, 6.3 Orthogonal Sets, Orthogonal Projections
Section 6. 6. Orthogonal Sets Orthogonal Projections Main Ideas in these sections: Orthogonal set = A set of mutually orthogonal vectors. OG LI. Orthogonal Projection of y onto u or onto an OG set {u u
More informationMath 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008
Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008 Exam 2 will be held on Tuesday, April 8, 7-8pm in 117 MacMillan What will be covered The exam will cover material from the lectures
More informationLINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS
LINEAR ALGEBRA, -I PARTIAL EXAM SOLUTIONS TO PRACTICE PROBLEMS Problem (a) For each of the two matrices below, (i) determine whether it is diagonalizable, (ii) determine whether it is orthogonally diagonalizable,
More informationMODULE 8 Topics: Null space, range, column space, row space and rank of a matrix
MODULE 8 Topics: Null space, range, column space, row space and rank of a matrix Definition: Let L : V 1 V 2 be a linear operator. The null space N (L) of L is the subspace of V 1 defined by N (L) = {x
More informationMath 3191 Applied Linear Algebra
Math 9 Applied Linear Algebra Lecture : Orthogonal Projections, Gram-Schmidt Stephen Billups University of Colorado at Denver Math 9Applied Linear Algebra p./ Orthonormal Sets A set of vectors {u, u,...,
More informationApplied Linear Algebra in Geoscience Using MATLAB
Applied Linear Algebra in Geoscience Using MATLAB Contents Getting Started Creating Arrays Mathematical Operations with Arrays Using Script Files and Managing Data Two-Dimensional Plots Programming in
More informationIMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET
IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET This is a (not quite comprehensive) list of definitions and theorems given in Math 1553. Pay particular attention to the ones in red. Study Tip For each
More informationLINEAR ALGEBRA KNOWLEDGE SURVEY
LINEAR ALGEBRA KNOWLEDGE SURVEY Instructions: This is a Knowledge Survey. For this assignment, I am only interested in your level of confidence about your ability to do the tasks on the following pages.
More informationEXAM 2 REVIEW DAVID SEAL
EXAM 2 REVIEW DAVID SEAL 3. Linear Systems and Matrices 3.2. Matrices and Gaussian Elimination. At this point in the course, you all have had plenty of practice with Gaussian Elimination. Be able to row
More informationMATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)
MATH 20F: LINEAR ALGEBRA LECTURE B00 (T KEMP) Definition 01 If T (x) = Ax is a linear transformation from R n to R m then Nul (T ) = {x R n : T (x) = 0} = Nul (A) Ran (T ) = {Ax R m : x R n } = {b R m
More informationDS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.
DS-GA 1002 Lecture notes 0 Fall 2016 Linear Algebra These notes provide a review of basic concepts in linear algebra. 1 Vector spaces You are no doubt familiar with vectors in R 2 or R 3, i.e. [ ] 1.1
More informationAssignment #9: Orthogonal Projections, Gram-Schmidt, and Least Squares. Name:
Assignment 9: Orthogonal Projections, Gram-Schmidt, and Least Squares Due date: Friday, April 0, 08 (:pm) Name: Section Number Assignment 9: Orthogonal Projections, Gram-Schmidt, and Least Squares Due
More information33AH, WINTER 2018: STUDY GUIDE FOR FINAL EXAM
33AH, WINTER 2018: STUDY GUIDE FOR FINAL EXAM (UPDATED MARCH 17, 2018) The final exam will be cumulative, with a bit more weight on more recent material. This outline covers the what we ve done since the
More informationLinear Systems. Class 27. c 2008 Ron Buckmire. TITLE Projection Matrices and Orthogonal Diagonalization CURRENT READING Poole 5.4
Linear Systems Math Spring 8 c 8 Ron Buckmire Fowler 9 MWF 9: am - :5 am http://faculty.oxy.edu/ron/math//8/ Class 7 TITLE Projection Matrices and Orthogonal Diagonalization CURRENT READING Poole 5. Summary
More informationMiderm II Solutions To find the inverse we row-reduce the augumented matrix [I A]. In our case, we row reduce
Miderm II Solutions Problem. [8 points] (i) [4] Find the inverse of the matrix A = To find the inverse we row-reduce the augumented matrix [I A]. In our case, we row reduce We have A = 2 2 (ii) [2] Possibly
More informationMATH 167: APPLIED LINEAR ALGEBRA Least-Squares
MATH 167: APPLIED LINEAR ALGEBRA Least-Squares October 30, 2014 Least Squares We do a series of experiments, collecting data. We wish to see patterns!! We expect the output b to be a linear function of
More informationMath Linear Algebra
Math 220 - Linear Algebra (Summer 208) Solutions to Homework #7 Exercise 6..20 (a) TRUE. u v v u = 0 is equivalent to u v = v u. The latter identity is true due to the commutative property of the inner
More informationLecture 4 Orthonormal vectors and QR factorization
Orthonormal vectors and QR factorization 4 1 Lecture 4 Orthonormal vectors and QR factorization EE263 Autumn 2004 orthonormal vectors Gram-Schmidt procedure, QR factorization orthogonal decomposition induced
More informationVectors. Vectors and the scalar multiplication and vector addition operations:
Vectors Vectors and the scalar multiplication and vector addition operations: x 1 x 1 y 1 2x 1 + 3y 1 x x n 1 = 2 x R n, 2 2 y + 3 2 2x = 2 + 3y 2............ x n x n y n 2x n + 3y n I ll use the two terms
More informationAnnouncements Monday, November 20
Announcements Monday, November 20 You already have your midterms! Course grades will be curved at the end of the semester. The percentage of A s, B s, and C s to be awarded depends on many factors, and
More informationChapter 6: Orthogonality
Chapter 6: Orthogonality (Last Updated: November 7, 7) These notes are derived primarily from Linear Algebra and its applications by David Lay (4ed). A few theorems have been moved around.. Inner products
More informationIMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET
IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET This is a (not quite comprehensive) list of definitions and theorems given in Math 1553. Pay particular attention to the ones in red. Study Tip For each
More informationMath 416, Spring 2010 Coordinate systems and Change of Basis February 16, 2010 COORDINATE SYSTEMS AND CHANGE OF BASIS. 1.
Math 46 Spring Coordinate systems and Change of asis February 6 COORDINAE SYSEMS AND CHANGE OF ASIS Announcements Don t forget that we have a quiz on hursday and test coming up the following hursday Finishing
More informationMATH 235: Inner Product Spaces, Assignment 7
MATH 235: Inner Product Spaces, Assignment 7 Hand in questions 3,4,5,6,9, by 9:3 am on Wednesday March 26, 28. Contents Orthogonal Basis for Inner Product Space 2 2 Inner-Product Function Space 2 3 Weighted
More informationLinear Algebra II. 7 Inner product spaces. Notes 7 16th December Inner products and orthonormal bases
MTH6140 Linear Algebra II Notes 7 16th December 2010 7 Inner product spaces Ordinary Euclidean space is a 3-dimensional vector space over R, but it is more than that: the extra geometric structure (lengths,
More informationFFTs in Graphics and Vision. Groups and Representations
FFTs in Graphics and Vision Groups and Representations Outline Groups Representations Schur s Lemma Correlation Groups A group is a set of elements G with a binary operation (often denoted ) such that
More informationHomework 5. (due Wednesday 8 th Nov midnight)
Homework (due Wednesday 8 th Nov midnight) Use this definition for Column Space of a Matrix Column Space of a matrix A is the set ColA of all linear combinations of the columns of A. In other words, if
More informationHomework 11 Solutions. Math 110, Fall 2013.
Homework 11 Solutions Math 110, Fall 2013 1 a) Suppose that T were self-adjoint Then, the Spectral Theorem tells us that there would exist an orthonormal basis of P 2 (R), (p 1, p 2, p 3 ), consisting
More informationMATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors.
MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors. Orthogonal sets Let V be a vector space with an inner product. Definition. Nonzero vectors v 1,v
More informationMath 396. An application of Gram-Schmidt to prove connectedness
Math 396. An application of Gram-Schmidt to prove connectedness 1. Motivation and background Let V be an n-dimensional vector space over R, and define GL(V ) to be the set of invertible linear maps V V
More informationLecture 6: Lies, Inner Product Spaces, and Symmetric Matrices
Math 108B Professor: Padraic Bartlett Lecture 6: Lies, Inner Product Spaces, and Symmetric Matrices Week 6 UCSB 2014 1 Lies Fun fact: I have deceived 1 you somewhat with these last few lectures! Let me
More informationMATH 15a: Linear Algebra Practice Exam 2
MATH 5a: Linear Algebra Practice Exam 2 Write all answers in your exam booklet. Remember that you must show all work and justify your answers for credit. No calculators are allowed. Good luck!. Compute
More information5. Orthogonal matrices
L Vandenberghe EE133A (Spring 2017) 5 Orthogonal matrices matrices with orthonormal columns orthogonal matrices tall matrices with orthonormal columns complex matrices with orthonormal columns 5-1 Orthonormal
More informationFINAL EXAM Ma (Eakin) Fall 2015 December 16, 2015
FINAL EXAM Ma-00 Eakin Fall 05 December 6, 05 Please make sure that your name and GUID are on every page. This exam is designed to be done with pencil-and-paper calculations. You may use your calculator
More informationThe 'linear algebra way' of talking about "angle" and "similarity" between two vectors is called "inner product". We'll define this next.
Orthogonality and QR The 'linear algebra way' of talking about "angle" and "similarity" between two vectors is called "inner product". We'll define this next. So, what is an inner product? An inner product
More informationMATH 310, REVIEW SHEET
MATH 310, REVIEW SHEET These notes are a summary of the key topics in the book (and follow the book pretty closely). You should be familiar with everything on here, but it s not comprehensive, so please
More informationMath Real Analysis II
Math 4 - Real Analysis II Solutions to Homework due May Recall that a function f is called even if f( x) = f(x) and called odd if f( x) = f(x) for all x. We saw that these classes of functions had a particularly
More informationMath 4A Notes. Written by Victoria Kala Last updated June 11, 2017
Math 4A Notes Written by Victoria Kala vtkala@math.ucsb.edu Last updated June 11, 2017 Systems of Linear Equations A linear equation is an equation that can be written in the form a 1 x 1 + a 2 x 2 +...
More informationSUMMARY OF MATH 1600
SUMMARY OF MATH 1600 Note: The following list is intended as a study guide for the final exam. It is a continuation of the study guide for the midterm. It does not claim to be a comprehensive list. You
More informationVector Spaces, Orthogonality, and Linear Least Squares
Week Vector Spaces, Orthogonality, and Linear Least Squares. Opening Remarks.. Visualizing Planes, Lines, and Solutions Consider the following system of linear equations from the opener for Week 9: χ χ
More informationand u and v are orthogonal if and only if u v = 0. u v = x1x2 + y1y2 + z1z2. 1. In R 3 the dot product is defined by
Linear Algebra [] 4.2 The Dot Product and Projections. In R 3 the dot product is defined by u v = u v cos θ. 2. For u = (x, y, z) and v = (x2, y2, z2), we have u v = xx2 + yy2 + zz2. 3. cos θ = u v u v,
More informationISOMETRIES OF R n KEITH CONRAD
ISOMETRIES OF R n KEITH CONRAD 1. Introduction An isometry of R n is a function h: R n R n that preserves the distance between vectors: h(v) h(w) = v w for all v and w in R n, where (x 1,..., x n ) = x
More informationLECTURES 14/15: LINEAR INDEPENDENCE AND BASES
LECTURES 14/15: LINEAR INDEPENDENCE AND BASES MA1111: LINEAR ALGEBRA I, MICHAELMAS 2016 1. Linear Independence We have seen in examples of span sets of vectors that sometimes adding additional vectors
More information6. Orthogonality and Least-Squares
Linear Algebra 6. Orthogonality and Least-Squares CSIE NCU 1 6. Orthogonality and Least-Squares 6.1 Inner product, length, and orthogonality. 2 6.2 Orthogonal sets... 8 6.3 Orthogonal projections... 13
More informationMTH 2310, FALL Introduction
MTH 2310, FALL 2011 SECTION 6.2: ORTHOGONAL SETS Homework Problems: 1, 5, 9, 13, 17, 21, 23 1, 27, 29, 35 1. Introduction We have discussed previously the benefits of having a set of vectors that is linearly
More informationTBP MATH33A Review Sheet. November 24, 2018
TBP MATH33A Review Sheet November 24, 2018 General Transformation Matrices: Function Scaling by k Orthogonal projection onto line L Implementation If we want to scale I 2 by k, we use the following: [
More informationChapter 4 Euclid Space
Chapter 4 Euclid Space Inner Product Spaces Definition.. Let V be a real vector space over IR. A real inner product on V is a real valued function on V V, denoted by (, ), which satisfies () (x, y) = (y,
More information[Disclaimer: This is not a complete list of everything you need to know, just some of the topics that gave people difficulty.]
Math 43 Review Notes [Disclaimer: This is not a complete list of everything you need to know, just some of the topics that gave people difficulty Dot Product If v (v, v, v 3 and w (w, w, w 3, then the
More informationMath 407: Linear Optimization
Math 407: Linear Optimization Lecture 16: The Linear Least Squares Problem II Math Dept, University of Washington February 28, 2018 Lecture 16: The Linear Least Squares Problem II (Math Dept, University
More informationis Use at most six elementary row operations. (Partial
MATH 235 SPRING 2 EXAM SOLUTIONS () (6 points) a) Show that the reduced row echelon form of the augmented matrix of the system x + + 2x 4 + x 5 = 3 x x 3 + x 4 + x 5 = 2 2x + 2x 3 2x 4 x 5 = 3 is. Use
More informationDot product and linear least squares problems
Dot product and linear least squares problems Dot Product For vectors u,v R n we define the dot product Note that we can also write this as u v = u,,u n u v = u v + + u n v n v v n = u v + + u n v n The
More informationINNER PRODUCT SPACE. Definition 1
INNER PRODUCT SPACE Definition 1 Suppose u, v and w are all vectors in vector space V and c is any scalar. An inner product space on the vectors space V is a function that associates with each pair of
More information. = V c = V [x]v (5.1) c 1. c k
Chapter 5 Linear Algebra It can be argued that all of linear algebra can be understood using the four fundamental subspaces associated with a matrix Because they form the foundation on which we later work,
More informationMTH 2032 SemesterII
MTH 202 SemesterII 2010-11 Linear Algebra Worked Examples Dr. Tony Yee Department of Mathematics and Information Technology The Hong Kong Institute of Education December 28, 2011 ii Contents Table of Contents
More informationorthogonal relations between vectors and subspaces Then we study some applications in vector spaces and linear systems, including Orthonormal Basis,
5 Orthogonality Goals: We use scalar products to find the length of a vector, the angle between 2 vectors, projections, orthogonal relations between vectors and subspaces Then we study some applications
More informationMAT 211, Spring 2015, Introduction to Linear Algebra.
MAT 211, Spring 2015, Introduction to Linear Algebra. Lecture 04, 53103: MWF 10-10:53 AM. Location: Library W4535 Contact: mtehrani@scgp.stonybrook.edu Final Exam: Monday 5/18/15 8:00 AM-10:45 AM The aim
More informationLinear Algebra Fundamentals
Linear Algebra Fundamentals It can be argued that all of linear algebra can be understood using the four fundamental subspaces associated with a matrix. Because they form the foundation on which we later
More informationMATH 23a, FALL 2002 THEORETICAL LINEAR ALGEBRA AND MULTIVARIABLE CALCULUS Solutions to Final Exam (in-class portion) January 22, 2003
MATH 23a, FALL 2002 THEORETICAL LINEAR ALGEBRA AND MULTIVARIABLE CALCULUS Solutions to Final Exam (in-class portion) January 22, 2003 1. True or False (28 points, 2 each) T or F If V is a vector space
More informationMath Linear Algebra II. 1. Inner Products and Norms
Math 342 - Linear Algebra II Notes 1. Inner Products and Norms One knows from a basic introduction to vectors in R n Math 254 at OSU) that the length of a vector x = x 1 x 2... x n ) T R n, denoted x,
More informationMATH 221: SOLUTIONS TO SELECTED HOMEWORK PROBLEMS
MATH 221: SOLUTIONS TO SELECTED HOMEWORK PROBLEMS 1. HW 1: Due September 4 1.1.21. Suppose v, w R n and c is a scalar. Prove that Span(v + cw, w) = Span(v, w). We must prove two things: that every element
More informationMATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL
MATH 3 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL MAIN TOPICS FOR THE FINAL EXAM:. Vectors. Dot product. Cross product. Geometric applications. 2. Row reduction. Null space, column space, row space, left
More informationInverses. Stephen Boyd. EE103 Stanford University. October 28, 2017
Inverses Stephen Boyd EE103 Stanford University October 28, 2017 Outline Left and right inverses Inverse Solving linear equations Examples Pseudo-inverse Left and right inverses 2 Left inverses a number
More informationLecture 10: Vector Algebra: Orthogonal Basis
Lecture 0: Vector Algebra: Orthogonal Basis Orthogonal Basis of a subspace Computing an orthogonal basis for a subspace using Gram-Schmidt Orthogonalization Process Orthogonal Set Any set of vectors that
More information6.1. Inner Product, Length and Orthogonality
These are brief notes for the lecture on Friday November 13, and Monday November 1, 2009: they are not complete, but they are a guide to what I want to say on those days. They are guaranteed to be incorrect..1.
More informationConceptual Questions for Review
Conceptual Questions for Review Chapter 1 1.1 Which vectors are linear combinations of v = (3, 1) and w = (4, 3)? 1.2 Compare the dot product of v = (3, 1) and w = (4, 3) to the product of their lengths.
More informationSolving a system by back-substitution, checking consistency of a system (no rows of the form
MATH 520 LEARNING OBJECTIVES SPRING 2017 BROWN UNIVERSITY SAMUEL S. WATSON Week 1 (23 Jan through 27 Jan) Definition of a system of linear equations, definition of a solution of a linear system, elementary
More informationA PRIMER ON SESQUILINEAR FORMS
A PRIMER ON SESQUILINEAR FORMS BRIAN OSSERMAN This is an alternative presentation of most of the material from 8., 8.2, 8.3, 8.4, 8.5 and 8.8 of Artin s book. Any terminology (such as sesquilinear form
More informationLecture 3: Linear Algebra Review, Part II
Lecture 3: Linear Algebra Review, Part II Brian Borchers January 4, Linear Independence Definition The vectors v, v,..., v n are linearly independent if the system of equations c v + c v +...+ c n v n
More informationLinear Algebra Massoud Malek
CSUEB Linear Algebra Massoud Malek Inner Product and Normed Space In all that follows, the n n identity matrix is denoted by I n, the n n zero matrix by Z n, and the zero vector by θ n An inner product
More information1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det
What is the determinant of the following matrix? 3 4 3 4 3 4 4 3 A 0 B 8 C 55 D 0 E 60 If det a a a 3 b b b 3 c c c 3 = 4, then det a a 4a 3 a b b 4b 3 b c c c 3 c = A 8 B 6 C 4 D E 3 Let A be an n n matrix
More informationYORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH M Test #2 Solutions
YORK UNIVERSITY Faculty of Science Department of Mathematics and Statistics MATH 3. M Test # Solutions. (8 pts) For each statement indicate whether it is always TRUE or sometimes FALSE. Note: For this
More informationThe value of a problem is not so much coming up with the answer as in the ideas and attempted ideas it forces on the would be solver I.N.
Math 410 Homework Problems In the following pages you will find all of the homework problems for the semester. Homework should be written out neatly and stapled and turned in at the beginning of class
More informationMATH 240 Spring, Chapter 1: Linear Equations and Matrices
MATH 240 Spring, 2006 Chapter Summaries for Kolman / Hill, Elementary Linear Algebra, 8th Ed. Sections 1.1 1.6, 2.1 2.2, 3.2 3.8, 4.3 4.5, 5.1 5.3, 5.5, 6.1 6.5, 7.1 7.2, 7.4 DEFINITIONS Chapter 1: Linear
More informationMATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION
MATH (LINEAR ALGEBRA ) FINAL EXAM FALL SOLUTIONS TO PRACTICE VERSION Problem (a) For each matrix below (i) find a basis for its column space (ii) find a basis for its row space (iii) determine whether
More informationWe see that this is a linear system with 3 equations in 3 unknowns. equation is A x = b, where
Practice Problems Math 35 Spring 7: Solutions. Write the system of equations as a matrix equation and find all solutions using Gauss elimination: x + y + 4z =, x + 3y + z = 5, x + y + 5z = 3. We see that
More informationRecitation 8: Graphs and Adjacency Matrices
Math 1b TA: Padraic Bartlett Recitation 8: Graphs and Adjacency Matrices Week 8 Caltech 2011 1 Random Question Suppose you take a large triangle XY Z, and divide it up with straight line segments into
More information