Section 6.4. The Gram Schmidt Process

Similar documents
Worksheet for Lecture 25 Section 6.4 Gram-Schmidt Process

Solutions to Review Problems for Chapter 6 ( ), 7.1

Math 3191 Applied Linear Algebra

Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

2. Every linear system with the same number of equations as unknowns has a unique solution.

The Gram Schmidt Process

The Gram Schmidt Process

Section 6.2, 6.3 Orthogonal Sets, Orthogonal Projections

LINEAR ALGEBRA SUMMARY SHEET.

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

Homework 5. (due Wednesday 8 th Nov midnight)

1. Diagonalize the matrix A if possible, that is, find an invertible matrix P and a diagonal

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

Diagonalizing Matrices

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL

1 0 1, then use that decomposition to solve the least squares problem. 1 Ax = 2. q 1 = a 1 a 1 = 1. to find the intermediate result:

6. Orthogonality and Least-Squares

Math 416, Spring 2010 Gram-Schmidt, the QR-factorization, Orthogonal Matrices March 4, 2010 GRAM-SCHMIDT, THE QR-FACTORIZATION, ORTHOGONAL MATRICES

Warm-up. True or false? Baby proof. 2. The system of normal equations for A x = y has solutions iff A x = y has solutions

Practice Exam. 2x 1 + 4x 2 + 2x 3 = 4 x 1 + 2x 2 + 3x 3 = 1 2x 1 + 3x 2 + 4x 3 = 5

Miderm II Solutions To find the inverse we row-reduce the augumented matrix [I A]. In our case, we row reduce

MTH 2032 SemesterII

Chapter 6: Orthogonality

Conceptual Questions for Review

Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008

Lecture 4 Orthonormal vectors and QR factorization

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

Math 261 Lecture Notes: Sections 6.1, 6.2, 6.3 and 6.4 Orthogonal Sets and Projections

Lecture 10: Vector Algebra: Orthogonal Basis

No books, no notes, no calculators. You must show work, unless the question is a true/false, yes/no, or fill-in-the-blank question.

Lecture 4: Applications of Orthogonality: QR Decompositions

Review problems for MA 54, Fall 2004.

MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors.

Orthogonal Complements

ft-uiowa-math2550 Assignment NOTRequiredJustHWformatOfQuizReviewForExam3part2 due 12/31/2014 at 07:10pm CST

Math 18, Linear Algebra, Lecture C00, Spring 2017 Review and Practice Problems for Final Exam

Linear Algebra Primer

(a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? Solution: dim N(A) 1, since rank(a) 3. Ax =

MATH 221, Spring Homework 10 Solutions

PRACTICE FINAL EXAM. why. If they are dependent, exhibit a linear dependence relation among them.

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017

SUMMARY OF MATH 1600

Chapter 3 Transformations

Dot product and linear least squares problems

MATH 235. Final ANSWERS May 5, 2015

Announcements Monday, November 26

Problem # Max points possible Actual score Total 120

Math 407: Linear Optimization

Glossary of Linear Algebra Terms. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB

Problem Set (T) If A is an m n matrix, B is an n p matrix and D is a p s matrix, then show

Math 2331 Linear Algebra

Transpose & Dot Product

235 Final exam review questions

18.06 Professor Johnson Quiz 1 October 3, 2007

1 Last time: least-squares problems

Transpose & Dot Product

ANSWERS. E k E 2 E 1 A = B

I. Multiple Choice Questions (Answer any eight)

Announcements Monday, November 26

MA 265 FINAL EXAM Fall 2012

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

FINAL EXAM Ma (Eakin) Fall 2015 December 16, 2015

PRACTICE PROBLEMS FOR THE FINAL

7. Symmetric Matrices and Quadratic Forms

Problem 1: Solving a linear equation

MATH 15a: Linear Algebra Practice Exam 2

Assignment 1 Math 5341 Linear Algebra Review. Give complete answers to each of the following questions. Show all of your work.

Assignment #9: Orthogonal Projections, Gram-Schmidt, and Least Squares. Name:

Matrix decompositions

LINEAR ALGEBRA QUESTION BANK

MATH 22A: LINEAR ALGEBRA Chapter 4

Linear Algebra Final Exam Study Guide Solutions Fall 2012

REVIEW FOR EXAM III SIMILARITY AND DIAGONALIZATION

TBP MATH33A Review Sheet. November 24, 2018

Inverses. Stephen Boyd. EE103 Stanford University. October 28, 2017

UNIT 6: The singular value decomposition.

1 9/5 Matrices, vectors, and their applications

A Brief Outline of Math 355

Linear Systems. Class 27. c 2008 Ron Buckmire. TITLE Projection Matrices and Orthogonal Diagonalization CURRENT READING Poole 5.4

LINEAR ALGEBRA KNOWLEDGE SURVEY

EK102 Linear Algebra PRACTICE PROBLEMS for Final Exam Spring 2016

Section 6.1. Inner Product, Length, and Orthogonality

Final EXAM Preparation Sheet

MATH 304 Linear Algebra Lecture 34: Review for Test 2.

Linear Algebra. and

33AH, WINTER 2018: STUDY GUIDE FOR FINAL EXAM

MATH. 20F SAMPLE FINAL (WINTER 2010)

Math 416, Spring 2010 The algebra of determinants March 16, 2010 THE ALGEBRA OF DETERMINANTS. 1. Determinants

Orthonormal Transformations

(v, w) = arccos( < v, w >

Review of Linear Algebra

YORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH M Test #2 Solutions

EXAM. Exam 1. Math 5316, Fall December 2, 2012

BASIC ALGORITHMS IN LINEAR ALGEBRA. Matrices and Applications of Gaussian Elimination. A 2 x. A T m x. A 1 x A T 1. A m x

University of Colorado Denver Department of Mathematical and Statistical Sciences Applied Linear Algebra Ph.D. Preliminary Exam June 8, 2012

Cheat Sheet for MATH461

EECS 275 Matrix Computation

Final Review Written by Victoria Kala SH 6432u Office Hours R 12:30 1:30pm Last Updated 11/30/2015

Transcription:

Section 6.4 The Gram Schmidt Process

Motivation The procedures in 6 start with an orthogonal basis {u, u,..., u m}. Find the B-coordinates of a vector x using dot products: x = m i= x u i u i u i u i Find the orthogonal projection of a vector x onto the span W of u, u,..., u m: m x u i proj W (x) = u i. u i u i i= Problem: What if your basis isn t orthogonal? Solution: The Gram Schmidt process: take any basis and make it orthogonal.

The Gram Schmidt Process Procedure The Gram Schmidt Process Let {v, v,..., v m} be a basis for a subspace W of R n. Define:. u = v v u. u = v proj Span{u }(v ) = v u u u v3 u v3 u 3. u 3 = v 3 proj Span{u,u }(v 3) = v 3 u u u u u u. m m. u m = v m proj Span{u,u,...,u m } (vm) = vm v m u i u i u i u i Then {u, u,..., u m} is an orthogonal basis for the same subspace W. Remark In fact, for every i between and n, the set {u, u,..., u i } is an orthogonal basis for Span{v, v,..., v i }. i=

The Gram Schmidt Process Example : Two vectors Find an orthogonal basis {u, u } for W = Span{v, v }, where v = and v =. Run Gram Schmidt:. u = v. u = v Why does this work? First we take u = v. Because u v, we can t take u = v. Fix: let L = Span{u }, and let u = (v ) L = v proj L (v ). By construction, u u =, because L u. v u u = =. u u W v = u Remember: This is an orthogonal basis for the same subspace. Span{u, u } = Span{v, v } v u = (v ) L L

The Gram Schmidt Process Example : Three vectors Find an orthogonal basis {u, u, u 3} for W = Span{v, v, v 3} = R 3, where v = v = 3 v 3 =. Run Gram Schmidt:. u = v. u = v v u u = = u u v3 u v3 u 3. u 3 = v 3 u u u u u u 3 = 4 = Remember: This is an orthogonal basis for the same subspace W.

The Gram Schmidt Process Example, continued 3 v =, v =, v 3 = G S u =, u =, u 3 = Why does this work? Check: Once we have u and u orthogonal, let W = Span{u, u }, and u 3 = (v 3) W = v 3 proj W3 (u 3). By construction, W u 3, so u u 3 = = u u 3. u u = " u u 3 = " u u 3 = " proj W (v 3 ) W u u v 3 u 3 L y z x

The Gram Schmidt Process Example 3: Vectors in R 4 Find an orthogonal basis {u, u, u 3} for W = Span{v, v, v 3}, where 4 v = v = 4 4 v 3 =. Run Gram Schmidt:. u = v. u = v 5/ v u u = 4 u u 4 6 4 = 5/ 5/ 5/ v3 u v3 u 3. u 3 = v 3 u u u u u u 4 5/ = 4 5/ 5 5/ = 5/

QR Factorization Recall: A set of vectors {v, v,..., v m} is orthonormal if they are orthogonal unit vectors: v i v j = when i j, and v i v i =. Orthonormal A matrix Q has orthonormal columns if and only if Q T Q = I. QR Factorization Theorem Let A be a matrix with linearly independent columns. Then A = QR where Q has orthonormal columns and R is upper-triangular with positive diagonal entries. The columns of A are a basis for W = Col A. The columns of Q are equivalent basis coming from Gram Schmidt (as applied to the columns of A), after normalizing to unit vectors. The columns of R come from the steps in Gram Schmidt.

Procedure: QR Factorization Through an example Find the QR factorization of A =. (The columns of A are the vectors v, v, v 3 from example.) Step : Run Gram Schmidt and solve for v, v, v 3 in terms of u, u, u 3. u = v = v = u v u u = v u = v u = v = u + u u u v3 u v3 u u 3 = v 3 u u u u u u = v 3 u u = v 3 = u + u + u 3

QR Factorization Through an example, continued Step : Write A = Q R, where Q has orthogonal columns u, u, u 3 and R is upper-triangular (with s on the diagonal) as shown below. v = u v = u + u v 3 = u + u + u 3 A v v v 3 = u u u 3 Q R first column of A = u u u 3 = u = v second column of A = u u u 3 = u + u = v third column of A = u u u 3 = u + u + u 3 = v 3

QR Factorization Through an example, continued A = Q R = Step 3: To get Q and R, scale the columns of Q to get unit vectors, and scale the rows of R by the opposite factor. / / / = / / / / / /. It doesn t change the product: the entries in the ith column of Q multiply by the entries in the ith row of R. The final QR decomposition is: / / A = QR Q = / / R =

QR Factorization Through a second example 4 Find the QR factorization of A = 4 4. (The columns are vectors from example 3.) Step : Run Gram Schmidt and solve for v, v, v 3 in terms of u, u, u 3: u = v = v = u 5/ v u u = v u = v 3 u u u = 5/ u 3 = v 3 5/ 5/ v3 u v3 u u u = v 3 + 4 u u u u 5 u = v = 3 u + u v3 = 4 u + u3 5

QR Factorization Through a second example, continued v = u v = 3 u + u v3 = u 4 u + u3 5 Step : Write A = Q R, where Q has orthogonal columns u, u, u 3 and R is upper-triangular with s on the diagonal. 5/ Q = u u u 3 = 5/ 5/ 5/ 3/ R = 4/5

QR Factorization Through a second example, continued 5/ A = Q R Q = 5/ 5/ 5/ 3/ R = 4/5 Step 3: To get Q and R, normalize the columns of Q and scale the rows of R: Q = u / u u / u u 3/ u 3 u 3/ u u R = u 4/5 u u 3 The final QR decomposition is / / / A = QR Q = / / 3 / / / / / R = 5 4.

Extra: computing determinants Consider the QR factorization of an invertible n n matrix: A = QR. det(r) is easy to compute because it is upper-triangular det(q) = ± (see below) Why: Q is orthonormal, Q T Q = I n, so Q T = Q. Also det(q T ) = det(q), = det(i n) = det(q T Q) = det(q T ) det(q) = det(q) ; so det(q) can take only two values: ±. Determinant (up to sign) If v, v,..., v n are the columns of A, and u, u,..., u n are the vectors obtained by applying Gram Schmidt, then det(a) = det(q) det(r) = ± u u u n ; Because the (i, i) entry of R is u i. Moreover, det(r) > so det(q) has the same sign as det(a).

Extra: computing eigenvalues The QR algorithm Let A be an n n matrix with real eigenvalues. Here is the algorithm: A = Q R A = R Q QR factorization swap the Q and R = Q R find its QR factorization A = R Q swap the Q and R = Q 3R 3 find its QR factorization et cetera Theorem The matrices A k converge to an upper triangular matrix whose diagonal entries are the eigenvalues of A. Moreover, the convergence is fast! The QR algorithm The algorithm above gives a computationally efficient way to find the eigenvalues of a matrix.