AMS526: Numerical Analysis I (Numerical Linear Algebra)

Similar documents
The QR Factorization

AMS526: Numerical Analysis I (Numerical Linear Algebra)

AMS526: Numerical Analysis I (Numerical Linear Algebra)

AMS526: Numerical Analysis I (Numerical Linear Algebra)

Math 407: Linear Optimization

AMS526: Numerical Analysis I (Numerical Linear Algebra)

AMS526: Numerical Analysis I (Numerical Linear Algebra for Computational and Data Sciences)

Applied Mathematics 205. Unit II: Numerical Linear Algebra. Lecturer: Dr. David Knezevic

Problem set 5: SVD, Orthogonal projections, etc.

AMS526: Numerical Analysis I (Numerical Linear Algebra)

AMS526: Numerical Analysis I (Numerical Linear Algebra for Computational and Data Sciences)

Pseudoinverse & Moore-Penrose Conditions

Numerical Linear Algebra Chap. 2: Least Squares Problems

AMS526: Numerical Analysis I (Numerical Linear Algebra)

4.8 Arnoldi Iteration, Krylov Subspaces and GMRES

Least-Squares Systems and The QR factorization

Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008

I. Multiple Choice Questions (Answer any eight)

AM 205: lecture 8. Last time: Cholesky factorization, QR factorization Today: how to compute the QR factorization, the Singular Value Decomposition

STAT 309: MATHEMATICAL COMPUTATIONS I FALL 2018 LECTURE 9

AMS526: Numerical Analysis I (Numerical Linear Algebra)

This can be accomplished by left matrix multiplication as follows: I

Linear Analysis Lecture 16

Solution of Linear Equations

AMS526: Numerical Analysis I (Numerical Linear Algebra)

Lecture notes: Applied linear algebra Part 1. Version 2

MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors.

ENGG5781 Matrix Analysis and Computations Lecture 8: QR Decomposition

AMS526: Numerical Analysis I (Numerical Linear Algebra)

Math 6610 : Analysis of Numerical Methods I. Chee Han Tan

(a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? Solution: dim N(A) 1, since rank(a) 3. Ax =

AMS526: Numerical Analysis I (Numerical Linear Algebra)

MATH36001 Generalized Inverses and the SVD 2015

Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition

AMS526: Numerical Analysis I (Numerical Linear Algebra)

Orthogonality. 6.1 Orthogonal Vectors and Subspaces. Chapter 6

YORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH M Test #2 Solutions

Lecture 13: Orthogonal projections and least squares (Section ) Thang Huynh, UC San Diego 2/9/2018

Math Fall Final Exam

Numerical Methods for Solving Large Scale Eigenvalue Problems

Basic Elements of Linear Algebra

7. Symmetric Matrices and Quadratic Forms

The Singular Value Decomposition and Least Squares Problems

Math 3191 Applied Linear Algebra

Conceptual Questions for Review

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

AMS526: Numerical Analysis I (Numerical Linear Algebra) Lecture 23: GMRES and Other Krylov Subspace Methods; Preconditioning

Stability of the Gram-Schmidt process

5. Orthogonal matrices

Notes on Solving Linear Least-Squares Problems

Linear Algebra Primer

Lecture 6, Sci. Comp. for DPhil Students

Linear Algebra- Final Exam Review

5.) For each of the given sets of vectors, determine whether or not the set spans R 3. Give reasons for your answers.

2. Review of Linear Algebra

MATH 350: Introduction to Computational Mathematics

MIDTERM. b) [2 points] Compute the LU Decomposition A = LU or explain why one does not exist.

Problem Set (T) If A is an m n matrix, B is an n p matrix and D is a p s matrix, then show

33AH, WINTER 2018: STUDY GUIDE FOR FINAL EXAM

University of Colorado Denver Department of Mathematical and Statistical Sciences Applied Linear Algebra Ph.D. Preliminary Exam June 8, 2012

MATH 22A: LINEAR ALGEBRA Chapter 4

Chapter 6. Orthogonality

Linear Algebra Review. Vectors

Orthonormal Transformations and Least Squares

Section 6.4. The Gram Schmidt Process

6. Orthogonality and Least-Squares

The 'linear algebra way' of talking about "angle" and "similarity" between two vectors is called "inner product". We'll define this next.

Numerical Methods. Elena loli Piccolomini. Civil Engeneering. piccolom. Metodi Numerici M p. 1/??

Homework 5. (due Wednesday 8 th Nov midnight)

Householder reflectors are matrices of the form. P = I 2ww T, where w is a unit vector (a vector of 2-norm unity)

MATH 167: APPLIED LINEAR ALGEBRA Least-Squares

MAT Linear Algebra Collection of sample exams

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

Notes on Eigenvalues, Singular Values and QR

EECS 275 Matrix Computation

18.06 Quiz 2 April 7, 2010 Professor Strang

The Singular Value Decomposition

Chapter 1. Preliminaries. The purpose of this chapter is to provide some basic background information. Linear Space. Hilbert Space.

Homework 11 Solutions. Math 110, Fall 2013.

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

2. LINEAR ALGEBRA. 1. Definitions. 2. Linear least squares problem. 3. QR factorization. 4. Singular value decomposition (SVD) 5.

AMS526: Numerical Analysis I (Numerical Linear Algebra)

MATH 235: Inner Product Spaces, Assignment 7

Lecture 10: Vector Algebra: Orthogonal Basis

Lecture 6. Numerical methods. Approximation of functions

Worksheet for Lecture 25 Section 6.4 Gram-Schmidt Process

Singular Value Decomposition

Orthogonal Projection, Low Rank Approximation, and Orthogonal Bases

Math 18, Linear Algebra, Lecture C00, Spring 2017 Review and Practice Problems for Final Exam

Numerical Methods I Singular Value Decomposition

ELE/MCE 503 Linear Algebra Facts Fall 2018

5.6. PSEUDOINVERSES 101. A H w.

Assignment 1 Math 5341 Linear Algebra Review. Give complete answers to each of the following questions. Show all of your work.

Linear Algebra Massoud Malek

Computational Methods CMSC/AMSC/MAPL 460. EigenValue decomposition Singular Value Decomposition. Ramani Duraiswami, Dept. of Computer Science

Assignment #9: Orthogonal Projections, Gram-Schmidt, and Least Squares. Name:

Vector and Matrix Norms. Vector and Matrix Norms

Designing Information Devices and Systems I Spring 2018 Lecture Notes Note 25

linearly indepedent eigenvectors as the multiplicity of the root, but in general there may be no more than one. For further discussion, assume matrice

Problem 1. CS205 Homework #2 Solutions. Solution

Transcription:

AMS526: Numerical Analysis I (Numerical Linear Algebra) Lecture 5: Projectors and QR Factorization Xiangmin Jiao SUNY Stony Brook Xiangmin Jiao Numerical Analysis I 1 / 14

Outline 1 Projectors 2 QR Factorization Xiangmin Jiao Numerical Analysis I 2 / 14

Projectors A projector satisfies P 2 = P. They are also said to be idempotent. Orthogonal projector. Oblique projector. [ ] 0 0 Example: α 1 is an oblique projector if α 0, is orthogonal projector if α = 0. Xiangmin Jiao Numerical Analysis I 3 / 14

Projectors A projector satisfies P 2 = P. They are also said to be idempotent. Orthogonal projector. Oblique projector. [ ] 0 0 Example: α 1 is an oblique projector if α 0, is orthogonal projector if α = 0. Xiangmin Jiao Numerical Analysis I 3 / 14

Complementary Projectors Complementary projectors: P vs. I P. What space does I P project? Xiangmin Jiao Numerical Analysis I 4 / 14

Complementary Projectors Complementary projectors: P vs. I P. What space does I P project? Answer: null(p). range(i P) null(p) because Pv = 0 (I P)v = v. range(i P) null(p) because for any v (I P)v = v Pv null(p). A projector separates C m into two complementary subspace: range space and null space (i.e., range(p) + null(p) = C m and range(p) null(p) = {0} for projector P C m m ) It projects onto range space along null space In other words, x = Px + r, where r null(p) Question: Are range space and null space of projector orthogonal to each other? Xiangmin Jiao Numerical Analysis I 4 / 14

Orthogonal Projector An orthogonal projector is one that projects onto a subspace S 1 along a space S 2, where S 1 and S 2 are orthogonal. Theorem A projector P is orthogonal if and only if P = P. Xiangmin Jiao Numerical Analysis I 5 / 14

Orthogonal Projector An orthogonal projector is one that projects onto a subspace S 1 along a space S 2, where S 1 and S 2 are orthogonal. Theorem A projector P is orthogonal if and only if P = P. Proof. If direction: If P = P, then (Px) (I P)y = x (P P 2 )y = 0. Only if direction: Use SVD. Suppose P projects onto S 1 along S 2 where S 1 S 2, and S 1 has dimension n. Let {q 1,...,, q n } be orthonormal basis of S 1 and {q n+1,..., q m } be a basis for S 2. Let Q be unitary matrix whose jth column is q j, and we have PQ = (q 1, q 2,..., q n, 0,..., 0), so Q PQ = diag(1, 1,, 1, 0, ) = Σ, and P = QΣQ. Xiangmin Jiao Numerical Analysis I 5 / 14

Orthogonal Projector An orthogonal projector is one that projects onto a subspace S 1 along a space S 2, where S 1 and S 2 are orthogonal. Theorem A projector P is orthogonal if and only if P = P. Proof. If direction: If P = P, then (Px) (I P)y = x (P P 2 )y = 0. Only if direction: Use SVD. Suppose P projects onto S 1 along S 2 where S 1 S 2, and S 1 has dimension n. Let {q 1,...,, q n } be orthonormal basis of S 1 and {q n+1,..., q m } be a basis for S 2. Let Q be unitary matrix whose jth column is q j, and we have PQ = (q 1, q 2,..., q n, 0,..., 0), so Q PQ = diag(1, 1,, 1, 0, ) = Σ, and P = QΣQ. Question: Are orthogonal projectors orthogonal matrices? Xiangmin Jiao Numerical Analysis I 5 / 14

Basis of Projections Projection with orthonormal basis Given any matrix ˆQ C m n whose columns are orthonormal, then P = ˆQ ˆQ is orthogonal projector, so is I P We write I P as P In particular, if ˆQ = q, we write Pq = qq and P q = I Pq For arbitrary vector a, we write Pa = a aa a and P a = I Pa Xiangmin Jiao Numerical Analysis I 6 / 14

Basis of Projections Projection with orthonormal basis Given any matrix ˆQ C m n whose columns are orthonormal, then P = ˆQ ˆQ is orthogonal projector, so is I P We write I P as P In particular, if ˆQ = q, we write Pq = qq and P q = I Pq For arbitrary vector a, we write Pa = a aa a and P a = I Pa Projection with arbitrary basis Given any matrix A C m n that has full rank and m n P = A(A A) 1 A is orthogonal projection What does P project onto? Xiangmin Jiao Numerical Analysis I 6 / 14

Basis of Projections Projection with orthonormal basis Given any matrix ˆQ C m n whose columns are orthonormal, then P = ˆQ ˆQ is orthogonal projector, so is I P We write I P as P In particular, if ˆQ = q, we write Pq = qq and P q = I Pq For arbitrary vector a, we write Pa = a aa a and P a = I Pa Projection with arbitrary basis Given any matrix A C m n that has full rank and m n is orthogonal projection What does P project onto? range(a) P = A(A A) 1 A (A A) 1 A is called the pseudo-inverse of A, denoted as A + Xiangmin Jiao Numerical Analysis I 6 / 14

Outline 1 Projectors 2 QR Factorization Xiangmin Jiao Numerical Analysis I 7 / 14

Motivation Question: Given a linear system Ax b where A C m n (m n) has full rank, how to solve the linear system? Xiangmin Jiao Numerical Analysis I 8 / 14

Motivation Question: Given a linear system Ax b where A C m n (m n) has full rank, how to solve the linear system? Answer: One possible solution is to use SVD. How? Xiangmin Jiao Numerical Analysis I 8 / 14

Motivation Question: Given a linear system Ax b where A C m n (m n) has full rank, how to solve the linear system? Answer: One possible solution is to use SVD. How? A = UΣV, so x = V Σ 1 U b. Xiangmin Jiao Numerical Analysis I 8 / 14

Motivation Question: Given a linear system Ax b where A C m n (m n) has full rank, how to solve the linear system? Answer: One possible solution is to use SVD. How? A = UΣV, so x = V Σ 1 U b. Another solution is to use QR factorization, which decompose A into product of two simple matrices Q and R where columns of Q are orthonormal and R is upper triangular. Xiangmin Jiao Numerical Analysis I 8 / 14

Two Different Versions of QR Analogous to SVD, there are two versions of QR Full QR factorization: A C m n (m n) A = QR where Q C m m is unitary and R C m n is upper triangular Reduced QR factorization: A C m n (m n) A = Q R where Q C m n contains orthonormal vectors and R C n n is upper triangular What space do {q 1, q 2,, q j }, j n span? Xiangmin Jiao Numerical Analysis I 9 / 14

Two Different Versions of QR Analogous to SVD, there are two versions of QR Full QR factorization: A C m n (m n) A = QR where Q C m m is unitary and R C m n is upper triangular Reduced QR factorization: A C m n (m n) A = Q R where Q C m n contains orthonormal vectors and R C n n is upper triangular What space do {q 1, q 2,, q j }, j n span? Answer: For full rank A, first j column vectors of A, i.e., q 1, q 2,..., q j = a 1, a 2,..., a j. Xiangmin Jiao Numerical Analysis I 9 / 14

Gram-Schmidt Orthogonalization A method to construct QR factorization is to orthogonalize the column vectors of A: Basic idea: Take first column a 1 and normalize it to obtain vector q 1 ; Take second column a2, subtract its orthogonal projection to q 1, and normalize to obtain q 2 ;... Take jth column of a j, subtract its orthogonal projection to q 1,..., q j 1, and normalize to obtain q j ; j 1 v j = a j q i a j q i, q j = v j / v j. i=1 This idea is called Gram-Schmidt orthogonalization. Xiangmin Jiao Numerical Analysis I 10 / 14

Gram-Schmidt Projections Orthogonal vectors produced by Gram-Schmidt can be written in terms of projectors q j = P ja j P j a j where P j = I ˆQ j 1 ˆQ j 1 with ˆQ j 1 = [ q 1 q 2 q j 1 ] P j projects orthogonally onto space orthogonal to q 1, q 2,..., q j 1 and rank of P j is m (j 1) Xiangmin Jiao Numerical Analysis I 11 / 14

Algorithm of Gram-Schmidt Orthogonalization Classical Gram-Schmidt method for j = 1 to n v j = a j for i = 1 to j 1 r ij = q i a j v j = v j r ij q i r jj = v j 2 q j = v j /r jj Xiangmin Jiao Numerical Analysis I 12 / 14

Algorithm of Gram-Schmidt Orthogonalization Classical Gram-Schmidt method for j = 1 to n v j = a j for i = 1 to j 1 r ij = q i a j v j = v j r ij q i r jj = v j 2 q j = v j /r jj Classical Gram-Schmidt (CGS) is unstable, which means that its solution is sensitive to perturbation Xiangmin Jiao Numerical Analysis I 12 / 14

Existence of QR Theorem Every A C m n (m n) has full QR factorization, hence also a reduced QR factorization. Xiangmin Jiao Numerical Analysis I 13 / 14

Existence of QR Theorem Every A C m n (m n) has full QR factorization, hence also a reduced QR factorization. Key idea of proof: If A has full rank, Gram-Schmidt algorithm provides a proof itself for having reduced QR. If A does not have full rank, at some step v j = 0. We can set q j to be a vector orthogonal to q i, i < j. To construct full QR from reduced QR, just continue Gram-Schmidt an additional m n steps. Xiangmin Jiao Numerical Analysis I 13 / 14

Uniqueness of QR Theorem Every A C m n (m n) of full rank has a unique reduced QR factorization A = Q R with r jj > 0. Key idea of proof: Proof is provided by Gram-Schmidt iteration itself. If the signs of r jj are determined, then r ij and q j are determined. Xiangmin Jiao Numerical Analysis I 14 / 14

Uniqueness of QR Theorem Every A C m n (m n) of full rank has a unique reduced QR factorization A = Q R with r jj > 0. Key idea of proof: Proof is provided by Gram-Schmidt iteration itself. If the signs of r jj are determined, then r ij and q j are determined. Question: Why do we require r jj > 0? Xiangmin Jiao Numerical Analysis I 14 / 14

Uniqueness of QR Theorem Every A C m n (m n) of full rank has a unique reduced QR factorization A = Q R with r jj > 0. Key idea of proof: Proof is provided by Gram-Schmidt iteration itself. If the signs of r jj are determined, then r ij and q j are determined. Question: Why do we require r jj > 0? Question: Is full QR factorization unique? Xiangmin Jiao Numerical Analysis I 14 / 14

Uniqueness of QR Theorem Every A C m n (m n) of full rank has a unique reduced QR factorization A = Q R with r jj > 0. Key idea of proof: Proof is provided by Gram-Schmidt iteration itself. If the signs of r jj are determined, then r ij and q j are determined. Question: Why do we require r jj > 0? Question: Is full QR factorization unique? Question: What if A does not have full rank? Xiangmin Jiao Numerical Analysis I 14 / 14