Lecture 4 Orthonormal vectors and QR factorization

Similar documents
Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition

orthogonal relations between vectors and subspaces Then we study some applications in vector spaces and linear systems, including Orthonormal Basis,

Section 6.4. The Gram Schmidt Process

Math 407: Linear Optimization

Worksheet for Lecture 25 Section 6.4 Gram-Schmidt Process

(a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? Solution: dim N(A) 1, since rank(a) 3. Ax =

Linear Analysis Lecture 16

MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors.

Math 3191 Applied Linear Algebra

STAT 309: MATHEMATICAL COMPUTATIONS I FALL 2018 LECTURE 9

LINEAR ALGEBRA SUMMARY SHEET.

ENGG5781 Matrix Analysis and Computations Lecture 8: QR Decomposition

5. Orthogonal matrices

AM 205: lecture 8. Last time: Cholesky factorization, QR factorization Today: how to compute the QR factorization, the Singular Value Decomposition

This can be accomplished by left matrix multiplication as follows: I

Lecture 6, Sci. Comp. for DPhil Students

There are two things that are particularly nice about the first basis

Math 18, Linear Algebra, Lecture C00, Spring 2017 Review and Practice Problems for Final Exam

Mathematical Methods

Review problems for MA 54, Fall 2004.

Algorithms to Compute Bases and the Rank of a Matrix

Lecture: Linear algebra. 4. Solutions of linear equation systems The fundamental theorem of linear algebra

Dot product and linear least squares problems

Numerical Linear Algebra Chap. 2: Least Squares Problems

6. Orthogonality and Least-Squares

Linear Algebra. and

ORTHOGONALITY AND LEAST-SQUARES [CHAP. 6]

Row Space and Column Space of a Matrix

7. Dimension and Structure.

Applied Mathematics 205. Unit II: Numerical Linear Algebra. Lecturer: Dr. David Knezevic

Linear Algebra Primer

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

Lecture 10: Vector Algebra: Orthogonal Basis

235 Final exam review questions

MTH 2032 SemesterII

Math 240 Calculus III

Lecture Notes for EE263

Cheat Sheet for MATH461

MATH 22A: LINEAR ALGEBRA Chapter 4

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

A Brief Outline of Math 355

Lecture 3: QR-Factorization

ANALYTICAL MATHEMATICS FOR APPLICATIONS 2018 LECTURE NOTES 3

EE731 Lecture Notes: Matrix Computations for Signal Processing

Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008

BASIC ALGORITHMS IN LINEAR ALGEBRA. Matrices and Applications of Gaussian Elimination. A 2 x. A T m x. A 1 x A T 1. A m x

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

Glossary of Linear Algebra Terms. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB

Chapter 6: Orthogonality

Recall: Dot product on R 2 : u v = (u 1, u 2 ) (v 1, v 2 ) = u 1 v 1 + u 2 v 2, u u = u u 2 2 = u 2. Geometric Meaning:

Matrix Algebra for Engineers Jeffrey R. Chasnov

Problem # Max points possible Actual score Total 120

MATH Topics in Applied Mathematics Lecture 12: Evaluation of determinants. Cross product.

Applied Linear Algebra in Geoscience Using MATLAB

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

2. Linear algebra. matrices and vectors. linear equations. range and nullspace of matrices. function of vectors, gradient and Hessian

STAT 309: MATHEMATICAL COMPUTATIONS I FALL 2018 LECTURE 13

Solution of Linear Equations

Math 396. An application of Gram-Schmidt to prove connectedness

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

The Singular Value Decomposition and Least Squares Problems

Linear Algebra in Actuarial Science: Slides to the lecture

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

Lecture 4: Applications of Orthogonality: QR Decompositions

1 Determinants. 1.1 Determinant

Quantum Computing Lecture 2. Review of Linear Algebra

Conceptual Questions for Review

Math 261 Lecture Notes: Sections 6.1, 6.2, 6.3 and 6.4 Orthogonal Sets and Projections

We will discuss matrix diagonalization algorithms in Numerical Recipes in the context of the eigenvalue problem in quantum mechanics, m A n = λ m

Math 416, Spring 2010 Gram-Schmidt, the QR-factorization, Orthogonal Matrices March 4, 2010 GRAM-SCHMIDT, THE QR-FACTORIZATION, ORTHOGONAL MATRICES

Chapter SSM: Linear Algebra Section Fails to be invertible; since det = 6 6 = Invertible; since det = = 2.

Linear Algebra Review. Vectors

Mathematical Optimisation, Chpt 2: Linear Equations and inequalities

Least-Squares Systems and The QR factorization

MODULE 8 Topics: Null space, range, column space, row space and rank of a matrix

ELE/MCE 503 Linear Algebra Facts Fall 2018

6.1. Inner Product, Length and Orthogonality

Linear Least Squares Problems

Linear Algebra II. 7 Inner product spaces. Notes 7 16th December Inner products and orthonormal bases

Linear Algebra 18 Orthogonality

Chapter 6. Orthogonality

Math 3C Lecture 25. John Douglas Moore

Preliminary/Qualifying Exam in Numerical Analysis (Math 502a) Spring 2012

33AH, WINTER 2018: STUDY GUIDE FOR FINAL EXAM

AMS526: Numerical Analysis I (Numerical Linear Algebra)

SUMMARY OF MATH 1600

18.06 Professor Johnson Quiz 1 October 3, 2007

1. General Vector Spaces

Practice Exam. 2x 1 + 4x 2 + 2x 3 = 4 x 1 + 2x 2 + 3x 3 = 1 2x 1 + 3x 2 + 4x 3 = 5

Matrix decompositions

Math 291-2: Lecture Notes Northwestern University, Winter 2016

University of Colorado Denver Department of Mathematical and Statistical Sciences Applied Linear Algebra Ph.D. Preliminary Exam June 8, 2012

Numerical Linear Algebra

MATH 425-Spring 2010 HOMEWORK ASSIGNMENTS

LINEAR ALGEBRA BOOT CAMP WEEK 4: THE SPECTRAL THEOREM

MATH 167: APPLIED LINEAR ALGEBRA Least-Squares

MATH 532: Linear Algebra

Mathematical Methods

October 25, 2013 INNER PRODUCT SPACES

Math 224, Fall 2007 Exam 3 Thursday, December 6, 2007

Transcription:

Orthonormal vectors and QR factorization 4 1 Lecture 4 Orthonormal vectors and QR factorization EE263 Autumn 2004 orthonormal vectors Gram-Schmidt procedure, QR factorization orthogonal decomposition induced by a matrix

Orthonormal vectors and QR factorization 4 2 Orthonormal set of vectors set of vectors u 1,..., u k R n is normalized if u i = 1, i = 1,..., k (u i are called unit vectors or direction vectors) orthogonal if u i u j for i j orthonormal if both in terms of U = [u 1 u k ], orthonormal means U T U = I k orthonormal vectors are independent (multiply α 1 u 1 + α 2 u 2 + + α k u k = 0 by u T i ) hence u 1,..., u k is an orthonormal basis for span(u 1,..., u k ) = R(U)

Orthonormal vectors and QR factorization 4 3 Geometric properties suppose columns of U = [u 1 u k ] are orthonormal if w = Uz, then w = z multiplication by U does not change norm mapping w = Uz is isometric: it preserves distances simple derivation using matrices: w 2 = Uz 2 = (Uz) T (Uz) = z T U T Uz = z T z = z 2 inner products are also preserved: Uz, U z = z, z if w = Uz and w = U z then w, w = Uz, U z = (Uz) T (U z) = z T U T U z = z, z norms and inner products preserved angles are preserved: (Uz, U z) = (z, z) i.e., multiplication by U preserves inner products, angles, and distances

Orthonormal vectors and QR factorization 4 4 Orthonormal basis for R n suppose u 1,..., u n is an orthonormal basis for R n then U = [u 1 u n ] is called orthogonal: it is square and satisfies U T U = I (you d think such matrices would be called orthonormal, not orthogonal... ) it follows that U 1 = U T, and hence also UU T = I, i.e., n i=1 u iu T i = I

Orthonormal vectors and QR factorization 4 5 Expansion in orthonormal basis suppose U is orthogonal, so x = UU T x, i.e., x = n i=1 (ut i x)u i u T i x is called the component of x in the direction u i a = U T x resolves x into the vector of its u i components x = Ua reconstitutes x from its u i components x = Ua = n i=1 a iu i is called the (u i -) expansion of x the identity I = UU T = n i=1 u i u T i is sometimes written as I = n i u i i=1 since x = n i u i x i=1 (but we won t use this notation)

Orthonormal vectors and QR factorization 4 6 Geometric interpretation if U is orthogonal, then transformation w = Uz preserves norm of vectors, i.e., Uz = z preserves angles between vectors, i.e., (Uz, U z) = (z, z) examples: rotations (about some axis) reflections (through some plane) in fact, the converse is true: if U is orthogonal, then the mapping w = Uz is either a rotation or a reflection

Orthonormal vectors and QR factorization 4 7 Example: rotation by θ in R 2 is given by y = U θ x, U θ = cos θ sin θ sin θ cos θ (since e 1 [ cos θ sin θ ] T, e2 [ sin θ cos θ ] T ) reflection across line x 2 = x 1 tan( θ 2 ) is given by y = R θ x, R θ = cos θ sin θ sin θ cos θ (since e 1 [ cos θ sin θ ] T, e2 [ PSfrag replacements sin θ cos θ ] T ) sin θ cos θ x 2 e 2 θ rotation cos θ sin θ e 1 x 1 x 2 e 2 θ reflection cos θ sin θ e 1 sin θ cos θ x 1 can check that U θ and R θ are orthogonal

Orthonormal vectors and QR factorization 4 8 Gram-Schmidt procedure given independent vectors a 1,..., a k R n, finds orthonormal vectors q 1,..., q k s.t. span(a 1,..., a r ) = span(q 1,..., q r ) for r k one method: q i s are given recursively as q 1 := a 1 q 1 := q 1 / q 1 (normalize) q 2 := a 2 (q T 1 a 2 )q 1 (remove q 1 component from a 2 ) q 2 := q 2 / q 2 (normalize) q 3 := a 3 (q T 1 a 3 )q 1 (q T 2 a 3 )q 2 (remove q 1, q 2 components from a 3 ) q 3 := q 3 / q 3 (normalize) etc. a 2 PSfrag replacements q 2 = a 2 (q T 1 a 2 )q 1 q 2 q 1 (q T 1 a 2 )q 1 q 1 = a 1

Orthonormal vectors and QR factorization 4 9 for i = 1, 2,..., k we have a i = r 1i q 1 + r 2i q 2 + + r ii q i (the r ij are easily obtained from the procedure) with r ii 0 (if not, a i can be written as a linear combination of a 1,..., a i 1 ) written in matrix form: [ a1 a 2 a k ] } {{ } A = [ ] q 1 q 2 q k } {{ } Q r 11 r 12 r 1k 0 r 22 r 2k...... 0 0 r kk } {{ } R i.e., A = QR where Q T Q = I k and R is upper triangular, invertible (full rank, det R = k i=1 r ii 0) called QR decomposition of A never computed using Gram-Schmidt (due to propagation of numerical error) columns of Q are orthonormal basis for R(A)

Orthonormal vectors and QR factorization 4 10 General Gram-Schmidt procedure on page 4 8 we assume a 1,..., a k are independent... if a 1,..., a k are dependent, we find q j = 0 for some j, which means a j is linearly dependent on a 1,..., a j 1 modified algorithm: when we encounter q j = 0, skip to next vector a j+1 and continue: r = 0; for i = 1,..., k { ã = a i r j=1 q j q T j a i ; if ã 0 { r = r + 1; q r = ã/ ã ; } } on exit, r = Rank(A) q 1,..., q r is an orthonormal basis for R(A) each a i is linear combination of previously generated q j s

Orthonormal vectors and QR factorization 4 11 in matrix notation we have A = QR with Q T Q = I r and R R rk in upper staircase form: zero entries possibly nonzero entries PSfrag replacements entries below staircase are zero corner entries (shown with ) are nonzero hence R is full rank (r) can permute columns with to front of matrix: A = Q[ R S]P where: Q T Q = I r R R rr is upper triangular and invertible P R kk is a permutation matrix

Orthonormal vectors and QR factorization 4 12 Applications directly yields o.n. basis for R(A) yields factorization A = BC with B R nr, C R rk, r = Rank(A) to check if b span(a 1,..., a k ): apply Gram-Schmidt to [a 1 a k b] more generally, staircase pattern in R shows which columns of A are dependent on previous ones works incrementally: one G-S procedure yields QR factorizations of [a 1 a p ] for p = 1,..., k: [a 1 a p ] = [q 1 q s ]R p where s = Rank([a 1 a p ]) and R p is leading s p submatrix of R

Orthonormal vectors and QR factorization 4 13 Full QR factorization with A = Q 1 R 1 the QR factorization as above, write A = [ Q 1 Q 2 ] R 1 0 where [Q 1 Q 2 ] is orthogonal, i.e., columns of Q 2 R n(n r) are orthonormal, orthogonal to Q 1 to find Q 2 : find any matrix à s.t. [A Ã] is full rank (e.g., à = I) apply general Gram-Schmidt to [A Ã] Q 1 are o.n. vectors obtained from columns of A Q 2 are o.n. vectors obtained from extra columns (Ã) i.e., any set of o.n. vectors can be extended to an o.n. basis for R n Q 1 and Q 2 are called complementary since their ranges are orthogonal together they span R n

Orthonormal vectors and QR factorization 4 14 Orthogonal decomposition induced by A interpretation of Q 2 : so A T = [ R T 1 0 ] Q T 1 Q T 2 z N (A T ) Q T 1 z = 0 z R(Q 2 ) i.e., columns of Q 2 are o.n. basis for N (A T ) (recall columns of Q 1 are o.n. basis for R(A)) we conclude: R(A) and N (A T ) are complementary subspaces: they are orthogonal: R(A) N (A T ) together they span R n sometimes this is written R(A) + N (A T ) = R n every y R n can be written uniquely as y = z + w, with z R(A), w N (A T )