EECS 275 Matrix Computation

Similar documents
EECS 275 Matrix Computation

AMS526: Numerical Analysis I (Numerical Linear Algebra)

ENGG5781 Matrix Analysis and Computations Lecture 8: QR Decomposition

EECS 275 Matrix Computation

EECS 275 Matrix Computation

EECS 275 Matrix Computation

EECS 275 Matrix Computation

Applied Mathematics 205. Unit II: Numerical Linear Algebra. Lecturer: Dr. David Knezevic

Section 6.4. The Gram Schmidt Process

Math 3191 Applied Linear Algebra

Lecture 3: QR-Factorization

Large-scale eigenvalue problems

We will discuss matrix diagonalization algorithms in Numerical Recipes in the context of the eigenvalue problem in quantum mechanics, m A n = λ m

Econ Slides from Lecture 7

Lecture 15, 16: Diagonalization

Notes on Eigenvalues, Singular Values and QR

AMS526: Numerical Analysis I (Numerical Linear Algebra)

Review of similarity transformation and Singular Value Decomposition

Solutions to Review Problems for Chapter 6 ( ), 7.1

Review problems for MA 54, Fall 2004.

EECS 275 Matrix Computation

Matrix decompositions

Lecture 6, Sci. Comp. for DPhil Students

AM 205: lecture 8. Last time: Cholesky factorization, QR factorization Today: how to compute the QR factorization, the Singular Value Decomposition

AMS526: Numerical Analysis I (Numerical Linear Algebra)

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

1 Last time: least-squares problems

MATH 532: Linear Algebra

Math 18, Linear Algebra, Lecture C00, Spring 2017 Review and Practice Problems for Final Exam

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

Numerical Analysis Lecture Notes

Numerical Linear Algebra Homework Assignment - Week 2

Lecture 4 Orthonormal vectors and QR factorization

Linear Analysis Lecture 16

This can be accomplished by left matrix multiplication as follows: I

Applied Numerical Linear Algebra. Lecture 8

Lecture 4 Eigenvalue problems

Important Matrix Factorizations

EXAM. Exam 1. Math 5316, Fall December 2, 2012

LINEAR ALGEBRA SUMMARY SHEET.

Lecture # 5 The Linear Least Squares Problem. r LS = b Xy LS. Our approach, compute the Q R decomposition, that is, n R X = Q, m n 0

. The following is a 3 3 orthogonal matrix: 2/3 1/3 2/3 2/3 2/3 1/3 1/3 2/3 2/3

SUMMARY OF MATH 1600

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

Solving large scale eigenvalue problems

Orthogonal Transformations

Chasing the Bulge. Sebastian Gant 5/19/ The Reduction to Hessenberg Form 3

6. Orthogonality and Least-Squares

TBP MATH33A Review Sheet. November 24, 2018

Course Notes: Week 1

Section 4.5 Eigenvalues of Symmetric Tridiagonal Matrices

Introduction to Matrix Algebra

Conceptual Questions for Review

3 QR factorization revisited

Math 407: Linear Optimization

ANSWERS. E k E 2 E 1 A = B

Last Time. Social Network Graphs Betweenness. Graph Laplacian. Girvan-Newman Algorithm. Spectral Bisection

5 Selected Topics in Numerical Linear Algebra

Numerical Linear Algebra Chap. 2: Least Squares Problems

Image Registration Lecture 2: Vectors and Matrices

Linear algebra & Numerical Analysis

LINEAR ALGEBRA KNOWLEDGE SURVEY

2. Every linear system with the same number of equations as unknowns has a unique solution.

Main matrix factorizations

Section 4.4 Reduction to Symmetric Tridiagonal Form

YORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH M Test #2 Solutions

Some notes on Linear Algebra. Mark Schmidt September 10, 2009

Lecture 3: Review of Linear Algebra

4.2. ORTHOGONALITY 161

AM205: Assignment 2. i=1

Lecture 3: Review of Linear Algebra

Properties of Linear Transformations from R n to R m

Problem 1: Solving a linear equation

I. Multiple Choice Questions (Answer any eight)

MATH 350: Introduction to Computational Mathematics

Solving large scale eigenvalue problems

forms Christopher Engström November 14, 2014 MAA704: Matrix factorization and canonical forms Matrix properties Matrix factorization Canonical forms

Linear Algebra - Part II

Linear Algebra: Matrix Eigenvalue Problems

Linear Algebra Methods for Data Mining

Chapter 3 Transformations

Worksheet for Lecture 25 Section 6.4 Gram-Schmidt Process

Chapter 6: Orthogonality

j=1 u 1jv 1j. 1/ 2 Lemma 1. An orthogonal set of vectors must be linearly independent.

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

Maths for Signals and Systems Linear Algebra in Engineering

MTH 2032 SemesterII

Linear Algebra, part 3 QR and SVD

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.

MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors.

MAA507, Power method, QR-method and sparse matrix representation.

2. Review of Linear Algebra

18.06SC Final Exam Solutions

a 11 a 12 a 11 a 12 a 13 a 21 a 22 a 23 . a 31 a 32 a 33 a 12 a 21 a 23 a 31 a = = = = 12

AMS526: Numerical Analysis I (Numerical Linear Algebra for Computational and Data Sciences)

arxiv: v1 [math.na] 5 May 2011

Practice Exam. 2x 1 + 4x 2 + 2x 3 = 4 x 1 + 2x 2 + 3x 3 = 1 2x 1 + 3x 2 + 4x 3 = 5

Example Linear Algebra Competency Test

Lecture 4: Applications of Orthogonality: QR Decompositions

REVIEW FOR EXAM III SIMILARITY AND DIAGONALIZATION

Transcription:

EECS 275 Matrix Computation Ming-Hsuan Yang Electrical Engineering and Computer Science University of California at Merced Merced, CA 95344 http://faculty.ucmerced.edu/mhyang Lecture 12 1 / 18

Overview QR decomposition by Householder transformation QR decomposition by Givens rotation 2 / 18

Reading Chapter 10 of Numerical Linear Algebra by Llyod Trefethen and David Bau Chapter 5 of Matrix Computations by Gene Golub and Charles Van Loan Chapter 5 of Matrix Analysis and Applied Linear Algebra by Carl Meyer 3 / 18

Householder triangularization In Gram-Schmidt process A R 1 R 2 R }{{ n = Q } R 1 has orthonormal columns. The product R = Rn 1 triangular R 1 2 R 1 1 is upper In Householder triangularization, a series of elementary orthogonal matrices Q k is applied to the left of A Q n Q 2 Q }{{} 1 A = R Q is upper triangular. The product Q = Q1 Q 2 Q n and therefore A = QR is a QR factorization of A is orthogonal, 4 / 18

Geometry of elementary projectors For u, x IR n, s.t. u = 1 Orthogonal projectors onto span{u} and u are P u = uu u u, and P u = I uu u u For u 0, the Householder transformation or the elementary reflector about u is R = I 2 uu u u or R = I 2uu when u = 1, and R = R = R 1 5 / 18

Triangularization by introducing zeros The matrix Q k is chosen to introduce zeros below the diagonal in the k-th column while preserving all the zeros previously introduced Q 1 0 0 Q 2 0 Q 3 0 0 0 0 0 0 A Q 1 A Q 2 Q 1 A Q 3 Q 2 Q 1 A Q k operates on row k,..., m (changed entries are denoted by boldface or and blank entries are zero) At beginning of step k, there is a block of zeros in the first k 1 columns of these rows The application of Q k forms linear combinations of these rows, and the linear combination of the zero entries remain zero After n steps, all the entries below the diagonal have been eliminated and Q n Q 2 Q 1 A = R is upper triangular 6 / 18

Householder reflectors At beginning of step k, there is a block of zeros in the first k 1 columns of these rows Each Q k is chosen to be [ ] I 0 Q k = 0 F where I is the (k 1) (k 1) identity matrix and F is an (m k + 1) (m k + 1) orthogonal matrix Multiplication by F has to introduce zeros into the k-th column The Householder algorithm chooses F to be a particular matrix called Householder reflector At step k, the entries k,..., m of the k-th column are given by vector x IR m k+1 7 / 18

Householder transformation (cont d) To introduce zeros into k-th column (x IR m k+1 ), the Householder transformation F should x F 0 x = F x =.. = x e 1 = αe 1 0 The reflector F will reflect the space IR m k+1 across the hyperplane H orthogonal to u = x e 1 x A hyperplane is characterized by a vector u = x e 1 x 8 / 18

Householder transformation (cont d) Every point x IR m is mapped to a mirror point and hence F x = (I 2 uu u u )x = x 2u( u x u u ) F = (I 2 uu u u ) Will fix the +/- sign in the next slide 9 / 18

The better of two Householder reflectors Two Householder reflectors (transformations) For numerical stability pick the one that moves reflect x to the vector x e 1 that is not to close to x itself, i.e., x e 1 x in this case In other words, the better of the two reflectors is u = sign(x 1 ) x e 1 + x where x 1 is the first element of x (sign(x 1 ) = 1 if x 1 = 0) 10 / 18

Householder QR factorization Algorithm: for k = 1 to n do x = A k:m,k u k = sign(x 1 ) x 2 e 1 + x u k = u k u k 2 A k:m,k:n = (I 2u k u k )A k:m,k:n end for Recall Q k = [ ] I 0 0 F Upon completion, A has been reduced to upper triangular form, i.e., R in A = QR Q = Q n Q 2 Q 1 or Q = Q 1 Q 2 Q n 11 / 18

QR decomposition with Householder transformation Want to compute QR decomposition A with Householder transformation 12 51 4 A = 6 167 68 4 24 41 Need to find a reflector for first column of A, x = [12, 6, 4] to x e 1 = [14, 0, 0] u= x e 1 x = [2, 6, 4] = 2[1, 3, 2] 6/7 3/7 2/7 14 21 14 F 1 =I 2 uu u u =, F 1 A = 3/7 2/7 6/7 2/7 6/7 3/7 Next need to zero out A 32 and apply the same process to [ ] A 49 14 = 168 77 0 49 14 0 168 77 12 / 18

QR decomposition with Householder (cont d) With the same process 1 0 0 F 2 = 0 7/25 24/25 0 24/25 7/25 Thus, we have 6/7 69/175 58/175 Q = Q 1 Q 2 = 3/7 158/175 6/175 2/7 6/35 33/35 14 21 14 R = Q 2 Q 1 A = Q A = 0 175 70 0 0 35 The matrix Q is orthogonal and R is upper triangular 13 / 18

Givens rotations Givens rotation: orthogonal transform to zero out elements selectively 1 0 0 0....... 0 c s 0 i G(i, k, θ) =.... 0 s c 0 k....... 0 0 0 1 i k where c = cos(θ) and s = sin(θ) for some θ Pre-multiply G(i, k, θ) amounts to a counterclockwise rotation θ in the (i, k) coordinate plane, y = G(i, k, θ)x cx i sx k j = i y j = sx i + cx k j = k j i, k x j 14 / 18

Givens rotations (cont d) Can zero out y k = sx i + cx k = 0 by setting c = x 2 i x i + x 2 k, s = x k, θ = arctan(x k /x i ) xi 2 + xk 2 QR decomposition can be computed by a series of Givens rotations Each rotation zeros an element in the subdiagonal of the matrix, forming R matrix, Q = G 1... G n forms the orthogonal Q matrix Useful for zero out few elements off diagonal (e.g., sparse matrix) Example 12 51 4 A = 6 167 68 4 24 41 Want to zero out A 31 = 4 with rotation vector (6, 4) to point along the x-axis, i.e., θ = arctan( 4/6) 15 / 18

QR factorization with Givens rotation With θ we have the orthogonal Givens rotation G 1 1 0 0 1 0 0 G 1 = 0 cos(θ) sin(θ) = 0 0.83205 0.55470 0 sin(θ) cos(θ) 0 0.55470 0.83205 Pre-multiply A with G 1 12 51 4 G 1 A = 7.2110 125.6396 33.83671 0 112.6041 71.83368 Continue to zero out A 21 and A 32 and form a triangular matrix R The orthogonal matrix Q = G 3 G 2 G 1, and G 3 G 2 G 1 A = Q A = R for QR decomposition 16 / 18

Gram-Schmidt, Householder and Givens Householder QR is numerically more stable Gram-Schmidt computes orthonormal basis incrementally Givens rotation is more useful for zero out few selective elements 17 / 18

Eigendecomposition Also known as spectral decomposition A is a square matrix A = QDQ 1 where Q is a square matrix whose columns are eigenvector and D is a diagonal matrix whose elements are the corresponding eigenvalues With eigendecomposition AQ = QD Aq i = λ i q i where λ i and q i are eigenvalues and eigenvectors of Ax = λx The eigenvectors are usually normalized but not necessarily If A can be eigendecomposed with all non-zero eigenvalues A 1 = QD 1 Q 1 18 / 18