Matrix decompositions

Similar documents
1 Positive definiteness and semidefiniteness

Lecture 3: QR-Factorization

Numerical Linear Algebra

Solution of Linear Equations

Linear Algebra in Actuarial Science: Slides to the lecture

forms Christopher Engström November 14, 2014 MAA704: Matrix factorization and canonical forms Matrix properties Matrix factorization Canonical forms

Cheat Sheet for MATH461

Preliminary/Qualifying Exam in Numerical Analysis (Math 502a) Spring 2012

Linear Algebra Section 2.6 : LU Decomposition Section 2.7 : Permutations and transposes Wednesday, February 13th Math 301 Week #4

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

Matrix Factorization and Analysis

Linear Algebra, part 3 QR and SVD

Scientific Computing

Computational Linear Algebra

This can be accomplished by left matrix multiplication as follows: I

5.6. PSEUDOINVERSES 101. A H w.

lecture 2 and 3: algorithms for linear algebra

Linear Algebra, part 3. Going back to least squares. Mathematical Models, Analysis and Simulation = 0. a T 1 e. a T n e. Anna-Karin Tornberg

Linear Algebra Review

Linear Least squares

MATH 532: Linear Algebra

Section 6.4. The Gram Schmidt Process

Conceptual Questions for Review

Orthogonal Transformations

I. Multiple Choice Questions (Answer any eight)

Linear Analysis Lecture 16

Review problems for MA 54, Fall 2004.

Numerical Linear Algebra

Math 407: Linear Optimization

lecture 3 and 4: algorithms for linear algebra

AM 205: lecture 8. Last time: Cholesky factorization, QR factorization Today: how to compute the QR factorization, the Singular Value Decomposition

Numerical Methods. Elena loli Piccolomini. Civil Engeneering. piccolom. Metodi Numerici M p. 1/??

CS 323: Numerical Analysis and Computing

Dimensionality Reduction: PCA. Nicholas Ruozzi University of Texas at Dallas

Applied Mathematics 205. Unit II: Numerical Linear Algebra. Lecturer: Dr. David Knezevic

DEN: Linear algebra numerical view (GEM: Gauss elimination method for reducing a full rank matrix to upper-triangular

Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition

Today s class. Linear Algebraic Equations LU Decomposition. Numerical Methods, Fall 2011 Lecture 8. Prof. Jinbo Bi CSE, UConn

Index. book 2009/5/27 page 121. (Page numbers set in bold type indicate the definition of an entry.)

Applied Linear Algebra in Geoscience Using MATLAB

There are six more problems on the next two pages

Math Fall Final Exam

Numerical Methods - Numerical Linear Algebra

Linear Algebraic Equations

STAT 309: MATHEMATICAL COMPUTATIONS I FALL 2018 LECTURE 9

Solving linear equations with Gaussian Elimination (I)

be a Householder matrix. Then prove the followings H = I 2 uut Hu = (I 2 uu u T u )u = u 2 uut u

1.Chapter Objectives

Notes on singular value decomposition for Math 54. Recall that if A is a symmetric n n matrix, then A has real eigenvalues A = P DP 1 A = P DP T.

CS412: Lecture #17. Mridul Aanjaneya. March 19, 2015

Linear Algebra Primer

7. Symmetric Matrices and Quadratic Forms

9. Numerical linear algebra background

Matrix Algebra for Engineers Jeffrey R. Chasnov

A Review of Linear Algebra

The Gram Schmidt Process

The Gram Schmidt Process

1 9/5 Matrices, vectors, and their applications

9. Numerical linear algebra background

Review. Example 1. Elementary matrices in action: (a) a b c. d e f = g h i. d e f = a b c. a b c. (b) d e f. d e f.

Homework 2 Foundations of Computational Math 2 Spring 2019

EXAM. Exam 1. Math 5316, Fall December 2, 2012

14.2 QR Factorization with Column Pivoting

BlockMatrixComputations and the Singular Value Decomposition. ATaleofTwoIdeas

Pivoting. Reading: GV96 Section 3.4, Stew98 Chapter 3: 1.3

Problem Set (T) If A is an m n matrix, B is an n p matrix and D is a p s matrix, then show

MATH 3511 Lecture 1. Solving Linear Systems 1

Linear Systems of n equations for n unknowns

Matrix Multiplication Chapter IV Special Linear Systems

30.3. LU Decomposition. Introduction. Prerequisites. Learning Outcomes

Bindel, Fall 2016 Matrix Computations (CS 6210) Notes for

SUMMARY OF MATH 1600

STAT 309: MATHEMATICAL COMPUTATIONS I FALL 2018 LECTURE 13

Numerical Linear Algebra

Program Lecture 2. Numerical Linear Algebra. Gaussian elimination (2) Gaussian elimination. Decompositions, numerical aspects

Bilinear and quadratic forms

1 Last time: least-squares problems

Applied Linear Algebra

Fundamentals of Numerical Linear Algebra

Computational Methods. Least Squares Approximation/Optimization

Lecture 6, Sci. Comp. for DPhil Students

The QR Factorization

The Singular Value Decomposition and Least Squares Problems

1 Vectors. Notes for Bindel, Spring 2017 Numerical Analysis (CS 4220)

Linear Least Squares. Using SVD Decomposition.

LinGloss. A glossary of linear algebra

LU Factorization. LU Decomposition. LU Decomposition. LU Decomposition: Motivation A = LU

5 Selected Topics in Numerical Linear Algebra

Direct Methods for Solving Linear Systems. Matrix Factorization

MATH36001 Generalized Inverses and the SVD 2015

ENGG5781 Matrix Analysis and Computations Lecture 8: QR Decomposition

Q T A = R ))) ) = A n 1 R

L2-7 Some very stylish matrix decompositions for solving Ax = b 10 Oct 2015

G1110 & 852G1 Numerical Linear Algebra

AM205: Assignment 2. i=1

EECS 275 Matrix Computation

Linear Algebra Massoud Malek

1 Inner Product and Orthogonality

Lecture 9: Numerical Linear Algebra Primer (February 11st)

Numerical Methods I Non-Square and Sparse Linear Systems

Transcription:

Matrix decompositions Zdeněk Dvořák May 19, 2015 Lemma 1 (Schur decomposition). If A is a symmetric real matrix, then there exists an orthogonal matrix Q and a diagonal matrix D such that A = QDQ T. The diagonal entries of D are the eigenvalues of A. Lemma 2 (Cholesky decomposition). If A is a positive definite n n matrix, then there exists a unique lower-triangular matrix L with positive entries on the diagonal such that A = LL T. 1 LU(P) decomposition Definition 1. Let A be an n m matrix. Then A = LU is an LU decomposition of A if L is a lower-triangular n n matrix with ones on the diagonal, and U is an upper-triangular n m matrix. Similar to Cholesky decomposition, but does not require positive semidefiniteness of A. LU decomposition does not always exist. But: Lemma 3. For every square matrix A, there exists a permutation matrix P such that P A has an LU decomposition. Proof. Reorder the rows of A so that Gaussian elimination algorithm does not need to exchange rows the reordering is described by the permutation matrix P. Run Gaussian elimination for P A, only allowing addition of a multiple of a row to some row with higher index; the resulting matrix is U. The matrix L is obtained by performing the inverse operations on an identity matrix in the reverse order. Example 1. Find an LUP decomposition of A = 2 2 0 0 1

In the Gaussian elimination algorithm, we select the first pivot in the 1st row, the second pivot in the 3rd row, and the third pivot in the 2nd row (and 1 0 0 0 the remaining rows are zero, afterwards). Hence, we set P = 0 0 1 0 0 1 0 0 0 0 0 1 so that P A = 0 1 1 1 2 2 0 0 and the Gaussian elimination no longer needs to exchange rows. In the Gaussian elimination, we in order add α i-th row to j-th row 2 1 3 2 1 4 1/2 3 4, 1 0 0 0 ending up with U = 0 1 1 1 0 0 2 2. This corresponds to L = 0 1 0 0 2 0 1 0. 2 0 1/2 1 Hence, 1 0 0 0 1 0 0 0 0 0 1 0 0 1 0 0 A = 0 1 0 0 0 1 1 1 2 0 1 0 0 0 2 2. 0 0 0 1 2 0 1/2 1 To solve system Ax = b, equivalently we can solve P Ax = LUx = P b by forward and backward substitution (efficient when solving repeatedly with different right-hand sides). Computing inverse, determinant (slower than direct methods). 2 QR decomposition Definition 2. Let A be an n m matrix. Then A = QR is a QR decomposition of A if Q is an orthogonal n n matrix and R is an upper-triangular n m matrix. 2

Lemma 4. Every matrix has a QR decomposition. Proof. Run the Gram-Schmidt ortonormalization process for the columns v 1,..., v m of A to obtain an orthonormal basis u 1,..., u k of span(v 1,..., v m ), such that for i = 1,..., k, we have v i span(u 1,..., u i ). Let u k+1,..., u n be arbitrary vectors extending u 1,..., u k to an orthonormal basis, and let Q = (u 1... u n ). Then R = Q 1 A = Q T A is upper-triangular. Example 2. Compute a QR decomposition of A = 2 2 0 0 Gram-Schmidt ortonormalization process for the columns of A gives 1 3 (1, 2, 0, 2)T, (0, 0, 1, 0) T, 1/3(2, 2, 0, 1) T. This can be extended toan orthonormal basis by adding the vector 1/3(2, 1, 0, 2), 1/3 0 2/3 2/3 hence we can set Q = 2/3 0 2/3 1/3 0 1 0 0. 2/3 0 1/3 2/3 3 3 1 1 Hence, R = Q T A = 0 1 1 1 0 0 1 1. Remark: there exist better (numerically more stable) ways of computing QR decomposition. Definition 3. Let A = QR be a QR decomposition of an n m matrix A of rank k, where last n k rows of R are equal to 0. Let R be the k m matrix consisting of the first k rows of R, and let Q = (Q Q ), where Q has k columns. Then A = Q R is a reduced QR decomposition of A. To solve system Ax = b, solve Rx = Q T b by substitution (numerically more stable than Gaussian elimination, but slower). If A is an n m matrix with m n, the rank of A is m, and A = Q R is a reduced QR decomposition of A, then the solution to R x = (Q ) T b is the least-squares approximate solution to Ax = b. 3

QR algorithm to compute eigenvalues (basic idea): Let A 0 = A and for i 0, let A i = Q i R i be a QR decomposition of A i and let A i+1 = R i Q i. Note that A i+1 = R i Q i = Q 1 i Q i R i Q i = Q 1 i A i Q i, and thus A 0, A 1,... have the same eigenvalues. The sequence typically converges to an upper-triangular matrix, whose diagonal entries give the eigenvalues. 3 Singular value (SVD) decomposition Definition 4. Let A be an n m matrix. Then A = Q 1 DQ T 2 is an SVD decomposition of A if Q 1 is an orthogonal n n matrix, D is a diagonal n m matrix with non-negative entries on the diagonal, and Q 2 is an orthogonal m m matrix. Let r = rank(a). The non-zero entries d 1,..., d r of the diagonal matrix D are positive, and we call them the singular values of A. Lemma 5. If d 1,..., d r are the singular values of A, then d 2 1,..., d 2 r are the non-zero eigenvalues of A T A. Proof. We have A T A = (Q 2 D T Q T 1 )Q 1 DQ T 2 = Q 2 (D T D)Q T 2. Note that D T D is diagonal with d 2 1,..., d 2 r on the diagonal and that A T A and D T D have the same eigenvalues. To construct an SVD decomposition, first find a Schur decomposition Q 2 D Q T 2 of A T A, thus determining Q 2 and the singular values d 1,..., d r as the roots of the elements of the diagonal of D ; this also determines D and r = rank(a). Let Q 2 consist of the first r columns of Q 2, let D be the r r diagonal matrix with d 1 1,..., d 1 r on the diagonal, and let Q 1 = AQ 2D. Extend the n r matrix Q 1 to an orthonormal matrix Q 1 arbitrarily. Example 3. Compute an SVD decomposition of A = 2 2 0 0 9 9 3 3 We have A T A = 9 10 4 4 3 4 3 3 with Schur decomposition Q 2LQ T 2 for 3 4 3 3 0.615 0.473 0.631 0.000 21.67 Q 2 = 0.673 0.101 0.732 0.000 0.290 0.619 0.181 0.707 and L = 0 3.059 0 0 0 0 0.272 0. 0.290 0.619 0.181 0.707 4

Thus, the singular values are 21.670 4.655, 3.059 1.749, and 0.272 0.522. 0.615 0.473 0.631 We get Q 1 = AQ 2D = A 0.673 0.101 0.732 0.215 0 0 0.290 0.619 0.181 0 0.572 0 = 0 0 1.916 0.290 0.619 0.181 0.402 0.380 0.500 0.554 0.657 0.387 0.269 0.650 0.709 0.679 0.051 0.307 0.402 0.380 0.500 0.666 get Q 1 = 0.554 0.657 0.387 0.334 0.269 0.650 0.709 0.050 0.679 0.051 0.307 0.665. Extending Q 1 to an orthonormal matrix, we. Decomposes f(x) = Ax to isometries and scaling (singular values describe the deformation). For regular matrix A, the ratio d 1 /d n of singular values estimates loss of precision for matrix computations (inverse, solution of systems of equations,... ). Signal processing, compression (replacing small singular values by 0 does not change the matrix too much). The Frobenius norm of a matrix A is A = i,j A2 i,j = Trace(A T A) = i d2 i, where d 1,..., d r are the singular values of A. Hence the smallest singular value of A is the distance in Frobenius norm from A to a singular matrix (and actually, there exists no closer singular matrix). 5