The Singular Value Decomposition

Similar documents
UNIT 6: The singular value decomposition.

Singular Value Decomposition

The QR Factorization

Applied Mathematics 205. Unit II: Numerical Linear Algebra. Lecturer: Dr. David Knezevic

be a Householder matrix. Then prove the followings H = I 2 uut Hu = (I 2 uu u T u )u = u 2 uut u

MATH36001 Generalized Inverses and the SVD 2015

Notes on Eigenvalues, Singular Values and QR

Eigenvalue and Eigenvector Problems

The Singular Value Decomposition

STAT 309: MATHEMATICAL COMPUTATIONS I FALL 2017 LECTURE 5

Notes on singular value decomposition for Math 54. Recall that if A is a symmetric n n matrix, then A has real eigenvalues A = P DP 1 A = P DP T.

Computational math: Assignment 1

Linear Algebra, part 3. Going back to least squares. Mathematical Models, Analysis and Simulation = 0. a T 1 e. a T n e. Anna-Karin Tornberg

AM 205: lecture 8. Last time: Cholesky factorization, QR factorization Today: how to compute the QR factorization, the Singular Value Decomposition

(a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? Solution: dim N(A) 1, since rank(a) 3. Ax =

Problem set 5: SVD, Orthogonal projections, etc.

Lecture notes: Applied linear algebra Part 1. Version 2

The Singular Value Decomposition (SVD) and Principal Component Analysis (PCA)

Math 6610 : Analysis of Numerical Methods I. Chee Han Tan

The Singular Value Decomposition and Least Squares Problems

Linear Least Squares. Using SVD Decomposition.

Linear Algebra Formulas. Ben Lee

Math 407: Linear Optimization

Summary of Week 9 B = then A A =

Linear Algebra, part 3 QR and SVD

MATH 612 Computational methods for equation solving and function minimization Week # 2

Maths for Signals and Systems Linear Algebra in Engineering

σ 11 σ 22 σ pp 0 with p = min(n, m) The σ ii s are the singular values. Notation change σ ii A 1 σ 2

Introduction to Numerical Linear Algebra II

5 Selected Topics in Numerical Linear Algebra

Review of Some Concepts from Linear Algebra: Part 2

Linear Algebra Review. Vectors

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

MATH 350: Introduction to Computational Mathematics

Dimensionality Reduction: PCA. Nicholas Ruozzi University of Texas at Dallas

1. Select the unique answer (choice) for each problem. Write only the answer.

linearly indepedent eigenvectors as the multiplicity of the root, but in general there may be no more than one. For further discussion, assume matrice

THE SINGULAR VALUE DECOMPOSITION MARKUS GRASMAIR

Symmetric and anti symmetric matrices

18.06 Problem Set 10 - Solutions Due Thursday, 29 November 2007 at 4 pm in

Singular Value Decomposition (SVD)

Singular Value Decomposition

Singular Value Decomposition (SVD) and Polar Form

Linear Algebra: Matrix Eigenvalue Problems

Review problems for MA 54, Fall 2004.

Computational Methods. Eigenvalues and Singular Values

Spectral Theorem for Self-adjoint Linear Operators

Chapter 0 Miscellaneous Preliminaries

5.6. PSEUDOINVERSES 101. A H w.

LinGloss. A glossary of linear algebra

Linear Algebra - Part II

Chapter 3 Transformations

Lecture notes on Quantum Computing. Chapter 1 Mathematical Background

ELE/MCE 503 Linear Algebra Facts Fall 2018

Linear Algebra in Actuarial Science: Slides to the lecture

Math Fall Final Exam

Linear Algebra Primer

EE731 Lecture Notes: Matrix Computations for Signal Processing

Review of some mathematical tools

Properties of Matrices and Operations on Matrices

18.06SC Final Exam Solutions

Fall TMA4145 Linear Methods. Exercise set Given the matrix 1 2

The University of Texas at Austin Department of Electrical and Computer Engineering. EE381V: Large Scale Learning Spring 2013.

Linear Algebra Review. Fei-Fei Li

Numerical Methods I Singular Value Decomposition

forms Christopher Engström November 14, 2014 MAA704: Matrix factorization and canonical forms Matrix properties Matrix factorization Canonical forms

Math Final December 2006 C. Robinson

Principal Component Analysis

Chapters 5 & 6: Theory Review: Solutions Math 308 F Spring 2015

Math 408 Advanced Linear Algebra

STA141C: Big Data & High Performance Statistical Computing

Chapter 7: Symmetric Matrices and Quadratic Forms

IV. Matrix Approximation using Least-Squares

Cheat Sheet for MATH461

Lecture 10 - Eigenvalues problem

AMS526: Numerical Analysis I (Numerical Linear Algebra)

ft-uiowa-math2550 Assignment NOTRequiredJustHWformatOfQuizReviewForExam3part2 due 12/31/2014 at 07:10pm CST

STA141C: Big Data & High Performance Statistical Computing

Linear Algebra Review. Fei-Fei Li

Matrix Factorization and Analysis

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )

Let A an n n real nonsymmetric matrix. The eigenvalue problem: λ 1 = 1 with eigenvector u 1 = ( ) λ 2 = 2 with eigenvector u 2 = ( 1

2. LINEAR ALGEBRA. 1. Definitions. 2. Linear least squares problem. 3. QR factorization. 4. Singular value decomposition (SVD) 5.

Problem # Max points possible Actual score Total 120

Linear Algebra Methods for Data Mining

Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008

7. Symmetric Matrices and Quadratic Forms

Review of Linear Algebra

Matrix Analysis and Algorithms

Linear Algebra. Session 12

AMS526: Numerical Analysis I (Numerical Linear Algebra for Computational and Data Sciences)

Conceptual Questions for Review

Matrix decompositions

ft-uiowa-math2550 Assignment OptionalFinalExamReviewMultChoiceMEDIUMlengthForm due 12/31/2014 at 10:36pm CST

Functional Analysis Review

This can be accomplished by left matrix multiplication as follows: I

Stat 159/259: Linear Algebra Notes

Singular Value Decomposition

Maths for Signals and Systems Linear Algebra in Engineering

LEC 2: Principal Component Analysis (PCA) A First Dimensionality Reduction Approach

Transcription:

The Singular Value Decomposition An Important topic in NLA Radu Tiberiu Trîmbiţaş Babeş-Bolyai University February 23, 2009 Radu Tiberiu Trîmbiţaş ( Babeş-Bolyai University)The Singular Value Decomposition February 23, 2009 1 / 24

The SVD -The Main Idea Motivation: The image of the unit sphere under any m n matrix is a hyperellipse Radu Tiberiu Trîmbiţaş ( Babeş-Bolyai University)The Singular Value Decomposition February 23, 2009 2 / 24

The SVD -Brief Description Suppose (for the moment) that A is m n with m n and full rank n Choose orthonormal bases v 1,..., v n for the row space u 1,..., u n for the column space such that Av i is in the direction of u i : Av i = σ i u i In MATLAB: eigshow The singular values σ 1 σ 2 σ n > 0 Radu Tiberiu Trîmbiţaş ( Babeş-Bolyai University)The Singular Value Decomposition February 23, 2009 3 / 24

The SVD -Brief Description In matrix form, Av i = σ i u i becomes where ˆΣ = diag(σ 1, σ 2,..., σ n ) AV = Û ˆΣ, that is, A = Û ˆΣV This is the reduced singular value decomposition Add orthogonal extension to Û and add rows to ˆΣ to obtain the full singular value decomposition A = UΣV Radu Tiberiu Trîmbiţaş ( Babeş-Bolyai University)The Singular Value Decomposition February 23, 2009 4 / 24

Reduced SVD A compact representation is the reduced SVD, for m n: A = Û ˆΣV where Û is m n, V is n n, Σ is n n Radu Tiberiu Trîmbiţaş ( Babeş-Bolyai University)The Singular Value Decomposition February 23, 2009 5 / 24

Full SVD Let A be an m n matrix. The singular value decomposition of A is the factorization A = UΣV where U is m m unitary (the left singular vectors of A) V is n n unitary (the right singular vectors of A) Σ is m n diagonal (the singular values of A) Radu Tiberiu Trîmbiţaş ( Babeş-Bolyai University)The Singular Value Decomposition February 23, 2009 6 / 24

Radu Tiberiu Trîmbiţaş ( Babeş-Bolyai University)The Singular Value Decomposition February 23, 2009 7 / 24

Formal Definition Let m and n be arbitrary; we do not require m n. Definition Given A C m n (not necessarily of full rank), a singular value decomposition (SVD) of A is a factorization where U C m m is unitary, V C n n is unitary, Σ R m n is diagonal. A = UΣV (1) In addition it is assumed that the diagonal entries σ j of Σ are nonnegative and in nonincreasing order; that is, σ 1 σ 2... σ p 0, where p = min(m, n). Radu Tiberiu Trîmbiţaş ( Babeş-Bolyai University)The Singular Value Decomposition February 23, 2009 8 / 24

The SVD and The Eigenvalue Decomposition The eigenvalue decomposition A = X ΛX 1 uses the same basis X for row and column space, but the SVD uses two different bases V, U generally does not use an orthonormal basis, but the SVD does is only defined for square matrices, but the SVD exists for all matrices For symmetric positive definite matrices A, the eigenvalue decomposition and the SVD are equal Radu Tiberiu Trîmbiţaş ( Babeş-Bolyai University)The Singular Value Decomposition February 23, 2009 9 / 24

Existence and Uniqueness Theorem Every matrix A C m n has a singular value decomposition (1). Furthermore, the singular values {σ j } are uniquely determined, and, if A is squared and the σ j are distinct, the left and the right singular vectors {u j } and {v j } are uniquely determined up to complex signs (i.e. complex scalar factors of modulus 1). Radu Tiberiu Trîmbiţaş ( Babeş-Bolyai University)The Singular Value Decomposition February 23, 2009 10 / 24

Existence and Uniqueness Proof Existence Proof. To prove the existence of the SVD, we isolate the direction of the largest action of A, and then proceed by induction on the dimension of A. Set σ 1 = A 2. By a compactness argument, there must be a vector v 1 C n with v 1 2 = 1 and u 1 2 = σ 1, where u 1 = Av 1. Consider any extension of v 1 to an orthonormal basis {v j } of C n and of u 1 to an orthonormal basis {u j } of C m, and let U 1 and V 1 denote the unitary matrices with columns u j and v j, respectively. Then we have [ U1 σ1 w AV 1 = S = 0 B ], (2) where 0 is a column vector of dimension m 1, w is a row vector of dimension n 1, and B has the dimensions (m 1) (n 1). Radu Tiberiu Trîmbiţaş ( Babeş-Bolyai University)The Singular Value Decomposition February 23, 2009 11 / 24

Existence and Uniqueness Existence Furthermore, [ σ1 w 0 B ] [ σ1 w ] 2 σ 2 1 + w w = (σ 2 1 + w w) 1/2 [ σ1 w ], 2 implying S 2 (σ1 2 + w w) 1/2. Since U 1 and V 1 are unitary, we know that S 2 = A 2 = σ 1, so this implies w = 0. If n = 1 or m = 1, we are done. Otherwise, the submatrix B describes the action of of A on the subspace orthogonal to v 1. By the induction hypothesis, B has a SVD B = U 2 Σ 2 V2. Now, it is easily verified that A = U 1 [ 1 0 0 U 2 ] [ σ1 0 0 Σ 2 ] [ 1 0 is a SVD of A, completing the proof of existence. 0 V 2 ] V 1 Radu Tiberiu Trîmbiţaş ( Babeş-Bolyai University)The Singular Value Decomposition February 23, 2009 12 / 24

Existence and Uniqueness Proof Uniqueness First we note that σ 1 is uniquely determined by the condition that is equal to A 2, as follows from (1). Now suppose that in addition to v 1, there is another linearly independent vector w with w 2 = 1 and Aw 2 = σ 1. Define a unit vector v 2, orthogonal to v 1, as a linear combination of v 1 and w, v 2 = w (v 1 w)v 1 w (v 1 w)v 1 2. Since A 2 = σ 1, Av 2 2 σ 1 ; but this must be an equality, for otherwise, since w = v 1 c + v 2 s for some constants c and s with c 2 + s 2 = 1, we would have Aw 2 < σ 1. This vector v 2 is a second right singular vector of A corresponding to the singular value σ 1 ; it will lead to the appearance of a vector y (equal to the last n 1 components of V 1 v 2) with y 2 = 1 and By 2 = σ 1. We conclude that, if the singular vector v 1 is not unique, then the corresponding singular value σ 1 is not simple. Radu Tiberiu Trîmbiţaş ( Babeş-Bolyai University)The Singular Value Decomposition February 23, 2009 13 / 24

Existence and uniqueness - Proof Uniqueness To complete the uniqueness proof we note that, as indicated above, once σ 1, v 1, and u 1 are determined, the remainder of the SVD is determined by the action of A on the space orthogonal to v 1. Since v 1 is unique up to sign, this orthogonal space is uniquely defined, and the uniqueness of the remaining singular values and vectors now follows by induction. Radu Tiberiu Trîmbiţaş ( Babeş-Bolyai University)The Singular Value Decomposition February 23, 2009 14 / 24

Matrix Properties via the SVD Theorem The rank of A is r, the number of nonzero singular values. Proof. The rank of a diagonal matrix is equal to the number of its nonzero entries, and in the decomposition A = UΣV, U and V are off full rank. Therefore rank(a) = rank(σ) = r. Radu Tiberiu Trîmbiţaş ( Babeş-Bolyai University)The Singular Value Decomposition February 23, 2009 15 / 24

Matrix Properties via the SVD Theorem range(a) = u 1,..., u r and null(a) = v r+1,..., v n. Proof. This is a consequence of the fact that range(σ) = e 1,..., e r C m and null(σ) = e r+1,..., e n C n. Radu Tiberiu Trîmbiţaş ( Babeş-Bolyai University)The Singular Value Decomposition February 23, 2009 16 / 24

Matrix Properties via the SVD Theorem A 2 = σ 1 and A F = σ 2 1 + σ2 2 + + σr r. Proof. The first result was already established in the proof of Theorem 2: since A = UΣV with unitary U and V, A 2 = Σ 2 = max{ σ j } = σ 1. For the second, note that invariant under unitary multiplication, so A F = Σ F, and the conclusion follows. Radu Tiberiu Trîmbiţaş ( Babeş-Bolyai University)The Singular Value Decomposition February 23, 2009 17 / 24

Matrix Properties via the SVD Theorem The nonzero singular values of A are the square roots of the nonzero eigenvalues of A A or AA. (This matrices have the same nonzero eigenvalues.) Proof. From the calculation A A = (UΣV ) (UΣV ) = V Σ U UΣV = V (Σ Σ)V, we see that A A is similar to Σ Σ and hence has the same n eigenvalues. The eigenvalues of a diagonal matrix Σ Σ are σ 2 1, σ2 2,..., σ2 p, with n p additional zero eigenvalues if n > p. A similar calculation applies to the m eigenvalues of AA. Radu Tiberiu Trîmbiţaş ( Babeş-Bolyai University)The Singular Value Decomposition February 23, 2009 18 / 24

Matrix Properties via the SVD Theorem If A = A, then the singular values of A are the absolute values of the eigenvalues of A. Proof. A Hermitian = A has a complete set of orthogonal eigenvectors and real eigenvalues A = X ΛX 1 holds with X = Q unitary and Λ real diagonal. But we can write A = QΛQ = Q Λ sign(λ)q, (3) Since sign(λ)q is unitary whenever Q is unitary, (3) is a SVD of A, with singular values equal to the diagonal entries of Λ, λ j. If desired, these numbers can be put into nonincreasing order by inserting suitable permutation matrices as factors in the left-hand unitary matrix of (3), Q, and the right-hand unitary matrix, sign(λ)q. Radu Tiberiu Trîmbiţaş ( Babeş-Bolyai University)The Singular Value Decomposition February 23, 2009 19 / 24

Matrix Properties via the SVD Theorem For A C m m, det(a) = m i=1 σ i. Proof. The determinant of a product of square matrices is the product of the determinants of the factors. Furthermore, the determinant of a unitary matrix is always 1 in the absolute value; this follows from the formula UU = I and the property det(u ) = (det(u)). Therefore, det(a) = det(uσv ) = det(u) det(σ) det(v ) = det(σ) = m i=1 σ i. Radu Tiberiu Trîmbiţaş ( Babeş-Bolyai University)The Singular Value Decomposition February 23, 2009 20 / 24

Low-Rank Approximations The SVD can be written as a sum of rank-one matrices A = r σ j u j vj j=1 The best rank ν approximation of A in the 2-norm is A ν = r σ j u j vj j=1 with A A ν 2 = σ ν+1 Also true in the Frobenius norm, with A A ν F = σν+1 2 + + σ2 r Radu Tiberiu Trîmbiţaş ( Babeş-Bolyai University)The Singular Value Decomposition February 23, 2009 21 / 24

Applications of the SVD Calculation of matrix properties: adu Tiberiu Trîmbiţaş ( Babeş-Bolyai University)The Singular Value Decomposition February 23, 2009 22 / 24

Applications of the SVD Calculation of matrix properties: Rank of matrix (counting σ j s > tolerance) adu Tiberiu Trîmbiţaş ( Babeş-Bolyai University)The Singular Value Decomposition February 23, 2009 22 / 24

Applications of the SVD Calculation of matrix properties: Rank of matrix (counting σ j s > tolerance) Bases for range and nullspace (in U and V ) adu Tiberiu Trîmbiţaş ( Babeş-Bolyai University)The Singular Value Decomposition February 23, 2009 22 / 24

Applications of the SVD Calculation of matrix properties: Rank of matrix (counting σ j s > tolerance) Bases for range and nullspace (in U and V ) Induced matrix norm 2 (= σ 1 ) adu Tiberiu Trîmbiţaş ( Babeş-Bolyai University)The Singular Value Decomposition February 23, 2009 22 / 24

Applications of the SVD Calculation of matrix properties: Rank of matrix (counting σ j s > tolerance) Bases for range and nullspace (in U and V ) Induced matrix norm 2 (= σ 1 ) Low-rank approximations (optimal in 2 and F ) adu Tiberiu Trîmbiţaş ( Babeş-Bolyai University)The Singular Value Decomposition February 23, 2009 22 / 24

Applications of the SVD Calculation of matrix properties: Rank of matrix (counting σ j s > tolerance) Bases for range and nullspace (in U and V ) Induced matrix norm 2 (= σ 1 ) Low-rank approximations (optimal in 2 and F ) Least squares fitting (more later, another option is QR) adu Tiberiu Trîmbiţaş ( Babeş-Bolyai University)The Singular Value Decomposition February 23, 2009 22 / 24

Applications of the SVD Calculation of matrix properties: Rank of matrix (counting σ j s > tolerance) Bases for range and nullspace (in U and V ) Induced matrix norm 2 (= σ 1 ) Low-rank approximations (optimal in 2 and F ) Least squares fitting (more later, another option is QR) Signal and image processing Radu Tiberiu Trîmbiţaş ( Babeş-Bolyai University)The Singular Value Decomposition February 23, 2009 22 / 24

Applications of the SVD Calculation of matrix properties: Rank of matrix (counting σ j s > tolerance) Bases for range and nullspace (in U and V ) Induced matrix norm 2 (= σ 1 ) Low-rank approximations (optimal in 2 and F ) Least squares fitting (more later, another option is QR) Signal and image processing Compression (see next slide) Radu Tiberiu Trîmbiţaş ( Babeş-Bolyai University)The Singular Value Decomposition February 23, 2009 22 / 24

Applications of the SVD Calculation of matrix properties: Rank of matrix (counting σ j s > tolerance) Bases for range and nullspace (in U and V ) Induced matrix norm 2 (= σ 1 ) Low-rank approximations (optimal in 2 and F ) Least squares fitting (more later, another option is QR) Signal and image processing Compression (see next slide) Noise removal (noise tends to have low σ j ) Radu Tiberiu Trîmbiţaş ( Babeş-Bolyai University)The Singular Value Decomposition February 23, 2009 22 / 24

Application: Image Compression View m n image as a (real) matrix A, find best rank ν approx. by SVD adu Tiberiu Trîmbiţaş ( Babeş-Bolyai University)The Singular Value Decomposition February 23, 2009 23 / 24

Application: Image Compression View m n image as a (real) matrix A, find best rank ν approx. by SVD Storage ν(m + n) instead of mn Radu Tiberiu Trîmbiţaş ( Babeş-Bolyai University)The Singular Value Decomposition February 23, 2009 23 / 24

Radu Tiberiu Trîmbiţaş ( Babeş-Bolyai University)The Singular Value Decomposition February 23, 2009 24 / 24