The Singular Value Decomposition

Similar documents
The Singular Value Decomposition (SVD) and Principal Component Analysis (PCA)

Maths for Signals and Systems Linear Algebra in Engineering

7. Symmetric Matrices and Quadratic Forms

Chapter 7: Symmetric Matrices and Quadratic Forms

Notes on singular value decomposition for Math 54. Recall that if A is a symmetric n n matrix, then A has real eigenvalues A = P DP 1 A = P DP T.

UNIT 6: The singular value decomposition.

Singular Value Decomposition

Independent Component Analysis

Properties of Matrices and Operations on Matrices

Numerical Linear Algebra

Chap 3. Linear Algebra

Numerical Linear Algebra Homework Assignment - Week 2

Linear Algebra Review. Vectors

1. General Vector Spaces

Math 315: Linear Algebra Solutions to Assignment 7

Linear Least Squares. Using SVD Decomposition.

Notes on Eigenvalues, Singular Values and QR

Linear Algebra Review. Fei-Fei Li

Linear Algebra: Matrix Eigenvalue Problems

Linear Algebra Review. Fei-Fei Li

Homework 1 Elena Davidson (B) (C) (D) (E) (F) (G) (H) (I)

Computational Methods CMSC/AMSC/MAPL 460. Eigenvalues and Eigenvectors. Ramani Duraiswami, Dept. of Computer Science

LEC 2: Principal Component Analysis (PCA) A First Dimensionality Reduction Approach

Econ 204 Supplement to Section 3.6 Diagonalization and Quadratic Forms. 1 Diagonalization and Change of Basis

Computational Methods. Eigenvalues and Singular Values

Fall TMA4145 Linear Methods. Exercise set Given the matrix 1 2

Summary of Week 9 B = then A A =

Linear Algebra. Session 12

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

18.06 Problem Set 8 - Solutions Due Wednesday, 14 November 2007 at 4 pm in

Numerical Methods I Singular Value Decomposition

Singular Value Decomposition

Eigenvalues and Eigenvectors

Computational math: Assignment 1

Review of Linear Algebra

Computational Methods CMSC/AMSC/MAPL 460. EigenValue decomposition Singular Value Decomposition. Ramani Duraiswami, Dept. of Computer Science

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

(a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? Solution: dim N(A) 1, since rank(a) 3. Ax =

1 Linearity and Linear Systems

Glossary of Linear Algebra Terms. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB

Quadratic forms. Here. Thus symmetric matrices are diagonalizable, and the diagonalization can be performed by means of an orthogonal matrix.

LinGloss. A glossary of linear algebra

The Singular Value Decomposition

The Singular Value Decomposition and Least Squares Problems

Dimensionality Reduction: PCA. Nicholas Ruozzi University of Texas at Dallas

Linear Algebra (Review) Volker Tresp 2017

be a Householder matrix. Then prove the followings H = I 2 uut Hu = (I 2 uu u T u )u = u 2 uut u

Assignment #10: Diagonalization of Symmetric Matrices, Quadratic Forms, Optimization, Singular Value Decomposition. Name:

Applied Mathematics 205. Unit V: Eigenvalue Problems. Lecturer: Dr. David Knezevic

forms Christopher Engström November 14, 2014 MAA704: Matrix factorization and canonical forms Matrix properties Matrix factorization Canonical forms

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )

Problem # Max points possible Actual score Total 120

Linear Algebra (Review) Volker Tresp 2018

Singular value decomposition

MATH36001 Generalized Inverses and the SVD 2015

Linear Algebra, part 3. Going back to least squares. Mathematical Models, Analysis and Simulation = 0. a T 1 e. a T n e. Anna-Karin Tornberg

Symmetric and anti symmetric matrices

Jordan Normal Form and Singular Decomposition

Image Registration Lecture 2: Vectors and Matrices

8. Diagonalization.

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

1 Linear Algebra Problems

The Principal Component Analysis

Knowledge Discovery and Data Mining 1 (VO) ( )

B553 Lecture 5: Matrix Algebra Review

Notes on Linear Algebra

STAT 309: MATHEMATICAL COMPUTATIONS I FALL 2017 LECTURE 5

Applied Mathematics 205. Unit II: Numerical Linear Algebra. Lecturer: Dr. David Knezevic

1. The Polar Decomposition

Pseudoinverse & Orthogonal Projection Operators

3D Computer Vision - WT 2004

linearly indepedent eigenvectors as the multiplicity of the root, but in general there may be no more than one. For further discussion, assume matrice

Lecture notes on Quantum Computing. Chapter 1 Mathematical Background

18.06 Problem Set 10 - Solutions Due Thursday, 29 November 2007 at 4 pm in

Positive Definite Matrix

Linear Algebra, part 3 QR and SVD

Lecture notes: Applied linear algebra Part 1. Version 2

Quantum Computing Lecture 2. Review of Linear Algebra

22.3. Repeated Eigenvalues and Symmetric Matrices. Introduction. Prerequisites. Learning Outcomes

CS 246 Review of Linear Algebra 01/17/19

a 11 a 12 a 11 a 12 a 13 a 21 a 22 a 23 . a 31 a 32 a 33 a 12 a 21 a 23 a 31 a = = = = 12

Linear algebra and applications to graphs Part 1

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix

Background Mathematics (2/2) 1. David Barber

Check that your exam contains 30 multiple-choice questions, numbered sequentially.

Linear Algebra Primer

Lecture 15, 16: Diagonalization

Functional Analysis Review

Lecture 8: Linear Algebra Background

Singular Value Decomposition

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

1 Inner Product and Orthogonality

Linear Systems. Class 27. c 2008 Ron Buckmire. TITLE Projection Matrices and Orthogonal Diagonalization CURRENT READING Poole 5.4

MATH 583A REVIEW SESSION #1

Review of some mathematical tools

AMS526: Numerical Analysis I (Numerical Linear Algebra)

Elementary linear algebra

Recall : Eigenvalues and Eigenvectors

Math Final December 2006 C. Robinson

Repeated Eigenvalues and Symmetric Matrices

Transcription:

The Singular Value Decomposition Philippe B. Laval KSU Fall 2015 Philippe B. Laval (KSU) SVD Fall 2015 1 / 13

Review of Key Concepts We review some key definitions and results about matrices that will be used in this section. The transpose of a matrix A, denoted A T is the matrix obtained from A by switching its rows and columns. In other words, if A = (a ij ) then A T = (a ji ). The conjugate transpose of a matrix A, denoted A is obtained from A by switching its rows and columns and taking the conjugate of its entries. In other words, if A = (a ij ) then A = (a ji ). A matrix A is said to be symmetric if A = A T. Symmetric matrices have the following properties: Their eigenvalues are always real. They are always diagonalizable. Their eigenvectors are orthogonal. They are orthogonally diagonalizable that is if A is such a matrix then there exists an orthogonal matrix P such that P 1 AP is diagonal. Philippe B. Laval (KSU) SVD Fall 2015 2 / 13

Review of Key Concepts A matrix A is said to be Hermitian if A = A. For matrices with real entries, being Hermitian is the same as being symmetric. An n n matrix A is said to be normal if A A = AA. Obviously, Hermitian matrices are also normal. A matrix A is said to be unitary if AA = A A = I. Unitary matrices have the following properties: They preserve the dot product that is Ax, Ay = x, y Their columns and rows are orthogonal. They are always diagonalizable. det A = 1. A 1 = A A matrix A is said to be orthogonal if AA T = A T A = I. Orthogonal matrices have the following properties: They preserve the dot product that is Ax, Ay = x, y Their columns and rows are orthogonal. det A = 1. A 1 = A T Philippe B. Laval (KSU) SVD Fall 2015 3 / 13

Review of Key Concepts A quadratic form on R n is a function Q defined on R n by Q (x) = x T Ax for some n n matrix A. Here are a few important facts about quadratic forms: In the case A is symmetric, there exists a change of variable x = Py that transforms x T Ax into y T Dy where D is a diagonal matrix. In the case A is symmetric, the maximum value of x T Ax is the absolute value of the largest eigenvalue λ 1 of A and it happens in the direction of u 1 the corresponding eigenvector. Philippe B. Laval (KSU) SVD Fall 2015 4 / 13

Introduction to the SVD of a Matrix Recall that if A is symmetric, then its eigenvalues are real. Moreover, if Ax = λx and x = 1 then Ax = λx = λ x = λ. Hence, λ measures the amount by which A stretches (or shrinks) vectors which have the same direction as the eigenvectors. If λ 1 is the eigenvalue with largest magnitude and v 1 is its corresponding eigenvectors, then v 1 gives the direction in which the stretching effect of A is the greatest. This description of v 1 and λ 1 has an analogue for rectangular matrices that will lead the the SVD. We begin with an example. Example [ ] 4 11 14 Let A =. Find a unit vector x at which the length of Ax 8 7 2 is maximized and compute this maximum length. Philippe B. Laval (KSU) SVD Fall 2015 5 / 13

The Singular Values of a Matrix Let A be an m n matrix. Then as noted in the example, A T A is an n n symmetric matrix hence orthogonally diagonalizable. Let {v 1, v 2,..., v n } be an orthonormal basis for R n consisting of the eigenvectors of A T A and let λ 1, λ 2,..., λ n be the corresponding eigenvalues of A T A. Then, for 1 i n, we have: Av i 2 = λ i we see that all the eigenvalues of A T A are nonnegative. by renumbering, we may assume that λ 1 λ 2... λ n. Definition The singular values of A are the square roots of the eigenvalues λ i of A T A, denoted σ i and they are arranged in decreasing order. In other words, σ i = λ i Since Av i 2 = λ i, we see that σ i is the length of the vectors Av i, where v i are the eigenvectors of A T A. Philippe B. Laval (KSU) SVD Fall 2015 6 / 13

The Singular Values of a Matrix Example Find the singular values of A, the matrix of the previous example. We have the following important theorem. Theorem Suppose that {v 1, v 2,..., v n } is an orthonormal basis for R n consisting of the eigenvectors of A T A arranged so that the corresponding eigenvalues of A T A satisfy λ 1 λ 2... λ n, and suppose that A has r nonzero singular values. Then, {Av 1, Av 2,..., Av r } is an orthogonal basis for ColA, and ranka = r. Philippe B. Laval (KSU) SVD Fall 2015 7 / 13

The SVD of an mxn Matrix Let A be an m n matrix with r nonzero singular values where r min (m, n). Define D to be the r r diagonal matrix consisting of these r nonzero singular values of A such that σ 1 σ 2... σ r. Let [ ] D 0 Σ = 0 0 be an m n matrix. The SVD decomposition of A will involve Σ. More specifically, we have the following theorem. Theorem Let A be an m n matrix with rank r. Then there exists an m n matrix Σ as in 1 as well as an m m orthogonal matrix U and an n n orthogonal matrix V such that A = UΣV T (1) Philippe B. Laval (KSU) SVD Fall 2015 8 / 13

The SVD of an mxn Matrix Definition Any decomposition A = UΣV T with U and V orthogonal, Σ as in 1 and positive diagonal entries for D,is called a singular value decomposition (SVD) of A. The matrix U and V are not uniquely determined, but the diagonal entries of Σ are necessarily the singular values of A. The columns of U are called the left singular vectors of A and the columns of V are called the right singular vectors of A. Example Find the SVD of the matrix A in the examples above. Philippe B. Laval (KSU) SVD Fall 2015 9 / 13

The SVD of an mxn Matrix We outline the proof. Let λ i and v i be as above. Then {Av 1, Av 2,..., Av r } is an orthogonal basis for col A. We normalize each Av i to obtain an orthonormal basis {u 1, u 2,..., u r }, where u i = 1 Av i Av i = 1 σ i Av i thus Av i = σ i u i (1 i r). Next, we extend {u 1, u 2,..., u r } to an orthonormal basis {u 1, u 2,..., u m } of R m and let U = [u 1, u 2,..., u m ] and V = [v 1, v 2,..., v n ]. By constructions, both U and V are orthogonal matrices. Also, AV = [σ 1 u 1 σ 2 u 2 σ r u r 0 0] Philippe B. Laval (KSU) SVD Fall 2015 10 / 13

The SVD of an mxn Matrix Let D and Σ be as above, then σ 1 0 0 0 0 0 σ 2...... 0 0. UΣ = [u 1, u 2,..., u m ] 0 0 σ r 0. 0 0 0 0 0.. 0 0 0 = [σ 1 u 1 σ 2 u 2 σ r u r 0 0] = AV Therefore, UΣV T = AVV T = A since V is orthogonal (VV T = I ) Philippe B. Laval (KSU) SVD Fall 2015 11 / 13

The SVD and MATLAB The MATLAB command is svd. Two useful formats for this command are: 1 X=svd(A) will returns a vector X containing the singular values of A. 2 [U,S,V] = svd(x) produces a diagonal matrix S, of the same dimension as X and with nonnegative diagonal elements in decreasing order, and unitary matrices U and V so that X = USV T. Philippe B. Laval (KSU) SVD Fall 2015 12 / 13

Exercises See the problems at the end of the notes on the basics of SVD. Philippe B. Laval (KSU) SVD Fall 2015 13 / 13