Singular Value Decomposition

Similar documents
Maths for Signals and Systems Linear Algebra in Engineering

UNIT 6: The singular value decomposition.

Dimensionality Reduction: PCA. Nicholas Ruozzi University of Texas at Dallas

The Singular Value Decomposition and Least Squares Problems

The Singular Value Decomposition

COMP 558 lecture 18 Nov. 15, 2010

Singular Value Decomposition

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix

Linear Algebra, part 3 QR and SVD

Notes on singular value decomposition for Math 54. Recall that if A is a symmetric n n matrix, then A has real eigenvalues A = P DP 1 A = P DP T.

Linear Algebra, part 3. Going back to least squares. Mathematical Models, Analysis and Simulation = 0. a T 1 e. a T n e. Anna-Karin Tornberg

The Singular Value Decomposition (SVD) and Principal Component Analysis (PCA)

Numerical Methods I Singular Value Decomposition

(a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? Solution: dim N(A) 1, since rank(a) 3. Ax =

be a Householder matrix. Then prove the followings H = I 2 uut Hu = (I 2 uu u T u )u = u 2 uut u

Singular Value Decomposition (SVD)

Recall : Eigenvalues and Eigenvectors

Singular Value Decomposition

Positive Definite Matrix

7. Symmetric Matrices and Quadratic Forms

Fall TMA4145 Linear Methods. Exercise set Given the matrix 1 2

Review problems for MA 54, Fall 2004.

Lecture: Face Recognition and Feature Reduction

18.06 Problem Set 10 - Solutions Due Thursday, 29 November 2007 at 4 pm in

Linear Least Squares. Using SVD Decomposition.

EK102 Linear Algebra PRACTICE PROBLEMS for Final Exam Spring 2016

CS 143 Linear Algebra Review

Lecture 15, 16: Diagonalization

Chapter 7: Symmetric Matrices and Quadratic Forms

Singular Value Decomposition

Jordan Normal Form and Singular Decomposition

Singular value decomposition

Eigenvalue and Eigenvector Homework

Lecture: Face Recognition and Feature Reduction

Linear Algebra & Geometry why is linear algebra useful in computer vision?

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

Lecture 02 Linear Algebra Basics

Singular Value Decomposition

Linear Algebra & Geometry why is linear algebra useful in computer vision?

MAC Module 12 Eigenvalues and Eigenvectors. Learning Objectives. Upon completing this module, you should be able to:

MAC Module 12 Eigenvalues and Eigenvectors

Computational Methods CMSC/AMSC/MAPL 460. EigenValue decomposition Singular Value Decomposition. Ramani Duraiswami, Dept. of Computer Science

Computational Methods. Eigenvalues and Singular Values

1 Linearity and Linear Systems

Background Mathematics (2/2) 1. David Barber

EE731 Lecture Notes: Matrix Computations for Signal Processing

Linear Algebra Review. Fei-Fei Li

Principal Component Analysis

Large Scale Data Analysis Using Deep Learning

MATH36001 Generalized Inverses and the SVD 2015

Let A an n n real nonsymmetric matrix. The eigenvalue problem: λ 1 = 1 with eigenvector u 1 = ( ) λ 2 = 2 with eigenvector u 2 = ( 1

Summary of Week 9 B = then A A =

Study Guide for Linear Algebra Exam 2

Math 315: Linear Algebra Solutions to Assignment 7

Notes on Eigenvalues, Singular Values and QR

LEC 2: Principal Component Analysis (PCA) A First Dimensionality Reduction Approach

Least Squares. Tom Lyche. October 26, Centre of Mathematics for Applications, Department of Informatics, University of Oslo

Announcements Monday, November 26

1 Last time: least-squares problems

1. Let m 1 and n 1 be two natural numbers such that m > n. Which of the following is/are true?

Maths for Signals and Systems Linear Algebra in Engineering

Main matrix factorizations

Linear Algebra Review. Fei-Fei Li

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

EIGENVALE PROBLEMS AND THE SVD. [5.1 TO 5.3 & 7.4]

HW2 - Due 01/30. Each answer must be mathematically justified. Don t forget your name.

MATH 612 Computational methods for equation solving and function minimization Week # 2

5.) For each of the given sets of vectors, determine whether or not the set spans R 3. Give reasons for your answers.

Math 308 Practice Final Exam Page and vector y =

Probabilistic Latent Semantic Analysis

2. Every linear system with the same number of equations as unknowns has a unique solution.

Linear Algebra Primer

I. Multiple Choice Questions (Answer any eight)

THE SINGULAR VALUE DECOMPOSITION MARKUS GRASMAIR

Problem # Max points possible Actual score Total 120

Linear Algebra in Actuarial Science: Slides to the lecture

Linear Systems. Class 27. c 2008 Ron Buckmire. TITLE Projection Matrices and Orthogonal Diagonalization CURRENT READING Poole 5.4

Example Linear Algebra Competency Test

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL

IV. Matrix Approximation using Least-Squares

Question 7. Consider a linear system A x = b with 4 unknown. x = [x 1, x 2, x 3, x 4 ] T. The augmented

Lecture 5 Singular value decomposition

Basic Calculus Review

EE263: Introduction to Linear Dynamical Systems Review Session 9

Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Final Exam, Linear Algebra, Fall, 2003, W. Stephen Wilson

22m:033 Notes: 7.1 Diagonalization of Symmetric Matrices

Linear Algebra - Part II

STA141C: Big Data & High Performance Statistical Computing

Linear Algebra Final Exam Study Guide Solutions Fall 2012

Latent Semantic Analysis. Hongning Wang

Linear Algebra Review. Vectors

1. Let A = (a) 2 (b) 3 (c) 0 (d) 4 (e) 1

MIT Final Exam Solutions, Spring 2017

The Singular Value Decomposition

. = V c = V [x]v (5.1) c 1. c k

The Singular Value Decomposition

Math 408 Advanced Linear Algebra

Matrices and Vectors. Definition of Matrix. An MxN matrix A is a two-dimensional array of numbers A =

Image Compression Using Singular Value Decomposition

Transcription:

Singular Value Decomposition

Motivatation The diagonalization theorem play a part in many interesting applications. Unfortunately not all matrices can be factored as A = PDP However a factorization A = QDP is possible for any m n matrix A. A special factorization of this type, called singular value decomposition, is one of the most useful matrix factorizations in applied linear algebra. As we will see one application is finding minimum value of Ax for solving Ax=0.

Singular Value Decomposition Let A be a m n matrix of rank r. We cannot find eigenvalues and eigenvectors for non-square matrices. However, A can be diagonalized in such a way that Av = σ u Av = σ u Av r = σ r u r The singular vectors v v r are orthogonal and are the basis for the row space of A. The output vectors u u r are orthogonal and in column space of A. The singular values σ σ r are all positive numbers in non-increasing order. 3

Av i = σ i u i leads to AV = UΣ: A v v r = u u r σ Note that in above equation the dimensions are m n n r = (m r)(r r) We always can find orthogonal basis for Nul(A) and Nul(A T ) and augment those vectors in matrices U and V, respectively to make them into square orthogonal matrices. We can fill zeroes in the rest of diagonal elements in matrix Σ. ( row space is orthogonal to null space) Then the matrices dimension in the equation AV = UΣ become: m n n n = (m m)(m n) σ r 4

After adding null space vectors the equation AV = UΣ becomes: A v v m = u u n σ σ r 0 How to find U and V? AV = UΣ A = UΣV = UΣV T, since V is orthogonal. AA T = UΣV T (UΣV T ) T = UΣV T VΣ T U T = UΣ U T, since V T V = I and ΣΣ T = Σ A T A = (UΣV T ) T UΣV T = VΣ T U T UΣV T = VΣ V T, since U T U = I and ΣΣ T = Σ 5

A = UΣV T AA T = UΣ U T A T A = VΣ V T Recall that AA T and A T A are symmetric matrices where their sizes are m m and n n, respectively. Recall that symmetric matrices are diagonalizable and their eigenvector matrix are orthogonal. The eigenvalues of AA T are the same as eigenvalues of A T A since the eigenvalues of AB are the same as eigenvalues of BA. In other words, Σ are the eigenvalues and columns of V are the eigenvectors of A T A, and Uare the eigenvectors of AA T 6

Example: Find matrix A = 3 3 vector decomposition. to its equivalent singular Find AA T = UDU T, AA T = 3 3 3 3 = λ λ = 0, λ = 0 λ 0 λ = 0 λ =, λ = 0. 7

Example Cont d: u = 0 u = 0 u = 0 0 u = 0 u = 0 u = U =, D = 0 0 0 8

Example Cont d: Find A T A = VDV T, AA T = 3 3 0 0 3 3 = 0 0 4 4 0 λ 0 0 0 λ 4 4 λ = 0, λ λ 0 λ = 0 λ =, λ = 0, λ 3 = 0. 9

Example Cont d: 0 0 0 0 4 4 v = 0 0 0 4 4 0 v = 0 v = 0 0 0 0 0 0 4 4 0 v = 0 0 0 0 0 4 4 8 v = 0 v = 0 0 0 0 0 0 0 4 4 0 v 3 = 0 0 0 0 0 4 4 v 3 = 0 v 3 = 5 0

Example Cont d: V = 6 6 6 5 5 5 30 30 5 30, D = A = UΣV T 3 3 = 0 0 0 0 0 0 0 0 0 0 0 0 0 6 5 30 6 5 30 6 5 5 30

Geometric interpretation Here is a geometric interpretation of SVD for a matrix M. V T (in figure V ) rotates unit vectors. Σ scales the vectors U perform the final rotation.

Note that we can assume SVD of matrix A as: A = Where r is the rank of the matrix. r i= σ i u i v i T Size of each σ i u i v i T is m n. The greater σ i the greater values added to reconstruct matrix A. A = σ u v T + + σ r u r v r T 3

Solve Ax = 0 What minimizes Ax. And why? As defined previous we can write A as A = UΣV T = u u n σ σ r 0 Where σ r is the smallest eigenvalue. Now we have Ax = v v r r i= σ i u i v i T x If we choose x = v r. Then since all v i T v r = 0 i r we have Ax = Av r = σ r u r v r T v r = σ r u r Smallest value of Ax associate with x = v r. And if σ r = 0 then v r is an answer of Ax=0

Pseudo inverse Assume Ax = b, Then by singular value decomposition of A we have Ax = b UΣV T x = b ΣV T x = U T b V T x = Σ U T b x = VΣ U T b VΣ U T called the pseudo inverse of A. It is useful for finding inverse of non-square matrices.

Applications One of the applications of SVD is dimensionality reduction. A m n matrix can be thought of gray level of a digital image. If the image A is decomposed to its singular values and vectors, we can pick only the most significant u i s, σ i s and v i s. By doing this we can compress the information of the image. Suppose the image A is m n. A = K i= σ i u i v T i In next slide you will see the original image and its compressed up to K most significant singular values. 6

Image compression using SVD K=8 K=3 K=8 K=5 Original 7