UNIT 6: The singular value decomposition.

Similar documents
Homework sheet 4: EIGENVALUES AND EIGENVECTORS. DIAGONALIZATION (with solutions) Year ? Why or why not? 6 9

7. Symmetric Matrices and Quadratic Forms

Singular Value Decomposition

Singular Value Decomposition

(a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? Solution: dim N(A) 1, since rank(a) 3. Ax =

Maths for Signals and Systems Linear Algebra in Engineering

Computational Methods. Eigenvalues and Singular Values

235 Final exam review questions

The Singular Value Decomposition

The Singular Value Decomposition

Review of Some Concepts from Linear Algebra: Part 2

Dimensionality Reduction: PCA. Nicholas Ruozzi University of Texas at Dallas

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix

Recall : Eigenvalues and Eigenvectors

Notes on Eigenvalues, Singular Values and QR

Conceptual Questions for Review

Linear Algebra, part 3. Going back to least squares. Mathematical Models, Analysis and Simulation = 0. a T 1 e. a T n e. Anna-Karin Tornberg

Notes on singular value decomposition for Math 54. Recall that if A is a symmetric n n matrix, then A has real eigenvalues A = P DP 1 A = P DP T.

Linear Algebra, part 3 QR and SVD

MA 265 FINAL EXAM Fall 2012

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

MATH 221, Spring Homework 10 Solutions

linearly indepedent eigenvectors as the multiplicity of the root, but in general there may be no more than one. For further discussion, assume matrice

Chapter 7: Symmetric Matrices and Quadratic Forms

Review problems for MA 54, Fall 2004.

MATH 304 Linear Algebra Lecture 34: Review for Test 2.

Linear Algebra. Session 12

Singular Value Decomposition

COMP 558 lecture 18 Nov. 15, 2010

Singular Value Decomposition

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

Linear Algebra Review. Vectors

The Singular Value Decomposition (SVD) and Principal Component Analysis (PCA)

Linear Algebra Primer

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

Glossary of Linear Algebra Terms. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

Chapter 3 Transformations

I. Multiple Choice Questions (Answer any eight)

AM 205: lecture 8. Last time: Cholesky factorization, QR factorization Today: how to compute the QR factorization, the Singular Value Decomposition

Applied Mathematics 205. Unit II: Numerical Linear Algebra. Lecturer: Dr. David Knezevic

CS 143 Linear Algebra Review

CS 246 Review of Linear Algebra 01/17/19

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )

Problem # Max points possible Actual score Total 120

Linear Algebra in Actuarial Science: Slides to the lecture

Linear Algebra Review. Fei-Fei Li

Econ Slides from Lecture 7

Properties of Matrices and Operations on Matrices

ANSWERS. E k E 2 E 1 A = B

Pseudoinverse & Moore-Penrose Conditions

Linear Algebra- Final Exam Review

SOLUTION KEY TO THE LINEAR ALGEBRA FINAL EXAM 1 2 ( 2) ( 1) c a = 1 0

ft-uiowa-math2550 Assignment NOTRequiredJustHWformatOfQuizReviewForExam3part2 due 12/31/2014 at 07:10pm CST

MATH Spring 2011 Sample problems for Test 2: Solutions

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL

The Singular Value Decomposition and Least Squares Problems

Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008

ELE/MCE 503 Linear Algebra Facts Fall 2018

a 11 a 12 a 11 a 12 a 13 a 21 a 22 a 23 . a 31 a 32 a 33 a 12 a 21 a 23 a 31 a = = = = 12

Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

5.) For each of the given sets of vectors, determine whether or not the set spans R 3. Give reasons for your answers.

Eigenvalues and Eigenvectors A =

Jordan Normal Form and Singular Decomposition

Computational math: Assignment 1

MATH 304 Linear Algebra Lecture 23: Diagonalization. Review for Test 2.

2. Review of Linear Algebra

Large Scale Data Analysis Using Deep Learning

Study Guide for Linear Algebra Exam 2

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

The SVD-Fundamental Theorem of Linear Algebra

1. Let m 1 and n 1 be two natural numbers such that m > n. Which of the following is/are true?

LinGloss. A glossary of linear algebra

18.06 Problem Set 10 - Solutions Due Thursday, 29 November 2007 at 4 pm in

Math Linear Algebra Final Exam Review Sheet

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

Math 102 Final Exam - Dec 14 - PCYNH pm Fall Name Student No. Section A0

Math 315: Linear Algebra Solutions to Assignment 7

Cheat Sheet for MATH461

MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors.

Warm-up. True or false? Baby proof. 2. The system of normal equations for A x = y has solutions iff A x = y has solutions

1. Select the unique answer (choice) for each problem. Write only the answer.

18.06SC Final Exam Solutions

Linear Algebra Review. Fei-Fei Li

Fall TMA4145 Linear Methods. Exercise set Given the matrix 1 2

Summary of Week 9 B = then A A =

Final Exam, Linear Algebra, Fall, 2003, W. Stephen Wilson

1 Last time: least-squares problems

TBP MATH33A Review Sheet. November 24, 2018

A Review of Linear Algebra

Stat 159/259: Linear Algebra Notes

Problem Set (T) If A is an m n matrix, B is an n p matrix and D is a p s matrix, then show

Linear Algebra - Part II

MATH. 20F SAMPLE FINAL (WINTER 2010)

Final Review Written by Victoria Kala SH 6432u Office Hours R 12:30 1:30pm Last Updated 11/30/2015

Section 6.4. The Gram Schmidt Process

Linear Algebra for Machine Learning. Sargur N. Srihari

Lecture notes: Applied linear algebra Part 1. Version 2

Math 408 Advanced Linear Algebra

q n. Q T Q = I. Projections Least Squares best fit solution to Ax = b. Gram-Schmidt process for getting an orthonormal basis from any basis.

Transcription:

UNIT 6: The singular value decomposition. María Barbero Liñán Universidad Carlos III de Madrid Bachelor in Statistics and Business Mathematical methods II 2011-2012

A square matrix is symmetric if A T = A. A square matrix A is orthogonally diagonalizable if there exist an orthogonal matrix P and a diagonal matrix D such that A = PDP T = PDP 1, where P, D and A have all the same size. Remember that... an orthogonal matrix satisfies P T = P 1. Theorems. 1 A n n matrix A orthogonally diagonalize if and only if A is a symmetric matrix. 2 Let A be a symmetric matrix. The eigenvectors of different eigenvalues are orthogonal. 2 /14

Steps to orthogonally diagonalize a matrix A. 1 Find the eigenvalues of A, which are the zeroes of the characteristic polynomial, P A (λ) = det(a λi n ). 2 Find the eigenvectors of each eigenvalue. 1 If an eigenvalue has more than one eigenvector, check if they are orthogonal. 2 If they are not orthogonal, obtain an othogonal basis by Gram-Schmidt method. 3 Normalize those eigenvectors that are not unit. 4 The columns of P are the eigenvectors which determine a set of orthonormal vectors. 5 The diagonal matrix D has the eigenvalues in the main diagonal in the same order as the eigenvectors in P. 3 /14

A VECTOR NORM is... a function : R V that assigns a real-valued length to each vector in V and satisfies the following conditions: 1 x 0 and x = 0 if and only if x = 0. 2 x + y x + y. 3 λx = λ x. Examples of norms in R n. 1 Euclidean norm or 2-norm: x 2 = (the usual one). 2 1-norm: x 1 = x 1 + + x n. 3 Infinity norm: x = max{ x 1,..., x n }. x 2 1 + + x 2 n 4 /14

The matrix norm of A M m n is defined as follows A (m,n) = max x R n Ax (m) x (n) = max x R n Ax (m). x 0 x (n) =1 Examples of matrix norms. 1 A 1 = max Ax 1 = max x 1 =1 1 k n m a jk. It is the maximum of the 1-norm of each column of the matrix. n 2 A = max Ax = max a x =1 1 k m kj. j=1 j=1 It is the maximum of the 1-norm of each row of the matrix. 5 /14

The singular values σ of A M m n are the square roots of the eigenvalues λ of the square matrix A T A, that is, σ = λ, where λ are the zeroes of the characteristic polynomial of A T A, P A T A(λ) = det(a T A λi n ). REMARK: The matrix A T A is square and symmetric. It has size n n. A 2 = max{σ 1,..., σ r } where r is the number of the singular values of A. The rank of A is equal to the number of nonzero singular values. 6 /14

The image of the unit sphere... under any matrix A of size m n is an hyperellipse. An hyperellipse is... a surface obtained by stretching the unit sphere in R n by some factors σ 1,... σ r in some orthogonal directions. The principal semiaxes of the hyperellipse are Av i, where v i s are the eigenvectors of A T A, and the semiaxes have length σ i. 7 /14

Singular value decomposition theorem (SVD). Let A be a m n matrix ( with ) rank r. Then there exists a unique D 0 m n matrix, Σ =, for which the diagonal entries in the 0 0 r r matrix D are the r singular values of A, σ 1 σ 2 σ r > 0; and there exist an m m orthogonal matrix U and an n n orthogonal matrix V such that A = UΣV T. REMARK: The matrices U and V are not uniquely determined. The columns of U are called left singular vectors of A. The columns of V are called right singular vectors of A. 8 /14

Steps to find the SVD of a m n matrix A with m n: 1 Find the eigenvalues λ of A T A and a set of ORTHONORMAL eigenvectors. (Use the Gram-Schmidt method to orthogonalize and normalize whenever is necessary). 2 Find Σ by writing the singular values λ in decreasing order in the main diagonal. The remaining entries of the matrix are zero up to obtain a m n matrix, the same size as A. 3 Find V whose columns are the set of orthonormal eigenvectors written in the same order as the singular values in Σ. The size of V is n n. 4 Find U whose size is m m and whose columns are: 1 u i = 1 σ i Av i. 2 Complete {u 1,..., u n } to an orthonormal basis for R m computing the orthogonal subspace of {u 1,..., u n } and normalizing the spanning set of it. 9 /14

Example of SVD of A. ( ) ( Rotate )( Stretch )( Rotate) 2 0 1 0 2 0 1 0 A = = = UΣV T 0 1 0 1 0 1 0 1 The image of the unit sphere in R 2 under A is an ellipse in R 2 with center at (0, 0) and semiaxis over the coordinate axis of length 2 and 1. y y y y V T x Σ x U x x 10 /14

Reduced SVD Let A be a m n matrix with m n and SVD given by UΣV T. The reduced singular valued decomposition of A is given by A = = = ( ) ( ) v 1 D n n u 1... u n u n+1... u m v 2 0 (m n) n ) (u ( ) D n n V T 1... u n u n+1... u m (u 1... u n ) D n n V T. 0 (m n) n Remark: Note that to obtain the reduced singular value decomposition of A we do not need to extend the singular vectors in V to obtain a m m matrix for U, we only use the left most m n submatrix of U. v n 11 /14

Moore-Penrose pseudoinverse A of A. If A is an invertible matrix, then A = A 1. If A is a m n matrix with m n and SVD given by UΣV T, then the Moore-Penrose pseudoinverse matrix is a n m matrix given by where ( ) + Σ D = = 0 A = (V T ) 1 Σ U 1 = VΣ U T, ( ) D 1 0 = 1 0 0 0 σ 1 1 0 0 0 σ 2....... 1 0 0 σ r 12 /14

Properties of the Moore-Penrose pseudoinverse: 1 AA A = A. 2 A AA = A. 3 (AA ) T = AA. 4 (A A) T = A A. Application When m > n, the Moore-Penrose pseudoinverse matrix gives a least-square solution of Ax = b. If A has full rank, A = (A T A) 1 A T. (Note: A b is a least-square solution of Ax = b and it is exactly the solution to the normal equations in UNIT 5.) If A does not have full rank, we need SVD to obtain A. A least-square solution of Ax = b is given by A b. 13 /14

More details on the previous application when A does not have full rank 1 Find the SVD of A: A = UΣV T. 2 A least-square solution of Ax = b is given by A b because (UΣV T )x = Ax = b, (U T U)ΣV T x = U T b, Σ ΣV T x = Σ U T b because Σ Σ = I n n, (VV T )x = V Σ U T b, x = VΣ U T b = A b. Remark: Remember that the SVD is not unique because the matrices U and V are not unique. Hence, depending on the SVD we find different least-square solutions of Ax = b. 14 /14