Computational Methods. Eigenvalues and Singular Values

Similar documents
Last Time. Social Network Graphs Betweenness. Graph Laplacian. Girvan-Newman Algorithm. Spectral Bisection

UNIT 6: The singular value decomposition.

Eigenvalue Problems. Eigenvalue problems occur in many areas of science and engineering, such as structural analysis

Scientific Computing: An Introductory Survey

The Singular Value Decomposition

Recall : Eigenvalues and Eigenvectors

Math 489AB Exercises for Chapter 2 Fall Section 2.3

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

Math 405: Numerical Methods for Differential Equations 2016 W1 Topics 10: Matrix Eigenvalues and the Symmetric QR Algorithm

Linear Algebra, part 3. Going back to least squares. Mathematical Models, Analysis and Simulation = 0. a T 1 e. a T n e. Anna-Karin Tornberg

7. Symmetric Matrices and Quadratic Forms

Numerical Methods. Elena loli Piccolomini. Civil Engeneering. piccolom. Metodi Numerici M p. 1/??

Computational Methods CMSC/AMSC/MAPL 460. EigenValue decomposition Singular Value Decomposition. Ramani Duraiswami, Dept. of Computer Science

18.06SC Final Exam Solutions

Notes on Eigenvalues, Singular Values and QR

Chapters 5 & 6: Theory Review: Solutions Math 308 F Spring 2015

Eigenproblems II: Computation

The Eigenvalue Problem: Perturbation Theory

arxiv: v1 [math.na] 5 May 2011

Chapter 3 Transformations

Singular Value Decomposition

Eigenvalues and eigenvectors

1. Let m 1 and n 1 be two natural numbers such that m > n. Which of the following is/are true?

Lecture 10 - Eigenvalues problem

Preliminary/Qualifying Exam in Numerical Analysis (Math 502a) Spring 2012

EIGENVALUE PROBLEMS. EIGENVALUE PROBLEMS p. 1/4

Linear Algebra Review. Fei-Fei Li

Review of Some Concepts from Linear Algebra: Part 2

The Singular Value Decomposition (SVD) and Principal Component Analysis (PCA)

MATH 304 Linear Algebra Lecture 34: Review for Test 2.

be a Householder matrix. Then prove the followings H = I 2 uut Hu = (I 2 uu u T u )u = u 2 uut u

Computation of eigenvalues and singular values Recall that your solutions to these questions will not be collected or evaluated.

CS 143 Linear Algebra Review

CS 246 Review of Linear Algebra 01/17/19

Singular Value Decomposition

Linear Algebra, part 3 QR and SVD

Properties of Matrices and Operations on Matrices

Dimensionality Reduction: PCA. Nicholas Ruozzi University of Texas at Dallas

ICS 6N Computational Linear Algebra Eigenvalues and Eigenvectors

AMS526: Numerical Analysis I (Numerical Linear Algebra)

A Review of Linear Algebra

SVD, PCA & Preprocessing

Eigenvalue Problems and Singular Value Decomposition

Jordan Normal Form and Singular Decomposition

Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Lecture 4 Eigenvalue problems

Linear Algebra Primer

AMS526: Numerical Analysis I (Numerical Linear Algebra)

Computational math: Assignment 1

EIGENVALUE PROBLEMS. Background on eigenvalues/ eigenvectors / decompositions. Perturbation analysis, condition numbers..

Numerical Methods I Eigenvalue Problems

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Example: Filter output power. maximization. Definition. Eigenvalues, eigenvectors and similarity. Example: Stability of linear systems.

Computational Methods CMSC/AMSC/MAPL 460. Eigenvalues and Eigenvectors. Ramani Duraiswami, Dept. of Computer Science

MAT Linear Algebra Collection of sample exams

Computational Methods. Least Squares Approximation/Optimization

Summary of Week 9 B = then A A =

The QR Decomposition

Linear Algebra for Machine Learning. Sargur N. Srihari

Eigenvalue and Eigenvector Problems

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )

Linear Algebra Review. Fei-Fei Li

Linear Algebra: Matrix Eigenvalue Problems

Math Linear Algebra Final Exam Review Sheet

B553 Lecture 5: Matrix Algebra Review

Eigenvalues and Eigenvectors. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB

Chapter 0 Miscellaneous Preliminaries

13-2 Text: 28-30; AB: 1.3.3, 3.2.3, 3.4.2, 3.5, 3.6.2; GvL Eigen2

(a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? Solution: dim N(A) 1, since rank(a) 3. Ax =

Eigenvalues and Eigenvectors

Econ Slides from Lecture 7

Final Review Written by Victoria Kala SH 6432u Office Hours R 12:30 1:30pm Last Updated 11/30/2015

Preliminary Examination, Numerical Analysis, August 2016

Applied Mathematics 205. Unit II: Numerical Linear Algebra. Lecturer: Dr. David Knezevic

Math 310 Final Exam Solutions

The Singular Value Decomposition

Numerical Methods for Solving Large Scale Eigenvalue Problems

Chap 3. Linear Algebra

6 EIGENVALUES AND EIGENVECTORS

ANSWERS. E k E 2 E 1 A = B

Eigenvalues, Eigenvectors, and Diagonalization

Singular Value Decomposition

Stat 159/259: Linear Algebra Notes

Matrices, Moments and Quadrature, cont d

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

5 Selected Topics in Numerical Linear Algebra

Lecture notes: Applied linear algebra Part 1. Version 2

Linear Algebra in Actuarial Science: Slides to the lecture

Conceptual Questions for Review

LinGloss. A glossary of linear algebra

Lecture # 11 The Power Method for Eigenvalues Part II. The power method find the largest (in magnitude) eigenvalue of. A R n n.

Eigenvalues, Eigenvectors, and Diagonalization

Positive Definite Matrix

Review of some mathematical tools

Numerical Methods - Numerical Linear Algebra

Singular Value Decomposition (SVD)

Linear Algebra- Final Exam Review

APPLIED NUMERICAL LINEAR ALGEBRA

Linear algebra & Numerical Analysis

The Singular Value Decomposition and Least Squares Problems

Transcription:

Computational Methods Eigenvalues and Singular Values Manfred Huber 2010 1

Eigenvalues and Singular Values Eigenvalues and singular values describe important aspects of transformations and of data relations Eigenvalues determine the important the degree to which a linear transformation changes the length of transformed vectors Eigenvectors indicate the directions in which the principal change happen Eigenvalues are important for many problems in computer science and engineering, including Dimensionality reduction Compression Manfred Huber 2010 2

Eigenvalues Eigenvalues and eigenvectors x characterize dimensions that are purely stretched by a given linear transformation The spectrum of A is the set of its eigenvalues The spectral radius of A is the magnitude of the larges of its eigenvalues Eigenvalues characterize the degree to which a linear transformation stretches input vectors Ax = λx Also important for sensitivity analysis of linear problems Manfred Huber 2010 3

Eigenvalues A linear transformation has as many eigenvalues and eigenvectors as it has dimensions Eigenvectors might be duplicates Eigenvalues might be complex Any data point (vector) can be written as a linear combination of eigenvectors Allows efficient decomposition of vectors Manfred Huber 2010 4

Power Iteration The eigenvalue equation is related to the fixed point equations (except with scaling) Ax = λx Simplest solution method to find eigenvectors (and eigenvalues) is power iteration characterize dimensions that are purely stretched by a given linear transformation Power iteration converges to a scaled version of the eigenvector with the dominant eigenvalue x t +1 = Ax t Manfred Huber 2010 5

Power Iteration Power iteration converges except if x 0 has no component of the dominant eigenvector There are more than one eigenvector with the same eigenvalue Normalized power iteration renormalizes the result x t+1 after each iteration y k +1 = Ax k, x k +1 = y k +1 Converges to dominant eigenvector and dominant eigenvalue y k λ d, x k 1 v d y k +1 v d Manfred Huber 2010 6

Inverse Iteration Inverse iteration is used to find the smallest eigenvalue converges except if Inverse iteration corresponds to power iteration with the inverse matrix A -1 Inverse iteration and power iteration can only find the smallest and the largest eigenvalues Ay k +1 = x k, x k +1 = y k +1 y k +1 Need to find a way to determine other eigenvalues and eigenvectors Manfred Huber 2010 7

Characteristic Polynomial The determination of eigenvectors and eigenvalues can be transformed into a root finding problem Has a nonzero solution for the eigenvector x if and only if (A-I) is not singular Eigenvalues of the nonsingular matrix are the roots of the characteristic polynomial The characteristic polynomial is a polynomial of degree n Complex eigenvalues occur in conjugate pairs Computation of the characteristic polynomial is complex (A λi)x = 0 det(a λi) = 0 Can be accelerated by first performing LU factorization Manfred Huber 2010 8

Characteristic Polynomial Computing roots of a polynomial of degree larger than 4 cannot always be computed directly and require an iterative solution Computing eigenvalues using the characteristic polynomial is numerically not stable and highly complex Computing coefficients of characteristic polynomial requires computation of the determinant Root finding requires iterative solution process Coefficients of characteristic are very sensitive Characteristic polynomial is a powerful theoretical tool but not a practical computational approach Manfred Huber 2010 9

Eigenvalue Problems Characteristics of eigenvalue problems influence the choice of algorithm All or only some eigenvalues Only eigenvalues or eigenvalues and eigenvectors Dense or sparse matrix Real of complex values Other properties of matrix A Manfred Huber 2010 10

Problem Transformations A number of transformations either preserve or have a predictable effect on the eigenvalues Shift: For any scalar Inversion: Powers: Ax = λx Ax = λx Ax = λx Polynomial: for any polynomial p(t) Ax = λx (A σi)x = (λ σ)x A 1 x = 1 λ x A k x = λ k x Similarity: for any similar matrix B = T -1 AT Bx = λx p(a)x = p(λ)x ATx = λ(tx) Manfred Huber 2010 11

Problem Transformations Eigenvalues and eigenvectors of diagonal matrices are easy to determine Eigenvalues are the values on the diagonal Eigenvectors are the columns of the identity matrix Not all matrices are diagonalizable using similarity transformations Eigenvalues of triangular matrices can also be determined easily Eigenvalues are diagonal entries of the matrix Eigenvectors can be computed from (A λi)x = 0 Manfred Huber 2010 12

Convergence of Iterations Speed of convergence of power iteration and inverse iteration depends on the ratio of two eigenvalues For power iteration, convergence is faster the larger the ratio of the largest and the second largest eigenvalue is For inverse iteration, convergence is faster the smaller the ratio of the smallest and the second smallest eigenvector is Shift transformation allows to change the ratio of eigenvalues λ 1 λ 1 σ λ 2 λ 2 σ Knowledge of eigenvalue of sought after eigenvector would allow to lower this ratio to 0 Allows to increase the convergence rate of inverse iteration Manfred Huber 2010 13

Rayleigh Quotient Iteration Rayleigh quotient iteration uses the Rayleigh quotient as a shift parameter σ = xt Ax, (A σi) x T x This allows to make the ratio of eigenvalues close to 0 and thus accelerates the convergence of inverse iteration (A σ k I)y k +1 = x k x k +1 = y k +1 y k +1 This algorithm is usually called Rayleigh quotient iteration Rayleigh quotient iteration converges usually very fast Each iteration requires a new matrix factorization and is therefore O(n 3 ) F Manfred Huber 2010 14

Computing All Eigenvalues Power iteration and inverse iteration allow to compute only the largest and the smallest eigenvalues and eigenvectors. To compute the other eigenvalues we need to either Remove the already found eigenvector (and eigenvalue) from the matrix to be able to reapply power or inverse iteration Find a way to find all the eigenvectors simultaneously Removing eigenvectors from the space spanned by a transformation A is called deflation Manfred Huber 2010 15

Deflation To remove an eigenvalue (and corresponding eigenvector) we have to find a set of transformations that preserves all other eigenvalues Householder transforms can be used to derive such a transformation H with The similarity transform described by H yields a matrix Hx 1 = αe 1 HAH 1 = λ 1 b T 0 B Since similarity transforms were used this matrix has the same eigenvalues B has all the eigenvalues of A with the exception of 1 Power iteration can be applied to this new matrix B Manfred Huber 2010 16

Deflation Power iteration with deflation can compute all eigenvalues but requires determining the eigenvector in each iteration Eigenvector in B can be used to compute eigenvector in A b T y 2 x 3 = H 1 λ 2 λ 1 y 2 Alternatively, the eigenvalue could be used directly in A to determine the eigenvector More computationally complex Manfred Huber 2010 17

Simultaneous Iteration Simultaneous iteration attempts to simultaneously iterate multiple vectors X k +1 = AX k X converges to the space spanned by the p dominant eigenvectors Subspace iteration But X becomes ill-conditioned since all columns in X ultimately converge to the dominant eigenvector Need normalization that keeps vectors well conditioned and non-equal Orthogonal iteration using QR factorization Manfred Huber 2010 18

QR Iteration As for least squares (and equation solving) QR factorization allows a factorization of the matrix into components that stay well conditioned Q k +1 R k +1 = X k X k +1 = AQ k +1 By using Q (a similarity transform) for the iteration, the eigenvalues are preserved and it converges to block triangular form Triangular form if all eigenvalues are real values and distinct Manfred Huber 2010 19

QR Iteration To find eigenvalues, QR iteration can be applied directly to A A k = Q k H A k 1 Q k Converges to triangular or block triangular matrix containing all eigenvalues as diagonal elements of as eigenvalues of diagonal blocks Can be computed without explicitly performing the product Q k +1 R k +1 = A k A k +1 = R k +1 Q k +1 (= Q H k +1 A k Q k +1 ) Can be accelerated using shift transformation Manfred Huber 2010 20

Singular Values Singular values are related to Eigenvalues and characterize important aspects of the space described by the transformation Nullspace Span Singular Value Decomposition divides a transformation A into a sequence of 3 transformations where the second is pure rescaling Scaling parameters are the singular values Columns of the other two transformations are the left and right singular vectors, respectively Manfred Huber 2010 21

Singular Values Singular values exist for all transformations A, independent of A being square or not Right singular vectors represent the input vectors that span the orthogonal basis that is being scaled Left singular vectors represent the vectors that the scaled internal basis vectors are transformed into for the output Sinuglar values are directly related to the eigenvalues Singular values are the nonnegative square roots of the eigenvalues of AA T or A T A Left singular vectors are eigenvectors of AA T Right singular vectors are eigenvectors of A T A Manfred Huber 2010 22

Singular Value Decomposition Singular value decomposition (SVD) factorizes A U is an mxm orthogonal matrix of left singular vectors V is an nxn orthogonal matrix of right singular vectors is an mxn diagonal matrix of singular values A = UΣV T Usually is arranged such that the singular values are ordered by magnitude Left and right singular vectors are related through the singular values Av,i = σ i u,i A T u,i = σ i v,i Manfred Huber 2010 23

Singular Value Decomposition Singular value decomposition (SVD) can be computed in different ways Using eigenvalue computation on AA T Compute eigenvalues of AA T Determine left singular vectors as eigenvectors for AA T Determine right singular vectors as eigenvectors for A T A Leads to some conditioning issues due to the need for matrix multiplication Directly from A by performing Householder transformations and givens rotations until a diagonal matrix is reached Perform QR factorization to achieve triangular matrix Use Householder transforms to achieve bidiagonal shape Use Givens rotations to achieve diagonal form This is usually better conditioned Manfred Huber 2010 24

Singular Value Decomposition Singular value decomposition (SVD) can be used for a range of applications u T Compute least squares solution Ax b x =,i b v i σ i Compute pseudoinverse A + = VΣ + U T Euclidean matrix norm: A 2 = σ max Condition number of a matrix: cond(a) = σ max σ min Matrix rank is equal to the number of non-zero singular values Nullspace of the matrix is spanned by the set of right singular vectors corresponding to singular values of 0 Span of a matrix is spanned by the left singular vectors corresponding to non-zero singular values Manfred Huber 2010 25 σ i 0

Singular Value Decomposition Singular value decomposition (SVD) is useful in a number of applications Data compression Right singular values transform data into a basis in which it is only scaled Data dimensions with 0 or very small scaling factors are not important for the overall data Wide range of applications: Image compression Dimensionality reduction for data Dimensionality reduction for matrix operations Filtering and noise reduction Most of the time, data has only few important dimensions and noise is most apparent in additional dimensions (with smaller singular values) Ignoring dimensions with small singular values can lead to less noisy data Manfred Huber 2010 26

Compression Example Image compression is an area where SVD has been used relatively early on Given an image, can we reduce the amount of data that has to be transmitted without loosing too much information Use SVD to find a lower rank approximation of the image that has only limited loss. Manfred Huber 2010 27

Compression Example In SVD, the magnitude of the singular values often decreases rapidly after the first few singular values To compress the image, only keep the k largest singular values (and thus singular vectors) to reconstruct the image A U p Σ p V p T Manfred Huber 2010 28

Compression Example Different compression levels have different loss Manfred Huber 2010 29

Eigenvalues and Singular Values Eigenvalues and Eigenvectors capture important properties about linear transformations A Eigenvalues and Singular values indicate the importance of particular dimensions of the space Can be used for compression Singular values can capture noise characteristics Can be used for filtering of data Can be used to remove noise from data before transformations are applied Singular values are also important to analyze problems such as conditioning and sensitivity Manfred Huber 2010 30