Invariant Subspace Perturbations or: How I Learned to Stop Worrying and Love Eigenvectors

Similar documents
Functional Analysis Review

Applied Mathematics 205. Unit V: Eigenvalue Problems. Lecturer: Dr. David Knezevic

Basic Calculus Review

The Eigenvalue Problem: Perturbation Theory

Definition (T -invariant subspace) Example. Example

U.C. Berkeley Better-than-Worst-Case Analysis Handout 3 Luca Trevisan May 24, 2018

Dissertation Defense

EE731 Lecture Notes: Matrix Computations for Signal Processing

Mid-year Report Linear and Non-linear Dimentionality. Reduction. applied to gene expression data of cancer tissue samples

Spectra of Adjacency and Laplacian Matrices

MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors.

Lecture notes: Applied linear algebra Part 2. Version 1

15 Singular Value Decomposition

Ma/CS 6b Class 23: Eigenvalues in Regular Graphs

Data Analysis and Manifold Learning Lecture 3: Graphs, Graph Matrices, and Graph Embeddings

Eigenvalues and Eigenvectors. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB

A Residual Inverse Power Method

Vector Spaces for Quantum Mechanics J. P. Leahy January 30, 2012

LARGE SPARSE EIGENVALUE PROBLEMS. General Tools for Solving Large Eigen-Problems

Graph Matching & Information. Geometry. Towards a Quantum Approach. David Emms

LARGE SPARSE EIGENVALUE PROBLEMS

Finding normalized and modularity cuts by spectral clustering. Ljubjana 2010, October

April 13, We now extend the structure of the horseshoe to more general kinds of invariant. x (v) λ n v.

ELE 538B: Mathematics of High-Dimensional Data. Spectral methods. Yuxin Chen Princeton University, Fall 2018

Intelligent Data Analysis. Principal Component Analysis. School of Computer Science University of Birmingham

Background Mathematics (2/2) 1. David Barber

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix

Convergence of Eigenspaces in Kernel Principal Component Analysis

Spectral Clustering on Handwritten Digits Database

arxiv: v1 [math.na] 5 Oct 2018

1 Planar rotations. Math Abstract Linear Algebra Fall 2011, section E1 Orthogonal matrices and rotations

Review problems for MA 54, Fall 2004.

Scientific Computing with Case Studies SIAM Press, Lecture Notes for Unit VII Sparse Matrix

Quantum Computing Lecture 2. Review of Linear Algebra

Spectral Graph Theory

14 Singular Value Decomposition

Data Analysis and Manifold Learning Lecture 2: Properties of Symmetric Matrices and Examples

Today: eigenvalue sensitivity, eigenvalue algorithms Reminder: midterm starts today

Arnoldi Methods in SLEPc

Math 4153 Exam 3 Review. The syllabus for Exam 3 is Chapter 6 (pages ), Chapter 7 through page 137, and Chapter 8 through page 182 in Axler.

Estimation of large dimensional sparse covariance matrices

Homework 2. Solutions T =

Relations Between Adjacency And Modularity Graph Partitioning: Principal Component Analysis vs. Modularity Component Analysis

Section 4.4 Reduction to Symmetric Tridiagonal Form

UMIACS-TR July CS-TR 2721 Revised March Perturbation Theory for. Rectangular Matrix Pencils. G. W. Stewart.

Chapter 5. Eigenvalues and Eigenvectors

Reflections in Hilbert Space III: Eigen-decomposition of Szegedy s operator

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.

Ma/CS 6b Class 20: Spectral Graph Theory

Multivariate Statistical Analysis

A geometric proof of the spectral theorem for real symmetric matrices

arxiv: v5 [math.na] 16 Nov 2017

Spectral Partitiong in a Stochastic Block Model

The University of Texas at Austin Department of Electrical and Computer Engineering. EE381V: Large Scale Learning Spring 2013.

Lecture 02 Linear Algebra Basics

Real symmetric matrices/1. 1 Eigenvalues and eigenvectors

Krylov subspace projection methods

Math Introduction to Numerical Analysis - Class Notes. Fernando Guevara Vasquez. Version Date: January 17, 2012.

MATH 5640: Functions of Diagonalizable Matrices

The Lanczos and conjugate gradient algorithms

Spectral Theorem for Self-adjoint Linear Operators

Linear Algebra - Part II

Lecture notes: Applied linear algebra Part 1. Version 2

LOCAL AND GLOBAL STABILITY OF FUSION FRAMES

NEW ESTIMATES FOR RITZ VECTORS

MATH 583A REVIEW SESSION #1

Lecture 7: Positive Semidefinite Matrices

Quantum Information and Quantum Many-body Systems

Notes on singular value decomposition for Math 54. Recall that if A is a symmetric n n matrix, then A has real eigenvalues A = P DP 1 A = P DP T.

Ritz Value Bounds That Exploit Quasi-Sparsity

EE Technion, Spring then. can be isometrically embedded into can be realized as a Gram matrix of rank, Properties:

Lecture 3: Review of Linear Algebra

MATH 581D FINAL EXAM Autumn December 12, 2016

Periodicity & State Transfer Some Results Some Questions. Periodic Graphs. Chris Godsil. St John s, June 7, Chris Godsil Periodic Graphs

A useful variant of the Davis Kahan theorem for statisticians

Eigenvalues, Eigenvectors, and an Intro to PCA

Contents. Preface for the Instructor. Preface for the Student. xvii. Acknowledgments. 1 Vector Spaces 1 1.A R n and C n 2

Exercise Set 7.2. Skills

EECS 275 Matrix Computation

5 Compact linear operators

Numerical Analysis Lecture Notes

Assignment #10: Diagonalization of Symmetric Matrices, Quadratic Forms, Optimization, Singular Value Decomposition. Name:

PCA and admixture models

Ma/CS 6b Class 20: Spectral Graph Theory

Stat 159/259: Linear Algebra Notes

Certifying the Global Optimality of Graph Cuts via Semidefinite Programming: A Theoretic Guarantee for Spectral Clustering

Bindel, Fall 2016 Matrix Computations (CS 6210) Notes for

Majorization for Changes in Ritz Values and Canonical Angles Between Subspaces (Part I and Part II)

Matrix Algorithms. Volume II: Eigensystems. G. W. Stewart H1HJ1L. University of Maryland College Park, Maryland

Computational functional genomics

Review of Some Concepts from Linear Algebra: Part 2

Linear Algebra using Dirac Notation: Pt. 2

Proofs for Quizzes. Proof. Suppose T is a linear transformation, and let A be a matrix such that T (x) = Ax for all x R m. Then

MATRICES ARE SIMILAR TO TRIANGULAR MATRICES

An Iterative Procedure for Solving the Riccati Equation A 2 R RA 1 = A 3 + RA 4 R. M.THAMBAN NAIR (I.I.T. Madras)

Spectral Clustering. Guokun Lai 2016/10

The Solvability Conditions for the Inverse Eigenvalue Problem of Hermitian and Generalized Skew-Hamiltonian Matrices and Its Approximation

Extreme eigenvalues of Erdős-Rényi random graphs

EIGENVALUE PROBLEMS. Background on eigenvalues/ eigenvectors / decompositions. Perturbation analysis, condition numbers..

Lecture 12 : Graph Laplacians and Cheeger s Inequality

Transcription:

Invariant Subspace Perturbations or: How I Learned to Stop Worrying and Love Eigenvectors Alexander Cloninger Norbert Wiener Center Department of Mathematics University of Maryland, College Park http://www.norbertwiener.umd.edu

Outline Introduction to Eigenvector Perturbation 1 Introduction to Eigenvector Perturbation 2 3

Outline Introduction to Eigenvector Perturbation 1 Introduction to Eigenvector Perturbation 2 3

Error in A Matrix Eigenvector decomposition is ubiquitous in mathematics Principle Component Analysis Quantum States Fourier Analysis Spectral Graph Theory Most Common Problem in Mathematics Find (λ, v) such that Av = λv One imagines computers made this problem trivial [U, S] = eig(a) Question: What happens to eigenpairs if matrix has tiny errors à = A + E

Error in A Matrix Answer for eigenvalues: You re fine and they behave rather continuously Weyl s Inequality Answer for eigenvectors: Problem!

Eigenvector Perturbations Eigenvectors under perturbation require careful treatment Dependent on separation of spectrum Example ( ) ( ) 1 ɛ 0 1 0 Let A = = σ(a) = {1 ɛ, 1 + ɛ}, V =. 0 1 + ɛ 0 1 ( ) ( ) 1 ɛ 1 1 Let à = = ɛ 1 σ(ã) = {1 ɛ, 1 + ɛ}, Ṽ =. 1 1 V and Ṽ are as far apart as possible

Eigenvector Perturbations Geometrically Problem is that image of A is rotationally symmetric

Stable Eigenvector Geometrically Lack of symmetry allows for robust perturbations

Eigenvector Perturbations Separation of spectrum creates stable perturbations Example ( ) 1 0 Let B = ( 0 2 ) Let B 1 ɛ = ɛ 2 = σ(b) = {1, 2}, V = = σ( B) = { 3 1+16ɛ 2 Ṽ = ( 0.995 0.099 0.099 0.995 ( ) 1 0. 0 1 2, 3+ 1+16ɛ 2 2 }, ) for ɛ =.1 Rotation between V and Ṽ is 5

Outline Introduction to Eigenvector Perturbation 1 Introduction to Eigenvector Perturbation 2 3

Eigenvalue Separation Consider symmetric matrices A, E R n n Eigenvectors of A + E depend on separation of spectrum σ(a) Definition (Separation of Spectrum) The separation of the spectrum σ(a) at λ is sep(λ, σ(a) \ λ) = min{ λ γ : γ σ(a) \ λ}

Invariant Subspace Perturbations I Theorem (Davis, 1963) Let A, E C n n be Hermetian. Let (λ, x) be an eigenpair of A such that Let sep(λ, σ(a) \ λ) = δ. P be a spectral projector of A such that Px = x P be the corresponding spectral projector of A + E, and P be the orthogonal complement P z = z P z. Then if E ɛ δ/2, P P ɛ δ ɛ.

Invariant Subspace Perturbations II Side Note on Advisers:

Similar Perturbation Theorems Davis, Kahan (1970): Clustered Subspaces are Preserved Stewart (1973): Take Direction of Error Into Account

Outline Introduction to Eigenvector Perturbation 1 Introduction to Eigenvector Perturbation 2 3

Concentration As Opposed to Angle Similarity Previous theory defined similarity by angle Θ[V, Ṽ ] Not only way to consider similarity Can also consider eigenvector localization Important when: σ(a) has high density in interval A is adjacency matrix for network graph Plot of Eigenvectors for A R 1000 1000 λ 20 = 0.0287... λ 25 = 0.0371... λ 30 = 0.0479 λ 31 = 0.0481 λ 32 = 0.0494

Concentration Pertrubation Plot of Eigenvectors for A, Ã R1000 1000, Ã = A + E v 25 v 31 ev 25 ev 31

Eigenvector Localization Theorem (C., 2014) Let A R n n be symmetric with eigendecomposition A = V ΣV. Let (λ i, v i ) be an eigenpair of A. Assume Then Partition V = [V 1, V 2, v i, V 3, V 4 ] where V 2, V 3 R n s, ordered by λ 1... λ n C {1,..., n} such that supp(v i ) C and supp(v j ) C where v j is a column of V 2, V 3. Let ( λ, x) an eigenvector of the perturbed matrix à = A + E, where x = [x 1,..., x n ]. j C c x j 2 ( λ λ i )x Ex 2 2 min(λ i λ i s, λ i+s λ i ) 2.

Conclusions Eigenvector perturbation depends on inverse of spectrum separation Led to Nobel Prize in Physics for particle localization (Anderson 1977) Concentration / localization is lesser restriction on eigenvectors Depends on cluster of vectors concentrated in similar area Applications in eigenstate localization on data-dependent graphs