Iterative methods for symmetric eigenvalue problems

Similar documents
Direct methods for symmetric eigenvalue problems

ECS231 Handout Subspace projection methods for Solving Large-Scale Eigenvalue Problems. Part I: Review of basic theory of eigenvalue problems

EIGIFP: A MATLAB Program for Solving Large Symmetric Generalized Eigenvalue Problems

Arnoldi Methods in SLEPc

AMS526: Numerical Analysis I (Numerical Linear Algebra)

RANA03-02 January Jacobi-Davidson methods and preconditioning with applications in pole-zero analysis

Preconditioned inverse iteration and shift-invert Arnoldi method

APPLIED NUMERICAL LINEAR ALGEBRA

Introduction to Arnoldi method

Krylov Subspaces. Lab 1. The Arnoldi Iteration

Application of Lanczos and Schur vectors in structural dynamics

A HARMONIC RESTARTED ARNOLDI ALGORITHM FOR CALCULATING EIGENVALUES AND DETERMINING MULTIPLICITY

Krylov Subspaces. The order-n Krylov subspace of A generated by x is

The Lanczos and conjugate gradient algorithms

Last Time. Social Network Graphs Betweenness. Graph Laplacian. Girvan-Newman Algorithm. Spectral Bisection

A refined Lanczos method for computing eigenvalues and eigenvectors of unsymmetric matrices

Eigenvalue Problems and Singular Value Decomposition

Index. for generalized eigenvalue problem, butterfly form, 211

Universiteit-Utrecht. Department. of Mathematics. Jacobi-Davidson algorithms for various. eigenproblems. - A working document -

Matrix Algorithms. Volume II: Eigensystems. G. W. Stewart H1HJ1L. University of Maryland College Park, Maryland

Iterative methods for Linear System of Equations. Joint Advanced Student School (JASS-2009)

Krylov subspace projection methods

1. Introduction. In this paper we consider the large and sparse eigenvalue problem. Ax = λx (1.1) T (λ)x = 0 (1.2)

LARGE SPARSE EIGENVALUE PROBLEMS. General Tools for Solving Large Eigen-Problems

Jacobi-Davidson methods and preconditioning with applications in pole-zero analysis Rommes, J.; Vorst, van der, H.A.; ter Maten, E.J.W.

LARGE SPARSE EIGENVALUE PROBLEMS

Solution of eigenvalue problems. Subspace iteration, The symmetric Lanczos algorithm. Harmonic Ritz values, Jacobi-Davidson s method

Alternative correction equations in the Jacobi-Davidson method

AMS526: Numerical Analysis I (Numerical Linear Algebra) Lecture 23: GMRES and Other Krylov Subspace Methods; Preconditioning

DELFT UNIVERSITY OF TECHNOLOGY

4.8 Arnoldi Iteration, Krylov Subspaces and GMRES

Eigenvalues and eigenvectors

Toward the Optimal Preconditioned Eigensolver: Locally Optimal Block Preconditioned Conjugate Gradient Method

Is there life after the Lanczos method? What is LOBPCG?

Computation of eigenvalues and singular values Recall that your solutions to these questions will not be collected or evaluated.

EINDHOVEN UNIVERSITY OF TECHNOLOGY Department ofmathematics and Computing Science

ECS130 Scientific Computing Handout E February 13, 2017

A Jacobi Davidson-type projection method for nonlinear eigenvalue problems

problem Au = u by constructing an orthonormal basis V k = [v 1 ; : : : ; v k ], at each k th iteration step, and then nding an approximation for the e

Iterative Methods for Linear Systems of Equations

A Parallel Implementation of the Davidson Method for Generalized Eigenproblems

AMS526: Numerical Analysis I (Numerical Linear Algebra)

Summary of Iterative Methods for Non-symmetric Linear Equations That Are Related to the Conjugate Gradient (CG) Method

DELFT UNIVERSITY OF TECHNOLOGY

Key words. conjugate gradients, normwise backward error, incremental norm estimation.

AMS526: Numerical Analysis I (Numerical Linear Algebra)

Off-diagonal perturbation, first-order approximation and quadratic residual bounds for matrix eigenvalue problems

A Domain Decomposition Based Jacobi-Davidson Algorithm for Quantum Dot Simulation

ITERATIVE PROJECTION METHODS FOR SPARSE LINEAR SYSTEMS AND EIGENPROBLEMS CHAPTER 11 : JACOBI DAVIDSON METHOD

Solution of eigenvalue problems. Subspace iteration, The symmetric Lanczos algorithm. Harmonic Ritz values, Jacobi-Davidson s method

Invariant Subspace Computation - A Geometric Approach

Eigenproblems II: Computation

1 Conjugate gradients

Density-Matrix-Based Algorithms for Solving Eingenvalue Problems

OUTLINE 1. Introduction 1.1 Notation 1.2 Special matrices 2. Gaussian Elimination 2.1 Vector and matrix norms 2.2 Finite precision arithmetic 2.3 Fact

HOMOGENEOUS JACOBI DAVIDSON. 1. Introduction. We study a homogeneous Jacobi Davidson variant for the polynomial eigenproblem

CONTROLLING INNER ITERATIONS IN THE JACOBI DAVIDSON METHOD

Space-Variant Computer Vision: A Graph Theoretic Approach

Eigenvalue Problems. Eigenvalue problems occur in many areas of science and engineering, such as structural analysis

Chapter 7 Duopoly. O. Afonso, P. B. Vasconcelos. Computational Economics: a concise introduction

State-of-the-art numerical solution of large Hermitian eigenvalue problems. Andreas Stathopoulos

Scientific Computing: An Introductory Survey

Harmonic Projection Methods for Large Non-symmetric Eigenvalue Problems

c 2009 Society for Industrial and Applied Mathematics

A CHEBYSHEV-DAVIDSON ALGORITHM FOR LARGE SYMMETRIC EIGENPROBLEMS

An Asynchronous Algorithm on NetSolve Global Computing System

Conjugate Gradient (CG) Method

A JACOBI DAVIDSON METHOD FOR SOLVING COMPLEX-SYMMETRIC EIGENVALUE PROBLEMS

Orthogonal iteration to QR

Model order reduction of large-scale dynamical systems with Jacobi-Davidson style eigensolvers

The Eigenvalue Problem: Perturbation Theory

A Jacobi Davidson Method for Nonlinear Eigenproblems

Controlling inner iterations in the Jacobi Davidson method

Key words. linear equations, polynomial preconditioning, nonsymmetric Lanczos, BiCGStab, IDR

An Arnoldi Method for Nonlinear Symmetric Eigenvalue Problems

1. Introduction. In this paper we consider variants of the Jacobi Davidson (JD) algorithm [27] for computing a few eigenpairs of

Karhunen-Loève Approximation of Random Fields Using Hierarchical Matrix Techniques

On prescribing Ritz values and GMRES residual norms generated by Arnoldi processes

Alternative correction equations in the Jacobi-Davidson method. Mathematical Institute. Menno Genseberger and Gerard L. G.

Implementation of a preconditioned eigensolver using Hypre

A communication-avoiding thick-restart Lanczos method on a distributed-memory system

A Jacobi Davidson Method with a Multigrid Solver for the Hermitian Wilson-Dirac Operator

Total least squares. Gérard MEURANT. October, 2008

Bindel, Fall 2016 Matrix Computations (CS 6210) Notes for

Residual iterative schemes for largescale linear systems

ON MATRIX BALANCING AND EIGENVECTOR COMPUTATION

AMS526: Numerical Analysis I (Numerical Linear Algebra)

Iterative methods for Linear System

M.A. Botchev. September 5, 2014

Computation of Smallest Eigenvalues using Spectral Schur Complements

Applied Mathematics 205. Unit V: Eigenvalue Problems. Lecturer: Dr. David Knezevic

Bindel, Fall 2016 Matrix Computations (CS 6210) Notes for

6.4 Krylov Subspaces and Conjugate Gradients

Large-scale eigenvalue problems

Some minimization problems

c 2006 Society for Industrial and Applied Mathematics

On the influence of eigenvalues on Bi-CG residual norms

Quantum Lattice Models & Introduction to Exact Diagonalization

Preconditioned Parallel Block Jacobi SVD Algorithm

Krylov Space Methods. Nonstationary sounds good. Radu Trîmbiţaş ( Babeş-Bolyai University) Krylov Space Methods 1 / 17

Transcription:

s Iterative s for symmetric eigenvalue problems, PhD McMaster University School of Computational Engineering and Science February 11, 2008

s 1 The power and its variants Inverse power Rayleigh quotient iteration Subspace iteration 2 s Rayleigh-Ritz Lanczos Implicitly restarted Lanczos Band Lanczos 3

Inverse power Rayleigh quotient iteration Subspace iteration s Find the dominating eigenvalue/eigenvector v k+1 = y k / y k 2 y k+1 = Av k+1 λ k+1 = v T k+1 y k+1 Only multiplication is involved Converges unless v 0 v max Convergence rate: λ max /λ max 1 Problems multiple/close largest eigenvalues only the largest eigenvalue is computed

Inverse power Rayleigh quotient iteration Subspace iteration s Inverse power Inner eigenvalues: λ ( (A σi) 1) = 1 λ(a) σ Apply the power to (A σi) 1 v k+1 = y k / y k 2 y k+1 = (A σi) 1 v k+1 λ k+1 = v T k+1 y k+1 Converges to the dominating eigenvalue of (A σi) 1 Converges unless v 0 v max Convergence rate: (λ max σ)/(λ max 1 σ), linear Viable only if (A σi)y = v is easily solvable

Inverse power Rayleigh quotient iteration Subspace iteration s Rayleigh quotient iteration Change the shift in each iteration v k+1 = y k / y k 2 σ k+1 = v T k+1 Av k+1/ v k+1 2 y k+1 = (A σ k+1 I) 1 v k+1 λ k+1 = v T k+1 y k+1 Convergence properties are unclear Finds an eigenvalue faster than inverse iteration Cubic convergence Does not necessarily find λ max May not converge to an eigenvalue A σ k+1 I will become singular New factorization in every iteration

Inverse power Rayleigh quotient iteration Subspace iteration s Subspace iteration Invariant subspaces are robust, eigenvectors are not 0 1 ε 2 0 0 0 ε 1 0 1 0 2 0 0 0 0 1 + ε It is better to identify the invariant subspaces QR factorize Y k = V k+1 R k+1 Y k+1 = AV k+1 H k+1 = V T k+1y k+1 Y, V R n p, H R p p The eigenvalues of H are the largest eigenvalues of A Clustered (not multiple) eigenvalues Choosing p smartly Can also be applied to (A σi) 1 Software: EA12 in HSL

s Rayleigh-Ritz Lanczos Implicitly restarted Lanczos Band Lanczos s Problems with power iteration based s Extremal eigenvalues only Internal eigenvalues require solution of a linear system A k v is used as the best guess for an eigenvector : span { v, Av,..., A k v } Find the best approximate eigenvector (Ritz vectors) Columns of Q k are orthogonal, span Krylov space λ(q T k AQ k) approximates λ(a) Choose a Q k to simplify the structure of Q T k AQ k }{{} T k

s Rayleigh-Ritz Lanczos Implicitly restarted Lanczos Band Lanczos Lanczos Gradually build the Maintain an orthogonal basis Q k, T k tridiagonal Find the corresponding Ritz vectors v 0 = 0, β 1 = 0, v 1 random unit repeat q j = Av j β j v j 1 α j = qj T v j q j = q j α j v j β j+1 = q j v j+1 = q j /β j+1 Extreme eigenvalues converge first Can also be applied to (A σi) 1 Memory consumption increases

s Rayleigh-Ritz Lanczos Implicitly restarted Lanczos Band Lanczos Implicitly restarted Lanczos Prevents k growing too much Applies shifts µ i to the algorithm Equivalently, changes v 0 How to choose the shifts? Flexible eigenvalue configurations Locking/purging eigenvalues Software: ARPACK (also in Matlab)

s Rayleigh-Ritz Lanczos Implicitly restarted Lanczos Band Lanczos Band Lanczos Multiple starting vectors, finds more eigenvalues span { V, AV,..., A k V } Suitable for multiple/clustered eigenvalues T is block tridiagonal

s Problems with Lanczos only efficient if the eigenvalues are well separated needs (A σi) 1 y for internal eigenvalues Build a different set of orthogonal vectors spanning the Galerkin vectors Interior eigenvalues without inversion Very good if A has multiple eigenvalues Software: JDQR (Matlab)

s Z. Bai, J. Demmel, J. Dongarra, A. Ruhe, and H. van der Vorst, editors. Templates for the Solution of Algebraic Eigenvalue Problems: A Practical Guide. SIAM, Philadelphia, 2000. James W. Demmel. Applied Numerical Linear Algebra. SIAM, Philadelphia, 1997.