Solving large Hamiltonian eigenvalue problems

Similar documents
Structured Krylov Subspace Methods for Eigenproblems with Spectral Symmetries

A Structure-Preserving Method for Large Scale Eigenproblems. of Skew-Hamiltonian/Hamiltonian (SHH) Pencils

Index. for generalized eigenvalue problem, butterfly form, 211

Perturbation theory for eigenvalues of Hermitian pencils. Christian Mehl Institut für Mathematik TU Berlin, Germany. 9th Elgersburg Workshop

The Newton-ADI Method for Large-Scale Algebraic Riccati Equations. Peter Benner.

Last Time. Social Network Graphs Betweenness. Graph Laplacian. Girvan-Newman Algorithm. Spectral Bisection

Ritz Value Bounds That Exploit Quasi-Sparsity

The quadratic eigenvalue problem (QEP) is to find scalars λ and nonzero vectors u satisfying

A refined Lanczos method for computing eigenvalues and eigenvectors of unsymmetric matrices

Structure preserving Krylov-subspace methods for Lyapunov equations

4.8 Arnoldi Iteration, Krylov Subspaces and GMRES

Solution of eigenvalue problems. Subspace iteration, The symmetric Lanczos algorithm. Harmonic Ritz values, Jacobi-Davidson s method

AMS526: Numerical Analysis I (Numerical Linear Algebra) Lecture 23: GMRES and Other Krylov Subspace Methods; Preconditioning

The Lanczos and conjugate gradient algorithms

Iterative Methods for Sparse Linear Systems

Matrix Algorithms. Volume II: Eigensystems. G. W. Stewart H1HJ1L. University of Maryland College Park, Maryland

Arnoldi Methods in SLEPc

A factorization of the inverse of the shifted companion matrix

On the Ritz values of normal matrices

Iterative methods for symmetric eigenvalue problems

QR-decomposition. The QR-decomposition of an n k matrix A, k n, is an n n unitary matrix Q and an n k upper triangular matrix R for which A = QR

AMS526: Numerical Analysis I (Numerical Linear Algebra)

Summary of Iterative Methods for Non-symmetric Linear Equations That Are Related to the Conjugate Gradient (CG) Method

Report Number 09/32. A Hamiltonian Krylov-Schur-type method based on the symplectic Lanczos process. Peter Benner, Heike Faßbender, Martin Stoll

Real Eigenvalue Extraction and the Distance to Uncontrollability

Cyclic reduction and index reduction/shifting for a second-order probabilistic problem

Spectral Analysis of Matrices - An Introduction for Engineers

Lecture 7: Positive Semidefinite Matrices

Solution of eigenvalue problems. Subspace iteration, The symmetric Lanczos algorithm. Harmonic Ritz values, Jacobi-Davidson s method

Numerical Optimization

EIGENVALUE PROBLEMS. EIGENVALUE PROBLEMS p. 1/4

Preconditioned inverse iteration and shift-invert Arnoldi method

Lecture 22. r i+1 = b Ax i+1 = b A(x i + α i r i ) =(b Ax i ) α i Ar i = r i α i Ar i

Chapter 5. Eigenvalues and Eigenvectors

Diagonalization of Matrix

w T 1 w T 2. w T n 0 if i j 1 if i = j

Background Mathematics (2/2) 1. David Barber

6.4 Krylov Subspaces and Conjugate Gradients

Nonlinear Eigenvalue Problems: A Challenge for Modern Eigenvalue Methods

MIT Final Exam Solutions, Spring 2017

Contents 1 Introduction and Preliminaries 1 Embedding of Extended Matrix Pencils 3 Hamiltonian Triangular Forms 1 4 Skew-Hamiltonian/Hamiltonian Matri

Numerical Methods in Matrix Computations

Data Analysis and Manifold Learning Lecture 2: Properties of Symmetric Matrices and Examples

Econ Slides from Lecture 7

Solving Large Nonlinear Sparse Systems

Eigenvalues and Eigenvectors. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB

Math Camp Notes: Linear Algebra II

Final Exam, Linear Algebra, Fall, 2003, W. Stephen Wilson

Math 504 (Fall 2011) 1. (*) Consider the matrices

1 Conjugate gradients

A linear algebra proof of the fundamental theorem of algebra

A linear algebra proof of the fundamental theorem of algebra

Karhunen-Loève Approximation of Random Fields Using Hierarchical Matrix Techniques

FEM and sparse linear system solving

Bindel, Fall 2016 Matrix Computations (CS 6210) Notes for

4. Linear transformations as a vector space 17

STRUCTURE PRESERVING DEFLATION OF INFINITE EIGENVALUES IN STRUCTURED PENCILS

On the solution of large Sylvester-observer equations

Krylov Subspaces. Lab 1. The Arnoldi Iteration

AMS526: Numerical Analysis I (Numerical Linear Algebra for Computational and Data Sciences)

A comparison of solvers for quadratic eigenvalue problems from combustion

Eigenvalue perturbation theory of classes of structured matrices under generic structured rank one perturbations

1 Extrapolation: A Hint of Things to Come

Definition (T -invariant subspace) Example. Example

DELFT UNIVERSITY OF TECHNOLOGY

COMP6237 Data Mining Covariance, EVD, PCA & SVD. Jonathon Hare

MATH 423 Linear Algebra II Lecture 33: Diagonalization of normal operators.

A new block method for computing the Hamiltonian Schur form

The residual again. The residual is our method of judging how good a potential solution x! of a system A x = b actually is. We compute. r = b - A x!

Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Krylov Space Methods. Nonstationary sounds good. Radu Trîmbiţaş ( Babeş-Bolyai University) Krylov Space Methods 1 / 17

5 More on Linear Algebra

YORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH M Test #2 Solutions

AMS526: Numerical Analysis I (Numerical Linear Algebra)

L -Norm Computation for Large-Scale Descriptor Systems Using Structured Iterative Eigensolvers

2 Eigenvectors and Eigenvalues in abstract spaces.

Linear Algebra - Part II

Matrix stabilization using differential equations.

ECS231 Handout Subspace projection methods for Solving Large-Scale Eigenvalue Problems. Part I: Review of basic theory of eigenvalue problems

Iterative methods for Linear System

On Solving Large Algebraic. Riccati Matrix Equations

Lecture 9: Krylov Subspace Methods. 2 Derivation of the Conjugate Gradient Algorithm

EECS 275 Matrix Computation

Eigenvectors. Prop-Defn

BALANCING-RELATED MODEL REDUCTION FOR DATA-SPARSE SYSTEMS

Math 205, Summer I, Week 4b:

Nonlinear Eigenvalue Problems: An Introduction

Linear Least-Squares Data Fitting

SKEW-HAMILTONIAN AND HAMILTONIAN EIGENVALUE PROBLEMS: THEORY, ALGORITHMS AND APPLICATIONS

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix

ON THE REDUCTION OF A HAMILTONIAN MATRIX TO HAMILTONIAN SCHUR FORM

Krylov Subspace Methods for Nonlinear Model Reduction

On the Computations of Eigenvalues of the Fourth-order Sturm Liouville Problems

Eigenvalues and Eigenvectors

Opportunities for ELPA to Accelerate the Solution of the Bethe-Salpeter Eigenvalue Problem

MATH 1553 PRACTICE MIDTERM 3 (VERSION B)

YORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH M Test #1. July 11, 2013 Solutions

PY 351 Modern Physics Short assignment 4, Nov. 9, 2018, to be returned in class on Nov. 15.

Stabilization and Acceleration of Algebraic Multigrid Method

Numerical Methods for General and Structured Eigenvalue Problems

Transcription:

Solving large Hamiltonian eigenvalue problems David S. Watkins watkins@math.wsu.edu Department of Mathematics Washington State University Adaptivity and Beyond, Vancouver, August 2005 p.1

Some Collaborators Adaptivity and Beyond, Vancouver, August 2005 p.2

Some Collaborators Volker Mehrmann Adaptivity and Beyond, Vancouver, August 2005 p.2

Some Collaborators Volker Mehrmann Thomas Apel Adaptivity and Beyond, Vancouver, August 2005 p.2

Some Collaborators Volker Mehrmann Thomas Apel Peter Benner Adaptivity and Beyond, Vancouver, August 2005 p.2

Some Collaborators Volker Mehrmann Thomas Apel Peter Benner Heike Faßbender Adaptivity and Beyond, Vancouver, August 2005 p.2

Some Collaborators Volker Mehrmann Thomas Apel Peter Benner Heike Faßbender... Adaptivity and Beyond, Vancouver, August 2005 p.2

Problem: Linear Elasticity Elastic Deformation (3D, anisotropic, composite materials) Adaptivity and Beyond, Vancouver, August 2005 p.3

Problem: Linear Elasticity Elastic Deformation (3D, anisotropic, composite materials) Singularities at cracks, interfaces Adaptivity and Beyond, Vancouver, August 2005 p.3

Problem: Linear Elasticity Elastic Deformation (3D, anisotropic, composite materials) Singularities at cracks, interfaces Lamé Equations (PDE, spherical coordinates) Adaptivity and Beyond, Vancouver, August 2005 p.3

Problem: Linear Elasticity Elastic Deformation (3D, anisotropic, composite materials) Singularities at cracks, interfaces Lamé Equations (PDE, spherical coordinates) Separate radial variable. Adaptivity and Beyond, Vancouver, August 2005 p.3

Problem: Linear Elasticity Elastic Deformation (3D, anisotropic, composite materials) Singularities at cracks, interfaces Lamé Equations (PDE, spherical coordinates) Separate radial variable. Get quadratic eigenvalue problem. (λ 2 M + λg + K)v = 0 M = M > 0 G = G K = K < 0 Adaptivity and Beyond, Vancouver, August 2005 p.3

Problem: Linear Elasticity Elastic Deformation (3D, anisotropic, composite materials) Singularities at cracks, interfaces Lamé Equations (PDE, spherical coordinates) Separate radial variable. Get quadratic eigenvalue problem. (λ 2 M + λg + K)v = 0 M = M > 0 G = G K = K < 0 solve numerically Adaptivity and Beyond, Vancouver, August 2005 p.3

Numerical Solution Adaptivity and Beyond, Vancouver, August 2005 p.4

Numerical Solution Discretize using finite elements Adaptivity and Beyond, Vancouver, August 2005 p.4

Numerical Solution Discretize using finite elements (λ 2 M + λg + K)v = 0 M T = M > 0 G T = G K T = K < 0 Adaptivity and Beyond, Vancouver, August 2005 p.4

Numerical Solution Discretize using finite elements (λ 2 M + λg + K)v = 0 M T = M > 0 G T = G K T = K < 0 matrix quadratic eigenvalue problem (large, sparse) Adaptivity and Beyond, Vancouver, August 2005 p.4

Numerical Solution Discretize using finite elements (λ 2 M + λg + K)v = 0 M T = M > 0 G T = G K T = K < 0 matrix quadratic eigenvalue problem (large, sparse) Find few smallest eigenvalues (and corresponding eigenvectors). Adaptivity and Beyond, Vancouver, August 2005 p.4

Other Applications Adaptivity and Beyond, Vancouver, August 2005 p.5

Other Applications Gyroscopic systems λ 2 M + λg + K Adaptivity and Beyond, Vancouver, August 2005 p.5

Other Applications Gyroscopic systems λ 2 M + λg + K Quadratic regulator (optimal control) [ ] [ ] C T C A T 0 E T A BB T λ E 0 symmetric/skew-symmetric Adaptivity and Beyond, Vancouver, August 2005 p.5

Other Applications Gyroscopic systems λ 2 M + λg + K Quadratic regulator (optimal control) [ ] [ ] C T C A T 0 E T A BB T λ E 0 symmetric/skew-symmetric Higher-order systems λ n A n + λ n 1 A n 1 + + λa 1 + A 0 Adaptivity and Beyond, Vancouver, August 2005 p.5

Hamiltonian Structure Adaptivity and Beyond, Vancouver, August 2005 p.6

Hamiltonian Structure Our matrices are real. Adaptivity and Beyond, Vancouver, August 2005 p.6

Hamiltonian Structure Our matrices are real. λ, λ, λ, λ occur together. Adaptivity and Beyond, Vancouver, August 2005 p.6

Hamiltonian Structure Our matrices are real. λ, λ, λ, λ occur together. spectra of Hamiltonian matrices Adaptivity and Beyond, Vancouver, August 2005 p.6

Hamiltonian Structure Our matrices are real. λ, λ, λ, λ occur together. spectra of Hamiltonian matrices Adaptivity and Beyond, Vancouver, August 2005 p.6

Hamiltonian Structure Our matrices are real. λ, λ, λ, λ occur together. spectra of Hamiltonian matrices Adaptivity and Beyond, Vancouver, August 2005 p.6

Hamiltonian Matrices Adaptivity and Beyond, Vancouver, August 2005 p.7

Hamiltonian Matrices H R 2n 2n Adaptivity and Beyond, Vancouver, August 2005 p.7

Hamiltonian Matrices H R 2n 2n [ 0 I J = I 0 ] R 2n 2n Adaptivity and Beyond, Vancouver, August 2005 p.7

Hamiltonian Matrices H R 2n 2n [ 0 I J = I 0 ] R 2n 2n H is Hamiltonian iff JH is symmetric. Adaptivity and Beyond, Vancouver, August 2005 p.7

Hamiltonian Matrices H R 2n 2n [ 0 I J = I 0 ] R 2n 2n H is Hamiltonian iff JH is symmetric. [ ] A K H = N A T, where K = K T and N = N T Adaptivity and Beyond, Vancouver, August 2005 p.7

Reduction of Order λ 2 Mv + λgv + Kv = 0 Adaptivity and Beyond, Vancouver, August 2005 p.8

Reduction of Order λ 2 Mv + λgv + Kv = 0 w = λv, Adaptivity and Beyond, Vancouver, August 2005 p.8

Reduction of Order λ 2 Mv + λgv + Kv = 0 w = λv, Mw = λmv Adaptivity and Beyond, Vancouver, August 2005 p.8

Reduction of Order λ 2 Mv + λgv + Kv = 0 w = λv, Mw = λmv [ ] [ ] [ K 0 v λ 0 M w G M M 0 ] [ v w ] = 0 Adaptivity and Beyond, Vancouver, August 2005 p.8

Reduction of Order λ 2 Mv + λgv + Kv = 0 w = λv, Mw = λmv [ ] [ ] [ K 0 v λ 0 M w G M M 0 ] [ v w ] = 0 Ax λnx = 0 Adaptivity and Beyond, Vancouver, August 2005 p.8

Reduction of Order λ 2 Mv + λgv + Kv = 0 w = λv, Mw = λmv [ ] [ ] [ K 0 v λ 0 M w G M M 0 ] [ v w ] = 0 Ax λnx = 0 symmetric/skew-symmetric Adaptivity and Beyond, Vancouver, August 2005 p.8

Reduction of Order λ 2 Mv + λgv + Kv = 0 w = λv, Mw = λmv [ ] [ ] [ K 0 v λ 0 M w G M M 0 ] [ v w ] = 0 Ax λnx = 0 symmetric/skew-symmetric Structure has been preserved. Adaptivity and Beyond, Vancouver, August 2005 p.8

Reduction to Hamiltonian Matrix A λn (symmetric/skew-symmetric) Adaptivity and Beyond, Vancouver, August 2005 p.9

Reduction to Hamiltonian Matrix A λn (symmetric/skew-symmetric) ( [ ]) N = R T 0 I JR J = I 0 Adaptivity and Beyond, Vancouver, August 2005 p.9

Reduction to Hamiltonian Matrix A λn (symmetric/skew-symmetric) ( [ ]) N = R T 0 I JR J = I 0 always possible, sometimes easy Adaptivity and Beyond, Vancouver, August 2005 p.9

Reduction to Hamiltonian Matrix A λn (symmetric/skew-symmetric) ( [ ]) N = R T 0 I JR J = I 0 always possible, sometimes easy Transform: Adaptivity and Beyond, Vancouver, August 2005 p.9

Reduction to Hamiltonian Matrix A λn (symmetric/skew-symmetric) ( [ ]) N = R T 0 I JR J = I 0 always possible, sometimes easy Transform: A λr T JR Adaptivity and Beyond, Vancouver, August 2005 p.9

Reduction to Hamiltonian Matrix A λn (symmetric/skew-symmetric) ( [ ]) N = R T 0 I JR J = I 0 always possible, sometimes easy Transform: A λr T JR R T AR 1 λj Adaptivity and Beyond, Vancouver, August 2005 p.9

Reduction to Hamiltonian Matrix A λn (symmetric/skew-symmetric) ( [ ]) N = R T 0 I JR J = I 0 always possible, sometimes easy Transform: A λr T JR R T AR 1 λj J 1 R T AR 1 λi Adaptivity and Beyond, Vancouver, August 2005 p.9

Reduction to Hamiltonian Matrix A λn (symmetric/skew-symmetric) ( [ ]) N = R T 0 I JR J = I 0 always possible, sometimes easy Transform: A λr T JR R T AR 1 λj J 1 R T AR 1 λi H = J 1 R T AR 1 is Hamiltonian. Adaptivity and Beyond, Vancouver, August 2005 p.9

Example (our application) [ K 0 0 M ] [ v w ] λ [ G M M 0 ] [ v w ] = 0 Adaptivity and Beyond, Vancouver, August 2005 p.10

Example (our application) [ ] [ K 0 0 M [ G M N = M 0 v w ] ] λ [ G M M 0 ] [ v w ] = 0 Adaptivity and Beyond, Vancouver, August 2005 p.10

Example (our application) [ ] [ K 0 0 M [ G M N = M 0 v w ] ] λ [ G M M 0 ] [ v w ] = 0 N = R T JR = Adaptivity and Beyond, Vancouver, August 2005 p.10

Example (our application) [ ] [ ] [ ] [ K 0 v G M λ 0 M w M 0 [ ] G M N = M 0 [ ] [ ] [ N = R T I 1 JR = 2 G 0 I 0 M I 0 v w ] = 0 I 0 1 2 G M ] Adaptivity and Beyond, Vancouver, August 2005 p.10

Example (our application) [ ] [ ] [ ] [ K 0 v G M λ 0 M w M 0 [ ] G M N = M 0 [ ] [ ] [ N = R T I 1 JR = 2 G 0 I 0 M I 0 cost: zero flops v w ] = 0 I 0 1 2 G M ] Adaptivity and Beyond, Vancouver, August 2005 p.10

H = J 1 R T AR 1 Adaptivity and Beyond, Vancouver, August 2005 p.11

H = J 1 R T AR 1 After some algebra... Adaptivity and Beyond, Vancouver, August 2005 p.11

H = J 1 R T AR 1 After some algebra... [ ] [ I 0 0 M 1 H = 1 2 G I K 0 ] [ I 0 1 2 G I ] Adaptivity and Beyond, Vancouver, August 2005 p.11

H = J 1 R T AR 1 After some algebra... [ ] [ I 0 0 M 1 H = 1 2 G I K 0 ] [ I 0 1 2 G I ] Do not form the product explicitly Adaptivity and Beyond, Vancouver, August 2005 p.11

H = J 1 R T AR 1 After some algebra... [ ] [ I 0 0 M 1 H = 1 2 G I K 0 ] [ I 0 1 2 G I ] Do not form the product explicitly Krylov subspace methods Adaptivity and Beyond, Vancouver, August 2005 p.11

H = J 1 R T AR 1 After some algebra... [ ] [ I 0 0 M 1 H = 1 2 G I K 0 ] [ I 0 1 2 G I ] Do not form the product explicitly Krylov subspace methods We just need to apply the operator. Adaptivity and Beyond, Vancouver, August 2005 p.11

H = J 1 R T AR 1 After some algebra... [ ] [ I 0 0 M 1 H = 1 2 G I K 0 ] [ I 0 1 2 G I ] Do not form the product explicitly Krylov subspace methods We just need to apply the operator. M = LL T (done once) Adaptivity and Beyond, Vancouver, August 2005 p.11

H = J 1 R T AR 1 After some algebra... [ ] [ I 0 0 M 1 H = 1 2 G I K 0 ] [ I 0 1 2 G I ] Do not form the product explicitly Krylov subspace methods We just need to apply the operator. M = LL T (done once) backsolve Adaptivity and Beyond, Vancouver, August 2005 p.11

However,... Adaptivity and Beyond, Vancouver, August 2005 p.12

However,... Adaptivity and Beyond, Vancouver, August 2005 p.12

However,...... we really want H 1. Adaptivity and Beyond, Vancouver, August 2005 p.12

However,...... we really want H 1. H = [ I 0 1 2 G I ] [ 0 M 1 K 0 ] [ I 0 1 2 G I ] Adaptivity and Beyond, Vancouver, August 2005 p.12

However,...... we really want H 1. [ ] [ I 0 H = 1 2 G I [ ] [ H 1 I 0 = 1 2 G I 0 M 1 K 0 0 ( K) 1 M 0 ] [ I 0 1 2 G I ] [ I 0 1 2 G I ] ] Adaptivity and Beyond, Vancouver, August 2005 p.12

However,...... we really want H 1. [ ] [ I 0 H = 1 2 G I [ ] [ H 1 I 0 = 1 2 G I K = LL T 0 M 1 K 0 0 ( K) 1 M 0 ] [ I 0 1 2 G I ] [ I 0 1 2 G I ] ] Adaptivity and Beyond, Vancouver, August 2005 p.12

Shift and Invert? Adaptivity and Beyond, Vancouver, August 2005 p.13

Shift and Invert? (H τi) 1 Adaptivity and Beyond, Vancouver, August 2005 p.13

Shift and Invert? (H τi) 1 is not Hamiltonian Adaptivity and Beyond, Vancouver, August 2005 p.13

Shift and Invert? (H τi) 1 is not Hamiltonian Structure is lost. Adaptivity and Beyond, Vancouver, August 2005 p.13

Shift and Invert? (H τi) 1 is not Hamiltonian Structure is lost. How can we recover it? Adaptivity and Beyond, Vancouver, August 2005 p.13

Exploitable Structures Adaptivity and Beyond, Vancouver, August 2005 p.14

Exploitable Structures symplectic (first idea) (H τi) 1 (H + τi) Adaptivity and Beyond, Vancouver, August 2005 p.14

Exploitable Structures symplectic (first idea) (H τi) 1 (H + τi) skew-hamiltonian (easiest?) H 2 Adaptivity and Beyond, Vancouver, August 2005 p.14

Exploitable Structures symplectic (first idea) (H τi) 1 (H + τi) skew-hamiltonian H 2 (easiest?) (H τi) 1 (H + τi) 1 Adaptivity and Beyond, Vancouver, August 2005 p.14

Exploitable Structures symplectic (first idea) (H τi) 1 (H + τi) skew-hamiltonian H 2 (easiest?) (H τi) 1 (H + τi) 1 Hamiltonian H 1 Adaptivity and Beyond, Vancouver, August 2005 p.14

Exploitable Structures symplectic (first idea) (H τi) 1 (H + τi) skew-hamiltonian H 2 (easiest?) (H τi) 1 (H + τi) 1 Hamiltonian H 1 H 1 (H τi) 1 (H + τi) 1 Adaptivity and Beyond, Vancouver, August 2005 p.14

Structured Lanczos Processes Adaptivity and Beyond, Vancouver, August 2005 p.15

Structured Lanczos Processes Unsymmetric Lanczos Process u k+1 b k d k = Bu k u k a k d k u k 1 b k 1 d k w k+1 d k+1 b k = B T w k w k d k a k w k 1 d k 1 b k 1 Adaptivity and Beyond, Vancouver, August 2005 p.15

Hamiltonian Lanczos Process u k+1 b k+1 = Hv k u k a k u k 1 b k 1 v k+1 d k+1 = Hu k+1 Adaptivity and Beyond, Vancouver, August 2005 p.16

Symplectic Lanczos Process v k+1 d k+1 b k = Sv k v k d k a k v k 1 d k 1 b k 1 + u k d k u k+1 d k+1 = S 1 v k+1 Adaptivity and Beyond, Vancouver, August 2005 p.17

Symplectic Lanczos Process v k+1 d k+1 b k = Sv k v k d k a k v k 1 d k 1 b k 1 + u k d k u k+1 d k+1 = S 1 v k+1 many interesting relationships Adaptivity and Beyond, Vancouver, August 2005 p.17

Implicit Restarts Adaptivity and Beyond, Vancouver, August 2005 p.18

Implicit Restarts short Lanczos runs (breakdowns!!, no look-ahead) Adaptivity and Beyond, Vancouver, August 2005 p.18

Implicit Restarts short Lanczos runs (breakdowns!!, no look-ahead) Restart implicitly as in IRA (Sorensen 1991), ARPACK Adaptivity and Beyond, Vancouver, August 2005 p.18

Implicit Restarts short Lanczos runs (breakdowns!!, no look-ahead) Restart implicitly as in IRA (Sorensen 1991), ARPACK Restart with HR, not QR Adaptivity and Beyond, Vancouver, August 2005 p.18

Remarks on Stability Adaptivity and Beyond, Vancouver, August 2005 p.19

Remarks on Stability Both Hamiltonian and symplectic Lanczos processes are potentially unstable. Adaptivity and Beyond, Vancouver, August 2005 p.19

Remarks on Stability Both Hamiltonian and symplectic Lanczos processes are potentially unstable. Breakdowns can occur. Adaptivity and Beyond, Vancouver, August 2005 p.19

Remarks on Stability Both Hamiltonian and symplectic Lanczos processes are potentially unstable. Breakdowns can occur. Are the answers worth anything? Adaptivity and Beyond, Vancouver, August 2005 p.19

Remarks on Stability Both Hamiltonian and symplectic Lanczos processes are potentially unstable. Breakdowns can occur. Are the answers worth anything? right and left eigenvectors Adaptivity and Beyond, Vancouver, August 2005 p.19

Remarks on Stability Both Hamiltonian and symplectic Lanczos processes are potentially unstable. Breakdowns can occur. Are the answers worth anything? right and left eigenvectors residuals Adaptivity and Beyond, Vancouver, August 2005 p.19

Remarks on Stability Both Hamiltonian and symplectic Lanczos processes are potentially unstable. Breakdowns can occur. Are the answers worth anything? right and left eigenvectors residuals condition numbers for eigenvalues Adaptivity and Beyond, Vancouver, August 2005 p.19

Remarks on Stability Both Hamiltonian and symplectic Lanczos processes are potentially unstable. Breakdowns can occur. Are the answers worth anything? right and left eigenvectors residuals condition numbers for eigenvalues Don t skip these tests. Adaptivity and Beyond, Vancouver, August 2005 p.19

Example Adaptivity and Beyond, Vancouver, August 2005 p.20

Example λ 2 Mv + λgv + Kv = 0 Adaptivity and Beyond, Vancouver, August 2005 p.20

Example λ 2 Mv + λgv + Kv = 0 n = 3423 Adaptivity and Beyond, Vancouver, August 2005 p.20

Example λ 2 Mv + λgv + Kv = 0 n = 3423 H = [ I 0 1 2 G I ] [ 0 M 1 K 0 ] [ I 0 1 2 G I ] Adaptivity and Beyond, Vancouver, August 2005 p.20

Example λ 2 Mv + λgv + Kv = 0 n = 3423 H = [ I 0 1 2 G I ] [ 0 M 1 K 0 ] [ I 0 1 2 G I ] H 1 = [ I 0 1 2 G I ] [ 0 ( K) 1 M 0 ] [ I 0 1 2 G I ] Adaptivity and Beyond, Vancouver, August 2005 p.20

Compare various approaches: Adaptivity and Beyond, Vancouver, August 2005 p.21

Compare various approaches: Hamiltonian(1) H 1 Adaptivity and Beyond, Vancouver, August 2005 p.21

Compare various approaches: Hamiltonian(1) Hamiltonian(3) H 1 H 1 (H τi) 1 (H + τi) 1 Adaptivity and Beyond, Vancouver, August 2005 p.21

Compare various approaches: Hamiltonian(1) Hamiltonian(3) H 1 H 1 (H τi) 1 (H + τi) 1 symplectic (H τi) 1 (H + τi) Adaptivity and Beyond, Vancouver, August 2005 p.21

Compare various approaches: Hamiltonian(1) Hamiltonian(3) H 1 H 1 (H τi) 1 (H + τi) 1 symplectic (H τi) 1 (H + τi) unstructured (H τi) 1 + ordinary Lanczos with implicit restarts Adaptivity and Beyond, Vancouver, August 2005 p.21

Compare various approaches: Hamiltonian(1) Hamiltonian(3) H 1 H 1 (H τi) 1 (H + τi) 1 symplectic (H τi) 1 (H + τi) unstructured (H τi) 1 + ordinary Lanczos with implicit restarts Get 6 smallest eigenvalues in right half-plane. Adaptivity and Beyond, Vancouver, August 2005 p.21

Compare various approaches: Hamiltonian(1) Hamiltonian(3) symplectic H 1 H 1 (H τi) 1 (H + τi) 1 (H τi) 1 (H + τi) unstructured (H τi) 1 + ordinary Lanczos with implicit restarts Get 6 smallest eigenvalues in right half-plane. Tolerance = 10 8 Adaptivity and Beyond, Vancouver, August 2005 p.21

Compare various approaches: Hamiltonian(1) Hamiltonian(3) symplectic H 1 H 1 (H τi) 1 (H + τi) 1 (H τi) 1 (H + τi) unstructured (H τi) 1 + ordinary Lanczos with implicit restarts Get 6 smallest eigenvalues in right half-plane. Tolerance = 10 8 Take 20 steps and restart with 10. Adaptivity and Beyond, Vancouver, August 2005 p.21

No-Clue Case (τ = 0) Adaptivity and Beyond, Vancouver, August 2005 p.22

No-Clue Case (τ = 0) Method Solves Eigvals Max. Found Resid. Hamiltonian(1) 78 9 2 10 10 Adaptivity and Beyond, Vancouver, August 2005 p.22

No-Clue Case (τ = 0) Method Solves Eigvals Max. Found Resid. Hamiltonian(1) 78 9 2 10 10 Unstructured 158 7 + 7 5 10 7 Adaptivity and Beyond, Vancouver, August 2005 p.22

No-Clue Case (τ = 0) Method Solves Eigvals Max. Found Resid. Hamiltonian(1) 78 9 2 10 10 Unstructured 158 7 + 7 5 10 7 Unstructured code must find everything twice. 5 x 10 3 4 3 2 1 0 1 2 3 4 5 2.5 2 1.5 1 0.5 0 0.5 1 1.5 2 2.5 Adaptivity and Beyond, Vancouver, August 2005 p.22

Conservative Shift (τ = 0.5) Adaptivity and Beyond, Vancouver, August 2005 p.23

Conservative Shift (τ = 0.5) Method Solves Eigvals Max. Found Resid. Hamiltonian(1) 78 9 2 10 10 5 x 10 3 4 3 2 1 0 1 2 3 4 5 2.5 2 1.5 1 0.5 0 0.5 1 1.5 2 2.5 Adaptivity and Beyond, Vancouver, August 2005 p.23

Conservative Shift (τ = 0.5) Method Solves Eigvals Max. Found Resid. Hamiltonian(1) 78 9 2 10 10 Unstructured 138 7 + 2 3 10 5 5 x 10 3 4 3 2 1 0 1 2 3 4 5 2.5 2 1.5 1 0.5 0 0.5 1 1.5 2 2.5 Adaptivity and Beyond, Vancouver, August 2005 p.23

Conservative Shift (τ = 0.5) Method Solves Eigvals Max. Found Resid. Hamiltonian(1) 78 9 2 10 10 Unstructured 138 7 + 2 3 10 5 Hamiltonian(3) 174 11 3 10 13 5 x 10 3 4 3 2 1 0 1 2 3 4 5 2.5 2 1.5 1 0.5 0 0.5 1 1.5 2 2.5 Adaptivity and Beyond, Vancouver, August 2005 p.23

Conservative Shift (τ = 0.5) Method Solves Eigvals Max. Found Resid. Hamiltonian(1) 78 9 2 10 10 Unstructured 138 7 + 2 3 10 5 Hamiltonian(3) 174 11 3 10 13 Symplectic 156 11 2 10 8 5 x 10 3 4 3 2 1 0 1 2 3 4 5 2.5 2 1.5 1 0.5 0 0.5 1 1.5 2 2.5 Adaptivity and Beyond, Vancouver, August 2005 p.23

Aggressive Shift (τ = 1.5) Adaptivity and Beyond, Vancouver, August 2005 p.24

Aggressive Shift (τ = 1.5) Method Solves Eigvals Max. Found Resid. Hamiltonian(1) 78 9 2 10 10 Adaptivity and Beyond, Vancouver, August 2005 p.24

Aggressive Shift (τ = 1.5) Method Solves Eigvals Max. Found Resid. Hamiltonian(1) 78 9 2 10 10 Unstructured 96 9 1 10 7 Adaptivity and Beyond, Vancouver, August 2005 p.24

Aggressive Shift (τ = 1.5) Method Solves Eigvals Max. Found Resid. Hamiltonian(1) 78 9 2 10 10 Unstructured 96 9 1 10 7 Hamiltonian(3) 120 9 2 10 12 Adaptivity and Beyond, Vancouver, August 2005 p.24

Aggressive Shift (τ = 1.5) Method Solves Eigvals Max. Found Resid. Hamiltonian(1) 78 9 2 10 10 Unstructured 96 9 1 10 7 Hamiltonian(3) 120 9 2 10 12 Symplectic 156 11 2 10 11 Adaptivity and Beyond, Vancouver, August 2005 p.24

The Last Slide Adaptivity and Beyond, Vancouver, August 2005 p.25

The Last Slide We have developed structure-preserving implicitly-restarted Lanczos methods for Hamiltonian and symplectic eigenvalue problems. Adaptivity and Beyond, Vancouver, August 2005 p.25

The Last Slide We have developed structure-preserving implicitly-restarted Lanczos methods for Hamiltonian and symplectic eigenvalue problems. The structure-preserving methods are more accurate than a comparable non-structured method. Adaptivity and Beyond, Vancouver, August 2005 p.25

The Last Slide We have developed structure-preserving implicitly-restarted Lanczos methods for Hamiltonian and symplectic eigenvalue problems. The structure-preserving methods are more accurate than a comparable non-structured method. By exploiting structure we can solve our problems more economically. Adaptivity and Beyond, Vancouver, August 2005 p.25

The Last Slide We have developed structure-preserving implicitly-restarted Lanczos methods for Hamiltonian and symplectic eigenvalue problems. The structure-preserving methods are more accurate than a comparable non-structured method. By exploiting structure we can solve our problems more economically. Thank you for your attention. Adaptivity and Beyond, Vancouver, August 2005 p.25