Structured Krylov Subspace Methods for Eigenproblems with Spectral Symmetries

Similar documents
Index. for generalized eigenvalue problem, butterfly form, 211

The Newton-ADI Method for Large-Scale Algebraic Riccati Equations. Peter Benner.

Solving large Hamiltonian eigenvalue problems

A Structure-Preserving Method for Large Scale Eigenproblems. of Skew-Hamiltonian/Hamiltonian (SHH) Pencils

Report Number 09/32. A Hamiltonian Krylov-Schur-type method based on the symplectic Lanczos process. Peter Benner, Heike Faßbender, Martin Stoll

Last Time. Social Network Graphs Betweenness. Graph Laplacian. Girvan-Newman Algorithm. Spectral Bisection

Arnoldi Methods in SLEPc

The quadratic eigenvalue problem (QEP) is to find scalars λ and nonzero vectors u satisfying

Solution of eigenvalue problems. Subspace iteration, The symmetric Lanczos algorithm. Harmonic Ritz values, Jacobi-Davidson s method

Numerical Methods in Matrix Computations

Definite versus Indefinite Linear Algebra. Christian Mehl Institut für Mathematik TU Berlin Germany. 10th SIAM Conference on Applied Linear Algebra

Perturbation theory for eigenvalues of Hermitian pencils. Christian Mehl Institut für Mathematik TU Berlin, Germany. 9th Elgersburg Workshop

Solution of eigenvalue problems. Subspace iteration, The symmetric Lanczos algorithm. Harmonic Ritz values, Jacobi-Davidson s method

Matrix Algorithms. Volume II: Eigensystems. G. W. Stewart H1HJ1L. University of Maryland College Park, Maryland

AMS526: Numerical Analysis I (Numerical Linear Algebra)

The Lanczos and conjugate gradient algorithms

APPLIED NUMERICAL LINEAR ALGEBRA

Nonlinear palindromic eigenvalue problems and their numerical solution

Canonical forms of structured matrices and pencils

Efficient Wavefield Simulators Based on Krylov Model-Order Reduction Techniques

Eigenvalue Problems. Eigenvalue problems occur in many areas of science and engineering, such as structural analysis

Scientific Computing: An Introductory Survey

An SVD-Like Matrix Decomposition and Its Applications

Contents 1 Introduction and Preliminaries 1 Embedding of Extended Matrix Pencils 3 Hamiltonian Triangular Forms 1 4 Skew-Hamiltonian/Hamiltonian Matri

Krylov subspace projection methods

An Arnoldi Method for Nonlinear Symmetric Eigenvalue Problems

ON ORTHOGONAL REDUCTION TO HESSENBERG FORM WITH SMALL BANDWIDTH

SKEW-HAMILTONIAN AND HAMILTONIAN EIGENVALUE PROBLEMS: THEORY, ALGORITHMS AND APPLICATIONS

6.4 Krylov Subspaces and Conjugate Gradients

AMS526: Numerical Analysis I (Numerical Linear Algebra) Lecture 23: GMRES and Other Krylov Subspace Methods; Preconditioning

LARGE SPARSE EIGENVALUE PROBLEMS. General Tools for Solving Large Eigen-Problems

Summary of Iterative Methods for Non-symmetric Linear Equations That Are Related to the Conjugate Gradient (CG) Method

LARGE SPARSE EIGENVALUE PROBLEMS

Cyclic reduction and index reduction/shifting for a second-order probabilistic problem

On Solving Large Algebraic. Riccati Matrix Equations

Numerical Methods I: Eigenvalues and eigenvectors

Krylov Subspace Methods that Are Based on the Minimization of the Residual

Iterative methods for Linear System of Equations. Joint Advanced Student School (JASS-2009)

p q m p q p q m p q p Σ q 0 I p Σ 0 0, n 2p q

FEM and sparse linear system solving

Finite dimensional indefinite inner product spaces and applications in Numerical Analysis

On prescribing Ritz values and GMRES residual norms generated by Arnoldi processes

On the Ritz values of normal matrices

Krylov Subspaces. The order-n Krylov subspace of A generated by x is

A robust numerical method for the γ-iteration in H control

Ritz Value Bounds That Exploit Quasi-Sparsity

Linear Algebra: Matrix Eigenvalue Problems

Applied Linear Algebra

Karhunen-Loève Approximation of Random Fields Using Hierarchical Matrix Techniques

On the influence of eigenvalues on Bi-CG residual norms

Review problems for MA 54, Fall 2004.

Krylov Subspaces. Lab 1. The Arnoldi Iteration

Model reduction of large-scale dynamical systems

Krylov Space Methods. Nonstationary sounds good. Radu Trîmbiţaş ( Babeş-Bolyai University) Krylov Space Methods 1 / 17

Finding Rightmost Eigenvalues of Large, Sparse, Nonsymmetric Parameterized Eigenvalue Problems

The skew-symmetric orthogonal solutions of the matrix equation AX = B

Math 504 (Fall 2011) 1. (*) Consider the matrices

Recent advances in approximation using Krylov subspaces. V. Simoncini. Dipartimento di Matematica, Università di Bologna.

4.8 Arnoldi Iteration, Krylov Subspaces and GMRES

Eigenvalue Problems CHAPTER 1 : PRELIMINARIES

Eigenvalue and Eigenvector Problems

Lecture 9: Krylov Subspace Methods. 2 Derivation of the Conjugate Gradient Algorithm

Foundations of Matrix Analysis

FORTRAN 77 Subroutines for the Solution of Skew-Hamiltonian/Hamiltonian Eigenproblems Part I: Algorithms and Applications 1

Matrix Equations and and Bivariate Function Approximation

AMS526: Numerical Analysis I (Numerical Linear Algebra)

ECS231 Handout Subspace projection methods for Solving Large-Scale Eigenvalue Problems. Part I: Review of basic theory of eigenvalue problems

A Jacobi Davidson Method for Nonlinear Eigenproblems

Recycling Bi-Lanczos Algorithms: BiCG, CGS, and BiCGSTAB

Matrix Mathematics. Theory, Facts, and Formulas with Application to Linear Systems Theory. Dennis S. Bernstein

Iterative methods for Linear System

EECS 275 Matrix Computation

Keeping σ fixed for several steps, iterating on µ and neglecting the remainder in the Lagrange interpolation one obtains. θ = λ j λ j 1 λ j σ, (2.

Krylov Subspace Methods for Large/Sparse Eigenvalue Problems

Computation of eigenvalues and singular values Recall that your solutions to these questions will not be collected or evaluated.

Numerical Methods - Numerical Linear Algebra

Research Matters. February 25, The Nonlinear Eigenvalue Problem. Nick Higham. Part III. Director of Research School of Mathematics

AA242B: MECHANICAL VIBRATIONS

Applied Mathematics 205. Unit V: Eigenvalue Problems. Lecturer: Dr. David Knezevic

Numerical Methods I Eigenvalue Problems

Nonlinear Eigenvalue Problems: A Challenge for Modern Eigenvalue Methods

Algorithms that use the Arnoldi Basis

A note on the product of two skew-hamiltonian matrices

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL

Solving Symmetric Indefinite Systems with Symmetric Positive Definite Preconditioners

Algebra C Numerical Linear Algebra Sample Exam Problems

ON THE REDUCTION OF A HAMILTONIAN MATRIX TO HAMILTONIAN SCHUR FORM

Preliminary/Qualifying Exam in Numerical Analysis (Math 502a) Spring 2012

Contents 1 Introduction 1 Preliminaries Singly structured matrices Doubly structured matrices 9.1 Matrices that are H-selfadjoint and G-selfadjoint...

ABSTRACT OF DISSERTATION. Ping Zhang

GMRES: Generalized Minimal Residual Algorithm for Solving Nonsymmetric Linear Systems

A refined Lanczos method for computing eigenvalues and eigenvectors of unsymmetric matrices

A Newton-Galerkin-ADI Method for Large-Scale Algebraic Riccati Equations

Lecture 7: Positive Semidefinite Matrices

Passivity Preserving Model Reduction for Large-Scale Systems. Peter Benner.

1 Conjugate gradients

The German word eigen is cognate with the Old English word āgen, which became owen in Middle English and own in modern English.

Math 108b: Notes on the Spectral Theorem

Linear Algebra 2 Spectral Notes

Computational Methods. Eigenvalues and Singular Values

Transcription:

Structured Krylov Subspace Methods for Eigenproblems with Spectral Symmetries Fakultät für Mathematik TU Chemnitz, Germany Peter Benner benner@mathematik.tu-chemnitz.de joint work with Heike Faßbender (TU Braunschweig) and Hongguo Xu (University of Kansas) Theoretical and Computational Aspects of Matrix Algorithms Dagstuhl, October 12 17, 2003

Overview Overview The Hamiltonian eigenproblem Applications The symplectic Lanczos method Choosing the free parameters Numerical examples Conclusions Peter Benner TU Chemnitz 2

Hamiltonian eigenproblem The Hamiltonian Eigenproblem Hx = λx H R 2n 2n is a Hamiltonian matrix, i.e., (HJ) T = HJ, where J = [ 0 Hamiltonian matrices have explicit block structure implied by symmetry of HJ, H = [ A B C A T ] I n I n 0, B = B T, C = C T ; are skew-selfadjoint w.r.t. indefinite inner product x, y J := x T Jy. form Lie algebra with corresponding Lie group: S n := {S R 2n 2n SJS T = J} = symplectic matrices, hence S 1 HS is Hamiltonian for all S S 2n. ]. Peter Benner TU Chemnitz 3

Hamiltonian eigenproblem Spectral Properties Spectral symmetry with respect to real and imaginary axes: λ Λ (H) λ, ±λ Λ (H) Hence, λ R ±λ Λ (H), λ = jω jr ±jω Λ (H), λ C \ jr ±λ, ±λ Λ (H), Peter Benner TU Chemnitz 4

applications Systems and control: Applications Solution methods for algebraic and differential Riccati equations. Design of LQR/LQG/H 2 /H controllers and filters for continuous-time linear control systems. Stability radii and system norm computations; optimization of system norms. Passivity-preserving model reduction based on balancing. Reduced-order control for infinite-dim. systems based on inertial manifolds. Computational physics: exponential inegrators for Hamiltonian dynamics. Quantum chemistry: computing excitation energies in many-particle systems using random phase approximation (RPA). Quadratic eigenvalue problems... Peter Benner TU Chemnitz 5

applications Quadratic Eigenvalue Problems Consider λ 2 Mx + λgx + Kx = 0, where M = M T spd, K = K T, G = G T. Motivation: linear elasticity: computation of corner singularities in 3D anisotropic elastic structures [Apel/Mehrmann/Watkins 01] FE discretization in structural analysis [H.R. Schwarz 84] acoustic simulation of poro-elastic materials [Meerbergen 99] gyroscopic systems, rotor systems [Lancaster 99] Peter Benner TU Chemnitz 6

applications Spectral Symmetry of QEP Spectrum of matrix polynomial λ 2 Mx + λgx + Kx = 0, G = G T, looks like Hamiltonian symmetry! Peter Benner TU Chemnitz 7

applications Linearization of QEP Setting y = λx yields linear (generalized) eigenvalue problem ([ 0 K M 0 ] λ [ M G 0 M ]) [ y x ] = 0. H = N = [ ] 0 K M 0 [ ] M G 0 M is Hamiltonian: (HJ) T = HJ is skew-hamiltonian: (NJ) T = NJ = H λn is a Hamiltonian/skew-Hamiltonian pencil [Mehrmann/Watkins 00] Peter Benner TU Chemnitz 8

applications Re-formulation of H λn Note: N = Z 1 Z 2 = [ I 1 2 G 0 M ] [ M 1 2 G 0 I ]. Hence, as M is spd where Z1 1 (H λn)z 1 2 = W λi, W = [ I 1 2 G 0 I ] [ 0 K M 1 0 ] [ I 1 2 G 0 I ] W is Hamiltonian! Peter Benner TU Chemnitz 9

symplectic Lanczos method Observations: Structured Lanczos Method for Hamiltonian Matrices Preservation of spectral symmetry requires structured eigensolver. Hamiltonian structure is preserved under symplectic similarity transformations. For large, sparse problems want Krylov subspace method. = Need symplectic basis for Krylov subspace K(H, 2k, v) := span{v, Hv, H 2 v,..., H 2k 1 }. H nonsymmetric apply Arnoldi method or nonsymmetric Lanczos method. Arnoldi method can only produce symplectic Krylov basis in very particular instances! [Ammar/Mehrmann 91] = Need variant of nonsymmetric Lanczos! Peter Benner TU Chemnitz 10

symplectic Lanczos method Symplectic Lanczos Transformation For every Hamiltonian matrix H there exists symplectic similarity transformation H = S 1 HS = δ 1 β 1 ζ 2 δ 2 ζ 2 β 2 ζ 3 δ 3 ζ 3............... ζ n δ n ζ n β n ν 1 δ 1 ν 2 δ 2 ν 3 δ 3...... ν n δ n Hamiltonian J-tridiagonal (J-Hessenberg) form = Need 4n 1 parameters to represent H. Peter Benner TU Chemnitz 11

symplectic Lanczos method Relation of Reduced Form and Krylov Subspace Theorem: If H = S 1 HS is in Hamiltonian J-tridiagonal form, then K(H, 2n 1, v) = SR with s 1 = v is an SR decomposition of the Krylov matrix K(H, 2n 1, v) := [ v, Hv,..., H 2n 1 v ]. Peter Benner TU Chemnitz 12

symplectic Lanczos method Some Notation Permutation using P := [e 1, e 3,..., e 2n 1, e 2, e 4,..., e 2n ] (perfect shuffle) yields H P := P HP T, S P := P SP T, J P := P JP T = diag ([ 0 1 1 0 ],..., [ 0 1 1 0 ]) H P := S 1 P H P S P = δ 1 β 1 0 ζ 2 ν 1 δ 1 0 0 0 ζ 2 δ 2 β 2 0 ζ 3 0 0 ν 2 δ 2 0 0 0 ζ 3...... 0 0............ 0 ζ n...... 0 0 0 ζ n δ n β n 0 0 ν n δ n Peter Benner TU Chemnitz 13

symplectic Lanczos method Derivation of Symplectic Lanczos Algorithm via J-Tridiagonalization Compute (permuted) J-tridiagonal form of a (permuted) Hamiltonian matrix H (H P ) by column-wise evaluation of HS = S H (H P S P = S P HP, where S P = [v 1, w 1, v 2, w 2,..., v n, w n ] is a permuted symplectic matrix). = H P v m = δ m v m + ν m w m ν m w m = H P v m δ m v m =: w m H P w m = ζ m v m 1 + β m v m δ m w m + ζ m+1 v m+1 ζ m+1 v m+1 = H P w m ζ m v m 1 β m v m + δ m w m =: ṽ m+1. = Choose parameters δ m, β m, ν m, ζ m such that resulting algorithm computes J P -orthogonal basis of Krylov subspace K(H P, v 1, l) = span{v 1, H P v 1,..., H 2l 1 P v 1 }. Peter Benner TU Chemnitz 14

symplectic Lanczos method Want S T P J P S P = J P = Choice of Parameters ν m+1 = vm+1j T P H P v m+1, β m = wmj T P H P w m,. ζ m 0, δ m are free! ζ m = ṽ m 2 = v m 2 = 1 Possible choices for δ m : δ m = 1 [B./Faßbender 97] δ m = v T mh P v m = v m w m [Lin/Ferng/Wang 97] δ m = 0 = algorithm is equivalent to HR/HZ tridiagonalization [B./Faßbender/Watkins 97, Watkins 02] Peter Benner TU Chemnitz 15

symplectic Lanczos method Generic Symplectic Lanczos Algorithm Choose initial vector ṽ 1 R 2n, ṽ 1 0. m = 1; v 0 = 0; ζ 1 = ṽ 1 2 ; ν 0 = 1 while ζ m 0 and ν m 1 0 v m = ṽ m /ζ m w m = H P v m δ m v m ν m = v T mj P H P v m if ν m 0 then w m = w m /ν m β m = w T mj P H P w m ṽ m+1 = H P w m ζ m v m 1 β m v m + δ m w m ζ m+1 = ṽ m+1 2 end if m = m + 1 end while Peter Benner TU Chemnitz 16

symplectic Lanczos method Lanczos Recursion and Breakdown H 2k P Define SP 2k = [v 1, w 1, v 2, w 2,..., v k, w k ] R 2n 2k, let principal submatrix of H P, then the symplectic Lanczos recursion is be the leading 2k 2k H P S 2k P = S 2k P 2k M P + ζ k+1 v k+1 e T 2k. Breakdown is possible: ṽ m = 0 ζ m = 0, lucky breakdown! w m = 0 ν m = 0, lucky breakdown! symplectic invariant subspace of H is detected, invariant subspace of H P of dimension 2m 1 is detected, ν m = 0 but v m 0 and w m 0 = serious breakdown! Peter Benner TU Chemnitz 17

choosing the free parameters Theorem: Uniqueness of Lanczos Vectors If there are two symplectic Lanczos recurrences with v 1 = v 1 HV k = V k H k + ζ k+1 ṽ k+1 e T 2k, HV k = V k H k + ζ k+1 ṽ k+1 e T 2k, then there exists a trivial symplectic matrix D k =, ˆDk, ˆF k diagonal, such that V k = V k D k. [ ˆDk ˆFk ˆD 1 k ] Proof: Consequence of uniqueness of SR decomposition up to trivial symplectic factors and relation of SR decomposition to Krylov subspace. Peter Benner TU Chemnitz 18

choosing the free parameters Condition of Lanczos Basis Accuracy of computed eigenvalues is affected by condition of Lanczos basis. keep cond 2 (V n ) small (i.e., choose best trivial symplectic factor D k ) V n symplectic = V 1 n = JV T n J = cond 2 (V n ) = V n 2 2. V n 2 can be bounded by max { v k 2, w k 2 } V n 2 2n max { v k 2, w k 2 }, 1 k n 1 k n = Minimize τ := max 1 k n { v k 2, w k 2 }. Peter Benner TU Chemnitz 19

choosing the free parameters Optimizing One Free Parameter Fix ζ k such that v k 2 = 1 for all k = τ = max { w k 2 } = max { 1 1 k n 1 k n ν k Hv k δ k v k 2 }. Requiring symplectic Lanczos basis ν k is fixed. Choice of δ k independent of δ 1,..., δ k 1 = δ k = argmin{ Hv k δv k 2 }. δ R Peter Benner TU Chemnitz 20

choosing the free parameters Consider the quadratic form Minimizing τ f k (δ) = Hv k δv k 2 2 = (Hv k δv k ) T (Hv k δv k ) = vk T H T Hv k 2δvk T Hv k + δ 2 vk T v }{{ k. } =1 First-order necessary condition for a minimum is Hence, f k(δ) = 2v T k Hv k + 2δ = 0. δ k = v T k Hv k, i.e., v k w k (local orthogonality condition) Same choice as in [Ferng/Lin/Wang 97]. Peter Benner TU Chemnitz 21

numerical examples Numerical Examples I Control of string of high-speed vehicles: n = 398, v = e, different choices for δ j 3500 Condition of Lanczos basis 3000 2500 2000 1500 1000 500 δ = 1 δ = 0 orthogonality condition random δ 0 0 5 10 15 20 25 30 35 40 45 50 number of Lanczos steps Peter Benner TU Chemnitz 22

numerical examples Numerical Examples II Random stable gyroscopic system: n = 200, different choices for δ j 15 Structured eigensolver QR algorithm Computed Eigenvalues 8 Condition of Lanczos basis 10 7 6 5 5 Im(λ) 0 4 5 10 3 2 δ = 1 δ = 0 orthogonality condition random δ 15 6 4 2 0 2 4 Re(λ) x 10 15 1 0 5 10 15 20 25 30 35 40 45 50 number of Lanczos steps Peter Benner TU Chemnitz 23

conclusions Conclusions Optimal choice of free parameters in symplectic Lanczos method can improve condition of Lanczos basis and may improve Ritz value accuracy. Optimization of two free parameters possible but... Thorough comparison necessary. For practical applications, use implicitly restarted variants [B./Faßbender 97, Watkins 02], but still more details necessary: shift-and-invert, deflation, locking, purging,... Variants for symplectic eigenproblem available [B./Faßbender 00]. Two-sided symplectic (implicitly restarted) Arnoldi based on symplectic URV decomposition under development [B./Kreßner/Mehrmann/Xu]. Peter Benner TU Chemnitz 24