Perturbation theory for eigenvalues of Hermitian pencils. Christian Mehl Institut für Mathematik TU Berlin, Germany. 9th Elgersburg Workshop

Similar documents
Definite versus Indefinite Linear Algebra. Christian Mehl Institut für Mathematik TU Berlin Germany. 10th SIAM Conference on Applied Linear Algebra

Structured eigenvalue/eigenvector backward errors of matrix pencils arising in optimal control

Finite dimensional indefinite inner product spaces and applications in Numerical Analysis

The Kalman-Yakubovich-Popov Lemma for Differential-Algebraic Equations with Applications

Canonical forms of structured matrices and pencils

Eigenvalue perturbation theory of structured real matrices and their sign characteristics under generic structured rank-one perturbations

Linear Algebra: Matrix Eigenvalue Problems

Nonlinear palindromic eigenvalue problems and their numerical solution

Eigenvalue perturbation theory of classes of structured matrices under generic structured rank one perturbations

ETNA Kent State University

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )

Structured Krylov Subspace Methods for Eigenproblems with Spectral Symmetries

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix

On Linear-Quadratic Control Theory of Implicit Difference Equations

Stability radii for real linear Hamiltonian systems with perturbed dissipation

w T 1 w T 2. w T n 0 if i j 1 if i = j

Generic rank-k perturbations of structured matrices

1 Linear Algebra Problems

EIGENVALUE BACKWARD ERRORS OF POLYNOMIAL EIGENVALUE PROBLEMS UNDER STRUCTURE PRESERVING PERTURBATIONS. Punit Sharma

LINEAR ALGEBRA BOOT CAMP WEEK 4: THE SPECTRAL THEOREM

18.06 Problem Set 8 - Solutions Due Wednesday, 14 November 2007 at 4 pm in

QUASI-UNIFORMLY POSITIVE OPERATORS IN KREIN SPACE. Denitizable operators in Krein spaces have spectral properties similar to those

Chapter One. Introduction

Iterative Solution of a Matrix Riccati Equation Arising in Stochastic Control

University of Colorado at Denver Mathematics Department Applied Linear Algebra Preliminary Exam With Solutions 16 January 2009, 10:00 am 2:00 pm

MATH 423 Linear Algebra II Lecture 33: Diagonalization of normal operators.

Algorithms for Solving the Polynomial Eigenvalue Problem

1. Find the solution of the following uncontrolled linear system. 2 α 1 1

Linear Algebra Lecture Notes-II

Computing Unstructured and Structured Polynomial Pseudospectrum Approximations

Solving Polynomial Eigenproblems by Linearization

EIGENVALUE PROBLEMS. Background on eigenvalues/ eigenvectors / decompositions. Perturbation analysis, condition numbers..

LINEAR-QUADRATIC OPTIMAL CONTROL OF DIFFERENTIAL-ALGEBRAIC SYSTEMS: THE INFINITE TIME HORIZON PROBLEM WITH ZERO TERMINAL STATE

Hyponormal matrices and semidefinite invariant subspaces in indefinite inner products

Definition (T -invariant subspace) Example. Example

Singular-value-like decomposition for complex matrix triples

A Structure-Preserving Method for Large Scale Eigenproblems. of Skew-Hamiltonian/Hamiltonian (SHH) Pencils

Lecture 7: Positive Semidefinite Matrices

Eigenvalues of rank one perturbations of matrices

Synopsis of Numerical Linear Algebra

Linearizing Symmetric Matrix Polynomials via Fiedler pencils with Repetition

ALGEBRA QUALIFYING EXAM PROBLEMS LINEAR ALGEBRA

Isospectral Families of High-order Systems

Review problems for MA 54, Fall 2004.

Structured Polynomial Eigenvalue Problems: Good Vibrations from Good Linearizations

MATH 583A REVIEW SESSION #1

Numerical Methods I Eigenvalue Problems

c 2006 Society for Industrial and Applied Mathematics

Scaling, Sensitivity and Stability in Numerical Solution of the Quadratic Eigenvalue Problem

Extending Results from Orthogonal Matrices to the Class of P -orthogonal Matrices

Homework sheet 4: EIGENVALUES AND EIGENVECTORS. DIAGONALIZATION (with solutions) Year ? Why or why not? 6 9

Lecture notes: Applied linear algebra Part 1. Version 2

Matrix Mathematics. Theory, Facts, and Formulas with Application to Linear Systems Theory. Dennis S. Bernstein

EXERCISES ON DETERMINANTS, EIGENVALUES AND EIGENVECTORS. 1. Determinants

LinGloss. A glossary of linear algebra

Linear Algebra Review

Eigenpairs and Similarity Transformations

Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Optimization Theory. A Concise Introduction. Jiongmin Yong

Applied Mathematics 205. Unit V: Eigenvalue Problems. Lecturer: Dr. David Knezevic

LINEAR ALGEBRA REVIEW

Solving large Hamiltonian eigenvalue problems

STRUCTURE PRESERVING DEFLATION OF INFINITE EIGENVALUES IN STRUCTURED PENCILS

Elementary linear algebra

Jordan Canonical Form of A Partitioned Complex Matrix and Its Application to Real Quaternion Matrices

EIGENVALUE PROBLEMS. EIGENVALUE PROBLEMS p. 1/4

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

The Eigenvalue Problem: Perturbation Theory

Structured Condition Numbers of Symmetric Algebraic Riccati Equations

1. General Vector Spaces

1. Introduction. Throughout this work we consider n n matrix polynomials with degree k of the form

Linear Algebra. Workbook

Inverse Eigenvalue Problem with Non-simple Eigenvalues for Damped Vibration Systems

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

Structured eigenvalue condition numbers and linearizations for matrix polynomials

PALINDROMIC POLYNOMIAL EIGENVALUE PROBLEMS: GOOD VIBRATIONS FROM GOOD LINEARIZATIONS

1. Foundations of Numerics from Advanced Mathematics. Linear Algebra

Index. for generalized eigenvalue problem, butterfly form, 211

Preliminary/Qualifying Exam in Numerical Analysis (Math 502a) Spring 2012

NOTES ON BILINEAR FORMS

On rank one perturbations of Hamiltonian system with periodic coefficients

Diagonalizing Matrices

Plan What is a normal formal of a mathematical object? Similarity and Congruence Transformations Jordan normal form and simultaneous diagonalisation R

BASIC ALGORITHMS IN LINEAR ALGEBRA. Matrices and Applications of Gaussian Elimination. A 2 x. A T m x. A 1 x A T 1. A m x

Semidefinite Programming Duality and Linear Time-invariant Systems

Quadratic forms. Here. Thus symmetric matrices are diagonalizable, and the diagonalization can be performed by means of an orthogonal matrix.

Structured Condition Numbers and Backward Errors in Scalar Product Spaces

Chap 3. Linear Algebra

Math 108b: Notes on the Spectral Theorem

Linear Algebra and its Applications

On doubly structured matrices and pencils that arise in linear response theory

Notes on Linear Algebra and Matrix Analysis

Econ 204 Supplement to Section 3.6 Diagonalization and Quadratic Forms. 1 Diagonalization and Change of Basis

The quadratic eigenvalue problem (QEP) is to find scalars λ and nonzero vectors u satisfying

EE/ACM Applications of Convex Optimization in Signal Processing and Communications Lecture 2

Symmetric and anti symmetric matrices

Math Linear Algebra II. 1. Inner Products and Norms

Week Quadratic forms. Principal axes theorem. Text reference: this material corresponds to parts of sections 5.5, 8.2,

MTH 102: Linear Algebra Department of Mathematics and Statistics Indian Institute of Technology - Kanpur. Problem Set

Math Matrix Algebra

Transcription:

Perturbation theory for eigenvalues of Hermitian pencils Christian Mehl Institut für Mathematik TU Berlin, Germany 9th Elgersburg Workshop Elgersburg, March 3, 2014 joint work with Shreemayee Bora, Michael Karow, and Punit Sharma

Why Hermitian pencils? Question: Why are eigenvalues of Hermitian pencils of interest in system theory?

Why Hermitian pencils? Question: Why are eigenvalues of Hermitian pencils of interest in system theory? Answer: Because Hermitian pencils are related to Hamiltonian matrices.

Why Hermitian pencils? Question: Why are eigenvalues of Hermitian pencils of interest in system theory? Answer: Because Hermitian pencils are related to Hamiltonian matrices. A matrix H C 2n 2n is called Hamiltonian if [ ] A C H = D A, where A, C, D C n n and where C, D are Hermitian.

Why Hermitian pencils? Question: Why are eigenvalues of Hermitian pencils of interest in system theory? Answer: Because Hermitian pencils are related to Hamiltonian matrices. A matrix H C 2n 2n is called Hamiltonian if [ ] A C H = D A, where A, C, D C n n and where C, D are Hermitian. Observation: The spectrum of Hamiltonian matrices is symmetric with respect to the imaginary axis: if λ C is an eigenvalue of H, then so is λ with the same multiplicities.

Application 1: Optimal Control The Linear Quadratic Optimal Control Problem: minimize the cost functional [ ] T [ ] [ ] x(t) Q S x(t) u(t) S T dt R u(t) subject to the dynamics 0 ẋ = Ax + Bu, x(0) = x 0, t [0, ), where x(t), x 0 R n, u(t) R m, A, Q R n n, S R n m, R R m m, [ ] Q S 0, R > 0. S T R The solution can be obtained by solving the eigenvalue problem for the Hamiltonian matrix [ ] A BR H := 1 S T BR 1 B T SR 1 S T Q A T + SR 1 B T.

Application 1: Optimal Control More general problem: minimize the cost functional [ ] T [ ] [ ] x(t) Q S x(t) u(t) S T dt R u(t) subject to the dynamics 0 Eẋ = Ax + Bu, x(0) = x 0, t [0, ), where x(t), x 0 R n, u(t) R m, E, A, Q R n n, S R n m, R R m m, [ ] Q S 0, R 0. S T R The solution can be obtained by solving the generalized eigenvalue problem for the matrix pencil of the form:

Application 1: Optimal Control More general problem: minimize the cost functional [ ] T [ ] [ ] x(t) Q S x(t) u(t) S T dt R u(t) subject to the dynamics 0 Eẋ = Ax + Bu, x(0) = x 0, t [0, ), where x(t), x 0 R n, u(t) R m, E, A, Q R n n, S R n m, R R m m, [ ] Q S 0, R 0. S T R λe A := λ 0 E 0 E 0 0 0 0 0 Q A S A 0 B. S B R

Application 1: Optimal Control Observations: on the generalized eigenvalue problem: 0 E 0 Q A S λe A := λ E 0 0 A 0 B 0 0 0 S B R L(λ) := λe A is an even matrix pencil, that is, L(λ) = L( λ). The eigenvalues of L(λ) are symmetric wrt the imaginary axis. L(iλ) = iλe A is a Hermitian matrix pencil. The eigenvalues of L(iλ) are symmetric wrt the real axis. If R is invertible, then the generalized eigenvalue problem can be reduced to the form[ ] [ ] 0 E Q SR λ 1 S A SR 1 B E 0 A BR 1 S BR 1 B.

Application 2: Algebraic Riccati Equations Algebraic Riccati Equation (ARE): find a solution X C n n such that D + XA + A X XCX = 0, where A, C, D C n n and C, D are Hermitian. The solution can be obtained by solving the eigenvalue problem for the Hamiltonian matrix [ ] A C H := D A : If the columns of [ ] X1 C 2n n X 2 form a basis of an invariant subspace H associated with the eigenvalues in the left half plane and if X 1 is invertible, then X = X 2 X1 1 is a solution of the ARE such that A + GX C.

Application 2: gyroscopic systems Gyroscopic systems: have the form Iẍ + Gẋ + Kx = 0, where G, K C n and where G = G and K = K. Introducing the new variables y 1 = ẋ 1 2 Gx and y 2 = x this system can be reduced to the first order system ẏ + Hy = 0 with the Hamiltonian matrix [ 1 H = 2 G K + ] 1 4 G2 1 I n 2 G. Consequently, the gyroscopic system is stable if all eigenvalues of H are on the imaginary axis and semisimple.

Interesting observation Experiment: Consider two Hamiltonian systems: 0 1 0 0 ẋ = H 1 x = 1 0 0 0 0 0 0 1 x, ẏ = H 2y = 0 0 1 0 0 0 1 0 0 0 0 1 1 0 0 0 0 1 0 0 y. Observation: Both matrices have the semisimple eigenvalues +i and i with algebraic multiplicity 2 each. Consequently both systems are stable. Question: How do the matrices (Systems) behave under perturbations, i.e., what happens to the eigenvalues of H 1 + H 1, H 2 + H 2?

Interesting observation Result 1: Eigenvalue distribution of 1000 perturbations, when H i was a random matrix of norm 1 4. The perturbed systems are not stable.

Interesting observation Result 2: Eigenvalue distribution of 1000 perturbations, when H i was a random Hamiltonian matrix of norm 1 4. The second perturbed system remains stable, the first one typically not.

Why? For the understanding, we need...

Indefinite Linear Algebra name Indefinite Linear Algebra invented by Gohberg, Lancaster, Rodman in 2005; ±H = H F n n invertible defines an inner product on F n : [x, y] H := y Hx for all x, y F n ; Here, either denotes the transpose T or the conjugate transpose ; H = H H = H H = H T H = H T Hermitian sesquilinear form skew-hermitian sesquilinear form symmetric bilinear form skew-symmetric bilinear form the inner product may be indefinite (needs not be positive definite).

Indefinite Linear Algebra The adjoint: For X F n n let X be the matrix satisfying [v, Xw] H = [X v, w] H for all v, w F n. We have X = H 1 X T H resp. X = H 1 X H. Matrices with symmetries in indefinite inner products: adjoint y T Hx y Hx A H-selfadjoint A = A A T H = HA A H = HA S H-skew-adjoint S = S S T H = HS S H = HS U H-unitary U = U 1 U T HU = H U HU = H

Indefinite Linear Algebra A matrix H F 2n 2n ishamiltonian if and only if [ 0 H T In J = JH, where J = I n 0 ]. Hamiltonian matrices are skew-adjoint with respect to the skew-symmetric bilinear form induced by J. J-selfadjoint N T J = JN skew-hamiltonian J-skew-adjoint H T J = JH Hamiltonian J-unitary S T JS = J symplectic Symplectic matrices occur in discrete optimal control problems.

Indefinite Linear Algebra Hermititian matrix pencil ˆ= H-selfadjoint matrix : Observation 1: A C n n is H-selfadjoint A H = HA (HA) = HA Observation 2: If H is invertible: Ax = λx HAx = λhx Conversely: If λh G is a Hermitian pencil (d.h. H = H and G = G ) and if H is invertible, then A := H 1 G is H-selfadjoint: A H = GH 1 H = G = HH 1 G = HA.

Why bother? Forget structure and use general methods

Why bother? Forget structure and use general methods No! We prefer structure-preserving methods, because...

Why bother? Forget structure and use general methods No! We prefer structure-preserving methods, because... structure-preserving algorithms are often faster

Why bother? Forget structure and use general methods No! We prefer structure-preserving methods, because... structure-preserving algorithms are often faster structure-preserving algorithms respect spectral symmetries

Why bother? Forget structure and use general methods No! We prefer structure-preserving methods, because... structure-preserving algorithms are often faster structure-preserving algorithms respect spectral symmetries structure-preserving algorithms produce physically meaningful results

Why bother? Forget structure and use general methods No! We prefer structure-preserving methods, because... structure-preserving algorithms are often faster structure-preserving algorithms respect spectral symmetries structure-preserving algorithms produce physically meaningful results perturbation theory is different if structure is preserved

Why bother? Forget structure and use general methods No! We prefer structure-preserving methods, because... structure-preserving algorithms are often faster structure-preserving algorithms respect spectral symmetries structure-preserving algorithms produce physically meaningful results perturbation theory is different if structure is preserved because some eigenvalues occur in pairs

Why bother? Forget structure and use general methods No! We prefer structure-preserving methods, because... structure-preserving algorithms are often faster structure-preserving algorithms respect spectral symmetries structure-preserving algorithms produce physically meaningful results perturbation theory is different if structure is preserved because some eigenvalues occur in pairs because there is sign characteristic

Why bother? Forget structure and use general methods No! We prefer structure-preserving methods, because... structure-preserving algorithms are often faster structure-preserving algorithms respect spectral symmetries structure-preserving algorithms produce physically meaningful results perturbation theory is different if structure is preserved because some eigenvalues occur in pairs because there is sign characteristic What is sign characteristic?

Canonical forms Definite Linear Algebra: Any Hermitian matrix is unitarily diagonalizable and all its eigenvalues are real. Indefinite Linear Algebra: Selfadjoint matrices with respect to indefinite inner products may have complex eigenvalues and need not be diagonalizable. Example: H = [ 0 1 1 0 ], A 1 = [ 2 1 0 2 ], A 2 = A 1 and A 2 are H-selfadjoint, i.e., A i H = HA i. [ i 0 0 i ]

Canonical forms Transformations that preserve structure: for bilinear forms: (H, A) (P T HP, P 1 AP ), P invertible; for sesquilinear forms: (H, A) (P HP, P 1 AP ), A is H-selfadjoint H-skew-adjoint H-unitary P 1 AP is Here P = P T or P = P, respectively. P invertible; P HP -selfadjoint P HP -skew-adjoint P HP -unitary

Canonical forms Theorem (Gohberg, Lancaster, Rodman, 1983, Thompson, 1976) Let A C n n be H-selfadjoint. Then there exists P C n n invertible such that where either P 1 AP = A 1 A k, P HP = H 1 H k, 1) A i = J ni (λ), and H i = εr ni, where λ R and ε = ±1; or [ ] [ ] Jni (µ) 0 0 Rni 2) A i =, H 0 J ni (µ) i =, where µ R. R ni 0 λ 1 0 0 Here J m (λ) = 0....... 0... λ 1 Cm m and R m = 0... 0 λ 0 1... 1 0 C m m.

Canonical forms There are similar results for H-skewadjoint and H-unitary matrices and for real or complex bilinear forms. Spectral symmetries: y T Hx y Hx y T Hx field F = C F = C F = R H-selfadjoints λ λ, λ λ, λ H-skew-adjoints λ, λ λ, λ λ, λ, λ, λ H-unitaries λ, λ 1 λ, λ 1 λ, λ 1, λ, λ 1

The sign characteristic Sign characteristic: There are additional invariants for real eigenvalues of H-selfadjoint matrices: signs ε = ±1. Example: A = [ 1 0 0 2 ], H ε = [ ε 0 0 1 ], ε = ±1; There is no transformation P 1 AP = A, P H +1 P = H 1, because of Sylvester s Law of Inertia; each Jordan block associated with a real eigenvalue of A has a corresponding sign ε {+1, 1}; the collection of all the signs is called the sign characteristic of A;

The sign characteristic Interpretation of the sign characteristic for simple eigenvalues: let (λ, v) be an eigenpair of the selfadjoint matrix A, where λ R: let ε be the sign corresponding to λ; the inner product [v, v] H is positive if ε = +1; the inner product [v, v] H is negative if ε = 1. Analogously: purely imaginary eigenvalues of H-skew-adjoint matrices have signs; unimodular eigenvalues of H-unitary matrices have signs.

What happens under structured perturbations? Example: symplectic matrices S R 2n 2n ; consider a slightly perturbed matrix S that is still symplectic; the behavior of the unimodular eigenvalues under perturbation depends on the sign characteristic; if two unimodular eigenvalues meet, the behavior is different if the corresponding signs are opposite or equal.

What happens under structured perturbations? let S have two close unimodular eigenvalues with opposite signs; if S is perturbed and the two eigenvalues meet, they generically form a Jordan block; then they may split off as a pair of nonunimodular reciprocal eigenvalues;

What happens under structured perturbations? let S have two close unimodular eigenvalues with opposite signs; if S is perturbed and the two eigenvalues meet, they generically form a Jordan block; then they may split off as a pair of nonunimodular reciprocal eigenvalues;

What happens under structured perturbations? let S have two close unimodular eigenvalues with opposite signs; if S is perturbed and the two eigenvalues meet, they generically form a Jordan block; then they may split off as a pair of nonunimodular reciprocal eigenvalues;

What happens under structured perturbations? let S have two close unimodular eigenvalues with opposite signs; if S is perturbed and the two eigenvalues meet, they generically form a Jordan block; then they may split off as a pair of nonunimodular reciprocal eigenvalues;

What happens under structured perturbations? let S have two close unimodular eigenvalues with opposite signs; if S is perturbed and the two eigenvalues meet, they generically form a Jordan block; then they may split off as a pair of nonunimodular reciprocal eigenvalues;

What happens under structured perturbations? let S have two close unimodular eigenvalues with opposite signs; if S is perturbed and the two eigenvalues meet, they generically form a Jordan block; then they may split off as a pair of nonunimodular reciprocal eigenvalues;

What happens under structured perturbations? let S have two close unimodular eigenvalues with opposite signs; if S is perturbed and the two eigenvalues meet, they generically form a Jordan block; then they may split off as a pair of nonunimodular reciprocal eigenvalues;

What happens under structured perturbations? let S have two close unimodular eigenvalues with opposite signs; if S is perturbed and the two eigenvalues meet, they generically form a Jordan block; then they may split off as a pair of nonunimodular reciprocal eigenvalues;

What happens under structured perturbations? let S have two close unimodular eigenvalues with equal signs; if S is perturbed and the two eigenvalues meet, they cannot form a Jordan block, and they must remain on the unit circle;

What happens under structured perturbations? let S have two close unimodular eigenvalues with equal signs; if S is perturbed and the two eigenvalues meet, they cannot form a Jordan block, and they must remain on the unit circle;

What happens under structured perturbations? let S have two close unimodular eigenvalues with equal signs; if S is perturbed and the two eigenvalues meet, they cannot form a Jordan block, and they must remain on the unit circle;

What happens under structured perturbations? let S have two close unimodular eigenvalues with equal signs; if S is perturbed and the two eigenvalues meet, they cannot form a Jordan block, and they must remain on the unit circle;

What happens under structured perturbations? let S have two close unimodular eigenvalues with equal signs; if S is perturbed and the two eigenvalues meet, they cannot form a Jordan block, and they must remain on the unit circle;

What happens under structured perturbations? let S have two close unimodular eigenvalues with equal signs; if S is perturbed and the two eigenvalues meet, they cannot form a Jordan block, and they must remain on the unit circle;

What happens under structured perturbations? let S have two close unimodular eigenvalues with equal signs; if S is perturbed and the two eigenvalues meet, they cannot form a Jordan block, and they must remain on the unit circle;

What happens under structured perturbations? let S have two close unimodular eigenvalues with equal signs; if S is perturbed and the two eigenvalues meet, they cannot form a Jordan block, and they must remain on the unit circle;

Interesting observation revisited Experiment: Consider two Hamiltonian systems: 0 1 0 0 ẋ = H 1 x = 1 0 0 0 0 0 0 1 x, ẏ = H 2y = 0 0 1 0 0 0 1 0 0 0 0 1 1 0 0 0 0 1 0 0 y. Observation 1: Both matrices have the semisimple eigenvalues +i and i with algebraic multiplicity 2 each. Consequently both systems are stable. Observation 2: One can check: the sign characteristic of the eigenvalue i of H 1 consists of mixed signs +1 and 1 (same story for i); the sign characteristic of the eigenvalue i of H 2 consists of identical signs +1 and +1 (similar story for i: signs 1 and 1).

Interesting observation revisited Result 2: Eigenvalue distribution of 1000 perturbations, when H i was a random Hamiltonian matrix of norm 1 4. Left: mixed sign characteristic; Right: definite sign characteristic

Generalization to matrix pencils matrix case: H-selfadjoint H-skew-adjoint H-unitary pencil case: Hermitian -even -palindromic pencil: λg H λk H λa A structure: G = G, H = H K = K, H = H A C n n spectral symmetry: λ, λ λ, λ λ, λ 1 critical curve : real line imaginary line unit circle sign characteristic: eigenvalues on the critical curve will have signs ε = ±1. Generalization to matrix polynomials possible.

Finally: The Task

Application: passivity of systems Consider a linear time-invariant control system ẋ = Ax + Bu, x(0) = 0, y = Cx + Du, Assume σ(a) C and D is invertible. The system is called passive if the Hamiltonian matrix [ ] [ F G A BR H = H F T := 1 C BR 1 B T C T R 1 C (A BR 1 C) T has no pure imaginary eigenvalues, where R = D + D T. Often, an approximation H of H is computed and often in this approximation process the passivity is lost. Aim: modify H by a Hamiltonian small norm and small rank perturbation to a nearby Hamiltonian matrix having no pure imaginary eigenvalues. Solved by Alam, Bora, Karow, Mehrmann, Moro 2010. ]

The generalized problem Task: find the smallest perturbation that moves eigenvalues on the critical curve of a structured matrix pencils off the critical curve.

The generalized problem Task: find the smallest perturbation that moves eigenvalues on the critical curve of a structured matrix pencils off the critical curve. start with Hermitian pencils, i.e., move eigenvalues from the real line into the complex plane

The generalized problem Task: find the smallest perturbation that moves eigenvalues on the critical curve of a structured matrix pencils off the critical curve. start with Hermitian pencils, i.e., move eigenvalues from the real line into the complex plane first step: find the smallest structured perturbation that makes λ C an eigenvalue of a given Hermitian pencil A 1 + za 2

The generalized problem Task: find the smallest perturbation that moves eigenvalues on the critical curve of a structured matrix pencils off the critical curve. start with Hermitian pencils, i.e., move eigenvalues from the real line into the complex plane first step: find the smallest structured perturbation that makes λ C an eigenvalue of a given Hermitian pencil A 1 + za 2 Problem: Let A 1, A 2 C n n be Hermitian and let λ C. Calculate η S (A 1, A 2, λ) { = inf ( 1 2 + 2 ) 2 det ((A1 1 ) + λ(a 2 2 )) = 0) } 1, 2 S and construct the corresponding 1 and 2 that attain the infimum.

The generalized problem Problem: Let A 1, A 2 C n n be Hermitian and let λ C. Calculate η S (A 1, A 2, λ) { = inf ( 1 2 + 2 ) 2 det ((A 1 1 ) + λ(a 2 2 )) = 0) } 1, 2 S and construct the corresponding 1 and 2 that attain the infimum. We call η S (A 1, A 2, λ) the structured backward error of the Hermitian pencil A 1 + za 2 with respect to λ and S.

The generalized problem Case 1: S = C n n and λ C: easy! because 1 (x) = η(a 1, A 2, λ) = σ min (A 1 + λa 2 ), 1 + λ 2 1 x x 1 + λ 2(A 1+λA 2 )xx, 2 (x) = λ x x 1 + λ 2(A 1+λA 2 )xx is the smallest perturbation that makes the pair (λ, x) an eigenpair of the pencil and η(a 1, A 2, λ) = min ( 1 (x) 2 + 2 (x)) 2. x 0

The generalized problem Case 1: S = C n n and λ C: easy! because 1 (x) = η(a 1, A 2, λ) = σ min (A 1 + λa 2 ), 1 + λ 2 1 x x 1 + λ 2(A 1+λA 2 )xx, 2 (x) = λ x x 1 + λ 2(A 1+λA 2 )xx is the smallest perturbation that makes the pair (λ, x) an eigenpair of the pencil and η(a 1, A 2, λ) = min ( 1 (x) 2 + 2 (x)) 2. x 0 Case 2: S = Herm(n) and λ R: easy! η(a 1, A 2, λ) = σ min (A 1 + λa 2 ), 1 + λ 2 because for λ R the above perturbations are Hermitian.

The generalized problem Case 3: S = Herm(n) and λ C \ R: difficult! Adhikari, Alam 2009: complicated formulas for the structured backward error of eigenpairs (λ, x) for Hermitian pencils, when λ R η(a 1, A 2, λ) could be obtained by minimizing these formulas over all x C n \ {0}. This minimization problem is not feasible. Consequence: We need a different strategy!

Reformulating the problem det ( (A 1 1 ) + λ(a 2 2 ) ) = 0 (A 1 + λa 2 )x = ( 1 + λ 2 )x for some x 0 x = (A 1 + λa 2 ) 1 ( 1 + λ 2 )x v 1 = 1 M(v 1 + λv 2 ) and v 2 = 2 M(v 1 + λv 2 ) abbreviating M := (A 1 + λa 2 ) 1, v 1 = 1 x, and v 2 = 2 x. Consequence: det ( (A 1 1 ) + λ(a 2 2 ) ) = 0 there exist v 1, v 2 satisfying v 1 + λv 2 0 and v 1 = 1 M(v 1 + λv 2 ) and v 2 = 2 M(v 1 + λv 2 )

Reformulating the problem two structured mapping problems: for x = M(v 1 + λv 2 ) and y = v k find H = k Herm(n) such that y = Hx for k = 1, 2. Theorem (see Mackey, Mackey, Tisseur, 2008). Let x, y C n, x 0. Then there exists a Hermitian matrix H Herm(n) such that Hx = y if and only if Im (x y) = 0. If the latter condition is satisfied then min { H H Herm(n), Hx = y } = y x and the minimum is attained for H 0 = y x [ y y x x ] [ y x x y 1 x 1 y x y ] 1 [ y y x x if x and y are linearly independent and for H 0 = yx x x otherwise. ].

Reformulating the problem We need v 1M(v 1 + λv 2 ) and v 2M(v 1 + λv 2 ) to be real. This is equivalent to requiring that v H 1 v = 0 und v H 2 v = 0, where v = [ v1 v 2 ], H 1 = i [ M M λm λm 0 ] [ 0 M, H 2 = i M λm λm ]. Moreover, the minimal norm 1 2 + 2 2 for a pair ( 1, 2 ) satisfying v 1 = 1 M(v 1 + λv 2 ) and v 2 = 2 M(v 1 + λv 2 ) is given by [ ] 1 2 + 2 2 = v v M v H 0 v, where H 0 = M λm M λm M λ 2 M. M

Reformulating the problem Consequence: η(a { 1, A 2, λ) 2 v } v = inf v H 0 v v C2n, v H 0 v 0, v H 1 v = 0, v H 2 v = 0 { v } H 0 v 1 = sup v C 2n \ {0}, v H v 1 v = 0, v H 2 v = 0 v Idea: Compute the maximal eigenvalue λ max of the matrix and minimize this over t 1, t 2 R. H 0 + t 1 H 1 + t 2 H 2

The main results Theorem: Bora, Karow, M., Sharma, 2012. Let H 0, H 1, H 2 C n n be Hermitian. Assume that H 0 is nonzero and positive semidefinite, and that any linear combination α 1 H 1 + α 2 H 2, (α 1, α 2 ) R 2 \ {0} is indefinite. (Here, indefinite is used in the sense strictly not semi-definite.) Then the following statements hold. (1) The function L(t 1, t 2 ) := λ max (H 0 +t 1 H 1 +t 2 H 2 ) has a minimum λ max. (2) If the minimum λ max of L(t 1, t 2 ) is attained at (t 1, t 2) R 2, then there exists an associated eigenvector w C n \{0} of H 0 +t 1H 1 +t 2H 2 with (3) We have λ max = sup w H 1 w = 0 = w H 2 w. { w } H 0 w w w w 0, w H 1 w = 0, w H 2 w = 0. In particular, the supremum is a maximum and attained for the eigenvector w from (2).

The main results Theorem: Bora, Karow, M., Sharma, 2012. Let A 1, A 2 C n n be Hermitian, λ C \ R, and Suppose det(a 1 + λa 2 ) 0, and let M = (A 1 + λa 2 ) 1. Then ( ) 1/2 η(a 1, A 2, λ) = min t 1,t 2 R λ max(h 0 + t 1 H 1 + t 2 H 2 ), where [ ] M H 0 = M λm M λm M λ 2 M, H M 1 = i [ ] 0 M H 2 = i M λm λm. [ M M λm λm 0 ], Consequence: Distance problem solved, the corresponding perturbation can be constructed from the eigenvector w from the previous result.

Experiments: Nr. 1 unstructured,structured Nr. 2 unstructured,structured Nr. 3 unstructured,structured Nr. 4 unstructured,structured Nr. 5 structured Movies

Conclusions the effects of structured perturbations of Hermitian pencils (and other structured matrices and pencils) may be significantly different compared to unstructured perturbations sign characteristic is a crucial fact in the theory of structured perturbations; formulas for the structured backward error for Hermitian pencils are available; extension to matrix polynomimals possible (involves minimization over more than two parameters).

Conclusions the effects of structured perturbations of Hermitian pencils (and other structured matrices and pencils) may be significantly different compared to unstructured perturbations sign characteristic is a crucial fact in the theory of structured perturbations; formulas for the structured backward error for Hermitian pencils are available; extension to matrix polynomimals possible (involves minimization over more than two parameters). Thank you for your attention!