Perturbation theory for eigenvalues of Hermitian pencils Christian Mehl Institut für Mathematik TU Berlin, Germany 9th Elgersburg Workshop Elgersburg, March 3, 2014 joint work with Shreemayee Bora, Michael Karow, and Punit Sharma
Why Hermitian pencils? Question: Why are eigenvalues of Hermitian pencils of interest in system theory?
Why Hermitian pencils? Question: Why are eigenvalues of Hermitian pencils of interest in system theory? Answer: Because Hermitian pencils are related to Hamiltonian matrices.
Why Hermitian pencils? Question: Why are eigenvalues of Hermitian pencils of interest in system theory? Answer: Because Hermitian pencils are related to Hamiltonian matrices. A matrix H C 2n 2n is called Hamiltonian if [ ] A C H = D A, where A, C, D C n n and where C, D are Hermitian.
Why Hermitian pencils? Question: Why are eigenvalues of Hermitian pencils of interest in system theory? Answer: Because Hermitian pencils are related to Hamiltonian matrices. A matrix H C 2n 2n is called Hamiltonian if [ ] A C H = D A, where A, C, D C n n and where C, D are Hermitian. Observation: The spectrum of Hamiltonian matrices is symmetric with respect to the imaginary axis: if λ C is an eigenvalue of H, then so is λ with the same multiplicities.
Application 1: Optimal Control The Linear Quadratic Optimal Control Problem: minimize the cost functional [ ] T [ ] [ ] x(t) Q S x(t) u(t) S T dt R u(t) subject to the dynamics 0 ẋ = Ax + Bu, x(0) = x 0, t [0, ), where x(t), x 0 R n, u(t) R m, A, Q R n n, S R n m, R R m m, [ ] Q S 0, R > 0. S T R The solution can be obtained by solving the eigenvalue problem for the Hamiltonian matrix [ ] A BR H := 1 S T BR 1 B T SR 1 S T Q A T + SR 1 B T.
Application 1: Optimal Control More general problem: minimize the cost functional [ ] T [ ] [ ] x(t) Q S x(t) u(t) S T dt R u(t) subject to the dynamics 0 Eẋ = Ax + Bu, x(0) = x 0, t [0, ), where x(t), x 0 R n, u(t) R m, E, A, Q R n n, S R n m, R R m m, [ ] Q S 0, R 0. S T R The solution can be obtained by solving the generalized eigenvalue problem for the matrix pencil of the form:
Application 1: Optimal Control More general problem: minimize the cost functional [ ] T [ ] [ ] x(t) Q S x(t) u(t) S T dt R u(t) subject to the dynamics 0 Eẋ = Ax + Bu, x(0) = x 0, t [0, ), where x(t), x 0 R n, u(t) R m, E, A, Q R n n, S R n m, R R m m, [ ] Q S 0, R 0. S T R λe A := λ 0 E 0 E 0 0 0 0 0 Q A S A 0 B. S B R
Application 1: Optimal Control Observations: on the generalized eigenvalue problem: 0 E 0 Q A S λe A := λ E 0 0 A 0 B 0 0 0 S B R L(λ) := λe A is an even matrix pencil, that is, L(λ) = L( λ). The eigenvalues of L(λ) are symmetric wrt the imaginary axis. L(iλ) = iλe A is a Hermitian matrix pencil. The eigenvalues of L(iλ) are symmetric wrt the real axis. If R is invertible, then the generalized eigenvalue problem can be reduced to the form[ ] [ ] 0 E Q SR λ 1 S A SR 1 B E 0 A BR 1 S BR 1 B.
Application 2: Algebraic Riccati Equations Algebraic Riccati Equation (ARE): find a solution X C n n such that D + XA + A X XCX = 0, where A, C, D C n n and C, D are Hermitian. The solution can be obtained by solving the eigenvalue problem for the Hamiltonian matrix [ ] A C H := D A : If the columns of [ ] X1 C 2n n X 2 form a basis of an invariant subspace H associated with the eigenvalues in the left half plane and if X 1 is invertible, then X = X 2 X1 1 is a solution of the ARE such that A + GX C.
Application 2: gyroscopic systems Gyroscopic systems: have the form Iẍ + Gẋ + Kx = 0, where G, K C n and where G = G and K = K. Introducing the new variables y 1 = ẋ 1 2 Gx and y 2 = x this system can be reduced to the first order system ẏ + Hy = 0 with the Hamiltonian matrix [ 1 H = 2 G K + ] 1 4 G2 1 I n 2 G. Consequently, the gyroscopic system is stable if all eigenvalues of H are on the imaginary axis and semisimple.
Interesting observation Experiment: Consider two Hamiltonian systems: 0 1 0 0 ẋ = H 1 x = 1 0 0 0 0 0 0 1 x, ẏ = H 2y = 0 0 1 0 0 0 1 0 0 0 0 1 1 0 0 0 0 1 0 0 y. Observation: Both matrices have the semisimple eigenvalues +i and i with algebraic multiplicity 2 each. Consequently both systems are stable. Question: How do the matrices (Systems) behave under perturbations, i.e., what happens to the eigenvalues of H 1 + H 1, H 2 + H 2?
Interesting observation Result 1: Eigenvalue distribution of 1000 perturbations, when H i was a random matrix of norm 1 4. The perturbed systems are not stable.
Interesting observation Result 2: Eigenvalue distribution of 1000 perturbations, when H i was a random Hamiltonian matrix of norm 1 4. The second perturbed system remains stable, the first one typically not.
Why? For the understanding, we need...
Indefinite Linear Algebra name Indefinite Linear Algebra invented by Gohberg, Lancaster, Rodman in 2005; ±H = H F n n invertible defines an inner product on F n : [x, y] H := y Hx for all x, y F n ; Here, either denotes the transpose T or the conjugate transpose ; H = H H = H H = H T H = H T Hermitian sesquilinear form skew-hermitian sesquilinear form symmetric bilinear form skew-symmetric bilinear form the inner product may be indefinite (needs not be positive definite).
Indefinite Linear Algebra The adjoint: For X F n n let X be the matrix satisfying [v, Xw] H = [X v, w] H for all v, w F n. We have X = H 1 X T H resp. X = H 1 X H. Matrices with symmetries in indefinite inner products: adjoint y T Hx y Hx A H-selfadjoint A = A A T H = HA A H = HA S H-skew-adjoint S = S S T H = HS S H = HS U H-unitary U = U 1 U T HU = H U HU = H
Indefinite Linear Algebra A matrix H F 2n 2n ishamiltonian if and only if [ 0 H T In J = JH, where J = I n 0 ]. Hamiltonian matrices are skew-adjoint with respect to the skew-symmetric bilinear form induced by J. J-selfadjoint N T J = JN skew-hamiltonian J-skew-adjoint H T J = JH Hamiltonian J-unitary S T JS = J symplectic Symplectic matrices occur in discrete optimal control problems.
Indefinite Linear Algebra Hermititian matrix pencil ˆ= H-selfadjoint matrix : Observation 1: A C n n is H-selfadjoint A H = HA (HA) = HA Observation 2: If H is invertible: Ax = λx HAx = λhx Conversely: If λh G is a Hermitian pencil (d.h. H = H and G = G ) and if H is invertible, then A := H 1 G is H-selfadjoint: A H = GH 1 H = G = HH 1 G = HA.
Why bother? Forget structure and use general methods
Why bother? Forget structure and use general methods No! We prefer structure-preserving methods, because...
Why bother? Forget structure and use general methods No! We prefer structure-preserving methods, because... structure-preserving algorithms are often faster
Why bother? Forget structure and use general methods No! We prefer structure-preserving methods, because... structure-preserving algorithms are often faster structure-preserving algorithms respect spectral symmetries
Why bother? Forget structure and use general methods No! We prefer structure-preserving methods, because... structure-preserving algorithms are often faster structure-preserving algorithms respect spectral symmetries structure-preserving algorithms produce physically meaningful results
Why bother? Forget structure and use general methods No! We prefer structure-preserving methods, because... structure-preserving algorithms are often faster structure-preserving algorithms respect spectral symmetries structure-preserving algorithms produce physically meaningful results perturbation theory is different if structure is preserved
Why bother? Forget structure and use general methods No! We prefer structure-preserving methods, because... structure-preserving algorithms are often faster structure-preserving algorithms respect spectral symmetries structure-preserving algorithms produce physically meaningful results perturbation theory is different if structure is preserved because some eigenvalues occur in pairs
Why bother? Forget structure and use general methods No! We prefer structure-preserving methods, because... structure-preserving algorithms are often faster structure-preserving algorithms respect spectral symmetries structure-preserving algorithms produce physically meaningful results perturbation theory is different if structure is preserved because some eigenvalues occur in pairs because there is sign characteristic
Why bother? Forget structure and use general methods No! We prefer structure-preserving methods, because... structure-preserving algorithms are often faster structure-preserving algorithms respect spectral symmetries structure-preserving algorithms produce physically meaningful results perturbation theory is different if structure is preserved because some eigenvalues occur in pairs because there is sign characteristic What is sign characteristic?
Canonical forms Definite Linear Algebra: Any Hermitian matrix is unitarily diagonalizable and all its eigenvalues are real. Indefinite Linear Algebra: Selfadjoint matrices with respect to indefinite inner products may have complex eigenvalues and need not be diagonalizable. Example: H = [ 0 1 1 0 ], A 1 = [ 2 1 0 2 ], A 2 = A 1 and A 2 are H-selfadjoint, i.e., A i H = HA i. [ i 0 0 i ]
Canonical forms Transformations that preserve structure: for bilinear forms: (H, A) (P T HP, P 1 AP ), P invertible; for sesquilinear forms: (H, A) (P HP, P 1 AP ), A is H-selfadjoint H-skew-adjoint H-unitary P 1 AP is Here P = P T or P = P, respectively. P invertible; P HP -selfadjoint P HP -skew-adjoint P HP -unitary
Canonical forms Theorem (Gohberg, Lancaster, Rodman, 1983, Thompson, 1976) Let A C n n be H-selfadjoint. Then there exists P C n n invertible such that where either P 1 AP = A 1 A k, P HP = H 1 H k, 1) A i = J ni (λ), and H i = εr ni, where λ R and ε = ±1; or [ ] [ ] Jni (µ) 0 0 Rni 2) A i =, H 0 J ni (µ) i =, where µ R. R ni 0 λ 1 0 0 Here J m (λ) = 0....... 0... λ 1 Cm m and R m = 0... 0 λ 0 1... 1 0 C m m.
Canonical forms There are similar results for H-skewadjoint and H-unitary matrices and for real or complex bilinear forms. Spectral symmetries: y T Hx y Hx y T Hx field F = C F = C F = R H-selfadjoints λ λ, λ λ, λ H-skew-adjoints λ, λ λ, λ λ, λ, λ, λ H-unitaries λ, λ 1 λ, λ 1 λ, λ 1, λ, λ 1
The sign characteristic Sign characteristic: There are additional invariants for real eigenvalues of H-selfadjoint matrices: signs ε = ±1. Example: A = [ 1 0 0 2 ], H ε = [ ε 0 0 1 ], ε = ±1; There is no transformation P 1 AP = A, P H +1 P = H 1, because of Sylvester s Law of Inertia; each Jordan block associated with a real eigenvalue of A has a corresponding sign ε {+1, 1}; the collection of all the signs is called the sign characteristic of A;
The sign characteristic Interpretation of the sign characteristic for simple eigenvalues: let (λ, v) be an eigenpair of the selfadjoint matrix A, where λ R: let ε be the sign corresponding to λ; the inner product [v, v] H is positive if ε = +1; the inner product [v, v] H is negative if ε = 1. Analogously: purely imaginary eigenvalues of H-skew-adjoint matrices have signs; unimodular eigenvalues of H-unitary matrices have signs.
What happens under structured perturbations? Example: symplectic matrices S R 2n 2n ; consider a slightly perturbed matrix S that is still symplectic; the behavior of the unimodular eigenvalues under perturbation depends on the sign characteristic; if two unimodular eigenvalues meet, the behavior is different if the corresponding signs are opposite or equal.
What happens under structured perturbations? let S have two close unimodular eigenvalues with opposite signs; if S is perturbed and the two eigenvalues meet, they generically form a Jordan block; then they may split off as a pair of nonunimodular reciprocal eigenvalues;
What happens under structured perturbations? let S have two close unimodular eigenvalues with opposite signs; if S is perturbed and the two eigenvalues meet, they generically form a Jordan block; then they may split off as a pair of nonunimodular reciprocal eigenvalues;
What happens under structured perturbations? let S have two close unimodular eigenvalues with opposite signs; if S is perturbed and the two eigenvalues meet, they generically form a Jordan block; then they may split off as a pair of nonunimodular reciprocal eigenvalues;
What happens under structured perturbations? let S have two close unimodular eigenvalues with opposite signs; if S is perturbed and the two eigenvalues meet, they generically form a Jordan block; then they may split off as a pair of nonunimodular reciprocal eigenvalues;
What happens under structured perturbations? let S have two close unimodular eigenvalues with opposite signs; if S is perturbed and the two eigenvalues meet, they generically form a Jordan block; then they may split off as a pair of nonunimodular reciprocal eigenvalues;
What happens under structured perturbations? let S have two close unimodular eigenvalues with opposite signs; if S is perturbed and the two eigenvalues meet, they generically form a Jordan block; then they may split off as a pair of nonunimodular reciprocal eigenvalues;
What happens under structured perturbations? let S have two close unimodular eigenvalues with opposite signs; if S is perturbed and the two eigenvalues meet, they generically form a Jordan block; then they may split off as a pair of nonunimodular reciprocal eigenvalues;
What happens under structured perturbations? let S have two close unimodular eigenvalues with opposite signs; if S is perturbed and the two eigenvalues meet, they generically form a Jordan block; then they may split off as a pair of nonunimodular reciprocal eigenvalues;
What happens under structured perturbations? let S have two close unimodular eigenvalues with equal signs; if S is perturbed and the two eigenvalues meet, they cannot form a Jordan block, and they must remain on the unit circle;
What happens under structured perturbations? let S have two close unimodular eigenvalues with equal signs; if S is perturbed and the two eigenvalues meet, they cannot form a Jordan block, and they must remain on the unit circle;
What happens under structured perturbations? let S have two close unimodular eigenvalues with equal signs; if S is perturbed and the two eigenvalues meet, they cannot form a Jordan block, and they must remain on the unit circle;
What happens under structured perturbations? let S have two close unimodular eigenvalues with equal signs; if S is perturbed and the two eigenvalues meet, they cannot form a Jordan block, and they must remain on the unit circle;
What happens under structured perturbations? let S have two close unimodular eigenvalues with equal signs; if S is perturbed and the two eigenvalues meet, they cannot form a Jordan block, and they must remain on the unit circle;
What happens under structured perturbations? let S have two close unimodular eigenvalues with equal signs; if S is perturbed and the two eigenvalues meet, they cannot form a Jordan block, and they must remain on the unit circle;
What happens under structured perturbations? let S have two close unimodular eigenvalues with equal signs; if S is perturbed and the two eigenvalues meet, they cannot form a Jordan block, and they must remain on the unit circle;
What happens under structured perturbations? let S have two close unimodular eigenvalues with equal signs; if S is perturbed and the two eigenvalues meet, they cannot form a Jordan block, and they must remain on the unit circle;
Interesting observation revisited Experiment: Consider two Hamiltonian systems: 0 1 0 0 ẋ = H 1 x = 1 0 0 0 0 0 0 1 x, ẏ = H 2y = 0 0 1 0 0 0 1 0 0 0 0 1 1 0 0 0 0 1 0 0 y. Observation 1: Both matrices have the semisimple eigenvalues +i and i with algebraic multiplicity 2 each. Consequently both systems are stable. Observation 2: One can check: the sign characteristic of the eigenvalue i of H 1 consists of mixed signs +1 and 1 (same story for i); the sign characteristic of the eigenvalue i of H 2 consists of identical signs +1 and +1 (similar story for i: signs 1 and 1).
Interesting observation revisited Result 2: Eigenvalue distribution of 1000 perturbations, when H i was a random Hamiltonian matrix of norm 1 4. Left: mixed sign characteristic; Right: definite sign characteristic
Generalization to matrix pencils matrix case: H-selfadjoint H-skew-adjoint H-unitary pencil case: Hermitian -even -palindromic pencil: λg H λk H λa A structure: G = G, H = H K = K, H = H A C n n spectral symmetry: λ, λ λ, λ λ, λ 1 critical curve : real line imaginary line unit circle sign characteristic: eigenvalues on the critical curve will have signs ε = ±1. Generalization to matrix polynomials possible.
Finally: The Task
Application: passivity of systems Consider a linear time-invariant control system ẋ = Ax + Bu, x(0) = 0, y = Cx + Du, Assume σ(a) C and D is invertible. The system is called passive if the Hamiltonian matrix [ ] [ F G A BR H = H F T := 1 C BR 1 B T C T R 1 C (A BR 1 C) T has no pure imaginary eigenvalues, where R = D + D T. Often, an approximation H of H is computed and often in this approximation process the passivity is lost. Aim: modify H by a Hamiltonian small norm and small rank perturbation to a nearby Hamiltonian matrix having no pure imaginary eigenvalues. Solved by Alam, Bora, Karow, Mehrmann, Moro 2010. ]
The generalized problem Task: find the smallest perturbation that moves eigenvalues on the critical curve of a structured matrix pencils off the critical curve.
The generalized problem Task: find the smallest perturbation that moves eigenvalues on the critical curve of a structured matrix pencils off the critical curve. start with Hermitian pencils, i.e., move eigenvalues from the real line into the complex plane
The generalized problem Task: find the smallest perturbation that moves eigenvalues on the critical curve of a structured matrix pencils off the critical curve. start with Hermitian pencils, i.e., move eigenvalues from the real line into the complex plane first step: find the smallest structured perturbation that makes λ C an eigenvalue of a given Hermitian pencil A 1 + za 2
The generalized problem Task: find the smallest perturbation that moves eigenvalues on the critical curve of a structured matrix pencils off the critical curve. start with Hermitian pencils, i.e., move eigenvalues from the real line into the complex plane first step: find the smallest structured perturbation that makes λ C an eigenvalue of a given Hermitian pencil A 1 + za 2 Problem: Let A 1, A 2 C n n be Hermitian and let λ C. Calculate η S (A 1, A 2, λ) { = inf ( 1 2 + 2 ) 2 det ((A1 1 ) + λ(a 2 2 )) = 0) } 1, 2 S and construct the corresponding 1 and 2 that attain the infimum.
The generalized problem Problem: Let A 1, A 2 C n n be Hermitian and let λ C. Calculate η S (A 1, A 2, λ) { = inf ( 1 2 + 2 ) 2 det ((A 1 1 ) + λ(a 2 2 )) = 0) } 1, 2 S and construct the corresponding 1 and 2 that attain the infimum. We call η S (A 1, A 2, λ) the structured backward error of the Hermitian pencil A 1 + za 2 with respect to λ and S.
The generalized problem Case 1: S = C n n and λ C: easy! because 1 (x) = η(a 1, A 2, λ) = σ min (A 1 + λa 2 ), 1 + λ 2 1 x x 1 + λ 2(A 1+λA 2 )xx, 2 (x) = λ x x 1 + λ 2(A 1+λA 2 )xx is the smallest perturbation that makes the pair (λ, x) an eigenpair of the pencil and η(a 1, A 2, λ) = min ( 1 (x) 2 + 2 (x)) 2. x 0
The generalized problem Case 1: S = C n n and λ C: easy! because 1 (x) = η(a 1, A 2, λ) = σ min (A 1 + λa 2 ), 1 + λ 2 1 x x 1 + λ 2(A 1+λA 2 )xx, 2 (x) = λ x x 1 + λ 2(A 1+λA 2 )xx is the smallest perturbation that makes the pair (λ, x) an eigenpair of the pencil and η(a 1, A 2, λ) = min ( 1 (x) 2 + 2 (x)) 2. x 0 Case 2: S = Herm(n) and λ R: easy! η(a 1, A 2, λ) = σ min (A 1 + λa 2 ), 1 + λ 2 because for λ R the above perturbations are Hermitian.
The generalized problem Case 3: S = Herm(n) and λ C \ R: difficult! Adhikari, Alam 2009: complicated formulas for the structured backward error of eigenpairs (λ, x) for Hermitian pencils, when λ R η(a 1, A 2, λ) could be obtained by minimizing these formulas over all x C n \ {0}. This minimization problem is not feasible. Consequence: We need a different strategy!
Reformulating the problem det ( (A 1 1 ) + λ(a 2 2 ) ) = 0 (A 1 + λa 2 )x = ( 1 + λ 2 )x for some x 0 x = (A 1 + λa 2 ) 1 ( 1 + λ 2 )x v 1 = 1 M(v 1 + λv 2 ) and v 2 = 2 M(v 1 + λv 2 ) abbreviating M := (A 1 + λa 2 ) 1, v 1 = 1 x, and v 2 = 2 x. Consequence: det ( (A 1 1 ) + λ(a 2 2 ) ) = 0 there exist v 1, v 2 satisfying v 1 + λv 2 0 and v 1 = 1 M(v 1 + λv 2 ) and v 2 = 2 M(v 1 + λv 2 )
Reformulating the problem two structured mapping problems: for x = M(v 1 + λv 2 ) and y = v k find H = k Herm(n) such that y = Hx for k = 1, 2. Theorem (see Mackey, Mackey, Tisseur, 2008). Let x, y C n, x 0. Then there exists a Hermitian matrix H Herm(n) such that Hx = y if and only if Im (x y) = 0. If the latter condition is satisfied then min { H H Herm(n), Hx = y } = y x and the minimum is attained for H 0 = y x [ y y x x ] [ y x x y 1 x 1 y x y ] 1 [ y y x x if x and y are linearly independent and for H 0 = yx x x otherwise. ].
Reformulating the problem We need v 1M(v 1 + λv 2 ) and v 2M(v 1 + λv 2 ) to be real. This is equivalent to requiring that v H 1 v = 0 und v H 2 v = 0, where v = [ v1 v 2 ], H 1 = i [ M M λm λm 0 ] [ 0 M, H 2 = i M λm λm ]. Moreover, the minimal norm 1 2 + 2 2 for a pair ( 1, 2 ) satisfying v 1 = 1 M(v 1 + λv 2 ) and v 2 = 2 M(v 1 + λv 2 ) is given by [ ] 1 2 + 2 2 = v v M v H 0 v, where H 0 = M λm M λm M λ 2 M. M
Reformulating the problem Consequence: η(a { 1, A 2, λ) 2 v } v = inf v H 0 v v C2n, v H 0 v 0, v H 1 v = 0, v H 2 v = 0 { v } H 0 v 1 = sup v C 2n \ {0}, v H v 1 v = 0, v H 2 v = 0 v Idea: Compute the maximal eigenvalue λ max of the matrix and minimize this over t 1, t 2 R. H 0 + t 1 H 1 + t 2 H 2
The main results Theorem: Bora, Karow, M., Sharma, 2012. Let H 0, H 1, H 2 C n n be Hermitian. Assume that H 0 is nonzero and positive semidefinite, and that any linear combination α 1 H 1 + α 2 H 2, (α 1, α 2 ) R 2 \ {0} is indefinite. (Here, indefinite is used in the sense strictly not semi-definite.) Then the following statements hold. (1) The function L(t 1, t 2 ) := λ max (H 0 +t 1 H 1 +t 2 H 2 ) has a minimum λ max. (2) If the minimum λ max of L(t 1, t 2 ) is attained at (t 1, t 2) R 2, then there exists an associated eigenvector w C n \{0} of H 0 +t 1H 1 +t 2H 2 with (3) We have λ max = sup w H 1 w = 0 = w H 2 w. { w } H 0 w w w w 0, w H 1 w = 0, w H 2 w = 0. In particular, the supremum is a maximum and attained for the eigenvector w from (2).
The main results Theorem: Bora, Karow, M., Sharma, 2012. Let A 1, A 2 C n n be Hermitian, λ C \ R, and Suppose det(a 1 + λa 2 ) 0, and let M = (A 1 + λa 2 ) 1. Then ( ) 1/2 η(a 1, A 2, λ) = min t 1,t 2 R λ max(h 0 + t 1 H 1 + t 2 H 2 ), where [ ] M H 0 = M λm M λm M λ 2 M, H M 1 = i [ ] 0 M H 2 = i M λm λm. [ M M λm λm 0 ], Consequence: Distance problem solved, the corresponding perturbation can be constructed from the eigenvector w from the previous result.
Experiments: Nr. 1 unstructured,structured Nr. 2 unstructured,structured Nr. 3 unstructured,structured Nr. 4 unstructured,structured Nr. 5 structured Movies
Conclusions the effects of structured perturbations of Hermitian pencils (and other structured matrices and pencils) may be significantly different compared to unstructured perturbations sign characteristic is a crucial fact in the theory of structured perturbations; formulas for the structured backward error for Hermitian pencils are available; extension to matrix polynomimals possible (involves minimization over more than two parameters).
Conclusions the effects of structured perturbations of Hermitian pencils (and other structured matrices and pencils) may be significantly different compared to unstructured perturbations sign characteristic is a crucial fact in the theory of structured perturbations; formulas for the structured backward error for Hermitian pencils are available; extension to matrix polynomimals possible (involves minimization over more than two parameters). Thank you for your attention!