A note on companion pencils

Similar documents
Factoring block Fiedler Companion Matrices

A factorization of the inverse of the shifted companion matrix

Fast computation of eigenvalues of companion, comrade, and related matrices

Fast and backward stable computation of roots of polynomials

A multiple shift QR-step for structured rank matrices

arxiv: v1 [math.na] 24 Jan 2019

Katholieke Universiteit Leuven Department of Computer Science

Polynomial eigenvalue solver based on tropically scaled Lagrange linearization. Van Barel, Marc and Tisseur, Francoise. MIMS EPrint: 2016.

Downloaded 09/18/18 to Redistribution subject to SIAM license or copyright; see

Implicit double shift QR-algorithm for companion matrices

Nonlinear palindromic eigenvalue problems and their numerical solution

YET ANOTHER ALGORITHM FOR THE SYMMETRIC EIGENVALUE PROBLEM. Dedicated to Bill Gragg on the occasion of his 80th birthday.

A UNIFIED APPROACH TO FIEDLER-LIKE PENCILS VIA STRONG BLOCK MINIMAL BASES PENCILS.

A Framework for Structured Linearizations of Matrix Polynomials in Various Bases

c 2015 Society for Industrial and Applied Mathematics

KU Leuven Department of Computer Science

Linearizing Symmetric Matrix Polynomials via Fiedler pencils with Repetition

Numerische Mathematik

Katholieke Universiteit Leuven Department of Computer Science

KU Leuven Department of Computer Science

A SIMPLIFIED APPROACH TO FIEDLER-LIKE PENCILS VIA STRONG BLOCK MINIMAL BASES PENCILS.

Compression of unitary rank structured matrices to CMV-like shape with an application to polynomial rootfinding arxiv: v1 [math.

Structured Matrix Methods for Polynomial Root-Finding

EXPLICIT BLOCK-STRUCTURES FOR BLOCK-SYMMETRIC FIEDLER-LIKE PENCILS

Katholieke Universiteit Leuven Department of Computer Science

Numerical Experiments for Finding Roots of the Polynomials in Chebyshev Basis

1. Introduction. Throughout this work we consider n n matrix polynomials with degree k of the form

Generic skew-symmetric matrix polynomials with fixed rank and fixed odd grade

Eigen-solving via reduction to DPR1 matrices

Algorithm to Compute Minimal Nullspace Basis of a Polynomial Matrix

Using semiseparable matrices to compute the SVD of a general matrix product/quotient

Refined Inertia of Matrix Patterns

Foundations of Matrix Analysis

ELA

A Fast Implicit QR Eigenvalue Algorithm for Companion Matrices

ANALYTICAL MATHEMATICS FOR APPLICATIONS 2018 LECTURE NOTES 3

KU Leuven Department of Computer Science

1 Linear Algebra Problems

Review problems for MA 54, Fall 2004.

Matrix Factorization and Analysis

Fiedler Companion Linearizations and the Recovery of Minimal Indices. De Teran, Fernando and Dopico, Froilan M. and Mackey, D.

Topics in linear algebra

MAPPING AND PRESERVER PROPERTIES OF THE PRINCIPAL PIVOT TRANSFORM

A Cholesky LR algorithm for the positive definite symmetric diagonal-plus-semiseparable eigenproblem

PALINDROMIC LINEARIZATIONS OF A MATRIX POLYNOMIAL OF ODD DEGREE OBTAINED FROM FIEDLER PENCILS WITH REPETITION.

5.6. PSEUDOINVERSES 101. A H w.

Generic degree structure of the minimal polynomial nullspace basis: a block Toeplitz matrix approach

Algorithms for Solving the Polynomial Eigenvalue Problem

Key words. Companion matrix, quasiseparable structure, QR iteration, eigenvalue computation,

Linear Algebra: Matrix Eigenvalue Problems

KU Leuven Department of Computer Science

On the reduction of matrix polynomials to Hessenberg form

Optimal Scaling of Companion Pencils for the QZ-Algorithm

Frame Diagonalization of Matrices

Lecture Notes for Inf-Mat 3350/4350, Tom Lyche

Intrinsic products and factorizations of matrices

1. Introduction. Throughout this work we consider n n matrix polynomials with degree k 2 of the form

Linear Algebra Review

Structure preserving stratification of skew-symmetric matrix polynomials. Andrii Dmytryshyn

Invariance properties in the root sensitivity of time-delay systems with double imaginary roots

Elementary maths for GMT

Lecture 3: QR-Factorization

Sparse spectrally arbitrary patterns

Linear Algebra Massoud Malek

Matrices. Chapter Definitions and Notations

Notes on Linear Algebra and Matrix Theory

THE STABLE EMBEDDING PROBLEM

Computing Eigenvalues and/or Eigenvectors;Part 2, The Power method and QR-algorithm

Matrix decompositions

Lecture Summaries for Linear Algebra M51A

Z-Pencils. November 20, Abstract

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )

On the application of different numerical methods to obtain null-spaces of polynomial matrices. Part 1: block Toeplitz algorithms.

Diagonalizing Matrices

and let s calculate the image of some vectors under the transformation T.

Katholieke Universiteit Leuven Department of Computer Science

Inverses of regular Hessenberg matrices

Chasing the Bulge. Sebastian Gant 5/19/ The Reduction to Hessenberg Form 3

PARAMETERIZATION OF STATE FEEDBACK GAINS FOR POLE PLACEMENT

AMS 209, Fall 2015 Final Project Type A Numerical Linear Algebra: Gaussian Elimination with Pivoting for Solving Linear Systems

STAT 309: MATHEMATICAL COMPUTATIONS I FALL 2017 LECTURE 5

On the application of different numerical methods to obtain null-spaces of polynomial matrices. Part 2: block displacement structure algorithms.

Spectrally arbitrary star sign patterns

Scientific Computing: Dense Linear Systems

Computing Eigenvalues and/or Eigenvectors;Part 2, The Power method and QR-algorithm

A reflection on the implicitly restarted Arnoldi method for computing eigenvalues near a vertical line

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

ON THE QR ITERATIONS OF REAL MATRICES

Generalized Shifted Inverse Iterations on Grassmann Manifolds 1

ON ORTHOGONAL REDUCTION TO HESSENBERG FORM WITH SMALL BANDWIDTH

SECTIONS 5.2/5.4 BASIC PROPERTIES OF EIGENVALUES AND EIGENVECTORS / SIMILARITY TRANSFORMATIONS

Queens College, CUNY, Department of Computer Science Numerical Methods CSCI 361 / 761 Spring 2018 Instructor: Dr. Sateesh Mane.

Paul Van Dooren s Index Sum Theorem: To Infinity and Beyond

Matrix Computations and Semiseparable Matrices

Unitary rank structured matrices

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS. + + x 1 x 2. x n 8 (4) 3 4 2

Computational Linear Algebra

Contour integral solutions of Sylvester-type matrix equations

Index. book 2009/5/27 page 121. (Page numbers set in bold type indicate the definition of an entry.)

MATRIX AND LINEAR ALGEBR A Aided with MATLAB

Transcription:

Contemporary Mathematics A note on companion pencils Jared L Aurentz, Thomas Mach, Raf Vandebril, and David S Watkins Abstract Various generalizations of companion matrices to companion pencils are presented Companion matrices link to monic polynomials, whereas companion pencils do not require monicity of the corresponding polynomial In the classical companion pencil case (A, B) only the coefficient of the highest degree appears in B s lower right corner We will show, however, that all coefficients of the polynomial can be distributed over both A and B creating additional flexibility Companion matrices admit a so-called Fiedler factorization into essentially 2 2 matrices These Fiedler factors can be reordered without affecting the eigenvalues (which equal the polynomial roots) of the assembled matrix We will propose a generalization of this factorization and the reshuffling property for companion pencils Special examples of the factorizations and extensions to matrix polynomials and product eigenvalue problems are included It is well known that the polynomial Introduction () p(z) = c n z n + c n z n + + c z + c 0 2000 Mathematics Subject Classification Primary 65F5; Secondary 5A23 Key words and phrases Companion pencil, polynomial, matrix polynomial, rootfinding, product eigenvalue problems The research was partially supported by the Research Council KU Leuven, projects CREA- 3-02 Can Unconventional Eigenvalue Algorithms Supersede the State of the Art, OT//055 Spectral Properties of Perturbed Normal Matrices and their Applications, and CoE EF/05/006 Optimization in Engineering (OPTEC); by the Fund for Scientific Research Flanders (Belgium) project G03422N Reestablishing Smoothness for Matrix Manifold Optimization via Resolution of Singularities; by the Interuniversity Attraction Poles Programme, initiated by the Belgian State, Science Policy Office, Belgian Network DYSCO (Dynamical Systems, Control, and Optimization); and by the European Research Council under the European Unions Seventh Framework Programme (FP7/2007203)/ERC grant agreement no 29068 The views expressed in this article are not those of the ERC or the European Commission, and the European Union is not liable for any use that may be made of the information contained here c 0000 (copyright holder)

2 J L AURENTZ, T MACH, R VANDEBRIL, AND D S WATKINS equals the determinant det (zb A), with 0 c 0 0 c A = (2) and B = 0 c n 2 c n c n The matrix pencil (A, B) is called the companion pencil of p(z) and can be used to compute the roots of p(z) as the roots of p(z) coincide with the eigenvalues of the pencil (A, B) At the moment, generalizing Frobenius companion matrices and/or pencils [3, 7,9] is an active research topic as new matrix forms and/or factorizations open up possibilities to develop new, possibly faster or more accurate algorithms for both the classical rootfinding problem as well as matrix polynomial eigenvalue problems [6,8, 2] Even though the companion pencil exhibits a favorable numerical behavior [0] with respect to the companion matrix, there are only few structured QZ algorithms for companion pencils available [4, 5] The backward error bound for polynomial rootfinding based on companion matrices is c max c c < k c ɛ m + O ( ɛ 2 ) m, c n with c the coefficients of a polynomial with the computed roots, k 2 a constant, c max the coefficient with the largest absolute value, and ɛ m the machine precision However, the backward error for polynomial rootfinding based on companion pencils is only c c < k 2 c ɛ m, where a scaling to c = is possible [8,0,2] Thus polynomial rootfinding with companion pencils is advantageous for polynomials with large cmax c n In this article no new rootfinders will be proposed, only theoretical extensions of the existing companion pencil are introduced First, in Section 2 we will generalize the companion pencil () by distributing not only the highest order, but all polynomial coefficients over both A and B Moreover, one does not have to choose whether to put a specific coefficient c i in A or B, one can also write c i = v i+ + w i, where the term v i will appear in A, and w i in B For matrix polynomials a related approach has been investigated in [] Second, we will adjust Fiedler s factorization of a companion matrix to obtain a similar factorization of the companion pencil and we will prove that also in this case one can reorder all the factors without altering the pencil s eigenvalues Finally, in Sections 4 and 5 we link some of the results to matrix polynomials and product eigenvalue problems 2 Generalized companion pencils The following theorem shows that the coefficients can be distributed over the last column of A and the last column of B

A NOTE ON COMPANION PENCILS 3 Theorem 2 Consider p(z), a polynomial with complex coefficients p(z) = c n z n + c n z n + + c z + c 0 Let v, w be vectors of length n with (3) v = c 0, v i+ + w i = c i, for i =,, n, and w n = c n Then the determinant det (zb A) of the pencil (A, B), with (4) A = Z ve T n, B = Z H Z + we T n, e n the nth standard unit vector, and Z the downshift matrix 0 0 Z =, 0 equals p(z) Proof We have z zw + v z zw 2 + v 2 det (zb A) = det z zw n + v n zw n + v n Applying Laplace s formula to the first row yields z zw 2 + v 2 z zw 3 + v 3 det (zb A) =z det z zw n + v n zw n + v n z z + ( ) n+ (zw + v ) det z } {{ } =( ) n The proof is completed by applying Laplace s formula recursively The coefficients c 0 in A and c n in B are fixed All other coefficients can be freely distributed between parts in A and parts in B as long as v i+ and w i sum up to c i The following example demonstrates that the additional freedom can be used to create additional structure in A and B

4 J L AURENTZ, T MACH, R VANDEBRIL, AND D S WATKINS Example 22 Let p(z) = 5z 3 z 2 + 2z Then both A and B can be chosen to consist solely of non-negative entries, eg, 0 3 A = 0 and B = 2 5 Another option is to choose always either w i = 0 or v i+ = 0: Corollary 23 Consider again p(z) from () Take two disjunct subsets I {0,, n } and J {,, n}, I J = such that their union I J = {0,, n} Define two vectors v and w as { c i if i I, v i+ = 0 if i / I, and w j = Then the eigenvalues of the pencil (A, B) with A = Z ve T n, B = Z H Z + we T n, { c j if j J, 0 if j / J and Z be the downshift matrix coincide with the polynomial roots of p(z) Example 24 Let p(z) = c 5 z 5 + c 4 z 4 + + c z + c 0 Then (A, B), with 0 c 0 c 0 0 0 A = 0 c 2 and B = c 3 0 0 0 c 4 c 5 is a companion pencil of p(z) Remark 25 When considering the existing QZ-algorithms by Boito, Eidelman, and Gemignani [4,5] we notice that the algorithms are based on the property that both A and B in (2) are of unitary-plus-rank-one form, a structure maintained under QZ-iterates This property also holds for the new pencil matrices in Theorem 2 implying that with modest modifications the existing algorithms could compute the eigenvalues of the new pencils as well 3 Fiedler companion factorization The matrix A of the pencil (A, B) is the companion matrix of the monic polynomial whose coefficients are defined by v As shown by Fiedler [9], companion matrices can be factored in n core transformations A core transformation X i is a modified identity matrix, where a 2 2 submatrix, for i = 0 and i = n one entry, on the diagonal is replaced by an arbitrary matrix; for all core transformations the subindex i links to the submatrix X i (i : i +, i : i + ) differing from the identity For the factorization of the companion matrix the core transformations A 0, A i,, A n, typically named Fiedler factors, have the form A 0 (, ) = v Note that A0 s active part is restricted to the upper left matrix entry Later B n s active part is restricted to the lower right matrix entry

A NOTE ON COMPANION PENCILS 5 and A i (i : i, i : i) = [ ] 0 v i for i = 2,, n For example, the factorization of A, with n = 5 equals 0 v 0 v 2 A = 0 v 3 0 v 4 = v 5 = A 0 A A 2 A 3 A 4, [ ][ v 0 ][ ] v 2 0 [ ] v 3 0 [ ] v 4 0 v 5 where only the essential parts of the core transformations are shown Moreover, Fiedler also proved [9] that reordering the Fiedler factors arbitrarily has no effect on the eigenvalues of their product matrix Thus for all permutations σ of {0,, n } the eigenvalues of A σ(0) A σ(n ) equal those of A 0 A n The product A σ(0) A σ(n ) is often referred to as a Fiedler companion matrix In the setting of companion pencils Fiedler factorizations have been used, eg, in [, 2] A similar factorization exists for B, but since B is upper triangular two sequences of core transformations are required For n = 5 we have B = ][ w ][ w2 0 ][ w3 0 [ ][ w4 0 w5 0 ][ ] 0 0 [ 0 0 ] [ 0 0 ] [ ] 0 0 = B 5 B 4 B 3 B 2 B G G 2 G 3 G 4, with B i (i : i +, i : i + ) = [ ] w i 0 and Gi (i : i +, i : i + ) = [ 0 0 ] We will now generalize Fiedler s reordering identity of the companion matrix factorization to the companion pencil case Theorem 3 Let (A, B) be a companion pencil of p(z), with A and B factored in core transformations A = A 0 A n and B = B n B G G n as in Theorem 2 Then, for every permutation σ : {,, n } {,, n } the pencil (A σ, B σ ), with A σ = A 0 A σ() A σ(n ), B σ = B n B σ(n ) B σ() G σ() G σ(n ) satisfies p(z) = det(zb σ A σ ) Proof We will form F σ = A σ B σ() B σ(n ) = A σbσ B n and show that F σ is a Fiedler companion matrix determined by σ and related to the monic polynomial p(z) = z n + c n z n + + c 0, missing coefficient c n Note that all the B i for i < n are invertible as is B n, as otherwise the original polynomial would have had a leading coefficient 0 2 The resulting pencil (F σ, B n ) is equivalent to the companion pencil (2) as one can prove by using the similarity transformations described in [9, Lemma 22] and altering the reasoning slightly not to touch transformation F n The Fiedler companion matrix F σ can be factored into core transformations, the Fiedler factors, F 0 F σ() F σ(n ), with F 0 (, ) = c 0 and F i (i : i +, i : i + ) = 2 In fact, Bn does not need to be inverted, so the proof still holds even for singular B n

6 J L AURENTZ, T MACH, R VANDEBRIL, AND D S WATKINS [ 0 ] c i The matrix Fn does not commute with B n Consequently the necessary transformations involve only F,, F n 2 To construct F σ = A σ Bσ B n we will multiply the pencil (A σ, B σ ) on the right with factors to eliminate parts in B σ and to move them to A σ s side Core transformations X i and X j commute if i j >, so we can reorder A σ = A 0 A σ() A σ(n ) as A σ = A σ = A 0 (A j A j 2 A )(A j2 A j2 2 A j ) (A n A n 2 A js ), We define j 0 = This reordering appeared also in [9], and contains factors (A jl+ A jl+ 2 A jl +A jl ) in which each core transformation A j is positioned to the right of A j, where j = j l+, j l+ 2,, j l + The ordering between these larger factors is, however, different A jl, is positioned to the left of A jl, with l = 0,, s We can also reorder the factors B i and G i to get B σ = B σ = B n (B js B n ) (B B j )(G j G ) (G n G js ), Let us now form the product of the two inner factors w w j S = (B B j )(G j G ) = Let r be the largest index with j r = j + r, which is in fact the number of factors (A jl+ A jl+ 2 A jl ) containing only a single matrix This means that the matrix A σ actually equals A σ = A 0 (A j A j 2 A )A j A jr (A jr+ A jr ) (A n A js ), where there are now r factors consisting of a single matrix One can show that S commutes with (G j2 G j+) (G n G js ) but not with G j,, G jr These factors G j,, G jr, when applied from the right on S, will each move w j one column to the right The matrix S = ((G j2 G j+) (G n G js )) S(G j2 G j+) (G n G js ), is an elimination matrix Thus inverting S is equivalent to changing all w i into w i By multiplying the pencil with S from the right, we can change it to ( ) A σ S, B n (B js B n ) (B j B j2 )(G j2 G j ) (G n G js ) Again one can show that S commutes with (A j2 A j+) (A n A js ) Further A j,, A jr will each move w j one column back to the left, so that (A j2 A j+) (A n A js ) S ((A j2 A j+) (A n A js )) = S We can now combine (A j A j 2 A )S, yielding (A j A )S = = (F j F ) v 2 w v j w j By repeating this procedure the remaining factors in B σ can be moved to the left and F σ = A 0 (F j F j 2 F )(F j2 F j2 2 F j ) (F n F n 2 F js )

A NOTE ON COMPANION PENCILS 7 Remark 32 We would like to emphasize that in Fiedler factorizations it is possible to reposition the A 0 term, so that it can appear everywhere in the factorization In the pencil case, however, this is in general not possible Nor A 0 nor B n can be moved freely, unless in some specific configurations, of which the Fiedler factorization with all B i s identities is one Let us now look at some examples of different shapes for companion pencils Example 33 Let p(z) = 2z 4 + (sin(t) + cos(t))z 3 + (3 + i)z 2 4z 2 A standard Hessenberg-upper triangular pencil (A, B) could look like 0 2 A = A 0 A A 2 A 3 = 0 5 0 3 and sin(t) B = B B 2 B 3 B 4 = i cos(t) 2 Another option would be a CMV-like ordering 0 2 A = A 0 A 2 A A 3 = 0 5 0 3 and sin(t) i i B = B 2 B B 3 B 4 = 0 cos(t) = cos(t) 2 2 The advantage is that A is pentadiagonal and B the product of a pentadiagonal and a permutation matrix, properties that can be exploited when developing QR or QZ algorithms Example 34 In this example we will demonstrate ( the effect of) the permutation 2 3 4 5 6 7 σ Let p(z) = c 8 z 8 + c z + c 0 and σ = Then we have 5 4 3 2 6 7 0 v 6 0 v 0 v 2 0 0 v 3 0 v 4 v 5 v 7 0 v 8, w w 2 0 0 w 3 0 w 4 w 8 w 7 0 0 w 5 w 6 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0, A 0A σ() A σ(7) B 8B σ(7) B σ() G σ() G σ(7) where only the active parts of the Fiedler factors are shown We see that the shape is the same in A and G but mirrored in B In the stacking notation above, one should replace each active matrix part by a full matrix; the notation is unambiguous as two blocks positioned on top of each other commute so their ordering does not play any role

8 J L AURENTZ, T MACH, R VANDEBRIL, AND D S WATKINS Remark 35 We focused on a specific factorization linked to the B-matrix However, the matrix B could also be factored as (eg, let n = 5) ][ ] ][ 0 0 [ ][ 0 0 w 0 ][ B = [0 ][ 0 0 w 2 0 ][ ] 0 w 3 0 [ ] 0 w 4 w5 The factorization differs, but results similar to Theorem 3 can be proved 4 Factoring companion pencils and product eigenvalue problems Even though Theorem 2 is probably the most useful in practice, it is still only a special case of a general product setting, where the polynomial rootfinding problem is written as a product eigenvalue problem [3] Theorem 4 Consider p(z), a polynomial with complex coefficients p(z) = c n z n + c n z n + + c z + c 0 Define A, B, and F (k) (for k =,, m) based on the n-tuples v, w, and f (k) v = c 0, v i = 0, for i = 2,, n, w n = c n, w i = 0, for i =,, n, m = 0, and f (k) n k= f (k) i = c i, for i =,, n, as follows 0 c 0 0 0 A = Z ve T n =, B = Z H Z + we T n = 0 0 f (k) and F (k) = I f (k) e T n =, f (k) n with Z the downshift matrix Then the product A( m k= F (k) )Bn equals the companion matrix in (2) Proof Straightforward multiplication of the elimination matrices provides the result The factors F (k) commute and thus the product notation is unambiguous Theorem 2 is a simple consequence of Theorem 4 The inverse of F (k) is again an elimination matrix, where only a sign change of the elements determined by f (k) needs to be affected; as a result, we have for all l < m that the pencil (A l k= F (k), Bn l+ m (F (k) ) ) shares the eigenvalues of the companion c n

A NOTE ON COMPANION PENCILS 9 matrix Moreover, instead of explicitly computing the product one could also use the factorization to retrieve the eigenvalues of the pencil Remark 42 Each of the factors F (k) can be written in terms of his Fiedler factorization as proposed in Section 3 Moreover, Theorem 3 can be generalized to the product setting, where each of the factors F (k) obeys the ordering imposed by the permutation σ The proof proceeds similarly to Theorem 3; instead of having to move only one matrix B from the right to the left entry in the pencil, one has to move now each of the factors, one by one to the left 5 Matrix Polynomials In this section we will generalize the results for classical polynomials to matrix polynomials Let p(z) = C n z n +C n z n + +C z +C 0 be a matrix polynomial having coefficients C i C m m Then p(z) matches the determinant det (zb A), with (5) 0 m V I m 0 m V 2 A = I m 0 m V n I m V n I m W I m W 2 and B = I m W n W n where V i+ + W i = C i, V = C 0, W n = C n, I m equals the identity matrix of size m m, and 0 m stands for the zero matrix of size m m In the following corollary it is shown that the additional freedom in the splitting can be used to form block-matrices A and B with structured blocks Corollary 5 Let p(z) = C n z n +C n z n + +C z+c 0 be a matrix polynomial with coefficients C i C m m Let further C 0 be skew-hermitian (C H 0 = C 0 ) and C n Hermitian (C n = C H n ) Then there are V = [ V V 2 V n ] T and W = [ W W 2 W n ] T as in (5), so that all Vi are skew-hermitian and all W i are Hermitian Proof V is skew-hermitian and ( W n is Hermitian ) For( all other) coefficients C i we can use the splitting V i+ = 2 Ci Ci H and Wi = 2 Ci + Ci H In Section 4 we already showed that we can distribute the coefficients over many more factors to obtain a product eigenvalue problem If C 0 and C n are of special form, one can even obtain a matrix product with structured matrices Corollary 52 Let p(z) = C n z n +C n z n + +C z+c 0 be a matrix polynomial with coefficients C i C m m, C 0 unitary-plus-rank-, and C n = I m Then there exist m upper-triangular and unitary-plus-rank- matrices F (k) and a unitary-plus-rank- matrix A, so that the eigenvalues of the matrix polynomial coincide with those of the product eigenvalue problem A( m k= F (k) )Bn Proof Adapting the notation from Theorem 4 to fit matrix polynomials as in (5) and letting V i and W i as in (5) and F (k) i as in Theorem 4 be matrices The desired property holds for the m(n ) matrices F (k) i, with (k =,, m;,

0 J L AURENTZ, T MACH, R VANDEBRIL, AND D S WATKINS i =,, n ; and s and t general indices) F (k) i (s, t) = { C i (s, t), t = k, 0, t k, as every F (k) takes one column out of the coefficient matrices 6 Conclusions & Future work A generalization of Fiedler s factorization of companion matrices to companion pencils was presented It was shown that the pencil approach can be seen as a specific case of a product eigenvalue problem, and it was noted that all results are applicable to the matrix polynomial case Forthcoming investigations focus on exploiting these splittings in rootsolvers, based on companion pencils to fastly obtain reliable solutions Acknowledgments We d like to thank Vanni Noferini and Leonardo Robol for the fruitful discussions on the conditioning of these pencils during their visit to the Dept of Computer Science, November 204 Furthermore we d like to express our gratitude to the referees, who pinpointed some shortcomings and errors; their careful reading has significantly improved the presentation and readability References E N Antoniou and S Vologiannidis, A new family of companion forms of polynomial matrices, Electronic Journal of Linear Algebra (2004), 78 87 2 E N Antoniou, S Vologiannidis, and N Karampetakis, Linearizations of polynomial matrices with symmetries and their applications, Intelligent Control, 2005 Proceedings of the 2005 IEEE International Symposium on, Mediterrean Conference on Control and Automation, June 2005, pp 59 63 3 R Bevilacqua, G M Del Corso, and L Gemignani, A CMV based eigensolver for companion matrices, arxiv preprint arxiv:4062820 (204) 4 P Boito, Y Eidelman, and L Gemignani, Implicit QR for rank-structured matrix pencils, BIT 54 (203), 85 5, Implicit QR for companion-like pencils, arxiv preprint arxiv:405606 (204) 6 F de Terán, Froilán M Dopico, and J Pérez, Backward stability of polynomial root-finding using Fiedler companion matrices, IMA Journal of Numerical Analysis (204) 7 B Eastman, I-J Kim, BL Shader, and KN Vander Meulen, Companion matrix patterns, Linear Algebra and its Applications 463 (204), 255 272 8 A Edelman and H Murakami, Polynomial roots from companion matrix eigenvalues, Mathematics of Computation 64 (995), no 20, 763 776 9 M Fiedler, A note on companion matrices, Linear Algebra and its Applications 372 (2003), 325 33 0 G F Jónsson and S Vavasis, Solving polynomials with small leading coefficients, SIAM Journal on Matrix Analysis and Applications 26 (2004), no 2, 400 44 D S Mackey, N Mackey, C Mehl, and V Mehrmann, Vector spaces of linearizations for matrix polynomials, SIAM Journal on Matrix Analysis and Applications 28 (2006), no 4, 97 004 2 P Van Dooren and P Dewilde, The eigenstructure of an arbitrary polynomial matrix: Computational aspects, Linear Algebra and its Applications 50 (983), 545 579 3 D S Watkins, Product eigenvalue problems, SIAM Review 47 (2005), no, 3 40

A NOTE ON COMPANION PENCILS Mathematical Institute, University of Oxford, Andrew Wiles Building, Woodstock Road, OX2 6GG Oxford, UK E-mail address: aurentz@mathsoxacuk URL: http://wwwmathsoxacuk/people/jaredaurentz Department of Computer Science, KU Leuven, Celestijnenlaan 200A, 300 Leuven (Heverlee), Belgium E-mail address: ThomasMach@cskuleuvenbe URL: http://peoplecskuleuvenbe/thomasmach/ Department of Computer Science, KU Leuven, Celestijnenlaan 200A, 300 Leuven (Heverlee), Belgium E-mail address: RafVandebril@cskuleuvenbe URL: http://peoplecskuleuvenbe/rafvandebril/ Department of Mathematics, Washington State University, Pullman, WA 9964-33, USA E-mail address: watkins@mathwsuedu URL: http://wwwmathwsuedu/faculty/watkins/