A Framework for Structured Linearizations of Matrix Polynomials in Various Bases

Similar documents
A UNIFIED APPROACH TO FIEDLER-LIKE PENCILS VIA STRONG BLOCK MINIMAL BASES PENCILS.

EXPLICIT BLOCK-STRUCTURES FOR BLOCK-SYMMETRIC FIEDLER-LIKE PENCILS

A SIMPLIFIED APPROACH TO FIEDLER-LIKE PENCILS VIA STRONG BLOCK MINIMAL BASES PENCILS.

Eigenvector error bound and perturbation for nonlinear eigenvalue problems

Nonlinear palindromic eigenvalue problems and their numerical solution

Algorithms for Solving the Polynomial Eigenvalue Problem

KU Leuven Department of Computer Science

Solving Polynomial Eigenproblems by Linearization

Jordan Structures of Alternating Matrix Polynomials

Linearizing Symmetric Matrix Polynomials via Fiedler pencils with Repetition

Scaling, Sensitivity and Stability in Numerical Solution of the Quadratic Eigenvalue Problem

Fiedler Companion Linearizations and the Recovery of Minimal Indices. De Teran, Fernando and Dopico, Froilan M. and Mackey, D.

Recent Advances in the Numerical Solution of Quadratic Eigenvalue Problems

arxiv: v2 [math.na] 8 Aug 2018

Strong linearizations of rational matrices with polynomial part expressed in an orthogonal basis

Polynomial Eigenvalue Problems: Theory, Computation, and Structure. Mackey, D. S. and Mackey, N. and Tisseur, F. MIMS EPrint: 2015.

Solving the Polynomial Eigenvalue Problem by Linearization

Skew-Symmetric Matrix Polynomials and their Smith Forms

1. Introduction. Throughout this work we consider n n matrix polynomials with degree k 2 of the form

Review problems for MA 54, Fall 2004.

arxiv: v1 [math.na] 22 Nov 2016

Polynomial eigenvalue solver based on tropically scaled Lagrange linearization. Van Barel, Marc and Tisseur, Francoise. MIMS EPrint: 2016.

Linear Algebra and its Applications

Perturbation theory for eigenvalues of Hermitian pencils. Christian Mehl Institut für Mathematik TU Berlin, Germany. 9th Elgersburg Workshop

A note on companion pencils

An Algorithm for. Nick Higham. Françoise Tisseur. Director of Research School of Mathematics.

Multiparameter eigenvalue problem as a structured eigenproblem

Topics in linear algebra

Paul Van Dooren s Index Sum Theorem: To Infinity and Beyond

Linear Algebra I. Ronald van Luijk, 2015

Trimmed linearizations for structured matrix polynomials

The value of a problem is not so much coming up with the answer as in the ideas and attempted ideas it forces on the would be solver I.N.

Quadratic Realizability of Palindromic Matrix Polynomials

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix

SOLUTION SETS OF RECURRENCE RELATIONS

This section is an introduction to the basic themes of the course.

MATH 167: APPLIED LINEAR ALGEBRA Chapter 2

Research Matters. February 25, The Nonlinear Eigenvalue Problem. Nick Higham. Part III. Director of Research School of Mathematics

Analytic Fredholm Theory

RETRACTED On construction of a complex finite Jacobi matrix from two spectra

ACM 104. Homework Set 4 Solutions February 14, 2001

Chapter 7. Canonical Forms. 7.1 Eigenvalues and Eigenvectors

KU Leuven Department of Computer Science

Eigenvalues and Eigenvectors

Geometric Modeling Summer Semester 2010 Mathematical Tools (1)

On the reduction of matrix polynomials to Hessenberg form

Preliminary/Qualifying Exam in Numerical Analysis (Math 502a) Spring 2012

4. Linear transformations as a vector space 17

Linear Algebra and its Applications

1. Introduction. Throughout this work we consider n n matrix polynomials with degree k of the form

MATH 583A REVIEW SESSION #1

Singular Value Decomposition

EIGENVALUES AND EIGENVECTORS

arxiv: v1 [cs.sy] 29 Dec 2018

[Disclaimer: This is not a complete list of everything you need to know, just some of the topics that gave people difficulty.]

THE STABLE EMBEDDING PROBLEM

Factoring block Fiedler Companion Matrices

Functional Analysis Review

On the computation of the Jordan canonical form of regular matrix polynomials

MATHEMATICS 217 NOTES

Triangularizing matrix polynomials. Taslaman, Leo and Tisseur, Francoise and Zaballa, Ion. MIMS EPrint:

A block-symmetric linearization of odd degree matrix polynomials with optimal eigenvalue condition number and backward error

Final Review Sheet. B = (1, 1 + 3x, 1 + x 2 ) then 2 + 3x + 6x 2

Math 291-2: Lecture Notes Northwestern University, Winter 2016

k In particular, the largest dimension of the subspaces of block-symmetric pencils we introduce is n 4

Linear Algebra Massoud Malek

Linear Algebra Review

Lecture 7: Positive Semidefinite Matrices

Descriptor system techniques in solving H 2 -optimal fault detection problems

Hermitian Matrix Polynomials with Real Eigenvalues of Definite Type. Part I: Classification. Al-Ammari, Maha and Tisseur, Francoise

University of Colorado at Denver Mathematics Department Applied Linear Algebra Preliminary Exam With Solutions 16 January 2009, 10:00 am 2:00 pm

18.S34 linear algebra problems (2007)

PALINDROMIC LINEARIZATIONS OF A MATRIX POLYNOMIAL OF ODD DEGREE OBTAINED FROM FIEDLER PENCILS WITH REPETITION.

Math 25a Practice Final #1 Solutions

Math 3013 Problem Set 6

(VI.C) Rational Canonical Form

Lecture Notes in Linear Algebra

Optimal Scaling of Companion Pencils for the QZ-Algorithm

Inverse Eigenvalue Problem with Non-simple Eigenvalues for Damped Vibration Systems

The following definition is fundamental.

ELA

1. Introduction. In this paper we discuss a new class of structured matrix polynomial eigenproblems Q(λ)v = 0, where

Problem Set (T) If A is an m n matrix, B is an n p matrix and D is a p s matrix, then show

STRUCTURE PRESERVING DEFLATION OF INFINITE EIGENVALUES IN STRUCTURED PENCILS

Bindel, Fall 2011 Intro to Scientific Computing (CS 3220) Week 3: Wednesday, Jan 9

Conceptual Questions for Review

OHSx XM511 Linear Algebra: Solutions to Online True/False Exercises

Dot Products. K. Behrend. April 3, Abstract A short review of some basic facts on the dot product. Projections. The spectral theorem.

A Field Extension as a Vector Space

Quantum wires, orthogonal polynomials and Diophantine approximation

Linear Algebra. Matrices Operations. Consider, for example, a system of equations such as x + 2y z + 4w = 0, 3x 4y + 2z 6w = 0, x 3y 2z + w = 0.

MATHEMATICAL ENGINEERING TECHNICAL REPORTS. Eigenvector Error Bound and Perturbation for Polynomial and Rational Eigenvalue Problems

Frame Diagonalization of Matrices

Computing Unstructured and Structured Polynomial Pseudospectrum Approximations

Key words. ellipsoids, signed distance, non-convex optimization, KKT conditions, Minkowski difference, two-parameter eigenvalue problem, Bézoutian

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )

Distance bounds for prescribed multiple eigenvalues of matrix polynomials

Math 396. Quotient spaces

Matrix Factorizations

Proofs for Quizzes. Proof. Suppose T is a linear transformation, and let A be a matrix such that T (x) = Ax for all x R m. Then

Transcription:

A Framework for Structured Linearizations of Matrix Polynomials in Various Bases Leonardo Robol <leonardo.robol@cs.kuleuven.be> Joint work with Raf Vandebril and Paul Van Dooren, KU Leuven and Université Catholique de Louvain 09 May 2016, Structured Matrix Days in Limoges 1 / 48

Introduction Rational vector spaces Linearizations, dual bases and structures I will talk about different mathematical objects that play very well together: 2 / 48

Introduction Rational vector spaces Linearizations, dual bases and structures I will talk about different mathematical objects that play very well together: Linearizations of (matrix) polynomials. 2 / 48

Introduction Rational vector spaces Linearizations, dual bases and structures I will talk about different mathematical objects that play very well together: Linearizations of (matrix) polynomials. Dual (minimal) bases, related to linear system theory. 2 / 48

Introduction Rational vector spaces Linearizations, dual bases and structures I will talk about different mathematical objects that play very well together: Linearizations of (matrix) polynomials. Dual (minimal) bases, related to linear system theory. How to take structures from polynomials and preserve them in linearizations (spoiler alert: using dual bases!). 2 / 48

Introduction Rational vector spaces Linearizations, dual bases and structures I will talk about different mathematical objects that play very well together: Linearizations of (matrix) polynomials. Dual (minimal) bases, related to linear system theory. How to take structures from polynomials and preserve them in linearizations (spoiler alert: using dual bases!). How to deal with different bases in the definition of a polynomial. 2 / 48

Introduction Rational vector spaces Vector spaces of rational functions One can consider the vector space F n (λ) of rational functions in the variable λ: v(λ) F n (λ) v(λ) = p 1 (λ) q 1 (λ). p n(λ) q n(λ) 3 / 48

Introduction Rational vector spaces Vector spaces of rational functions One can consider the vector space F n (λ) of rational functions in the variable λ: v(λ) F n (λ) v(λ) = p 1 (λ) q 1 (λ). p n(λ) q n(λ) F(λ) is a field, so everything is nice. However, every vector can be written as v(λ) = q(λ) 1 ṽ(λ) with v(λ) F[λ]. 3 / 48

Introduction Rational vector spaces Vector spaces of rational functions One can consider the vector space F n (λ) of rational functions in the variable λ: v(λ) F n (λ) v(λ) = p 1 (λ) q 1 (λ). p n(λ) q n(λ) F(λ) is a field, so everything is nice. However, every vector can be written as v(λ) = q(λ) 1 ṽ(λ) with v(λ) F[λ]. Switching between polynomials and rational functions is useful for theoretical reasons. 3 / 48

Introduction Rational vector spaces Bases for rational vector spaces Let V F n (λ) a r-dimensional subspace of the vector space on the field of rational functions. 4 / 48

Introduction Rational vector spaces Bases for rational vector spaces Let V F n (λ) a r-dimensional subspace of the vector space on the field of rational functions. We can write a basis for V as A(λ) F n r (λ), so that such that: f (λ) V f (λ) = A(λ)w f (λ) 4 / 48

Introduction Rational vector spaces Bases for rational vector spaces Let V F n (λ) a r-dimensional subspace of the vector space on the field of rational functions. We can write a basis for V as A(λ) F n r (λ), so that such that: f (λ) V f (λ) = A(λ)w f (λ) V admits a basis composed only of polynomials. 4 / 48

Introduction Rational vector spaces Bases for rational vector spaces Let V F n (λ) a r-dimensional subspace of the vector space on the field of rational functions. We can write a basis for V as A(λ) F n r (λ), so that such that: f (λ) V f (λ) = A(λ)w f (λ) V admits a basis composed only of polynomials. Among these bases, we can look for the ones whose sum of column-degrees is minimal. The basis is not unique, but its column-degrees are. We call these bases minimal, and their column degrees minimal indices. 4 / 48

Introduction Dual bases Dual spaces We can look for the dual space V, which has dimension n r. V = {g(λ) F n (λ) g(λ) T f (λ) = 0 f (λ) V}. 5 / 48

Introduction Dual bases Dual spaces We can look for the dual space V, which has dimension n r. V = {g(λ) F n (λ) g(λ) T f (λ) = 0 f (λ) V}. If B(λ) is a dual basis to A(λ) if A(λ) T B(λ) = 0. This is equivalent to saying that B(λ) is a basis of V. If both A(λ) and B(λ) are minimal we say that they form a pair of dual minimal bases. 5 / 48

Introduction Dual bases Dual spaces We can look for the dual space V, which has dimension n r. V = {g(λ) F n (λ) g(λ) T f (λ) = 0 f (λ) V}. If B(λ) is a dual basis to A(λ) if A(λ) T B(λ) = 0. This is equivalent to saying that B(λ) is a basis of V. If both A(λ) and B(λ) are minimal we say that they form a pair of dual minimal bases. 1 λ...... 1 λ λ k. 1 = 0 are dual minimal bases. 5 / 48

Introduction Dual bases Properties of dual minimal bases Let A(λ) T B(λ) = 0 be a pair of dual minimal bases. The sum of degrees of a minimal basis and of a minimal basis of the dual space always coincide: r n r deg A(λ)e i = B(λ)e i. i=1 i=1 6 / 48

Introduction Dual bases Properties of dual minimal bases Let A(λ) T B(λ) = 0 be a pair of dual minimal bases. The sum of degrees of a minimal basis and of a minimal basis of the dual space always coincide: r n r deg A(λ)e i = B(λ)e i. i=1 i=1 Pairs of minimal bases behave in a good way under perturbations: 6 / 48

Introduction Dual bases Properties of dual minimal bases Let A(λ) T B(λ) = 0 be a pair of dual minimal bases. The sum of degrees of a minimal basis and of a minimal basis of the dual space always coincide: r n r deg A(λ)e i = B(λ)e i. i=1 i=1 Pairs of minimal bases behave in a good way under perturbations: Minimality is preserved for small enough changes. 6 / 48

Introduction Dual bases Properties of dual minimal bases Let A(λ) T B(λ) = 0 be a pair of dual minimal bases. The sum of degrees of a minimal basis and of a minimal basis of the dual space always coincide: r n r deg A(λ)e i = B(λ)e i. i=1 Pairs of minimal bases behave in a good way under perturbations: Minimality is preserved for small enough changes. Small perturbations to A(λ) correspond to small perturbations of the dual space (in an appropriate sense). i=1 6 / 48

Introduction Dual bases Properties of dual minimal bases Let A(λ) T B(λ) = 0 be a pair of dual minimal bases. The sum of degrees of a minimal basis and of a minimal basis of the dual space always coincide: r n r deg A(λ)e i = B(λ)e i. i=1 Pairs of minimal bases behave in a good way under perturbations: Minimality is preserved for small enough changes. Small perturbations to A(λ) correspond to small perturbations of the dual space (in an appropriate sense). i=1 These facts are crucial in proving backward stability results. 6 / 48

Linearizations Basic examples Building linearizations We are interested in computing eigenvalues and eigenstructure of a matrix polynomial P(λ): P(λ) = n P i λ i, or, more generally, P(λ) = i=0 n P i φ i (λ). i=0 where Φ = {φ 0 (λ),..., φ n (λ)} is some polynomial basis (monomials, Chebyshev, Lagrange, Newton,... ). 7 / 48

Linearizations Basic examples Building linearizations We are interested in computing eigenvalues and eigenstructure of a matrix polynomial P(λ): P(λ) = n P i λ i, or, more generally, P(λ) = i=0 n P i φ i (λ). i=0 where Φ = {φ 0 (λ),..., φ n (λ)} is some polynomial basis (monomials, Chebyshev, Lagrange, Newton,... ). We look for a pencil L(λ) with the same spectral properties. We say that L(λ) is a linearization for P(λ) if and only if: P(λ) I m(n 1) = E(λ)L(λ)F (λ), det E(λ), det F (λ) invertible in F[λ]. 7 / 48

Linearizations Basic examples From dual bases to linearizations Given a basis Φ := {φ 0 (λ),..., φ n (λ)} we say that L φ (λ) and π φ (λ) are dual bases associated to Φ if: φ n (λ) L φ (λ) T π φ (λ) = 0, π φ (λ) =. φ 0 (λ) Theorem If L φ (λ) and π φ (λ) are dual bases associated to Φ then [ W T ] W n (λ) T (λ) L(λ) := L T φ (λ) I, W (λ) =. m W 0 (λ) T linearizes P(λ) := W T (λ)(π φ (λ) I m ) = n i=0 W i(λ)φ i (λ). 8 / 48

Linearizations Basic examples Almost a proof We check what the eigenvalues of such a pencil are: { W T (λ)v = 0 L(λ)v = 0 (L T (λ) I m )v = 0 9 / 48

Linearizations Basic examples Almost a proof We check what the eigenvalues of such a pencil are: { W T (λ)v = 0 L(λ)v = 0 (L T (λ) I m )v = 0 Since v is in the dual of L(λ) we can write it as v = (π(λ) I m )w v where w v 0 so that: W T (λ)v = 0 W T (λ)(π(λ) I m )w v = 0 P(λ)w v = 0. 9 / 48

Linearizations Basic examples Almost a proof We check what the eigenvalues of such a pencil are: { W T (λ)v = 0 L(λ)v = 0 (L T (λ) I m )v = 0 Since v is in the dual of L(λ) we can write it as v = (π(λ) I m )w v where w v 0 so that: W T (λ)v = 0 W T (λ)(π(λ) I m )w v = 0 P(λ)w v = 0. Basic idea: The eigenvector must lie in the dual of L(λ), thus has the correct structure. 9 / 48

since p(λ) = [ λp n + p n 1... p 0 ] π(λ). 10 / 48 Structured Linearizations Linearizations Basic examples A well-known example One can consider the pair of dual minimal bases 1 L(λ) = λ...... 1, π(λ) = λ λ n 1 so that λp n + p n 1...... p 0 1 λ n L(λ) =...... linearizes p(λ) := p i λ i. i=0 1 λ. λ 1

Linearizations Basic examples Other well-known examples... the previous approach allows for linearization in arbitrary bases! λp n + p n 1 p n 2 p n... p 1 p 0 1 2λ 1 L(λ) =......... 1 2λ 1 1 λ is a linearization for the polynomial expressed in the Chebyshev basis of the first kind: n p(λ) = p i T i (λ). i=0 11 / 48

Linearizations Basic examples Other well-known examples... the previous approach allows for linearization in arbitrary bases! λp n + p n 1 p n 2 p n... p 1 p 0 1 2λ 1 L(λ) =......... 1 2λ 1 1 λ is a linearization for the polynomial expressed in the Chebyshev basis of the first kind: n p(λ) = p i T i (λ). i=0 Already in [Amirlaslani, Corless, Lancaster, 2009]: all the orthogonal bases and also Newton, Lagrange, Hermite are possible. 11 / 48

Linearizations Basic examples Just to get the idea Assume we have an orthogonal basis φ i (λ) satisfying the relation: αφ j+1 (λ) = (λ β)φ j (λ) γφ j 1 (λ), α 0, j > 0, (1) 12 / 48

Linearizations Basic examples Just to get the idea Assume we have an orthogonal basis φ i (λ) satisfying the relation: αφ j+1 (λ) = (λ β)φ j (λ) γφ j 1 (λ), α 0, j > 0, (1) Then α (β λ) γ L φ (λ) T......... := α (β λ) γ φ 0 (λ) φ 1 (λ) is a basis dual to π φ (λ). 12 / 48

Linearizations Basic examples Using two dual bases at the same time The construction of linearizations can be generalized to the case where we have two pairs of dual minimal bases. 13 / 48

Linearizations Basic examples Using two dual bases at the same time The construction of linearizations can be generalized to the case where we have two pairs of dual minimal bases. Let Φ, Ψ be two polynomial bases, and L φ (λ), π φ (λ) and L ψ (λ), π ψ (λ) the corresponding dual minimal bases. 13 / 48

Linearizations Basic examples Using two dual bases at the same time The construction of linearizations can be generalized to the case where we have two pairs of dual minimal bases. Let Φ, Ψ be two polynomial bases, and L φ (λ), π φ (λ) and L ψ (λ), π ψ (λ) the corresponding dual minimal bases. Theorem The matrix pencil L(λ) := [ λm1 + M 0 ] L φ (λ) L ψ (λ) T 0 is a linearization for the polynomial p(λ) := π ψ (λ) T (λm 1 + M 0 )π φ (λ). 13 / 48

Linearizations Fiedler and more bases Relation with Fiedler linearizations The family of linearizations that can be built using the previous result is very large: 14 / 48

Linearizations Fiedler and more bases Relation with Fiedler linearizations The family of linearizations that can be built using the previous result is very large: The Frobenius linearizations (the classical companions) are a particular case. 14 / 48

Linearizations Fiedler and more bases Relation with Fiedler linearizations The family of linearizations that can be built using the previous result is very large: The Frobenius linearizations (the classical companions) are a particular case. Every Fiedler pencil/companion is just a permutation of a pencil in the previous form. 14 / 48

Linearizations Fiedler and more bases Relation with Fiedler linearizations The family of linearizations that can be built using the previous result is very large: The Frobenius linearizations (the classical companions) are a particular case. Every Fiedler pencil/companion is just a permutation of a pencil in the previous form. However, many more linearizations fits in this class! 14 / 48

Linearizations Fiedler and more bases Equivalence to Fiedler linearizations Consider p(λ) = p 4 λ 4 + p 3 λ 3 + p 2 λ 2 + p 1 λ + p 0. A Fiedler linearization looks like: λp 4 + p 3 1 0 0 λb A = p 2 λ p 1 p 0 1 0 λ 0. 0 0 1 λ 15 / 48

Linearizations Fiedler and more bases Equivalence to Fiedler linearizations Consider p(λ) = p 4 λ 4 + p 3 λ 3 + p 2 λ 2 + p 1 λ + p 0. A Fiedler linearization looks like: λp 4 + p 3 1 0 0 λb A = p 2 λ p 1 p 0 1 0 λ 0. 0 0 1 λ Permuting the above pencil in a good way yields: λp 4 + p 3 0 0 1 p 2 p 1 p 0 λ 1 λ 0 0 0 1 λ 0 [ ] = λm1 + M 0 L φ (λ) L ψ (λ) T 0 15 / 48

Linearizations Fiedler and more bases Exploiting the extra freedom According to the previous Theorem, we can write the linearized polynomial: λp 4 + p 3 0 0 1 p 2 p 1 p 0 λ 1 λ 0 0 0 1 λ 0 [ λ 1 ] [ ] λp 4 + p 3 0 λ 2 λ. p 2 p 1 p 0 1 16 / 48

Linearizations Fiedler and more bases Exploiting the extra freedom According to the previous Theorem, we can write the linearized polynomial: λp 4 + p 3 0 0 1 p 2 p 1 p 0 λ 1 λ 0 0 0 1 λ 0 [ λ 1 ] [ ] λp 4 + p 3 0 λ 2 λ. p 2 p 1 p 0 1 The latter product corresponds to the sum of all the elements in the Hadamard product: [ ] [ λp4 + p 3 λ 3 λ 2 ] λ p 2 p 1 p 0 λ 2 λ 1 which clearly is p(λ). However, reshuffling is allowed! 16 / 48

Linearizations Fiedler and more bases Exploiting the extra freedom We can get symmetric linearizations for odd degree polynomials: λp 5 + p 4 1 λp 3 + p 2 λ 1 λp 1 + p 0 λ 1 λ 0 0 1 λ 0 0 Which trivially extends to block symmetric linearizations for matrix polynomials, which are symmetric whenever the polynomial is. 17 / 48

Linearizations Fiedler and more bases Exploiting the extra freedom We can get symmetric linearizations for odd degree polynomials: λp 5 + p 4 1 λp 3 + p 2 λ 1 λp 1 + p 0 λ 1 λ 0 0 1 λ 0 0 Which trivially extends to block symmetric linearizations for matrix polynomials, which are symmetric whenever the polynomial is. Main underlying idea: Construct the two bases L φ (λ) and L ψ (λ) so that they have the same symmetry of the matrix polynomial, and you will get a structured linearization. 17 / 48

Preserving structures Symmetries Why do we care about symmetries? 18 / 48

Preserving structures Symmetries Why do we care about symmetries? They are beautiful. 18 / 48

Preserving structures Symmetries Why do we care about symmetries? They are beautiful. They are fun to work with. 18 / 48

Preserving structures Symmetries Why do we care about symmetries? They are beautiful. They are fun to work with. But since we need to get research funded we also notice that: 18 / 48

Preserving structures Symmetries Why do we care about symmetries? They are beautiful. They are fun to work with. But since we need to get research funded we also notice that: They are naturally present in many engineering applications. 18 / 48

Preserving structures Symmetries Why do we care about symmetries? They are beautiful. They are fun to work with. But since we need to get research funded we also notice that: They are naturally present in many engineering applications. Preserving symmetries in the linearizations means preserving also spectral symmetries. 18 / 48

Preserving structures Symmetries Why do we care about symmetries? They are beautiful. They are fun to work with. But since we need to get research funded we also notice that: They are naturally present in many engineering applications. Preserving symmetries in the linearizations means preserving also spectral symmetries. Structured methods can give us approximations to the eigenvalues that make sense in the underlying model and have a lower condition number (structured vs unstructured). 18 / 48

Preserving structures Symmetries Why do we care about symmetries? They are beautiful. They are fun to work with. But since we need to get research funded we also notice that: They are naturally present in many engineering applications. Preserving symmetries in the linearizations means preserving also spectral symmetries. Structured methods can give us approximations to the eigenvalues that make sense in the underlying model and have a lower condition number (structured vs unstructured). We get better results in the end. 18 / 48

Preserving structures Symmetries Why do we care about symmetries? They are beautiful. They are fun to work with. But since we need to get research funded we also notice that: They are naturally present in many engineering applications. Preserving symmetries in the linearizations means preserving also spectral symmetries. Structured methods can give us approximations to the eigenvalues that make sense in the underlying model and have a lower condition number (structured vs unstructured). We get better results in the end. Famous example: simulation of vibrations for high speed trains: palindromic matrix polynomials. 18 / 48

Preserving structures Symmetries Matrix and scalar polynomials Most of the things that you see here for scalar polynomials are indeed true (and useful) for matrix polynomials. 19 / 48

Preserving structures Symmetries Matrix and scalar polynomials Most of the things that you see here for scalar polynomials are indeed true (and useful) for matrix polynomials. Just take some I m and attach them on the right of (almost) every matrix you can see, that is [ ] λm1 + M 0 L φ (λ) L T ψ (λ) 0 are linearizations for: [ λm1 + M 0 ] L φ (λ) I m L T ψ (λ) I m 0 π φ (λ) T (λm 1 +M 0 )π ψ (λ), (π φ (λ) T I m )(λm 1 +M 0 )(π ψ (λ) I m ) 19 / 48

Preserving structures Symmetries Matrix and scalar polynomials Most of the things that you see here for scalar polynomials are indeed true (and useful) for matrix polynomials. Just take some I m and attach them on the right of (almost) every matrix you can see, that is [ ] λm1 + M 0 L φ (λ) L T ψ (λ) 0 are linearizations for: [ λm1 + M 0 ] L φ (λ) I m L T ψ (λ) I m 0 π φ (λ) T (λm 1 +M 0 )π ψ (λ), (π φ (λ) T I m )(λm 1 +M 0 )(π ψ (λ) I m ) I will switch back and forth between the two notations. 19 / 48

Preserving structures Even odd case Even/odd matrix polynomials Another common structure found in applications is -even/odd polynomial ( {T,, 1}). 20 / 48

Preserving structures Even odd case Even/odd matrix polynomials Another common structure found in applications is -even/odd polynomial ( {T,, 1}). P(λ) = P( λ) P(λ) = P( λ) even odd 20 / 48

Preserving structures Even odd case Even/odd matrix polynomials Another common structure found in applications is -even/odd polynomial ( {T,, 1}). P(λ) = P( λ) P(λ) = P( λ) even odd Symmetries are reflect in the spectrum: { λ eigenvalue λ eigenvalue {T, 1} λ eigenvalue λ eigenvalue = 20 / 48

Preserving structures Even odd case Even/odd matrix polynomials Another common structure found in applications is -even/odd polynomial ( {T,, 1}). P(λ) = P( λ) P(λ) = P( λ) even odd Symmetries are reflect in the spectrum: { λ eigenvalue λ eigenvalue {T, 1} λ eigenvalue λ eigenvalue = Given such a polynomial, how to construct a -even/odd linearization? Let s construct two adapted L(λ)... 20 / 48

Preserving structures Even odd case -even/odd linearizations! 1 λ 1 λ L φ (λ) T =......, L ψ (λ) T =...... 1 λ 1 λ Notice that L φ (λ) = L ψ ( λ), so... L(λ) = [ ] λm1 + M 0 L φ (λ) I m L T ψ (λ) I, M m 0 1 = M1 T, M 0 = M0 T is a T -even linearization for the matrix polynomial: P(λ) = (π T φ (λ) I m)(λm 1 + M 0 )(π ψ (λ) I m ). 21 / 48

Preserving structures Even odd case The final step We can easily check that: λ n 1 ( 1) n 1 λ n 1 π φ (λ) =. λ, π ψ(λ) =. λ, 1 1 22 / 48

Preserving structures Even odd case The final step We can easily check that: λ n 1 ( 1) n 1 λ n 1 π φ (λ) =. λ, π ψ(λ) =. λ, 1 1 and that a reasonable choice for λm 1 + M 0 is given by: ( 1) n 1 (λp 2n 1 + P 2n 2 ) λm 1 + M 0 =.... P0 + λp1 If P(λ) is -even the λm 1 + M 0 is also -even and so we have built a structured linearization. 22 / 48

Preserving structures Palindromic linearizations Palindromic matrix polynomials Another interesting case: P(λ) = rev P(λ), P(λ) = rev P(λ). Here rev P(λ) := λ deg P P(λ 1 ). This induces the spectral symmetry { λ eigenvalue λ 1 eigenvalue {T, 1} λ eigenvalue λ 1 eigenvalue = The = 1 case is not interesting, since it is impossible to build structured linearizations. In the other cases, however, we can use more or less the same trick of before. 23 / 48

Preserving structures Palindromic linearizations Building palindromic dual bases 1 λ λ L φ (λ) T =......, L ψ (λ) T = 1 λ 1...... λ 1 24 / 48

Preserving structures Palindromic linearizations Building palindromic dual bases 1 λ λ L φ (λ) T =......, L ψ (λ) T = 1 λ We follow the now usual approach and we have: λ n 1 1 λ π φ (λ) =. λ 1, π ψ(λ) =. λ n 1, 1...... λ 1 24 / 48

Preserving structures Palindromic linearizations... and then a palindromic linearization When building the linearization we have the correct symmetries, since rev L φ (λ) = L ψ (λ), and so L(λ) = is a palindromic linearization. [ λm + M ] L φ (λ) L ψ (λ) T 0 25 / 48

Preserving structures Palindromic linearizations... and then a palindromic linearization When building the linearization we have the correct symmetries, since rev L φ (λ) = L ψ (λ), and so L(λ) = [ λm + M ] L φ (λ) L ψ (λ) T 0 is a palindromic linearization. How to choose M? Some computations show that 0 m 0 m P 0 M =... 0 m 0 m Pn 1 is the right choice for a palindromic polynomial of degree 2n 1. 25 / 48

Preserving structures Palindromic linearizations Midway conclusions We are considering a family of matrices that includes Fiedler linearizations, and we have found structured linearizations which are also possible in the Fielder framework. 26 / 48

Preserving structures Palindromic linearizations Midway conclusions We are considering a family of matrices that includes Fiedler linearizations, and we have found structured linearizations which are also possible in the Fielder framework. However, we build bases which reflect the structure of the problem, instead of fighting with permutations and combinatorics. 26 / 48

Preserving structures Palindromic linearizations Midway conclusions We are considering a family of matrices that includes Fiedler linearizations, and we have found structured linearizations which are also possible in the Fielder framework. However, we build bases which reflect the structure of the problem, instead of fighting with permutations and combinatorics. I think it is much easier to go this way. 26 / 48

Preserving structures Palindromic linearizations Midway conclusions We are considering a family of matrices that includes Fiedler linearizations, and we have found structured linearizations which are also possible in the Fielder framework. However, we build bases which reflect the structure of the problem, instead of fighting with permutations and combinatorics. I think it is much easier to go this way. Can we do other interesting things with this framework? 26 / 48

Preserving structures Palindromic linearizations Midway conclusions We are considering a family of matrices that includes Fiedler linearizations, and we have found structured linearizations which are also possible in the Fielder framework. However, we build bases which reflect the structure of the problem, instead of fighting with permutations and combinatorics. I think it is much easier to go this way. Can we do other interesting things with this framework? Yes, let s see something completely different! 26 / 48

Intersections of polynomials Scalar polynomials Intersection of polynomials Consider the problem: find the values of λ such that with p 1 (λ), p 2 (λ) polynomials. p 1 (λ) = p 2 (λ) 27 / 48

Intersections of polynomials Scalar polynomials Intersection of polynomials Consider the problem: find the values of λ such that p 1 (λ) = p 2 (λ) with p 1 (λ), p 2 (λ) polynomials. That s essentially saying: find the roots of p 1 (λ) p 2 (λ). 27 / 48

Intersections of polynomials Scalar polynomials Intersection of polynomials Consider the problem: find the values of λ such that p 1 (λ) = p 2 (λ) with p 1 (λ), p 2 (λ) polynomials. That s essentially saying: find the roots of p 1 (λ) p 2 (λ). However, what to do if p 1 (λ) and p 2 (λ) are represented using different bases? Can we still easily solve the problem? 27 / 48

Intersections of polynomials Scalar polynomials Intersection of polynomials Consider the problem: find the values of λ such that p 1 (λ) = p 2 (λ) with p 1 (λ), p 2 (λ) polynomials. That s essentially saying: find the roots of p 1 (λ) p 2 (λ). However, what to do if p 1 (λ) and p 2 (λ) are represented using different bases? Can we still easily solve the problem? Convert them to the same basis, and use a companion matrix for that basis. Use our framework in a creative way. 27 / 48

Intersections of polynomials Scalar polynomials Let s recap L(λ) = [ λm1 + M 0 ] L φ (λ) L ψ (λ) T 0 is a linearization for p(λ) := π T φ (λ)(λm 1 + M 0 )π ψ (λ). 28 / 48

Intersections of polynomials Scalar polynomials Let s recap L(λ) = [ λm1 + M 0 ] L φ (λ) L ψ (λ) T 0 is a linearization for p(λ) := π T φ (λ)(λm 1 + M 0 )π ψ (λ). Let w φ, w ψ constant vectors such that w T φ π φ(λ) = 1 = w T ψ π ψ(λ) (the coordinates of the constant 1 in our basis). 28 / 48

Intersections of polynomials Scalar polynomials Let s recap L(λ) = [ λm1 + M 0 ] L φ (λ) L ψ (λ) T 0 is a linearization for p(λ) := π T φ (λ)(λm 1 + M 0 )π ψ (λ). Let w φ, w ψ constant vectors such that w T φ π φ(λ) = 1 = w T ψ π ψ(λ) (the coordinates of the constant 1 in our basis). If we set λm 1 + M 0 = w φ p T 1 p 2w T ψ we have: p(λ) = p T 1 π ψ (λ) π T φ (λ)p 2 = η ɛ p 1,i ψ i (λ) p 2,i φ i (λ), i=0 i=0 assuming η and ɛ are the number of columns of L ψ (λ) and L φ (λ). 28 / 48

Intersections of polynomials Scalar polynomials What is the advantage? We have linearized a difference of polynomials expressed in different bases directly (no computations needed!). 29 / 48

Intersections of polynomials Scalar polynomials What is the advantage? We have linearized a difference of polynomials expressed in different bases directly (no computations needed!). The linearizations directly contain the coefficients of the two polynomials, thus avoiding the potentially ill-conditioned basis change. 29 / 48

Intersections of polynomials Scalar polynomials What is the advantage? We have linearized a difference of polynomials expressed in different bases directly (no computations needed!). The linearizations directly contain the coefficients of the two polynomials, thus avoiding the potentially ill-conditioned basis change. However, L(λ) is a linearization for a polynomial of grade ɛ + η, while we have degree max{ɛ, η}. This gives us many infinite eigenvalues. 29 / 48

Intersections of polynomials Scalar polynomials Infinite eigenvalues Infinite eigenvalues can be deflated a posteriori. 30 / 48

Intersections of polynomials Scalar polynomials Infinite eigenvalues Infinite eigenvalues can be deflated a posteriori. We can easily characterize their complete eigenstructure. They form a very long Jordan chain at infinity very badly conditioned! 30 / 48

Intersections of polynomials Scalar polynomials Infinite eigenvalues Infinite eigenvalues can be deflated a posteriori. We can easily characterize their complete eigenstructure. They form a very long Jordan chain at infinity very badly conditioned! However, the others eigenvalues can be perfectly conditioned. 30 / 48

Intersections of polynomials Scalar polynomials Infinite eigenvalues Infinite eigenvalues can be deflated a posteriori. We can easily characterize their complete eigenstructure. They form a very long Jordan chain at infinity very badly conditioned! However, the others eigenvalues can be perfectly conditioned. We can also deflate the infinite eigenstructure by the staircase algorithm (since we know the eigenstructure, no rank decision are needed, and the approach is perfectly stable). 30 / 48

Intersections of polynomials Scalar polynomials Numerical accuracy I claimed that this approach is more stable than conversion to a common basis. Is this true? 31 / 48

Intersections of polynomials Scalar polynomials Numerical accuracy I claimed that this approach is more stable than conversion to a common basis. Is this true? Norm of absolute errors 10 10 10 1 10 8 10 17 Monomial Chebyshev Linearization Lin + Deflation 10 1 10 2 10 3 Degree 31 / 48

Intersections of polynomials Going to rational functions Extensions to rational functions The same idea can be applied to a slightly more general problem. Find the solutions of: f (λ) := p(λ) q(λ) + r(λ) s(λ) = 0, with p(λ), q(λ), r(λ), and s(λ) polynomials. 32 / 48

Intersections of polynomials Going to rational functions Extensions to rational functions The same idea can be applied to a slightly more general problem. Find the solutions of: f (λ) := p(λ) q(λ) + r(λ) s(λ) = 0, with p(λ), q(λ), r(λ), and s(λ) polynomials. This is equivalent to finding the roots of t(λ) := p(λ)s(λ) + q(λ)r(λ). 32 / 48

Intersections of polynomials Going to rational functions A linearization for rational functions Similar to the previous case, we have: [ ps L(λ) = T + qr T ] L φ (λ) L ψ (λ) T 0 which is a linearization for t(λ). The eigenvalues are solutions of f (λ) := p(λ) q(λ) + r(λ) s(λ) = 0, where p(λ), q(λ) are expressed in the Φ basis, and r(λ) and s(λ) in the Ψ basis. 33 / 48

Intersections of polynomials Going to rational functions Easy proof We can easily check what is linearized by L(λ): t(λ) = π T φ (λ)(pst + qr T )π ψ (λ) = = p(λ)s(λ) + q(λ)r(λ). where p(λ), q(λ) are represented in the basis Φ, and r(λ), s(λ) in the basis Ψ. 34 / 48

Intersections of polynomials Going to rational functions Easy proof We can easily check what is linearized by L(λ): t(λ) = π T φ (λ)(pst + qr T )π ψ (λ) = = p(λ)s(λ) + q(λ)r(λ). where p(λ), q(λ) are represented in the basis Φ, and r(λ), s(λ) in the basis Ψ. Notice the the top-left rank 2 block has no term depending on λ. 34 / 48

Intersections of polynomials Going to rational functions Good news Under mild hypothesis on degree matching we have only 1 simple infinite eigenvalue. 35 / 48

Intersections of polynomials Going to rational functions Good news Under mild hypothesis on degree matching we have only 1 simple infinite eigenvalue. If one of the two basis satisfies the relation π ɛ,φ (λ) = (λa + B)π ɛ 1,φ (λ) we can get rid of all the infinite eigenvalues by slightly adjusting the construction. 35 / 48

Intersections of polynomials Going to rational functions Good news Under mild hypothesis on degree matching we have only 1 simple infinite eigenvalue. If one of the two basis satisfies the relation π ɛ,φ (λ) = (λa + B)π ɛ 1,φ (λ) we can get rid of all the infinite eigenvalues by slightly adjusting the construction. Hint: any basis you can think of satisfies the above property (monomial, all orthogonal, Newton, Lagrange, Hermite are all ok). 35 / 48

Intersections of polynomials Going to rational functions Another way to write p(λ)... L(λ) = [ M(λ) ] Lφ (λ) L ψ (λ) T 0 linearizes M i,j (λ)ψ i (λ)φ j (λ). i,j 36 / 48

Intersections of polynomials Going to rational functions Another way to write p(λ)... L(λ) = [ M(λ) ] Lφ (λ) L ψ (λ) T 0 linearizes M i,j (λ)ψ i (λ)φ j (λ). i,j We can see this as a linearization in the product family ψ φ := {φ i (λ)ψ j (λ) i = 1,..., ɛ, j = 1,..., η} 36 / 48

Intersections of polynomials Going to rational functions Another way to write p(λ)... L(λ) = [ M(λ) ] Lφ (λ) L ψ (λ) T 0 linearizes M i,j (λ)ψ i (λ)φ j (λ). i,j We can see this as a linearization in the product family ψ φ := {φ i (λ)ψ j (λ) i = 1,..., ɛ, j = 1,..., η} Not a basis in general. 36 / 48

Intersections of polynomials Going to rational functions Another way to write p(λ)... L(λ) = [ M(λ) ] Lφ (λ) L ψ (λ) T 0 linearizes M i,j (λ)ψ i (λ)φ j (λ). i,j We can see this as a linearization in the product family ψ φ := {φ i (λ)ψ j (λ) i = 1,..., ɛ, j = 1,..., η} Not a basis in general. However, when φ i and ψ j this is a generating family for the polynomials of degree up to ɛ + η. 36 / 48

Intersections of polynomials Going to rational functions Another way to write p(λ)... L(λ) = [ M(λ) ] Lφ (λ) L ψ (λ) T 0 linearizes M i,j (λ)ψ i (λ)φ j (λ). i,j We can see this as a linearization in the product family ψ φ := {φ i (λ)ψ j (λ) i = 1,..., ɛ, j = 1,..., η} Not a basis in general. However, when φ i and ψ j this is a generating family for the polynomials of degree up to ɛ + η. We can exploit the redundancy in a good way. 36 / 48

Intersections of polynomials More and more bases!... which leads to a natural question Can we handle more than two bases? 37 / 48

Intersections of polynomials More and more bases!... which leads to a natural question Can we handle more than two bases? Given φ (1),..., φ (j) polynomial bases I can construct their product: φ (1)... φ (j) 37 / 48

Intersections of polynomials More and more bases!... which leads to a natural question Can we handle more than two bases? Given φ (1),..., φ (j) polynomial bases I can construct their product: φ (1)... φ (j) We know how to handle j = 1 (the classical companion case). 37 / 48

Intersections of polynomials More and more bases!... which leads to a natural question Can we handle more than two bases? Given φ (1),..., φ (j) polynomial bases I can construct their product: φ (1)... φ (j) We know how to handle j = 1 (the classical companion case).... and also j = 2 (what we have seen until now). 37 / 48

Intersections of polynomials More and more bases!... which leads to a natural question Can we handle more than two bases? Given φ (1),..., φ (j) polynomial bases I can construct their product: φ (1)... φ (j) We know how to handle j = 1 (the classical companion case).... and also j = 2 (what we have seen until now). What about going higher? 37 / 48

Intersections of polynomials More and more bases! Enlarged dual bases We can deal with this by constructing enlarged dual bases. φ 0 (λ)ψ 0 (λ). π φ ψ (λ) := φ i (λ)ψ j (λ). φ ɛ (λ)ψ η (λ) 38 / 48

Intersections of polynomials More and more bases! Enlarged dual bases We can deal with this by constructing enlarged dual bases. φ 0 (λ)ψ 0 (λ). π φ ψ (λ) := φ i (λ)ψ j (λ). φ ɛ (λ)ψ η (λ) Can we build a suitable L(λ) such that L(λ) T π φ ψ (λ) = 0? 38 / 48

Intersections of polynomials More and more bases! Enlarged dual bases We can deal with this by constructing enlarged dual bases. φ 0 (λ)ψ 0 (λ). π φ ψ (λ) := φ i (λ)ψ j (λ). φ ɛ (λ)ψ η (λ) Can we build a suitable L(λ) such that L(λ) T π φ ψ (λ) = 0? Yes! 38 / 48

Intersections of polynomials More and more bases! Product dual bases [ L k,φ ψ (λ) T A Lη,ψ (λ) = T ] L ɛ,φ (λ) T w T, k := (ɛ + 1)(η + 1) 1, form a dual basis together with π φ ψ (λ), where A is any invertible matrix and w any vector such that w T π η,ψ (λ) is a nonzero constant. 39 / 48

Intersections of polynomials More and more bases! Product dual bases [ L k,φ ψ (λ) T A Lη,ψ (λ) = T ] L ɛ,φ (λ) T w T, k := (ɛ + 1)(η + 1) 1, form a dual basis together with π φ ψ (λ), where A is any invertible matrix and w any vector such that w T π η,ψ (λ) is a nonzero constant. We can just plug this ingredient in our framework and obtain the desired linearizations! 39 / 48

Intersections of polynomials More and more bases! Some examples Let p(λ) = 3 i=0 p iλ i a degree 3 polynomial. 40 / 48

Intersections of polynomials More and more bases! Some examples Let p(λ) = 3 i=0 p iλ i a degree 3 polynomial. Choosing {ψ i } = {1, λ, λ 2 } and {φ i } = {1} yields Frobenius form: λp 3 + p 2 p 1 p 0 L(λ) = 1 λ. 1 λ 40 / 48

Intersections of polynomials More and more bases! Some examples Let p(λ) = 3 i=0 p iλ i a degree 3 polynomial. Choosing {ψ i } = {1, λ, λ 2 } and {φ i } = {1} yields Frobenius form: λp 3 + p 2 p 1 p 0 L(λ) = 1 λ. 1 λ Choosing {ψ i } = {φ i } = {1, λ} yields, e.g., the symmetric 1 λp 3 + p 2 2 p 1 1 L(λ) = 1 2 p 1 p 0 λ. 1 λ 0 40 / 48

Intersections of polynomials More and more bases! Some examples Choosing {ψ i } = {1, λ} {1, λ} and {φ i } = {1} yields L(λ) = 1 λp 3 + p 2 2 p 1 1 λ 1 2 p 1 p 0 1 λ 1 λ. 41 / 48

Intersections of polynomials More and more bases! Some examples Choosing {ψ i } = {1, λ} {1, λ} and {φ i } = {1} yields L(λ) = 1 λp 3 + p 2 2 p 1 1 λ 1 2 p 1 p 0 1 λ 1 λ. Note, the dimension could increase (not strong anymore). 41 / 48

Future outlook Backward stability Backward stability Work ongoing in collaboration with many people (Piers Lawrence, Françoise Tisseur, Raf Vandebril, Paul Van Dooren,... ). 42 / 48

Future outlook Backward stability Backward stability Work ongoing in collaboration with many people (Piers Lawrence, Françoise Tisseur, Raf Vandebril, Paul Van Dooren,... ). I want to solve the eigenvalue problem L(λ) via QZ, with [ ] W T L(λ) = L(λ) T, L(λ) T π(λ) = 0. 42 / 48

Future outlook Backward stability Backward stability Work ongoing in collaboration with many people (Piers Lawrence, Françoise Tisseur, Raf Vandebril, Paul Van Dooren,... ). I want to solve the eigenvalue problem L(λ) via QZ, with [ ] W T L(λ) = L(λ) T, L(λ) T π(λ) = 0.... but QZ will give me back the exact eigenvalues of L(λ): [ W L(λ) = T + δw T ] [ W = T ] L(λ) + δl(λ) L(λ) 42 / 48

Future outlook Backward stability Relating the two pencils The perturbation in W T is fine (it is a perturbation to the coefficients, so it seems reasonable.) 43 / 48

Future outlook Backward stability Relating the two pencils The perturbation in W T is fine (it is a perturbation to the coefficients, so it seems reasonable.) The perturbation to L(λ) will correspond to a perturbation to the dual space, that is I need to find δπ(λ) such that (L(λ) + δl(λ)) T (π(λ) + δπ(λ)) = 0. 43 / 48

Future outlook Backward stability Relating the two pencils The perturbation in W T is fine (it is a perturbation to the coefficients, so it seems reasonable.) The perturbation to L(λ) will correspond to a perturbation to the dual space, that is I need to find δπ(λ) such that (L(λ) + δl(λ)) T (π(λ) + δπ(λ)) = 0. Many interesting questions for us: When do small perturbations to L(λ) correspond to small perturbations on π(λ)? 43 / 48

Future outlook Backward stability Relating the two pencils The perturbation in W T is fine (it is a perturbation to the coefficients, so it seems reasonable.) The perturbation to L(λ) will correspond to a perturbation to the dual space, that is I need to find δπ(λ) such that (L(λ) + δl(λ)) T (π(λ) + δπ(λ)) = 0. Many interesting questions for us: When do small perturbations to L(λ) correspond to small perturbations on π(λ)? Can I measure the growth factor in the perturbation size? 43 / 48

Future outlook Backward stability Relating the two pencils The perturbation in W T is fine (it is a perturbation to the coefficients, so it seems reasonable.) The perturbation to L(λ) will correspond to a perturbation to the dual space, that is I need to find δπ(λ) such that (L(λ) + δl(λ)) T (π(λ) + δπ(λ)) = 0. Many interesting questions for us: When do small perturbations to L(λ) correspond to small perturbations on π(λ)? Can I measure the growth factor in the perturbation size? Does it work flawlessly also when using two bases (or more?). 43 / 48

Future outlook Backward stability To be continued...... to know the answer to all the above questions, you will need to see the next episode of dual bases and matrix polynomials. 44 / 48

Future outlook Backward stability To be continued...... to know the answer to all the above questions, you will need to see the next episode of dual bases and matrix polynomials. Likely the talk of Piers Lawrence at ILAS 2016 will be about these matters. 44 / 48

Future outlook Backward stability To be continued...... to know the answer to all the above questions, you will need to see the next episode of dual bases and matrix polynomials. Likely the talk of Piers Lawrence at ILAS 2016 will be about these matters. I have a draft in my laptop which seems promising. I am working on it in these days, let me know if you want to discuss something about that! 44 / 48

Future outlook Multivariate linearizations Multivariate polynomials What we have seen also naturally fit into the framework of linearizing the bivariate polynomial p(λ, µ) = n n p ij λ i µ j i=0 j=0 45 / 48

Future outlook Multivariate linearizations Multivariate polynomials What we have seen also naturally fit into the framework of linearizing the bivariate polynomial p(λ, µ) = n n p ij λ i µ j i=0 j=0 In fact, this can be naturally expressed as p(λ, µ) = π(λ) T Pπ(µ), P = (p ij ), and π(λ) the usual vector with the monomials. 45 / 48

Future outlook Multivariate linearizations Visualizing the linearization The following is a linearization for p(λ, µ): [ ] P L(λ) L(λ, µ) = L(µ) T 0 in the sense that det L(λ, µ) = p(λ, µ). 46 / 48

Future outlook Multivariate linearizations Visualizing the linearization The following is a linearization for p(λ, µ): [ ] P L(λ) L(λ, µ) = L(µ) T 0 in the sense that det L(λ, µ) = p(λ, µ). One could use this to solve problems of the form { p(λ, µ) = 0 q(λ, µ) = 0 by turning them into a multiparameter eigenvalue problem { L 1 (λ, µ)v 1 = 0 L 2 (λ, µ)v 2 = 0 46 / 48

Future outlook Multivariate linearizations Issues The previous step is not free from numerical issues. To solve the MEP one usually turns it into a linear eigenvalue problem, which makes the dimension grow, and add a quite large singular part to the pencil. 47 / 48

Future outlook Multivariate linearizations Issues The previous step is not free from numerical issues. To solve the MEP one usually turns it into a linear eigenvalue problem, which makes the dimension grow, and add a quite large singular part to the pencil. Work on this topic is being carried out by Bor Plestenjak. Our framework easily allows extension to more variables, which seems to be almost optimal in dimension for generic problems. 47 / 48

Conclusions Conclusions We have seen as dual bases and matrix polynomials play well together. 48 / 48

Conclusions Conclusions We have seen as dual bases and matrix polynomials play well together. This approach gives us a lot of flexibility in building linearizations, and gives us a better understanding of how different bases work. Moreover, we can combine different bases in the construction. 48 / 48

Conclusions Conclusions We have seen as dual bases and matrix polynomials play well together. This approach gives us a lot of flexibility in building linearizations, and gives us a better understanding of how different bases work. Moreover, we can combine different bases in the construction. Easier analysis of backward stability issues. Results are promising and seem to give new insights on scalings and linearization constructions. 48 / 48

Conclusions Conclusions We have seen as dual bases and matrix polynomials play well together. This approach gives us a lot of flexibility in building linearizations, and gives us a better understanding of how different bases work. Moreover, we can combine different bases in the construction. Easier analysis of backward stability issues. Results are promising and seem to give new insights on scalings and linearization constructions. Promising developments for multivariate rootfinding. 48 / 48

Conclusions Conclusions We have seen as dual bases and matrix polynomials play well together. This approach gives us a lot of flexibility in building linearizations, and gives us a better understanding of how different bases work. Moreover, we can combine different bases in the construction. Easier analysis of backward stability issues. Results are promising and seem to give new insights on scalings and linearization constructions. Promising developments for multivariate rootfinding. Thanks for your attention! 48 / 48