A Framework for Structured Linearizations of Matrix Polynomials in Various Bases

Size: px
Start display at page:

Download "A Framework for Structured Linearizations of Matrix Polynomials in Various Bases"

Transcription

1 A Framework for Structured Linearizations of Matrix Polynomials in Various Bases Leonardo Robol Joint work with Raf Vandebril and Paul Van Dooren, KU Leuven and Université Catholique de Louvain 09 May 2016, Structured Matrix Days in Limoges 1 / 48

2 Introduction Rational vector spaces Linearizations, dual bases and structures I will talk about different mathematical objects that play very well together: 2 / 48

3 Introduction Rational vector spaces Linearizations, dual bases and structures I will talk about different mathematical objects that play very well together: Linearizations of (matrix) polynomials. 2 / 48

4 Introduction Rational vector spaces Linearizations, dual bases and structures I will talk about different mathematical objects that play very well together: Linearizations of (matrix) polynomials. Dual (minimal) bases, related to linear system theory. 2 / 48

5 Introduction Rational vector spaces Linearizations, dual bases and structures I will talk about different mathematical objects that play very well together: Linearizations of (matrix) polynomials. Dual (minimal) bases, related to linear system theory. How to take structures from polynomials and preserve them in linearizations (spoiler alert: using dual bases!). 2 / 48

6 Introduction Rational vector spaces Linearizations, dual bases and structures I will talk about different mathematical objects that play very well together: Linearizations of (matrix) polynomials. Dual (minimal) bases, related to linear system theory. How to take structures from polynomials and preserve them in linearizations (spoiler alert: using dual bases!). How to deal with different bases in the definition of a polynomial. 2 / 48

7 Introduction Rational vector spaces Vector spaces of rational functions One can consider the vector space F n (λ) of rational functions in the variable λ: v(λ) F n (λ) v(λ) = p 1 (λ) q 1 (λ). p n(λ) q n(λ) 3 / 48

8 Introduction Rational vector spaces Vector spaces of rational functions One can consider the vector space F n (λ) of rational functions in the variable λ: v(λ) F n (λ) v(λ) = p 1 (λ) q 1 (λ). p n(λ) q n(λ) F(λ) is a field, so everything is nice. However, every vector can be written as v(λ) = q(λ) 1 ṽ(λ) with v(λ) F[λ]. 3 / 48

9 Introduction Rational vector spaces Vector spaces of rational functions One can consider the vector space F n (λ) of rational functions in the variable λ: v(λ) F n (λ) v(λ) = p 1 (λ) q 1 (λ). p n(λ) q n(λ) F(λ) is a field, so everything is nice. However, every vector can be written as v(λ) = q(λ) 1 ṽ(λ) with v(λ) F[λ]. Switching between polynomials and rational functions is useful for theoretical reasons. 3 / 48

10 Introduction Rational vector spaces Bases for rational vector spaces Let V F n (λ) a r-dimensional subspace of the vector space on the field of rational functions. 4 / 48

11 Introduction Rational vector spaces Bases for rational vector spaces Let V F n (λ) a r-dimensional subspace of the vector space on the field of rational functions. We can write a basis for V as A(λ) F n r (λ), so that such that: f (λ) V f (λ) = A(λ)w f (λ) 4 / 48

12 Introduction Rational vector spaces Bases for rational vector spaces Let V F n (λ) a r-dimensional subspace of the vector space on the field of rational functions. We can write a basis for V as A(λ) F n r (λ), so that such that: f (λ) V f (λ) = A(λ)w f (λ) V admits a basis composed only of polynomials. 4 / 48

13 Introduction Rational vector spaces Bases for rational vector spaces Let V F n (λ) a r-dimensional subspace of the vector space on the field of rational functions. We can write a basis for V as A(λ) F n r (λ), so that such that: f (λ) V f (λ) = A(λ)w f (λ) V admits a basis composed only of polynomials. Among these bases, we can look for the ones whose sum of column-degrees is minimal. The basis is not unique, but its column-degrees are. We call these bases minimal, and their column degrees minimal indices. 4 / 48

14 Introduction Dual bases Dual spaces We can look for the dual space V, which has dimension n r. V = {g(λ) F n (λ) g(λ) T f (λ) = 0 f (λ) V}. 5 / 48

15 Introduction Dual bases Dual spaces We can look for the dual space V, which has dimension n r. V = {g(λ) F n (λ) g(λ) T f (λ) = 0 f (λ) V}. If B(λ) is a dual basis to A(λ) if A(λ) T B(λ) = 0. This is equivalent to saying that B(λ) is a basis of V. If both A(λ) and B(λ) are minimal we say that they form a pair of dual minimal bases. 5 / 48

16 Introduction Dual bases Dual spaces We can look for the dual space V, which has dimension n r. V = {g(λ) F n (λ) g(λ) T f (λ) = 0 f (λ) V}. If B(λ) is a dual basis to A(λ) if A(λ) T B(λ) = 0. This is equivalent to saying that B(λ) is a basis of V. If both A(λ) and B(λ) are minimal we say that they form a pair of dual minimal bases. 1 λ λ λ k. 1 = 0 are dual minimal bases. 5 / 48

17 Introduction Dual bases Properties of dual minimal bases Let A(λ) T B(λ) = 0 be a pair of dual minimal bases. The sum of degrees of a minimal basis and of a minimal basis of the dual space always coincide: r n r deg A(λ)e i = B(λ)e i. i=1 i=1 6 / 48

18 Introduction Dual bases Properties of dual minimal bases Let A(λ) T B(λ) = 0 be a pair of dual minimal bases. The sum of degrees of a minimal basis and of a minimal basis of the dual space always coincide: r n r deg A(λ)e i = B(λ)e i. i=1 i=1 Pairs of minimal bases behave in a good way under perturbations: 6 / 48

19 Introduction Dual bases Properties of dual minimal bases Let A(λ) T B(λ) = 0 be a pair of dual minimal bases. The sum of degrees of a minimal basis and of a minimal basis of the dual space always coincide: r n r deg A(λ)e i = B(λ)e i. i=1 i=1 Pairs of minimal bases behave in a good way under perturbations: Minimality is preserved for small enough changes. 6 / 48

20 Introduction Dual bases Properties of dual minimal bases Let A(λ) T B(λ) = 0 be a pair of dual minimal bases. The sum of degrees of a minimal basis and of a minimal basis of the dual space always coincide: r n r deg A(λ)e i = B(λ)e i. i=1 Pairs of minimal bases behave in a good way under perturbations: Minimality is preserved for small enough changes. Small perturbations to A(λ) correspond to small perturbations of the dual space (in an appropriate sense). i=1 6 / 48

21 Introduction Dual bases Properties of dual minimal bases Let A(λ) T B(λ) = 0 be a pair of dual minimal bases. The sum of degrees of a minimal basis and of a minimal basis of the dual space always coincide: r n r deg A(λ)e i = B(λ)e i. i=1 Pairs of minimal bases behave in a good way under perturbations: Minimality is preserved for small enough changes. Small perturbations to A(λ) correspond to small perturbations of the dual space (in an appropriate sense). i=1 These facts are crucial in proving backward stability results. 6 / 48

22 Linearizations Basic examples Building linearizations We are interested in computing eigenvalues and eigenstructure of a matrix polynomial P(λ): P(λ) = n P i λ i, or, more generally, P(λ) = i=0 n P i φ i (λ). i=0 where Φ = {φ 0 (λ),..., φ n (λ)} is some polynomial basis (monomials, Chebyshev, Lagrange, Newton,... ). 7 / 48

23 Linearizations Basic examples Building linearizations We are interested in computing eigenvalues and eigenstructure of a matrix polynomial P(λ): P(λ) = n P i λ i, or, more generally, P(λ) = i=0 n P i φ i (λ). i=0 where Φ = {φ 0 (λ),..., φ n (λ)} is some polynomial basis (monomials, Chebyshev, Lagrange, Newton,... ). We look for a pencil L(λ) with the same spectral properties. We say that L(λ) is a linearization for P(λ) if and only if: P(λ) I m(n 1) = E(λ)L(λ)F (λ), det E(λ), det F (λ) invertible in F[λ]. 7 / 48

24 Linearizations Basic examples From dual bases to linearizations Given a basis Φ := {φ 0 (λ),..., φ n (λ)} we say that L φ (λ) and π φ (λ) are dual bases associated to Φ if: φ n (λ) L φ (λ) T π φ (λ) = 0, π φ (λ) =. φ 0 (λ) Theorem If L φ (λ) and π φ (λ) are dual bases associated to Φ then [ W T ] W n (λ) T (λ) L(λ) := L T φ (λ) I, W (λ) =. m W 0 (λ) T linearizes P(λ) := W T (λ)(π φ (λ) I m ) = n i=0 W i(λ)φ i (λ). 8 / 48

25 Linearizations Basic examples Almost a proof We check what the eigenvalues of such a pencil are: { W T (λ)v = 0 L(λ)v = 0 (L T (λ) I m )v = 0 9 / 48

26 Linearizations Basic examples Almost a proof We check what the eigenvalues of such a pencil are: { W T (λ)v = 0 L(λ)v = 0 (L T (λ) I m )v = 0 Since v is in the dual of L(λ) we can write it as v = (π(λ) I m )w v where w v 0 so that: W T (λ)v = 0 W T (λ)(π(λ) I m )w v = 0 P(λ)w v = 0. 9 / 48

27 Linearizations Basic examples Almost a proof We check what the eigenvalues of such a pencil are: { W T (λ)v = 0 L(λ)v = 0 (L T (λ) I m )v = 0 Since v is in the dual of L(λ) we can write it as v = (π(λ) I m )w v where w v 0 so that: W T (λ)v = 0 W T (λ)(π(λ) I m )w v = 0 P(λ)w v = 0. Basic idea: The eigenvector must lie in the dual of L(λ), thus has the correct structure. 9 / 48

28 since p(λ) = [ λp n + p n 1... p 0 ] π(λ). 10 / 48 Structured Linearizations Linearizations Basic examples A well-known example One can consider the pair of dual minimal bases 1 L(λ) = λ , π(λ) = λ λ n 1 so that λp n + p n p 0 1 λ n L(λ) = linearizes p(λ) := p i λ i. i=0 1 λ. λ 1

29 Linearizations Basic examples Other well-known examples... the previous approach allows for linearization in arbitrary bases! λp n + p n 1 p n 2 p n... p 1 p 0 1 2λ 1 L(λ) = λ 1 1 λ is a linearization for the polynomial expressed in the Chebyshev basis of the first kind: n p(λ) = p i T i (λ). i=0 11 / 48

30 Linearizations Basic examples Other well-known examples... the previous approach allows for linearization in arbitrary bases! λp n + p n 1 p n 2 p n... p 1 p 0 1 2λ 1 L(λ) = λ 1 1 λ is a linearization for the polynomial expressed in the Chebyshev basis of the first kind: n p(λ) = p i T i (λ). i=0 Already in [Amirlaslani, Corless, Lancaster, 2009]: all the orthogonal bases and also Newton, Lagrange, Hermite are possible. 11 / 48

31 Linearizations Basic examples Just to get the idea Assume we have an orthogonal basis φ i (λ) satisfying the relation: αφ j+1 (λ) = (λ β)φ j (λ) γφ j 1 (λ), α 0, j > 0, (1) 12 / 48

32 Linearizations Basic examples Just to get the idea Assume we have an orthogonal basis φ i (λ) satisfying the relation: αφ j+1 (λ) = (λ β)φ j (λ) γφ j 1 (λ), α 0, j > 0, (1) Then α (β λ) γ L φ (λ) T := α (β λ) γ φ 0 (λ) φ 1 (λ) is a basis dual to π φ (λ). 12 / 48

33 Linearizations Basic examples Using two dual bases at the same time The construction of linearizations can be generalized to the case where we have two pairs of dual minimal bases. 13 / 48

34 Linearizations Basic examples Using two dual bases at the same time The construction of linearizations can be generalized to the case where we have two pairs of dual minimal bases. Let Φ, Ψ be two polynomial bases, and L φ (λ), π φ (λ) and L ψ (λ), π ψ (λ) the corresponding dual minimal bases. 13 / 48

35 Linearizations Basic examples Using two dual bases at the same time The construction of linearizations can be generalized to the case where we have two pairs of dual minimal bases. Let Φ, Ψ be two polynomial bases, and L φ (λ), π φ (λ) and L ψ (λ), π ψ (λ) the corresponding dual minimal bases. Theorem The matrix pencil L(λ) := [ λm1 + M 0 ] L φ (λ) L ψ (λ) T 0 is a linearization for the polynomial p(λ) := π ψ (λ) T (λm 1 + M 0 )π φ (λ). 13 / 48

36 Linearizations Fiedler and more bases Relation with Fiedler linearizations The family of linearizations that can be built using the previous result is very large: 14 / 48

37 Linearizations Fiedler and more bases Relation with Fiedler linearizations The family of linearizations that can be built using the previous result is very large: The Frobenius linearizations (the classical companions) are a particular case. 14 / 48

38 Linearizations Fiedler and more bases Relation with Fiedler linearizations The family of linearizations that can be built using the previous result is very large: The Frobenius linearizations (the classical companions) are a particular case. Every Fiedler pencil/companion is just a permutation of a pencil in the previous form. 14 / 48

39 Linearizations Fiedler and more bases Relation with Fiedler linearizations The family of linearizations that can be built using the previous result is very large: The Frobenius linearizations (the classical companions) are a particular case. Every Fiedler pencil/companion is just a permutation of a pencil in the previous form. However, many more linearizations fits in this class! 14 / 48

40 Linearizations Fiedler and more bases Equivalence to Fiedler linearizations Consider p(λ) = p 4 λ 4 + p 3 λ 3 + p 2 λ 2 + p 1 λ + p 0. A Fiedler linearization looks like: λp 4 + p λb A = p 2 λ p 1 p λ λ 15 / 48

41 Linearizations Fiedler and more bases Equivalence to Fiedler linearizations Consider p(λ) = p 4 λ 4 + p 3 λ 3 + p 2 λ 2 + p 1 λ + p 0. A Fiedler linearization looks like: λp 4 + p λb A = p 2 λ p 1 p λ λ Permuting the above pencil in a good way yields: λp 4 + p p 2 p 1 p 0 λ 1 λ λ 0 [ ] = λm1 + M 0 L φ (λ) L ψ (λ) T 0 15 / 48

42 Linearizations Fiedler and more bases Exploiting the extra freedom According to the previous Theorem, we can write the linearized polynomial: λp 4 + p p 2 p 1 p 0 λ 1 λ λ 0 [ λ 1 ] [ ] λp 4 + p 3 0 λ 2 λ. p 2 p 1 p / 48

43 Linearizations Fiedler and more bases Exploiting the extra freedom According to the previous Theorem, we can write the linearized polynomial: λp 4 + p p 2 p 1 p 0 λ 1 λ λ 0 [ λ 1 ] [ ] λp 4 + p 3 0 λ 2 λ. p 2 p 1 p 0 1 The latter product corresponds to the sum of all the elements in the Hadamard product: [ ] [ λp4 + p 3 λ 3 λ 2 ] λ p 2 p 1 p 0 λ 2 λ 1 which clearly is p(λ). However, reshuffling is allowed! 16 / 48

44 Linearizations Fiedler and more bases Exploiting the extra freedom We can get symmetric linearizations for odd degree polynomials: λp 5 + p 4 1 λp 3 + p 2 λ 1 λp 1 + p 0 λ 1 λ λ 0 0 Which trivially extends to block symmetric linearizations for matrix polynomials, which are symmetric whenever the polynomial is. 17 / 48

45 Linearizations Fiedler and more bases Exploiting the extra freedom We can get symmetric linearizations for odd degree polynomials: λp 5 + p 4 1 λp 3 + p 2 λ 1 λp 1 + p 0 λ 1 λ λ 0 0 Which trivially extends to block symmetric linearizations for matrix polynomials, which are symmetric whenever the polynomial is. Main underlying idea: Construct the two bases L φ (λ) and L ψ (λ) so that they have the same symmetry of the matrix polynomial, and you will get a structured linearization. 17 / 48

46 Preserving structures Symmetries Why do we care about symmetries? 18 / 48

47 Preserving structures Symmetries Why do we care about symmetries? They are beautiful. 18 / 48

48 Preserving structures Symmetries Why do we care about symmetries? They are beautiful. They are fun to work with. 18 / 48

49 Preserving structures Symmetries Why do we care about symmetries? They are beautiful. They are fun to work with. But since we need to get research funded we also notice that: 18 / 48

50 Preserving structures Symmetries Why do we care about symmetries? They are beautiful. They are fun to work with. But since we need to get research funded we also notice that: They are naturally present in many engineering applications. 18 / 48

51 Preserving structures Symmetries Why do we care about symmetries? They are beautiful. They are fun to work with. But since we need to get research funded we also notice that: They are naturally present in many engineering applications. Preserving symmetries in the linearizations means preserving also spectral symmetries. 18 / 48

52 Preserving structures Symmetries Why do we care about symmetries? They are beautiful. They are fun to work with. But since we need to get research funded we also notice that: They are naturally present in many engineering applications. Preserving symmetries in the linearizations means preserving also spectral symmetries. Structured methods can give us approximations to the eigenvalues that make sense in the underlying model and have a lower condition number (structured vs unstructured). 18 / 48

53 Preserving structures Symmetries Why do we care about symmetries? They are beautiful. They are fun to work with. But since we need to get research funded we also notice that: They are naturally present in many engineering applications. Preserving symmetries in the linearizations means preserving also spectral symmetries. Structured methods can give us approximations to the eigenvalues that make sense in the underlying model and have a lower condition number (structured vs unstructured). We get better results in the end. 18 / 48

54 Preserving structures Symmetries Why do we care about symmetries? They are beautiful. They are fun to work with. But since we need to get research funded we also notice that: They are naturally present in many engineering applications. Preserving symmetries in the linearizations means preserving also spectral symmetries. Structured methods can give us approximations to the eigenvalues that make sense in the underlying model and have a lower condition number (structured vs unstructured). We get better results in the end. Famous example: simulation of vibrations for high speed trains: palindromic matrix polynomials. 18 / 48

55 Preserving structures Symmetries Matrix and scalar polynomials Most of the things that you see here for scalar polynomials are indeed true (and useful) for matrix polynomials. 19 / 48

56 Preserving structures Symmetries Matrix and scalar polynomials Most of the things that you see here for scalar polynomials are indeed true (and useful) for matrix polynomials. Just take some I m and attach them on the right of (almost) every matrix you can see, that is [ ] λm1 + M 0 L φ (λ) L T ψ (λ) 0 are linearizations for: [ λm1 + M 0 ] L φ (λ) I m L T ψ (λ) I m 0 π φ (λ) T (λm 1 +M 0 )π ψ (λ), (π φ (λ) T I m )(λm 1 +M 0 )(π ψ (λ) I m ) 19 / 48

57 Preserving structures Symmetries Matrix and scalar polynomials Most of the things that you see here for scalar polynomials are indeed true (and useful) for matrix polynomials. Just take some I m and attach them on the right of (almost) every matrix you can see, that is [ ] λm1 + M 0 L φ (λ) L T ψ (λ) 0 are linearizations for: [ λm1 + M 0 ] L φ (λ) I m L T ψ (λ) I m 0 π φ (λ) T (λm 1 +M 0 )π ψ (λ), (π φ (λ) T I m )(λm 1 +M 0 )(π ψ (λ) I m ) I will switch back and forth between the two notations. 19 / 48

58 Preserving structures Even odd case Even/odd matrix polynomials Another common structure found in applications is -even/odd polynomial ( {T,, 1}). 20 / 48

59 Preserving structures Even odd case Even/odd matrix polynomials Another common structure found in applications is -even/odd polynomial ( {T,, 1}). P(λ) = P( λ) P(λ) = P( λ) even odd 20 / 48

60 Preserving structures Even odd case Even/odd matrix polynomials Another common structure found in applications is -even/odd polynomial ( {T,, 1}). P(λ) = P( λ) P(λ) = P( λ) even odd Symmetries are reflect in the spectrum: { λ eigenvalue λ eigenvalue {T, 1} λ eigenvalue λ eigenvalue = 20 / 48

61 Preserving structures Even odd case Even/odd matrix polynomials Another common structure found in applications is -even/odd polynomial ( {T,, 1}). P(λ) = P( λ) P(λ) = P( λ) even odd Symmetries are reflect in the spectrum: { λ eigenvalue λ eigenvalue {T, 1} λ eigenvalue λ eigenvalue = Given such a polynomial, how to construct a -even/odd linearization? Let s construct two adapted L(λ) / 48

62 Preserving structures Even odd case -even/odd linearizations! 1 λ 1 λ L φ (λ) T =......, L ψ (λ) T = λ 1 λ Notice that L φ (λ) = L ψ ( λ), so... L(λ) = [ ] λm1 + M 0 L φ (λ) I m L T ψ (λ) I, M m 0 1 = M1 T, M 0 = M0 T is a T -even linearization for the matrix polynomial: P(λ) = (π T φ (λ) I m)(λm 1 + M 0 )(π ψ (λ) I m ). 21 / 48

63 Preserving structures Even odd case The final step We can easily check that: λ n 1 ( 1) n 1 λ n 1 π φ (λ) =. λ, π ψ(λ) =. λ, / 48

64 Preserving structures Even odd case The final step We can easily check that: λ n 1 ( 1) n 1 λ n 1 π φ (λ) =. λ, π ψ(λ) =. λ, 1 1 and that a reasonable choice for λm 1 + M 0 is given by: ( 1) n 1 (λp 2n 1 + P 2n 2 ) λm 1 + M 0 =.... P0 + λp1 If P(λ) is -even the λm 1 + M 0 is also -even and so we have built a structured linearization. 22 / 48

65 Preserving structures Palindromic linearizations Palindromic matrix polynomials Another interesting case: P(λ) = rev P(λ), P(λ) = rev P(λ). Here rev P(λ) := λ deg P P(λ 1 ). This induces the spectral symmetry { λ eigenvalue λ 1 eigenvalue {T, 1} λ eigenvalue λ 1 eigenvalue = The = 1 case is not interesting, since it is impossible to build structured linearizations. In the other cases, however, we can use more or less the same trick of before. 23 / 48

66 Preserving structures Palindromic linearizations Building palindromic dual bases 1 λ λ L φ (λ) T =......, L ψ (λ) T = 1 λ λ 1 24 / 48

67 Preserving structures Palindromic linearizations Building palindromic dual bases 1 λ λ L φ (λ) T =......, L ψ (λ) T = 1 λ We follow the now usual approach and we have: λ n 1 1 λ π φ (λ) =. λ 1, π ψ(λ) =. λ n 1, λ 1 24 / 48

68 Preserving structures Palindromic linearizations... and then a palindromic linearization When building the linearization we have the correct symmetries, since rev L φ (λ) = L ψ (λ), and so L(λ) = is a palindromic linearization. [ λm + M ] L φ (λ) L ψ (λ) T 0 25 / 48

69 Preserving structures Palindromic linearizations... and then a palindromic linearization When building the linearization we have the correct symmetries, since rev L φ (λ) = L ψ (λ), and so L(λ) = [ λm + M ] L φ (λ) L ψ (λ) T 0 is a palindromic linearization. How to choose M? Some computations show that 0 m 0 m P 0 M =... 0 m 0 m Pn 1 is the right choice for a palindromic polynomial of degree 2n / 48

70 Preserving structures Palindromic linearizations Midway conclusions We are considering a family of matrices that includes Fiedler linearizations, and we have found structured linearizations which are also possible in the Fielder framework. 26 / 48

71 Preserving structures Palindromic linearizations Midway conclusions We are considering a family of matrices that includes Fiedler linearizations, and we have found structured linearizations which are also possible in the Fielder framework. However, we build bases which reflect the structure of the problem, instead of fighting with permutations and combinatorics. 26 / 48

72 Preserving structures Palindromic linearizations Midway conclusions We are considering a family of matrices that includes Fiedler linearizations, and we have found structured linearizations which are also possible in the Fielder framework. However, we build bases which reflect the structure of the problem, instead of fighting with permutations and combinatorics. I think it is much easier to go this way. 26 / 48

73 Preserving structures Palindromic linearizations Midway conclusions We are considering a family of matrices that includes Fiedler linearizations, and we have found structured linearizations which are also possible in the Fielder framework. However, we build bases which reflect the structure of the problem, instead of fighting with permutations and combinatorics. I think it is much easier to go this way. Can we do other interesting things with this framework? 26 / 48

74 Preserving structures Palindromic linearizations Midway conclusions We are considering a family of matrices that includes Fiedler linearizations, and we have found structured linearizations which are also possible in the Fielder framework. However, we build bases which reflect the structure of the problem, instead of fighting with permutations and combinatorics. I think it is much easier to go this way. Can we do other interesting things with this framework? Yes, let s see something completely different! 26 / 48

75 Intersections of polynomials Scalar polynomials Intersection of polynomials Consider the problem: find the values of λ such that with p 1 (λ), p 2 (λ) polynomials. p 1 (λ) = p 2 (λ) 27 / 48

76 Intersections of polynomials Scalar polynomials Intersection of polynomials Consider the problem: find the values of λ such that p 1 (λ) = p 2 (λ) with p 1 (λ), p 2 (λ) polynomials. That s essentially saying: find the roots of p 1 (λ) p 2 (λ). 27 / 48

77 Intersections of polynomials Scalar polynomials Intersection of polynomials Consider the problem: find the values of λ such that p 1 (λ) = p 2 (λ) with p 1 (λ), p 2 (λ) polynomials. That s essentially saying: find the roots of p 1 (λ) p 2 (λ). However, what to do if p 1 (λ) and p 2 (λ) are represented using different bases? Can we still easily solve the problem? 27 / 48

78 Intersections of polynomials Scalar polynomials Intersection of polynomials Consider the problem: find the values of λ such that p 1 (λ) = p 2 (λ) with p 1 (λ), p 2 (λ) polynomials. That s essentially saying: find the roots of p 1 (λ) p 2 (λ). However, what to do if p 1 (λ) and p 2 (λ) are represented using different bases? Can we still easily solve the problem? Convert them to the same basis, and use a companion matrix for that basis. Use our framework in a creative way. 27 / 48

79 Intersections of polynomials Scalar polynomials Let s recap L(λ) = [ λm1 + M 0 ] L φ (λ) L ψ (λ) T 0 is a linearization for p(λ) := π T φ (λ)(λm 1 + M 0 )π ψ (λ). 28 / 48

80 Intersections of polynomials Scalar polynomials Let s recap L(λ) = [ λm1 + M 0 ] L φ (λ) L ψ (λ) T 0 is a linearization for p(λ) := π T φ (λ)(λm 1 + M 0 )π ψ (λ). Let w φ, w ψ constant vectors such that w T φ π φ(λ) = 1 = w T ψ π ψ(λ) (the coordinates of the constant 1 in our basis). 28 / 48

81 Intersections of polynomials Scalar polynomials Let s recap L(λ) = [ λm1 + M 0 ] L φ (λ) L ψ (λ) T 0 is a linearization for p(λ) := π T φ (λ)(λm 1 + M 0 )π ψ (λ). Let w φ, w ψ constant vectors such that w T φ π φ(λ) = 1 = w T ψ π ψ(λ) (the coordinates of the constant 1 in our basis). If we set λm 1 + M 0 = w φ p T 1 p 2w T ψ we have: p(λ) = p T 1 π ψ (λ) π T φ (λ)p 2 = η ɛ p 1,i ψ i (λ) p 2,i φ i (λ), i=0 i=0 assuming η and ɛ are the number of columns of L ψ (λ) and L φ (λ). 28 / 48

82 Intersections of polynomials Scalar polynomials What is the advantage? We have linearized a difference of polynomials expressed in different bases directly (no computations needed!). 29 / 48

83 Intersections of polynomials Scalar polynomials What is the advantage? We have linearized a difference of polynomials expressed in different bases directly (no computations needed!). The linearizations directly contain the coefficients of the two polynomials, thus avoiding the potentially ill-conditioned basis change. 29 / 48

84 Intersections of polynomials Scalar polynomials What is the advantage? We have linearized a difference of polynomials expressed in different bases directly (no computations needed!). The linearizations directly contain the coefficients of the two polynomials, thus avoiding the potentially ill-conditioned basis change. However, L(λ) is a linearization for a polynomial of grade ɛ + η, while we have degree max{ɛ, η}. This gives us many infinite eigenvalues. 29 / 48

85 Intersections of polynomials Scalar polynomials Infinite eigenvalues Infinite eigenvalues can be deflated a posteriori. 30 / 48

86 Intersections of polynomials Scalar polynomials Infinite eigenvalues Infinite eigenvalues can be deflated a posteriori. We can easily characterize their complete eigenstructure. They form a very long Jordan chain at infinity very badly conditioned! 30 / 48

87 Intersections of polynomials Scalar polynomials Infinite eigenvalues Infinite eigenvalues can be deflated a posteriori. We can easily characterize their complete eigenstructure. They form a very long Jordan chain at infinity very badly conditioned! However, the others eigenvalues can be perfectly conditioned. 30 / 48

88 Intersections of polynomials Scalar polynomials Infinite eigenvalues Infinite eigenvalues can be deflated a posteriori. We can easily characterize their complete eigenstructure. They form a very long Jordan chain at infinity very badly conditioned! However, the others eigenvalues can be perfectly conditioned. We can also deflate the infinite eigenstructure by the staircase algorithm (since we know the eigenstructure, no rank decision are needed, and the approach is perfectly stable). 30 / 48

89 Intersections of polynomials Scalar polynomials Numerical accuracy I claimed that this approach is more stable than conversion to a common basis. Is this true? 31 / 48

90 Intersections of polynomials Scalar polynomials Numerical accuracy I claimed that this approach is more stable than conversion to a common basis. Is this true? Norm of absolute errors Monomial Chebyshev Linearization Lin + Deflation Degree 31 / 48

91 Intersections of polynomials Going to rational functions Extensions to rational functions The same idea can be applied to a slightly more general problem. Find the solutions of: f (λ) := p(λ) q(λ) + r(λ) s(λ) = 0, with p(λ), q(λ), r(λ), and s(λ) polynomials. 32 / 48

92 Intersections of polynomials Going to rational functions Extensions to rational functions The same idea can be applied to a slightly more general problem. Find the solutions of: f (λ) := p(λ) q(λ) + r(λ) s(λ) = 0, with p(λ), q(λ), r(λ), and s(λ) polynomials. This is equivalent to finding the roots of t(λ) := p(λ)s(λ) + q(λ)r(λ). 32 / 48

93 Intersections of polynomials Going to rational functions A linearization for rational functions Similar to the previous case, we have: [ ps L(λ) = T + qr T ] L φ (λ) L ψ (λ) T 0 which is a linearization for t(λ). The eigenvalues are solutions of f (λ) := p(λ) q(λ) + r(λ) s(λ) = 0, where p(λ), q(λ) are expressed in the Φ basis, and r(λ) and s(λ) in the Ψ basis. 33 / 48

94 Intersections of polynomials Going to rational functions Easy proof We can easily check what is linearized by L(λ): t(λ) = π T φ (λ)(pst + qr T )π ψ (λ) = = p(λ)s(λ) + q(λ)r(λ). where p(λ), q(λ) are represented in the basis Φ, and r(λ), s(λ) in the basis Ψ. 34 / 48

95 Intersections of polynomials Going to rational functions Easy proof We can easily check what is linearized by L(λ): t(λ) = π T φ (λ)(pst + qr T )π ψ (λ) = = p(λ)s(λ) + q(λ)r(λ). where p(λ), q(λ) are represented in the basis Φ, and r(λ), s(λ) in the basis Ψ. Notice the the top-left rank 2 block has no term depending on λ. 34 / 48

96 Intersections of polynomials Going to rational functions Good news Under mild hypothesis on degree matching we have only 1 simple infinite eigenvalue. 35 / 48

97 Intersections of polynomials Going to rational functions Good news Under mild hypothesis on degree matching we have only 1 simple infinite eigenvalue. If one of the two basis satisfies the relation π ɛ,φ (λ) = (λa + B)π ɛ 1,φ (λ) we can get rid of all the infinite eigenvalues by slightly adjusting the construction. 35 / 48

98 Intersections of polynomials Going to rational functions Good news Under mild hypothesis on degree matching we have only 1 simple infinite eigenvalue. If one of the two basis satisfies the relation π ɛ,φ (λ) = (λa + B)π ɛ 1,φ (λ) we can get rid of all the infinite eigenvalues by slightly adjusting the construction. Hint: any basis you can think of satisfies the above property (monomial, all orthogonal, Newton, Lagrange, Hermite are all ok). 35 / 48

99 Intersections of polynomials Going to rational functions Another way to write p(λ)... L(λ) = [ M(λ) ] Lφ (λ) L ψ (λ) T 0 linearizes M i,j (λ)ψ i (λ)φ j (λ). i,j 36 / 48

100 Intersections of polynomials Going to rational functions Another way to write p(λ)... L(λ) = [ M(λ) ] Lφ (λ) L ψ (λ) T 0 linearizes M i,j (λ)ψ i (λ)φ j (λ). i,j We can see this as a linearization in the product family ψ φ := {φ i (λ)ψ j (λ) i = 1,..., ɛ, j = 1,..., η} 36 / 48

101 Intersections of polynomials Going to rational functions Another way to write p(λ)... L(λ) = [ M(λ) ] Lφ (λ) L ψ (λ) T 0 linearizes M i,j (λ)ψ i (λ)φ j (λ). i,j We can see this as a linearization in the product family ψ φ := {φ i (λ)ψ j (λ) i = 1,..., ɛ, j = 1,..., η} Not a basis in general. 36 / 48

102 Intersections of polynomials Going to rational functions Another way to write p(λ)... L(λ) = [ M(λ) ] Lφ (λ) L ψ (λ) T 0 linearizes M i,j (λ)ψ i (λ)φ j (λ). i,j We can see this as a linearization in the product family ψ φ := {φ i (λ)ψ j (λ) i = 1,..., ɛ, j = 1,..., η} Not a basis in general. However, when φ i and ψ j this is a generating family for the polynomials of degree up to ɛ + η. 36 / 48

103 Intersections of polynomials Going to rational functions Another way to write p(λ)... L(λ) = [ M(λ) ] Lφ (λ) L ψ (λ) T 0 linearizes M i,j (λ)ψ i (λ)φ j (λ). i,j We can see this as a linearization in the product family ψ φ := {φ i (λ)ψ j (λ) i = 1,..., ɛ, j = 1,..., η} Not a basis in general. However, when φ i and ψ j this is a generating family for the polynomials of degree up to ɛ + η. We can exploit the redundancy in a good way. 36 / 48

104 Intersections of polynomials More and more bases!... which leads to a natural question Can we handle more than two bases? 37 / 48

105 Intersections of polynomials More and more bases!... which leads to a natural question Can we handle more than two bases? Given φ (1),..., φ (j) polynomial bases I can construct their product: φ (1)... φ (j) 37 / 48

106 Intersections of polynomials More and more bases!... which leads to a natural question Can we handle more than two bases? Given φ (1),..., φ (j) polynomial bases I can construct their product: φ (1)... φ (j) We know how to handle j = 1 (the classical companion case). 37 / 48

107 Intersections of polynomials More and more bases!... which leads to a natural question Can we handle more than two bases? Given φ (1),..., φ (j) polynomial bases I can construct their product: φ (1)... φ (j) We know how to handle j = 1 (the classical companion case).... and also j = 2 (what we have seen until now). 37 / 48

108 Intersections of polynomials More and more bases!... which leads to a natural question Can we handle more than two bases? Given φ (1),..., φ (j) polynomial bases I can construct their product: φ (1)... φ (j) We know how to handle j = 1 (the classical companion case).... and also j = 2 (what we have seen until now). What about going higher? 37 / 48

109 Intersections of polynomials More and more bases! Enlarged dual bases We can deal with this by constructing enlarged dual bases. φ 0 (λ)ψ 0 (λ). π φ ψ (λ) := φ i (λ)ψ j (λ). φ ɛ (λ)ψ η (λ) 38 / 48

110 Intersections of polynomials More and more bases! Enlarged dual bases We can deal with this by constructing enlarged dual bases. φ 0 (λ)ψ 0 (λ). π φ ψ (λ) := φ i (λ)ψ j (λ). φ ɛ (λ)ψ η (λ) Can we build a suitable L(λ) such that L(λ) T π φ ψ (λ) = 0? 38 / 48

111 Intersections of polynomials More and more bases! Enlarged dual bases We can deal with this by constructing enlarged dual bases. φ 0 (λ)ψ 0 (λ). π φ ψ (λ) := φ i (λ)ψ j (λ). φ ɛ (λ)ψ η (λ) Can we build a suitable L(λ) such that L(λ) T π φ ψ (λ) = 0? Yes! 38 / 48

112 Intersections of polynomials More and more bases! Product dual bases [ L k,φ ψ (λ) T A Lη,ψ (λ) = T ] L ɛ,φ (λ) T w T, k := (ɛ + 1)(η + 1) 1, form a dual basis together with π φ ψ (λ), where A is any invertible matrix and w any vector such that w T π η,ψ (λ) is a nonzero constant. 39 / 48

113 Intersections of polynomials More and more bases! Product dual bases [ L k,φ ψ (λ) T A Lη,ψ (λ) = T ] L ɛ,φ (λ) T w T, k := (ɛ + 1)(η + 1) 1, form a dual basis together with π φ ψ (λ), where A is any invertible matrix and w any vector such that w T π η,ψ (λ) is a nonzero constant. We can just plug this ingredient in our framework and obtain the desired linearizations! 39 / 48

114 Intersections of polynomials More and more bases! Some examples Let p(λ) = 3 i=0 p iλ i a degree 3 polynomial. 40 / 48

115 Intersections of polynomials More and more bases! Some examples Let p(λ) = 3 i=0 p iλ i a degree 3 polynomial. Choosing {ψ i } = {1, λ, λ 2 } and {φ i } = {1} yields Frobenius form: λp 3 + p 2 p 1 p 0 L(λ) = 1 λ. 1 λ 40 / 48

116 Intersections of polynomials More and more bases! Some examples Let p(λ) = 3 i=0 p iλ i a degree 3 polynomial. Choosing {ψ i } = {1, λ, λ 2 } and {φ i } = {1} yields Frobenius form: λp 3 + p 2 p 1 p 0 L(λ) = 1 λ. 1 λ Choosing {ψ i } = {φ i } = {1, λ} yields, e.g., the symmetric 1 λp 3 + p 2 2 p 1 1 L(λ) = 1 2 p 1 p 0 λ. 1 λ 0 40 / 48

117 Intersections of polynomials More and more bases! Some examples Choosing {ψ i } = {1, λ} {1, λ} and {φ i } = {1} yields L(λ) = 1 λp 3 + p 2 2 p 1 1 λ 1 2 p 1 p 0 1 λ 1 λ. 41 / 48

118 Intersections of polynomials More and more bases! Some examples Choosing {ψ i } = {1, λ} {1, λ} and {φ i } = {1} yields L(λ) = 1 λp 3 + p 2 2 p 1 1 λ 1 2 p 1 p 0 1 λ 1 λ. Note, the dimension could increase (not strong anymore). 41 / 48

119 Future outlook Backward stability Backward stability Work ongoing in collaboration with many people (Piers Lawrence, Françoise Tisseur, Raf Vandebril, Paul Van Dooren,... ). 42 / 48

120 Future outlook Backward stability Backward stability Work ongoing in collaboration with many people (Piers Lawrence, Françoise Tisseur, Raf Vandebril, Paul Van Dooren,... ). I want to solve the eigenvalue problem L(λ) via QZ, with [ ] W T L(λ) = L(λ) T, L(λ) T π(λ) = / 48

121 Future outlook Backward stability Backward stability Work ongoing in collaboration with many people (Piers Lawrence, Françoise Tisseur, Raf Vandebril, Paul Van Dooren,... ). I want to solve the eigenvalue problem L(λ) via QZ, with [ ] W T L(λ) = L(λ) T, L(λ) T π(λ) = but QZ will give me back the exact eigenvalues of L(λ): [ W L(λ) = T + δw T ] [ W = T ] L(λ) + δl(λ) L(λ) 42 / 48

122 Future outlook Backward stability Relating the two pencils The perturbation in W T is fine (it is a perturbation to the coefficients, so it seems reasonable.) 43 / 48

123 Future outlook Backward stability Relating the two pencils The perturbation in W T is fine (it is a perturbation to the coefficients, so it seems reasonable.) The perturbation to L(λ) will correspond to a perturbation to the dual space, that is I need to find δπ(λ) such that (L(λ) + δl(λ)) T (π(λ) + δπ(λ)) = / 48

124 Future outlook Backward stability Relating the two pencils The perturbation in W T is fine (it is a perturbation to the coefficients, so it seems reasonable.) The perturbation to L(λ) will correspond to a perturbation to the dual space, that is I need to find δπ(λ) such that (L(λ) + δl(λ)) T (π(λ) + δπ(λ)) = 0. Many interesting questions for us: When do small perturbations to L(λ) correspond to small perturbations on π(λ)? 43 / 48

125 Future outlook Backward stability Relating the two pencils The perturbation in W T is fine (it is a perturbation to the coefficients, so it seems reasonable.) The perturbation to L(λ) will correspond to a perturbation to the dual space, that is I need to find δπ(λ) such that (L(λ) + δl(λ)) T (π(λ) + δπ(λ)) = 0. Many interesting questions for us: When do small perturbations to L(λ) correspond to small perturbations on π(λ)? Can I measure the growth factor in the perturbation size? 43 / 48

126 Future outlook Backward stability Relating the two pencils The perturbation in W T is fine (it is a perturbation to the coefficients, so it seems reasonable.) The perturbation to L(λ) will correspond to a perturbation to the dual space, that is I need to find δπ(λ) such that (L(λ) + δl(λ)) T (π(λ) + δπ(λ)) = 0. Many interesting questions for us: When do small perturbations to L(λ) correspond to small perturbations on π(λ)? Can I measure the growth factor in the perturbation size? Does it work flawlessly also when using two bases (or more?). 43 / 48

127 Future outlook Backward stability To be continued to know the answer to all the above questions, you will need to see the next episode of dual bases and matrix polynomials. 44 / 48

128 Future outlook Backward stability To be continued to know the answer to all the above questions, you will need to see the next episode of dual bases and matrix polynomials. Likely the talk of Piers Lawrence at ILAS 2016 will be about these matters. 44 / 48

129 Future outlook Backward stability To be continued to know the answer to all the above questions, you will need to see the next episode of dual bases and matrix polynomials. Likely the talk of Piers Lawrence at ILAS 2016 will be about these matters. I have a draft in my laptop which seems promising. I am working on it in these days, let me know if you want to discuss something about that! 44 / 48

130 Future outlook Multivariate linearizations Multivariate polynomials What we have seen also naturally fit into the framework of linearizing the bivariate polynomial p(λ, µ) = n n p ij λ i µ j i=0 j=0 45 / 48

131 Future outlook Multivariate linearizations Multivariate polynomials What we have seen also naturally fit into the framework of linearizing the bivariate polynomial p(λ, µ) = n n p ij λ i µ j i=0 j=0 In fact, this can be naturally expressed as p(λ, µ) = π(λ) T Pπ(µ), P = (p ij ), and π(λ) the usual vector with the monomials. 45 / 48

132 Future outlook Multivariate linearizations Visualizing the linearization The following is a linearization for p(λ, µ): [ ] P L(λ) L(λ, µ) = L(µ) T 0 in the sense that det L(λ, µ) = p(λ, µ). 46 / 48

133 Future outlook Multivariate linearizations Visualizing the linearization The following is a linearization for p(λ, µ): [ ] P L(λ) L(λ, µ) = L(µ) T 0 in the sense that det L(λ, µ) = p(λ, µ). One could use this to solve problems of the form { p(λ, µ) = 0 q(λ, µ) = 0 by turning them into a multiparameter eigenvalue problem { L 1 (λ, µ)v 1 = 0 L 2 (λ, µ)v 2 = 0 46 / 48

134 Future outlook Multivariate linearizations Issues The previous step is not free from numerical issues. To solve the MEP one usually turns it into a linear eigenvalue problem, which makes the dimension grow, and add a quite large singular part to the pencil. 47 / 48

135 Future outlook Multivariate linearizations Issues The previous step is not free from numerical issues. To solve the MEP one usually turns it into a linear eigenvalue problem, which makes the dimension grow, and add a quite large singular part to the pencil. Work on this topic is being carried out by Bor Plestenjak. Our framework easily allows extension to more variables, which seems to be almost optimal in dimension for generic problems. 47 / 48

136 Conclusions Conclusions We have seen as dual bases and matrix polynomials play well together. 48 / 48

137 Conclusions Conclusions We have seen as dual bases and matrix polynomials play well together. This approach gives us a lot of flexibility in building linearizations, and gives us a better understanding of how different bases work. Moreover, we can combine different bases in the construction. 48 / 48

138 Conclusions Conclusions We have seen as dual bases and matrix polynomials play well together. This approach gives us a lot of flexibility in building linearizations, and gives us a better understanding of how different bases work. Moreover, we can combine different bases in the construction. Easier analysis of backward stability issues. Results are promising and seem to give new insights on scalings and linearization constructions. 48 / 48

139 Conclusions Conclusions We have seen as dual bases and matrix polynomials play well together. This approach gives us a lot of flexibility in building linearizations, and gives us a better understanding of how different bases work. Moreover, we can combine different bases in the construction. Easier analysis of backward stability issues. Results are promising and seem to give new insights on scalings and linearization constructions. Promising developments for multivariate rootfinding. 48 / 48

140 Conclusions Conclusions We have seen as dual bases and matrix polynomials play well together. This approach gives us a lot of flexibility in building linearizations, and gives us a better understanding of how different bases work. Moreover, we can combine different bases in the construction. Easier analysis of backward stability issues. Results are promising and seem to give new insights on scalings and linearization constructions. Promising developments for multivariate rootfinding. Thanks for your attention! 48 / 48

A UNIFIED APPROACH TO FIEDLER-LIKE PENCILS VIA STRONG BLOCK MINIMAL BASES PENCILS.

A UNIFIED APPROACH TO FIEDLER-LIKE PENCILS VIA STRONG BLOCK MINIMAL BASES PENCILS. A UNIFIED APPROACH TO FIEDLER-LIKE PENCILS VIA STRONG BLOCK MINIMAL BASES PENCILS M I BUENO, F M DOPICO, J PÉREZ, R SAAVEDRA, AND B ZYKOSKI Abstract The standard way of solving the polynomial eigenvalue

More information

EXPLICIT BLOCK-STRUCTURES FOR BLOCK-SYMMETRIC FIEDLER-LIKE PENCILS

EXPLICIT BLOCK-STRUCTURES FOR BLOCK-SYMMETRIC FIEDLER-LIKE PENCILS EXPLICIT BLOCK-STRUCTURES FOR BLOCK-SYMMETRIC FIEDLER-LIKE PENCILS M I BUENO, M MARTIN, J PÉREZ, A SONG, AND I VIVIANO Abstract In the last decade, there has been a continued effort to produce families

More information

A SIMPLIFIED APPROACH TO FIEDLER-LIKE PENCILS VIA STRONG BLOCK MINIMAL BASES PENCILS.

A SIMPLIFIED APPROACH TO FIEDLER-LIKE PENCILS VIA STRONG BLOCK MINIMAL BASES PENCILS. A SIMPLIFIED APPROACH TO FIEDLER-LIKE PENCILS VIA STRONG BLOCK MINIMAL BASES PENCILS. M. I. BUENO, F. M. DOPICO, J. PÉREZ, R. SAAVEDRA, AND B. ZYKOSKI Abstract. The standard way of solving the polynomial

More information

Eigenvector error bound and perturbation for nonlinear eigenvalue problems

Eigenvector error bound and perturbation for nonlinear eigenvalue problems Eigenvector error bound and perturbation for nonlinear eigenvalue problems Yuji Nakatsukasa School of Mathematics University of Tokyo Joint work with Françoise Tisseur Workshop on Nonlinear Eigenvalue

More information

Nonlinear palindromic eigenvalue problems and their numerical solution

Nonlinear palindromic eigenvalue problems and their numerical solution Nonlinear palindromic eigenvalue problems and their numerical solution TU Berlin DFG Research Center Institut für Mathematik MATHEON IN MEMORIAM RALPH BYERS Polynomial eigenvalue problems k P(λ) x = (

More information

Algorithms for Solving the Polynomial Eigenvalue Problem

Algorithms for Solving the Polynomial Eigenvalue Problem Algorithms for Solving the Polynomial Eigenvalue Problem Nick Higham School of Mathematics The University of Manchester higham@ma.man.ac.uk http://www.ma.man.ac.uk/~higham/ Joint work with D. Steven Mackey

More information

KU Leuven Department of Computer Science

KU Leuven Department of Computer Science Backward error of polynomial eigenvalue problems solved by linearization of Lagrange interpolants Piers W. Lawrence Robert M. Corless Report TW 655, September 214 KU Leuven Department of Computer Science

More information

Solving Polynomial Eigenproblems by Linearization

Solving Polynomial Eigenproblems by Linearization Solving Polynomial Eigenproblems by Linearization Nick Higham School of Mathematics University of Manchester higham@ma.man.ac.uk http://www.ma.man.ac.uk/~higham/ Joint work with D. Steven Mackey and Françoise

More information

Jordan Structures of Alternating Matrix Polynomials

Jordan Structures of Alternating Matrix Polynomials Jordan Structures of Alternating Matrix Polynomials D. Steven Mackey Niloufer Mackey Christian Mehl Volker Mehrmann August 17, 2009 Abstract Alternating matrix polynomials, that is, polynomials whose coefficients

More information

Linearizing Symmetric Matrix Polynomials via Fiedler pencils with Repetition

Linearizing Symmetric Matrix Polynomials via Fiedler pencils with Repetition Linearizing Symmetric Matrix Polynomials via Fiedler pencils with Repetition Kyle Curlett Maribel Bueno Cachadina, Advisor March, 2012 Department of Mathematics Abstract Strong linearizations of a matrix

More information

Scaling, Sensitivity and Stability in Numerical Solution of the Quadratic Eigenvalue Problem

Scaling, Sensitivity and Stability in Numerical Solution of the Quadratic Eigenvalue Problem Scaling, Sensitivity and Stability in Numerical Solution of the Quadratic Eigenvalue Problem Nick Higham School of Mathematics The University of Manchester higham@ma.man.ac.uk http://www.ma.man.ac.uk/~higham/

More information

Fiedler Companion Linearizations and the Recovery of Minimal Indices. De Teran, Fernando and Dopico, Froilan M. and Mackey, D.

Fiedler Companion Linearizations and the Recovery of Minimal Indices. De Teran, Fernando and Dopico, Froilan M. and Mackey, D. Fiedler Companion Linearizations and the Recovery of Minimal Indices De Teran, Fernando and Dopico, Froilan M and Mackey, D Steven 2009 MIMS EPrint: 200977 Manchester Institute for Mathematical Sciences

More information

Recent Advances in the Numerical Solution of Quadratic Eigenvalue Problems

Recent Advances in the Numerical Solution of Quadratic Eigenvalue Problems Recent Advances in the Numerical Solution of Quadratic Eigenvalue Problems Françoise Tisseur School of Mathematics The University of Manchester ftisseur@ma.man.ac.uk http://www.ma.man.ac.uk/~ftisseur/

More information

arxiv: v2 [math.na] 8 Aug 2018

arxiv: v2 [math.na] 8 Aug 2018 THE CONDITIONING OF BLOCK KRONECKER l-ifications OF MATRIX POLYNOMIALS JAVIER PÉREZ arxiv:1808.01078v2 math.na] 8 Aug 2018 Abstract. A strong l-ification of a matrix polynomial P(λ) = A i λ i of degree

More information

Strong linearizations of rational matrices with polynomial part expressed in an orthogonal basis

Strong linearizations of rational matrices with polynomial part expressed in an orthogonal basis Strong linearizations of rational matrices with polynomial part expressed in an orthogonal basis Froilán M Dopico a,1, Silvia Marcaida b,2,, María C Quintana a,1 a Departamento de Matemáticas, Universidad

More information

Polynomial Eigenvalue Problems: Theory, Computation, and Structure. Mackey, D. S. and Mackey, N. and Tisseur, F. MIMS EPrint: 2015.

Polynomial Eigenvalue Problems: Theory, Computation, and Structure. Mackey, D. S. and Mackey, N. and Tisseur, F. MIMS EPrint: 2015. Polynomial Eigenvalue Problems: Theory, Computation, and Structure Mackey, D. S. and Mackey, N. and Tisseur, F. 2015 MIMS EPrint: 2015.29 Manchester Institute for Mathematical Sciences School of Mathematics

More information

Solving the Polynomial Eigenvalue Problem by Linearization

Solving the Polynomial Eigenvalue Problem by Linearization Solving the Polynomial Eigenvalue Problem by Linearization Nick Higham School of Mathematics The University of Manchester higham@ma.man.ac.uk http://www.ma.man.ac.uk/~higham/ Joint work with Ren-Cang Li,

More information

Skew-Symmetric Matrix Polynomials and their Smith Forms

Skew-Symmetric Matrix Polynomials and their Smith Forms Skew-Symmetric Matrix Polynomials and their Smith Forms D. Steven Mackey Niloufer Mackey Christian Mehl Volker Mehrmann March 23, 2013 Abstract Two canonical forms for skew-symmetric matrix polynomials

More information

1. Introduction. Throughout this work we consider n n matrix polynomials with degree k 2 of the form

1. Introduction. Throughout this work we consider n n matrix polynomials with degree k 2 of the form FIEDLER COMPANION LINEARIZATIONS AND THE RECOVERY OF MINIMAL INDICES FERNANDO DE TERÁN, FROILÁN M DOPICO, AND D STEVEN MACKEY Abstract A standard way of dealing with a matrix polynomial P (λ) is to convert

More information

Review problems for MA 54, Fall 2004.

Review problems for MA 54, Fall 2004. Review problems for MA 54, Fall 2004. Below are the review problems for the final. They are mostly homework problems, or very similar. If you are comfortable doing these problems, you should be fine on

More information

arxiv: v1 [math.na] 22 Nov 2016

arxiv: v1 [math.na] 22 Nov 2016 A UNIFIED APPROACH TO FIEDLER-LIKE PENCILS VIA STRONG BLOCK MINIMAL BASES PENCILS. M. I. BUENO, F. M. DOPICO, J. PÉREZ, R. SAAVEDRA, AND B. ZYKOSKI arxiv:1611.07170v1 [math.na 22 Nov 2016 Abstract. The

More information

Polynomial eigenvalue solver based on tropically scaled Lagrange linearization. Van Barel, Marc and Tisseur, Francoise. MIMS EPrint: 2016.

Polynomial eigenvalue solver based on tropically scaled Lagrange linearization. Van Barel, Marc and Tisseur, Francoise. MIMS EPrint: 2016. Polynomial eigenvalue solver based on tropically scaled Lagrange linearization Van Barel, Marc and Tisseur, Francoise 2016 MIMS EPrint: 201661 Manchester Institute for Mathematical Sciences School of Mathematics

More information

Linear Algebra and its Applications

Linear Algebra and its Applications Linear Algebra and its Applications 430 (2009) 579 586 Contents lists available at ScienceDirect Linear Algebra and its Applications journal homepage: www.elsevier.com/locate/laa Low rank perturbation

More information

Perturbation theory for eigenvalues of Hermitian pencils. Christian Mehl Institut für Mathematik TU Berlin, Germany. 9th Elgersburg Workshop

Perturbation theory for eigenvalues of Hermitian pencils. Christian Mehl Institut für Mathematik TU Berlin, Germany. 9th Elgersburg Workshop Perturbation theory for eigenvalues of Hermitian pencils Christian Mehl Institut für Mathematik TU Berlin, Germany 9th Elgersburg Workshop Elgersburg, March 3, 2014 joint work with Shreemayee Bora, Michael

More information

A note on companion pencils

A note on companion pencils Contemporary Mathematics A note on companion pencils Jared L Aurentz, Thomas Mach, Raf Vandebril, and David S Watkins Abstract Various generalizations of companion matrices to companion pencils are presented

More information

An Algorithm for. Nick Higham. Françoise Tisseur. Director of Research School of Mathematics.

An Algorithm for. Nick Higham. Françoise Tisseur. Director of Research School of Mathematics. An Algorithm for the Research Complete Matters Solution of Quadratic February Eigenvalue 25, 2009 Problems Nick Higham Françoise Tisseur Director of Research School of Mathematics The School University

More information

Multiparameter eigenvalue problem as a structured eigenproblem

Multiparameter eigenvalue problem as a structured eigenproblem Multiparameter eigenvalue problem as a structured eigenproblem Bor Plestenjak Department of Mathematics University of Ljubljana This is joint work with M Hochstenbach Będlewo, 2932007 1/28 Overview Introduction

More information

Topics in linear algebra

Topics in linear algebra Chapter 6 Topics in linear algebra 6.1 Change of basis I want to remind you of one of the basic ideas in linear algebra: change of basis. Let F be a field, V and W be finite dimensional vector spaces over

More information

Paul Van Dooren s Index Sum Theorem: To Infinity and Beyond

Paul Van Dooren s Index Sum Theorem: To Infinity and Beyond Paul Van Dooren s Index Sum Theorem: To Infinity and Beyond Froilán M. Dopico Departamento de Matemáticas Universidad Carlos III de Madrid, Spain Colloquium in honor of Paul Van Dooren on becoming Emeritus

More information

Linear Algebra I. Ronald van Luijk, 2015

Linear Algebra I. Ronald van Luijk, 2015 Linear Algebra I Ronald van Luijk, 2015 With many parts from Linear Algebra I by Michael Stoll, 2007 Contents Dependencies among sections 3 Chapter 1. Euclidean space: lines and hyperplanes 5 1.1. Definition

More information

Trimmed linearizations for structured matrix polynomials

Trimmed linearizations for structured matrix polynomials Trimmed linearizations for structured matrix polynomials Ralph Byers Volker Mehrmann Hongguo Xu January 5 28 Dedicated to Richard S Varga on the occasion of his 8th birthday Abstract We discuss the eigenvalue

More information

The value of a problem is not so much coming up with the answer as in the ideas and attempted ideas it forces on the would be solver I.N.

The value of a problem is not so much coming up with the answer as in the ideas and attempted ideas it forces on the would be solver I.N. Math 410 Homework Problems In the following pages you will find all of the homework problems for the semester. Homework should be written out neatly and stapled and turned in at the beginning of class

More information

Quadratic Realizability of Palindromic Matrix Polynomials

Quadratic Realizability of Palindromic Matrix Polynomials Quadratic Realizability of Palindromic Matrix Polynomials Fernando De Terán a, Froilán M. Dopico a, D. Steven Mackey b, Vasilije Perović c, a Departamento de Matemáticas, Universidad Carlos III de Madrid,

More information

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix DIAGONALIZATION Definition We say that a matrix A of size n n is diagonalizable if there is a basis of R n consisting of eigenvectors of A ie if there are n linearly independent vectors v v n such that

More information

SOLUTION SETS OF RECURRENCE RELATIONS

SOLUTION SETS OF RECURRENCE RELATIONS SOLUTION SETS OF RECURRENCE RELATIONS SEBASTIAN BOZLEE UNIVERSITY OF COLORADO AT BOULDER The first section of these notes describes general solutions to linear, constant-coefficient, homogeneous recurrence

More information

This section is an introduction to the basic themes of the course.

This section is an introduction to the basic themes of the course. Chapter 1 Matrices and Graphs 1.1 The Adjacency Matrix This section is an introduction to the basic themes of the course. Definition 1.1.1. A simple undirected graph G = (V, E) consists of a non-empty

More information

MATH 167: APPLIED LINEAR ALGEBRA Chapter 2

MATH 167: APPLIED LINEAR ALGEBRA Chapter 2 MATH 167: APPLIED LINEAR ALGEBRA Chapter 2 Jesús De Loera, UC Davis February 1, 2012 General Linear Systems of Equations (2.2). Given a system of m equations and n unknowns. Now m n is OK! Apply elementary

More information

Research Matters. February 25, The Nonlinear Eigenvalue Problem. Nick Higham. Part III. Director of Research School of Mathematics

Research Matters. February 25, The Nonlinear Eigenvalue Problem. Nick Higham. Part III. Director of Research School of Mathematics Research Matters February 25, 2009 The Nonlinear Eigenvalue Problem Nick Higham Part III Director of Research School of Mathematics Françoise Tisseur School of Mathematics The University of Manchester

More information

Analytic Fredholm Theory

Analytic Fredholm Theory Analytic Fredholm Theory Ethan Y. Jaffe The purpose of this note is to prove a version of analytic Fredholm theory, and examine a special case. Theorem 1.1 (Analytic Fredholm Theory). Let Ω be a connected

More information

RETRACTED On construction of a complex finite Jacobi matrix from two spectra

RETRACTED On construction of a complex finite Jacobi matrix from two spectra Electronic Journal of Linear Algebra Volume 26 Volume 26 (203) Article 8 203 On construction of a complex finite Jacobi matrix from two spectra Gusein Sh. Guseinov guseinov@ati.edu.tr Follow this and additional

More information

ACM 104. Homework Set 4 Solutions February 14, 2001

ACM 104. Homework Set 4 Solutions February 14, 2001 ACM 04 Homework Set 4 Solutions February 4, 00 Franklin Chapter, Problem 4, page 55 Suppose that we feel that some observations are more important or reliable than others Redefine the function to be minimized

More information

Chapter 7. Canonical Forms. 7.1 Eigenvalues and Eigenvectors

Chapter 7. Canonical Forms. 7.1 Eigenvalues and Eigenvectors Chapter 7 Canonical Forms 7.1 Eigenvalues and Eigenvectors Definition 7.1.1. Let V be a vector space over the field F and let T be a linear operator on V. An eigenvalue of T is a scalar λ F such that there

More information

KU Leuven Department of Computer Science

KU Leuven Department of Computer Science Compact rational Krylov methods for nonlinear eigenvalue problems Roel Van Beeumen Karl Meerbergen Wim Michiels Report TW 651, July 214 KU Leuven Department of Computer Science Celestinenlaan 2A B-31 Heverlee

More information

Eigenvalues and Eigenvectors

Eigenvalues and Eigenvectors Sec. 6.1 Eigenvalues and Eigenvectors Linear transformations L : V V that go from a vector space to itself are often called linear operators. Many linear operators can be understood geometrically by identifying

More information

Geometric Modeling Summer Semester 2010 Mathematical Tools (1)

Geometric Modeling Summer Semester 2010 Mathematical Tools (1) Geometric Modeling Summer Semester 2010 Mathematical Tools (1) Recap: Linear Algebra Today... Topics: Mathematical Background Linear algebra Analysis & differential geometry Numerical techniques Geometric

More information

On the reduction of matrix polynomials to Hessenberg form

On the reduction of matrix polynomials to Hessenberg form Electronic Journal of Linear Algebra Volume 3 Volume 3: (26) Article 24 26 On the reduction of matrix polynomials to Hessenberg form Thomas R. Cameron Washington State University, tcameron@math.wsu.edu

More information

Preliminary/Qualifying Exam in Numerical Analysis (Math 502a) Spring 2012

Preliminary/Qualifying Exam in Numerical Analysis (Math 502a) Spring 2012 Instructions Preliminary/Qualifying Exam in Numerical Analysis (Math 502a) Spring 2012 The exam consists of four problems, each having multiple parts. You should attempt to solve all four problems. 1.

More information

4. Linear transformations as a vector space 17

4. Linear transformations as a vector space 17 4 Linear transformations as a vector space 17 d) 1 2 0 0 1 2 0 0 1 0 0 0 1 2 3 4 32 Let a linear transformation in R 2 be the reflection in the line = x 2 Find its matrix 33 For each linear transformation

More information

Linear Algebra and its Applications

Linear Algebra and its Applications Linear Algebra and its Applications 436 (2012) 3954 3973 Contents lists available at ScienceDirect Linear Algebra and its Applications journal homepage: www.elsevier.com/locate/laa Hermitian matrix polynomials

More information

1. Introduction. Throughout this work we consider n n matrix polynomials with degree k of the form

1. Introduction. Throughout this work we consider n n matrix polynomials with degree k of the form LINEARIZATIONS OF SINGULAR MATRIX POLYNOMIALS AND THE RECOVERY OF MINIMAL INDICES FERNANDO DE TERÁN, FROILÁN M. DOPICO, AND D. STEVEN MACKEY Abstract. A standard way of dealing with a regular matrix polynomial

More information

MATH 583A REVIEW SESSION #1

MATH 583A REVIEW SESSION #1 MATH 583A REVIEW SESSION #1 BOJAN DURICKOVIC 1. Vector Spaces Very quick review of the basic linear algebra concepts (see any linear algebra textbook): (finite dimensional) vector space (or linear space),

More information

Singular Value Decomposition

Singular Value Decomposition Chapter 5 Singular Value Decomposition We now reach an important Chapter in this course concerned with the Singular Value Decomposition of a matrix A. SVD, as it is commonly referred to, is one of the

More information

EIGENVALUES AND EIGENVECTORS

EIGENVALUES AND EIGENVECTORS EIGENVALUES AND EIGENVECTORS Diagonalizable linear transformations and matrices Recall, a matrix, D, is diagonal if it is square and the only non-zero entries are on the diagonal This is equivalent to

More information

arxiv: v1 [cs.sy] 29 Dec 2018

arxiv: v1 [cs.sy] 29 Dec 2018 ON CHECKING NULL RANK CONDITIONS OF RATIONAL MATRICES ANDREAS VARGA Abstract. In this paper we discuss possible numerical approaches to reliably check the rank condition rankg(λ) = 0 for a given rational

More information

[Disclaimer: This is not a complete list of everything you need to know, just some of the topics that gave people difficulty.]

[Disclaimer: This is not a complete list of everything you need to know, just some of the topics that gave people difficulty.] Math 43 Review Notes [Disclaimer: This is not a complete list of everything you need to know, just some of the topics that gave people difficulty Dot Product If v (v, v, v 3 and w (w, w, w 3, then the

More information

THE STABLE EMBEDDING PROBLEM

THE STABLE EMBEDDING PROBLEM THE STABLE EMBEDDING PROBLEM R. Zavala Yoé C. Praagman H.L. Trentelman Department of Econometrics, University of Groningen, P.O. Box 800, 9700 AV Groningen, The Netherlands Research Institute for Mathematics

More information

Factoring block Fiedler Companion Matrices

Factoring block Fiedler Companion Matrices Chapter Factoring block Fiedler Companion Matrices Gianna M. Del Corso, Federico Poloni, Leonardo Robol, and Raf Vandebril Abstract When Fiedler published his A note on Companion matrices in 00 in Linear

More information

Functional Analysis Review

Functional Analysis Review Outline 9.520: Statistical Learning Theory and Applications February 8, 2010 Outline 1 2 3 4 Vector Space Outline A vector space is a set V with binary operations +: V V V and : R V V such that for all

More information

On the computation of the Jordan canonical form of regular matrix polynomials

On the computation of the Jordan canonical form of regular matrix polynomials On the computation of the Jordan canonical form of regular matrix polynomials G Kalogeropoulos, P Psarrakos 2 and N Karcanias 3 Dedicated to Professor Peter Lancaster on the occasion of his 75th birthday

More information

MATHEMATICS 217 NOTES

MATHEMATICS 217 NOTES MATHEMATICS 27 NOTES PART I THE JORDAN CANONICAL FORM The characteristic polynomial of an n n matrix A is the polynomial χ A (λ) = det(λi A), a monic polynomial of degree n; a monic polynomial in the variable

More information

Triangularizing matrix polynomials. Taslaman, Leo and Tisseur, Francoise and Zaballa, Ion. MIMS EPrint:

Triangularizing matrix polynomials. Taslaman, Leo and Tisseur, Francoise and Zaballa, Ion. MIMS EPrint: Triangularizing matrix polynomials Taslaman, Leo and Tisseur, Francoise and Zaballa, Ion 2012 MIMS EPrint: 2012.77 Manchester Institute for Mathematical Sciences School of Mathematics The University of

More information

A block-symmetric linearization of odd degree matrix polynomials with optimal eigenvalue condition number and backward error

A block-symmetric linearization of odd degree matrix polynomials with optimal eigenvalue condition number and backward error Calcolo manuscript No. (will be inserted by the editor) A block-symmetric linearization of odd degree matrix polynomials with optimal eigenvalue condition number and backward error M. I Bueno F. M. Dopico

More information

Final Review Sheet. B = (1, 1 + 3x, 1 + x 2 ) then 2 + 3x + 6x 2

Final Review Sheet. B = (1, 1 + 3x, 1 + x 2 ) then 2 + 3x + 6x 2 Final Review Sheet The final will cover Sections Chapters 1,2,3 and 4, as well as sections 5.1-5.4, 6.1-6.2 and 7.1-7.3 from chapters 5,6 and 7. This is essentially all material covered this term. Watch

More information

Math 291-2: Lecture Notes Northwestern University, Winter 2016

Math 291-2: Lecture Notes Northwestern University, Winter 2016 Math 291-2: Lecture Notes Northwestern University, Winter 2016 Written by Santiago Cañez These are lecture notes for Math 291-2, the second quarter of MENU: Intensive Linear Algebra and Multivariable Calculus,

More information

k In particular, the largest dimension of the subspaces of block-symmetric pencils we introduce is n 4

k In particular, the largest dimension of the subspaces of block-symmetric pencils we introduce is n 4 LARGE VECTOR SPACES OF BLOCK-SYMMETRIC STRONG LINEARIZATIONS OF MATRIX POLYNOMIALS M. I. BUENO, F. M. DOPICO, S. FURTADO, AND M. RYCHNOVSKY Abstract. Given a matrix polynomial P (λ) = P k i=0 λi A i of

More information

Linear Algebra Massoud Malek

Linear Algebra Massoud Malek CSUEB Linear Algebra Massoud Malek Inner Product and Normed Space In all that follows, the n n identity matrix is denoted by I n, the n n zero matrix by Z n, and the zero vector by θ n An inner product

More information

Linear Algebra Review

Linear Algebra Review Chapter 1 Linear Algebra Review It is assumed that you have had a course in linear algebra, and are familiar with matrix multiplication, eigenvectors, etc. I will review some of these terms here, but quite

More information

Lecture 7: Positive Semidefinite Matrices

Lecture 7: Positive Semidefinite Matrices Lecture 7: Positive Semidefinite Matrices Rajat Mittal IIT Kanpur The main aim of this lecture note is to prepare your background for semidefinite programming. We have already seen some linear algebra.

More information

Descriptor system techniques in solving H 2 -optimal fault detection problems

Descriptor system techniques in solving H 2 -optimal fault detection problems Descriptor system techniques in solving H 2 -optimal fault detection problems Andras Varga German Aerospace Center (DLR) DAE 10 Workshop Banff, Canada, October 25-29, 2010 Outline approximate fault detection

More information

Hermitian Matrix Polynomials with Real Eigenvalues of Definite Type. Part I: Classification. Al-Ammari, Maha and Tisseur, Francoise

Hermitian Matrix Polynomials with Real Eigenvalues of Definite Type. Part I: Classification. Al-Ammari, Maha and Tisseur, Francoise Hermitian Matrix Polynomials with Real Eigenvalues of Definite Type. Part I: Classification Al-Ammari, Maha and Tisseur, Francoise 2010 MIMS EPrint: 2010.9 Manchester Institute for Mathematical Sciences

More information

University of Colorado at Denver Mathematics Department Applied Linear Algebra Preliminary Exam With Solutions 16 January 2009, 10:00 am 2:00 pm

University of Colorado at Denver Mathematics Department Applied Linear Algebra Preliminary Exam With Solutions 16 January 2009, 10:00 am 2:00 pm University of Colorado at Denver Mathematics Department Applied Linear Algebra Preliminary Exam With Solutions 16 January 2009, 10:00 am 2:00 pm Name: The proctor will let you read the following conditions

More information

18.S34 linear algebra problems (2007)

18.S34 linear algebra problems (2007) 18.S34 linear algebra problems (2007) Useful ideas for evaluating determinants 1. Row reduction, expanding by minors, or combinations thereof; sometimes these are useful in combination with an induction

More information

PALINDROMIC LINEARIZATIONS OF A MATRIX POLYNOMIAL OF ODD DEGREE OBTAINED FROM FIEDLER PENCILS WITH REPETITION.

PALINDROMIC LINEARIZATIONS OF A MATRIX POLYNOMIAL OF ODD DEGREE OBTAINED FROM FIEDLER PENCILS WITH REPETITION. PALINDROMIC LINEARIZATIONS OF A MATRIX POLYNOMIAL OF ODD DEGREE OBTAINED FROM FIEDLER PENCILS WITH REPETITION. M.I. BUENO AND S. FURTADO Abstract. Many applications give rise to structured, in particular

More information

Math 25a Practice Final #1 Solutions

Math 25a Practice Final #1 Solutions Math 25a Practice Final #1 Solutions Problem 1. Suppose U and W are subspaces of V such that V = U W. Suppose also that u 1,..., u m is a basis of U and w 1,..., w n is a basis of W. Prove that is a basis

More information

Math 3013 Problem Set 6

Math 3013 Problem Set 6 Math 3013 Problem Set 6 Problems from 31 (pgs 189-190 of text): 11,16,18 Problems from 32 (pgs 140-141 of text): 4,8,12,23,25,26 1 (Problems 3111 and 31 16 in text) Determine whether the given set is closed

More information

(VI.C) Rational Canonical Form

(VI.C) Rational Canonical Form (VI.C) Rational Canonical Form Let s agree to call a transformation T : F n F n semisimple if there is a basis B = { v,..., v n } such that T v = λ v, T v 2 = λ 2 v 2,..., T v n = λ n for some scalars

More information

Lecture Notes in Linear Algebra

Lecture Notes in Linear Algebra Lecture Notes in Linear Algebra Dr. Abdullah Al-Azemi Mathematics Department Kuwait University February 4, 2017 Contents 1 Linear Equations and Matrices 1 1.2 Matrices............................................

More information

Optimal Scaling of Companion Pencils for the QZ-Algorithm

Optimal Scaling of Companion Pencils for the QZ-Algorithm Optimal Scaling of Companion Pencils for the QZ-Algorithm D Lemonnier, P Van Dooren 1 Introduction Computing roots of a monic polynomial may be done by computing the eigenvalues of the corresponding companion

More information

Inverse Eigenvalue Problem with Non-simple Eigenvalues for Damped Vibration Systems

Inverse Eigenvalue Problem with Non-simple Eigenvalues for Damped Vibration Systems Journal of Informatics Mathematical Sciences Volume 1 (2009), Numbers 2 & 3, pp. 91 97 RGN Publications (Invited paper) Inverse Eigenvalue Problem with Non-simple Eigenvalues for Damped Vibration Systems

More information

The following definition is fundamental.

The following definition is fundamental. 1. Some Basics from Linear Algebra With these notes, I will try and clarify certain topics that I only quickly mention in class. First and foremost, I will assume that you are familiar with many basic

More information

ELA

ELA SHARP LOWER BOUNDS FOR THE DIMENSION OF LINEARIZATIONS OF MATRIX POLYNOMIALS FERNANDO DE TERÁN AND FROILÁN M. DOPICO Abstract. A standard way of dealing with matrixpolynomial eigenvalue problems is to

More information

1. Introduction. In this paper we discuss a new class of structured matrix polynomial eigenproblems Q(λ)v = 0, where

1. Introduction. In this paper we discuss a new class of structured matrix polynomial eigenproblems Q(λ)v = 0, where STRUCTURED POLYNOMIAL EIGENPROBLEMS RELATED TO TIME-DELAY SYSTEMS H. FASSBENDER, D. S. MACKEY, N. MACKEY, AND C. SCHRÖDER Abstract. A new class of structured polynomial eigenproblems arising in the stability

More information

Problem Set (T) If A is an m n matrix, B is an n p matrix and D is a p s matrix, then show

Problem Set (T) If A is an m n matrix, B is an n p matrix and D is a p s matrix, then show MTH 0: Linear Algebra Department of Mathematics and Statistics Indian Institute of Technology - Kanpur Problem Set Problems marked (T) are for discussions in Tutorial sessions (T) If A is an m n matrix,

More information

STRUCTURE PRESERVING DEFLATION OF INFINITE EIGENVALUES IN STRUCTURED PENCILS

STRUCTURE PRESERVING DEFLATION OF INFINITE EIGENVALUES IN STRUCTURED PENCILS Electronic Transactions on Numerical Analysis Volume 44 pp 1 24 215 Copyright c 215 ISSN 168 9613 ETNA STRUCTURE PRESERVING DEFLATION OF INFINITE EIGENVALUES IN STRUCTURED PENCILS VOLKER MEHRMANN AND HONGGUO

More information

Bindel, Fall 2011 Intro to Scientific Computing (CS 3220) Week 3: Wednesday, Jan 9

Bindel, Fall 2011 Intro to Scientific Computing (CS 3220) Week 3: Wednesday, Jan 9 Problem du jour Week 3: Wednesday, Jan 9 1. As a function of matrix dimension, what is the asymptotic complexity of computing a determinant using the Laplace expansion (cofactor expansion) that you probably

More information

Conceptual Questions for Review

Conceptual Questions for Review Conceptual Questions for Review Chapter 1 1.1 Which vectors are linear combinations of v = (3, 1) and w = (4, 3)? 1.2 Compare the dot product of v = (3, 1) and w = (4, 3) to the product of their lengths.

More information

OHSx XM511 Linear Algebra: Solutions to Online True/False Exercises

OHSx XM511 Linear Algebra: Solutions to Online True/False Exercises This document gives the solutions to all of the online exercises for OHSx XM511. The section ( ) numbers refer to the textbook. TYPE I are True/False. Answers are in square brackets [. Lecture 02 ( 1.1)

More information

Dot Products. K. Behrend. April 3, Abstract A short review of some basic facts on the dot product. Projections. The spectral theorem.

Dot Products. K. Behrend. April 3, Abstract A short review of some basic facts on the dot product. Projections. The spectral theorem. Dot Products K. Behrend April 3, 008 Abstract A short review of some basic facts on the dot product. Projections. The spectral theorem. Contents The dot product 3. Length of a vector........................

More information

A Field Extension as a Vector Space

A Field Extension as a Vector Space Chapter 8 A Field Extension as a Vector Space In this chapter, we take a closer look at a finite extension from the point of view that is a vector space over. It is clear, for instance, that any is a linear

More information

Quantum wires, orthogonal polynomials and Diophantine approximation

Quantum wires, orthogonal polynomials and Diophantine approximation Quantum wires, orthogonal polynomials and Diophantine approximation Introduction Quantum Mechanics (QM) is a linear theory Main idea behind Quantum Information (QI): use the superposition principle of

More information

Linear Algebra. Matrices Operations. Consider, for example, a system of equations such as x + 2y z + 4w = 0, 3x 4y + 2z 6w = 0, x 3y 2z + w = 0.

Linear Algebra. Matrices Operations. Consider, for example, a system of equations such as x + 2y z + 4w = 0, 3x 4y + 2z 6w = 0, x 3y 2z + w = 0. Matrices Operations Linear Algebra Consider, for example, a system of equations such as x + 2y z + 4w = 0, 3x 4y + 2z 6w = 0, x 3y 2z + w = 0 The rectangular array 1 2 1 4 3 4 2 6 1 3 2 1 in which the

More information

MATHEMATICAL ENGINEERING TECHNICAL REPORTS. Eigenvector Error Bound and Perturbation for Polynomial and Rational Eigenvalue Problems

MATHEMATICAL ENGINEERING TECHNICAL REPORTS. Eigenvector Error Bound and Perturbation for Polynomial and Rational Eigenvalue Problems MATHEMATICAL ENGINEERING TECHNICAL REPORTS Eigenvector Error Bound and Perturbation for Polynomial and Rational Eigenvalue Problems Yuji NAKATSUKASA and Françoise TISSEUR METR 2016 04 April 2016 DEPARTMENT

More information

Frame Diagonalization of Matrices

Frame Diagonalization of Matrices Frame Diagonalization of Matrices Fumiko Futamura Mathematics and Computer Science Department Southwestern University 00 E University Ave Georgetown, Texas 78626 U.S.A. Phone: + (52) 863-98 Fax: + (52)

More information

Computing Unstructured and Structured Polynomial Pseudospectrum Approximations

Computing Unstructured and Structured Polynomial Pseudospectrum Approximations Computing Unstructured and Structured Polynomial Pseudospectrum Approximations Silvia Noschese 1 and Lothar Reichel 2 1 Dipartimento di Matematica, SAPIENZA Università di Roma, P.le Aldo Moro 5, 185 Roma,

More information

Key words. ellipsoids, signed distance, non-convex optimization, KKT conditions, Minkowski difference, two-parameter eigenvalue problem, Bézoutian

Key words. ellipsoids, signed distance, non-convex optimization, KKT conditions, Minkowski difference, two-parameter eigenvalue problem, Bézoutian COMPUTING THE SIGNED DISTANCE BETWEEN OVERLAPPING ELLIPSOIDS SATORU IWATA, YUJI NAKATSUKASA, AND AKIKO TAKEDA This version corrects an error in Sections 4., and revises Section 4.3 accordingly; the arguments

More information

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v ) Section 3.2 Theorem 3.6. Let A be an m n matrix of rank r. Then r m, r n, and, by means of a finite number of elementary row and column operations, A can be transformed into the matrix ( ) Ir O D = 1 O

More information

Distance bounds for prescribed multiple eigenvalues of matrix polynomials

Distance bounds for prescribed multiple eigenvalues of matrix polynomials Distance bounds for prescribed multiple eigenvalues of matrix polynomials Panayiotis J Psarrakos February 5, Abstract In this paper, motivated by a problem posed by Wilkinson, we study the coefficient

More information

Math 396. Quotient spaces

Math 396. Quotient spaces Math 396. Quotient spaces. Definition Let F be a field, V a vector space over F and W V a subspace of V. For v, v V, we say that v v mod W if and only if v v W. One can readily verify that with this definition

More information

Matrix Factorizations

Matrix Factorizations 1 Stat 540, Matrix Factorizations Matrix Factorizations LU Factorization Definition... Given a square k k matrix S, the LU factorization (or decomposition) represents S as the product of two triangular

More information

Proofs for Quizzes. Proof. Suppose T is a linear transformation, and let A be a matrix such that T (x) = Ax for all x R m. Then

Proofs for Quizzes. Proof. Suppose T is a linear transformation, and let A be a matrix such that T (x) = Ax for all x R m. Then Proofs for Quizzes 1 Linear Equations 2 Linear Transformations Theorem 1 (2.1.3, linearity criterion). A function T : R m R n is a linear transformation if and only if a) T (v + w) = T (v) + T (w), for

More information