Hessenberg Pairs of Linear Transformations

Similar documents
A classification of sharp tridiagonal pairs. Tatsuro Ito, Kazumasa Nomura, Paul Terwilliger

q-inverting pairs of linear transformations and the q-tetrahedron algebra

How to sharpen a tridiagonal pair

Bidiagonal pairs, Tridiagonal pairs, Lie algebras, and Quantum Groups

Totally Bipartite Tridiagonal Pairs. Kazumasa Nomura and Paul Terwilliger

Tridiagonal pairs in algebraic graph theory

Totally Bipartite Tridiagonal Pairs. Kazumasa Nomura and Paul Terwilliger

Tridiagonal pairs of q-racah type and the quantum enveloping algebra U q (sl 2 )

Leonard pairs and the q-tetrahedron algebra. Tatsuro Ito, Hjalmar Rosengren, Paul Terwilliger

Leonard triples of q-racah type

1 Last time: least-squares problems

A family of tridiagonal pairs related to the quantum affine algebra U_q(sl_2)

MATH SOLUTIONS TO PRACTICE MIDTERM LECTURE 1, SUMMER Given vector spaces V and W, V W is the vector space given by

Chapter 2: Linear Independence and Bases

Upper triangular matrices and Billiard Arrays

Linear Algebra 1. M.T.Nair Department of Mathematics, IIT Madras. and in that case x is called an eigenvector of T corresponding to the eigenvalue λ.

Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Definition (T -invariant subspace) Example. Example

The Cayley-Hamilton Theorem and the Jordan Decomposition

We simply compute: for v = x i e i, bilinearity of B implies that Q B (v) = B(v, v) is given by xi x j B(e i, e j ) =

The Terwilliger algebra of a Q-polynomial distance-regular graph with respect to a set of vertices

ON ORTHOGONAL REDUCTION TO HESSENBERG FORM WITH SMALL BANDWIDTH

Math 113 Winter 2013 Prof. Church Midterm Solutions

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )

arxiv: v1 [math.gr] 8 Nov 2008

Leonard Systems and their Friends

arxiv: v3 [math.ra] 10 Jun 2016

The Jordan Normal Form and its Applications

MAT 445/ INTRODUCTION TO REPRESENTATION THEORY

JORDAN NORMAL FORM. Contents Introduction 1 Jordan Normal Form 1 Conclusion 5 References 5

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix

Lecture Summaries for Linear Algebra M51A

ALGEBRA QUALIFYING EXAM PROBLEMS LINEAR ALGEBRA

LINEAR ALGEBRA BOOT CAMP WEEK 4: THE SPECTRAL THEOREM

Generalized eigenspaces

arxiv: v1 [math.ra] 10 Nov 2018

Kernels of Directed Graph Laplacians. J. S. Caughman and J.J.P. Veerman

A proof of the Jordan normal form theorem

NONCOMMUTATIVE POLYNOMIAL EQUATIONS. Edward S. Letzter. Introduction

Chapter 6: Orthogonality

Linear algebra 2. Yoav Zemel. March 1, 2012

Math 3191 Applied Linear Algebra

Geometric Mapping Properties of Semipositive Matrices

Homework Set 5 Solutions

AN UPPER BOUND FOR SIGNATURES OF IRREDUCIBLE, SELF-DUAL gl(n, C)-REPRESENTATIONS

ft-uiowa-math2550 Assignment NOTRequiredJustHWformatOfQuizReviewForExam3part2 due 12/31/2014 at 07:10pm CST

The Spectral Theorem for normal linear maps

Linear Algebra II Lecture 13

The Jordan canonical form

Math Final December 2006 C. Robinson

Foundations of Matrix Analysis

Direct Sums and Invariants. Direct Sums

Linear Algebra. Session 12

MATH 115A: SAMPLE FINAL SOLUTIONS

Infinite-Dimensional Triangularization

642:550, Summer 2004, Supplement 6 The Perron-Frobenius Theorem. Summer 2004

Eigenvalues and Eigenvectors

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

IRREDUCIBLE REPRESENTATIONS OF SEMISIMPLE LIE ALGEBRAS. Contents

Background on Linear Algebra - Lecture 2

A linear algebra proof of the fundamental theorem of algebra

Notes on nilpotent orbits Computational Theory of Real Reductive Groups Workshop. Eric Sommers

INTRODUCTION TO LIE ALGEBRAS. LECTURE 2.

A linear algebra proof of the fundamental theorem of algebra

Homework 2 Foundations of Computational Math 2 Spring 2019

MATH 221, Spring Homework 10 Solutions

The spectra of super line multigraphs

LINEAR ALGEBRA BOOT CAMP WEEK 2: LINEAR OPERATORS

Parameterizing orbits in flag varieties

CLASSIFICATION OF TREES EACH OF WHOSE ASSOCIATED ACYCLIC MATRICES WITH DISTINCT DIAGONAL ENTRIES HAS DISTINCT EIGENVALUES

Intrinsic products and factorizations of matrices

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

LINEAR ALGEBRA MICHAEL PENKAVA

Name: Final Exam MATH 3320

Constructing and (some) classification of integer matrices with integer eigenvalues

SCHUR-WEYL DUALITY FOR U(n)

CHAPTER 6. Representations of compact groups

D-bounded Distance-Regular Graphs

m We can similarly replace any pair of complex conjugate eigenvalues with 2 2 real blocks. = R

THE MINIMAL POLYNOMIAL AND SOME APPLICATIONS

Positive definite preserving linear transformations on symmetric matrix spaces

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

Detailed Proof of The PerronFrobenius Theorem

x i e i ) + Q(x n e n ) + ( i<n c ij x i x j

2 Two-Point Boundary Value Problems

First we introduce the sets that are going to serve as the generalizations of the scalars.

Linear Algebra. Workbook

A NOTE ON THE JORDAN CANONICAL FORM

and let s calculate the image of some vectors under the transformation T.

Recall : Eigenvalues and Eigenvectors

MAT2342 : Introduction to Applied Linear Algebra Mike Newman, fall Projections. introduction

Solving Homogeneous Systems with Sub-matrices

Mic ael Flohr Representation theory of semi-simple Lie algebras: Example su(3) 6. and 20. June 2003

3.2 Real and complex symmetric bilinear forms

arxiv: v1 [math.co] 3 Nov 2014

On Euclidean distance matrices

QUALITATIVE CONTROLLABILITY AND UNCONTROLLABILITY BY A SINGLE ENTRY

Chapter 2: Matrix Algebra

Lecture 19: Polar and singular value decompositions; generalized eigenspaces; the decomposition theorem (1)

Transcription:

Hessenberg Pairs of Linear Transformations Ali Godjali November 21, 2008 arxiv:0812.0019v1 [math.ra] 28 Nov 2008 Abstract Let K denote a field and V denote a nonzero finite-dimensional vector space over K. We consider an ordered pair of linear transformations A : V V and A : V V that satisfy (i) (iii) below. (i) Each of A,A is diagonalizable on V. (ii) There exists an ordering {V i } d i=0 of the eigenspaces of A such that where V 1 = 0, V d+1 = 0. A V i V 0 +V 1 +...+V i+1 (0 i d), (iii) There exists an ordering {V i }δ i=0 of the eigenspaces of A such that AV i V 0 +V 1 +...+V i+1 (0 i δ), where V 1 = 0, V δ+1 = 0. We call such a pair a Hessenberg pair on V. In this paper we obtain some characterizations of Hessenberg pairs. We also explain how Hessenberg pairs are related to tridiagonal pairs. Keywords: Leonard pair, tridiagonal pair, q-inverting pair, split decomposition. 2000 Mathematics Subject Classification: 15A04, 05E30. 1 Introduction In [1, Definition 1.1] Ito, Tanabe and Terwilliger introduced the notion of a tridiagonal pair of linear transformations. Loosely speaking, this is a pair of diagonalizable linear transformations on a nonzero finite-dimensional vector space, each of which acts on the eigenspaces of the other in a certain restricted way. In [1, Theorem 4.6] Ito et. al. showed that a tridiagonal pair induces a certain direct sum decomposition of the underlying vector space, called the split decomposition [1, Definition 4.1]. In order to clarify this result, in the present paper we introduce a generalization of a tridiagonal pair called a Hessenberg pair. Our main results are summarized as follows. Let V denote a nonzero finite-dimensional vector space, and let (A,A ) denote a pair of diagonalizable linear transformations on V. We show that if (A,A ) induces a split decomposition of V, then (A,A ) is a Hessenberg pair on V. Moreover the 1

converse holds provided that V has no proper nonzero subspaces that are invariant under each of A, A. The rest of this section contains precise statements of our main definitions and results. We will use the following terms. Let K denote a field and V denote a nonzero finite-dimensional vector space over K. By a linear transformation on V, we mean a K-linear map from V to V. Let A denote a linear transformation on V and let W denote a subspace of V. We call W an eigenspace of A whenever W 0 and there exists θ K such that W = {v V Av = θv}. In this case θ is called the eigenvalue of A corresponding to W. We say A is diagonalizable whenever V is spanned by the eigenspaces of A. Definition 1.1. By a Hessenberg pair on V, we mean an ordered pair (A,A ) of linear transformations on V that satisfy (i) (iii) below. (i) Each of A,A is diagonalizable on V. (ii) There exists an ordering {V i } d i=0 of the eigenspaces of A such that where V 1 = 0, V d+1 = 0. A V i V 0 +V 1 +...+V i+1 (0 i d), (1) (iii) There exists an ordering {V i }δ i=0 of the eigenspaces of A such that where V 1 = 0, V δ+1 = 0. AV i V 0 +V 1 +...+V i+1 (0 i δ), (2) Note 1.2. It is a common notational convention to use A to represent the conjugatetranspose of A. We are not using this convention. In a Hessenberg pair (A,A ) the linear transformations A and A are arbitrary subject to (i) (iii) above. Note 1.3. The term Hessenberg comes from matrix theory. A square matrix is called upper Hessenberg whenever each entry below the subdiagonal is zero [2, p. 28]. Referring to Definition 1.1, the orderings {V i } d i=0 and {Vi } δ i=0 are not unique in general. To facilitate our discussion of these orderings we introduce some terms. Let (A,A ) denote an ordered pair of diagonalizable linear transformations on V. Let {V i } d i=0 (resp. {Vi }δ i=0 ) denote any ordering of the eigenspaces of A (resp. A ). We say that the pair (A,A ) is Hessenberg with respect to ({V i } d i=0 ;{V i }δ i=0 ) whenever these orderings satisfy (1) and (2). Often it is convenient to focus on eigenvalues rather than eigenspaces. Let {θ i } d i=0 (resp. {θi} δ i=0) denote the ordering of the eigenvalues of A (resp. A ) that corresponds to {V i } d i=0 (resp. {Vi }δ i=0 ). We say that the pair (A,A ) is Hessenberg with respect to ({θ i } d i=0 ;{θ i }δ i=0 ) whenever (A,A ) is Hessenberg with respect to ({V i } d i=0;{v i } δ i=0). 2

Definition 1.4. Let (A,A ) denote an ordered pair of linear transformations on V. We say that the pair (A,A ) is irreducible whenever there is no subspace W of V such that AW W, A W W, W 0, W V. We are primarily interested in the irreducible Hessenberg pairs. However for parts of our argument the irreducibility assumption is not needed. As we will see in Proposition 2.4, for an irreducible Hessenberg pair the scalars d and δ from Definition 1.1 are equal. We now turn to the notion of a split decomposition. We will define this notion after a few preliminary comments. By a decomposition of V we mean a sequence {U i } d i=0 consisting of nonzero subspaces of V such that V = U 0 +U 1 + +U d (direct sum). (3) For notational convenience we set U 1 = 0, U d+1 = 0. For an example of a decomposition, let A denote a diagonalizable linear transformation on V. Then any ordering of the eigenspaces of A is a decomposition of V. Lemma 1.5. Let A denote a linear transformation on V. Let {U i } d i=0 denote a decomposition of V and let {θ i } d i=0 denote a sequence of mutually distinct elements of K. Assume (A θ i I)U i U i+1 (0 i d). (4) Then A is diagonalizable and {θ i } d i=0 are the eigenvalues of A. Proof: From (4) we see that, with respect to an appropriate basis for V, A is represented by a lower triangular matrix which has diagonal entries {θ i } d i=0, with θ i appearing dim(u i ) times for 0 i d. Therefore {θ i } d i=0 are the roots of the characteristic polynomial of A. It remains to show that A is diagonalizable. From (4) we see that d i=0 (A θ ii) vanishes on V. By this and since {θ i } d i=0 are distinct we see that the minimal polynomial of A has distinct roots. Therefore A is diagonalizable and the result follows. Definition 1.6. Let d denote a nonnegative integer. Let A (resp. A ) denote a diagonalizable linear transformation on V with eigenvalues {θ i } d i=0 (resp. {θ i }d i=0 ). By an (A,A )-split decomposition of V with respect to ({θ i } d i=0 ;{θ i }d i=0 ), we mean a decomposition {U i} d i=0 of V such that both for 0 i d. (A θ d i I)U i U i+1, (5) (A θ ii)u i U i 1 (6) As we will see in Corollary 3.4, the (A,A )-split decomposition of V with respect to ({θ i } d i=0 ;{θ i }d i=0 ) is unique if it exists. The main results of this paper are the following two theorems and subsequent corollary. 3

Theorem 1.7. Let d denote a nonnegative integer. Let A (resp. A ) denote a diagonalizable linear transformation on V with eigenvalues {θ i } d i=0 (resp. {θ i }d i=0 ). Suppose that the pair (A,A ) is irreducible, and Hessenberg with respect to ({θ i } d i=0;{θi} d i=0). Then there exists an (A,A )-split decomposition of V with respect to ({θ i } d i=0 ;{θ i }d i=0 ). Theorem 1.8. Let d denote a nonnegative integer. Let A (resp. A ) denote a diagonalizable linear transformation on V with eigenvalues {θ i } d i=0 (resp. {θ i} d i=0). Suppose that there exists an (A,A )-split decomposition of V with respect to ({θ i } d i=0 ;{θ i }d i=0 ). Then the pair (A,A ) is Hessenberg with respect to ({θ i } d i=0; {θ i} d i=0). Combining Theorem 1.7 and Theorem 1.8 we obtain the following corollary. Corollary 1.9. Let d denote a nonnegative integer. Let A (resp. A ) denote a diagonalizable linear transformation on V with eigenvalues {θ i } d i=0 (resp. {θ i} d i=0). Assume the pair (A,A ) is irreducible. Then the following (i), (ii) are equivalent. (i) The pair (A,A ) is Hessenberg with respect to ({θ i } d i=0 ; {θ i }d i=0 ). (ii) There exists an (A,A )-split decomposition of V with respect to ({θ i } d i=0 ;{θ i }d i=0 ). 2 The Proof of Theorem 1.7 In this section we give a proof of Theorem 1.7. Along the way, we show that the scalars d and δ from Definition 1.1 are equal. We will refer to the following setup. Assumption 2.1. Let A (resp. A ) denote a diagonalizable linear transformation onv with eigenvalues {θ i } d i=0 (resp. {θi} δ i=0). Let {V i } d i=0 (resp. {Vi } δ i=0) denote the corresponding eigenspaces of A (resp. A ). We assume that the pair (A,A ) is irreducible and Hessenberg with respect to ({θ i } d i=0;{θi} δ i=0). For all integers i and j we set V ij = (V 0 + +V i ) (V0 + +V j ). (7) We interpret the sum on the left in (7) to be 0 (resp. V) if i < 0 (resp. i > d). We interpret the sum on the right in (7) to be 0 (resp. V) if j < 0 (resp. j > δ). Lemma 2.2. With reference to Assumption 2.1, the following (i), (ii) hold for 0 i d and 0 j δ. (i) V iδ = V 0 + +V i. (ii) V dj = V 0 + +V j. Proof: (i) Set j = δ in (7) and use the fact that V = V 0 + +V δ. (ii) Set i = d in (7) and use the fact that V = V 0 + +V d. Lemma 2.3. With reference to Assumption 2.1, the following (i), (ii) hold for 0 i d and 0 j δ. (i) (A θ i I)V ij V i 1,j+1. 4

(ii) (A θ ji)v ij V i+1,j 1. Proof: (i) Since V i is the eigenspace of A corresponding to the eigenvalue θ i, we have Using (2) we find (A θ i I) (A θ i I) i i 1 V h = V h. (8) j j+1 Vh Vh. (9) Evaluating (A θ i I)V ij using (7) (9), we find it is contained in V i 1,j+1. (ii) Using (1) we find (A θ ji) i i+1 V h V h. (10) Since Vj is the eigenspace of A corresponding to the eigenvalue θj, we have j 1 j (A θj I) Vh = Vh. (11) Evaluating (A θ j I)V ij using (7), (10), (11), we find it is contained in V i+1,j 1. Proposition 2.4. With reference to Assumption 2.1, the scalars d and δ from Definition 1.1 are equal. Moreover, V ij = 0 if i+j < d, (0 i,j d). (12) Proof: For all nonnegative integers r such that r d and r δ, we define W r = V 0r +V 1,r 1 + +V r0. (13) We have AW r W r by Lemma 2.3(i) and A W r W r by Lemma 2.3(ii). Now W r = 0 or W r = V since the pair (A,A ) is irreducible. Suppose for the moment that r d 1. Each term on the right in (13) is contained in V 0 + +V r so W r V 0 + +V r. Thus W r V and hence W r = 0. Next suppose r = d. Then V d0 W r. Recall V d0 = V0 by Lemma 2.2(ii) and V0 0 so V d0 0. Now W r 0 so W r = V. We have now shown that W r = 0 if r d 1 and W r = V if r = d. Similarly W r = 0 if r δ 1 and W r = V if r = δ. Now d = δ; otherwise we take r = min(d,δ) in our above comments and find W r is both 0 and V, for a contradiction. The result follows. Lemma 2.5. With reference to Assumption 2.1, the sequence {V d i,i } d i=0 is an (A,A )-split decomposition of V with respect to ({θ i } d i=0 ;{θ i }d i=0 ). 5

Proof: Observe that (5) follows from Lemma 2.3(i) and (6) follows from Lemma 2.3(ii). It remains to show that the sequence {V d i,i } d i=0 is a decomposition. We first show V = d V d i,i. (14) i=0 Let W denote the sum on the right in (14). We have AW W by Lemma 2.3(i) and A W W by Lemma 2.3(ii). Now W = 0 or W = V by the irreducibility assumption. Observe that W contains V d0 and V d0 = V0 is nonzero so W 0. We conclude that W = V and (14) follows. Next we show that the sum (14) is direct. To do this we show that (V d0 +V d 1,1 + +V d i+1,i 1 ) V d i,i (15) is zero for 1 i d. Let i be given. From the construction for 0 j i 1, and V d j,j V 0 +V 1 + +V i 1 Therefore (15) is contained in V d i,i V 0 +V 1 + +V d i. (V 0 +V 1 + +V d i ) (V0 +V 1 + +V i 1 ). (16) But (16) is equal to V d i,i 1 and this is zero by (12), so (15) is zero. We have shown that the sum (14) is direct. Next we show that V d i,i 0 for 0 i d. Suppose there exists an integer i (0 i d) such that V d i,i = 0. Observe that i 0 since V d0 = V0 is nonzero and i d since V 0d = V 0 is nonzero. Set W = V d0 +V d 1,1 + +V d i+1,i 1 and observe that W 0 and W V by our above remarks. By Lemma 2.3(ii), we find A W W. By Lemma 2.3(i) and since V d i,i = 0, we find AW W. Now W = 0 or W = V by our irreducibility assumption, which yields a contradiction. We conclude that V d i,i 0 for 0 i d. We have now shown that the sequence {V d i,i } d i=0 is a decomposition of V and we are done. Theorem 1.7 is immediate from Lemma 2.5. 3 The Proof of Theorem 1.8 In this section we give a proof of Theorem 1.8. Along the way, we show that the split decomposition from Definition 1.6 is unique if it exists. The following assumption sets the stage. Assumption 3.1. Let d denote a nonnegative integer. Let A (resp. A ) denote a diagonalizable linear transformation on V with eigenvalues {θ i } d i=0 (resp. {θ i }d i=0 ). Let {V i} d i=0 (resp. {Vi } d i=0) denote the corresponding eigenspaces of A (resp. A ). We assume that there exists a decomposition {U i } d i=0 of V that is (A,A )-split with respect to ({θ i } d i=0 ;{θ i }d i=0 ). 6

Lemma 3.2. With reference to Assumption 3.1, for 0 i d both Proof: First consider (17). We abbreviate U i +U i+1 + +U d = V 0 +V 1 + +V d i, (17) U 0 +U 1 + +U i = V 0 +V 1 + +V i. (18) W = U i +U i+1 + +U d, Z = V 0 +V 1 + +V d i. We show W = Z. To obtain Z W, set X = i 1 (A θ d hi), and observe Z = XV by elementary linear algebra. Using (5), we find XU j W for 0 j d, so XV W in view of (3). We now have Z W. To obtain W Z, set Y = d h=i (A θ d hi), and observe Z = {v V Yv = 0}. (19) Using (5), we find YU j = 0 for i j d, so YW = 0. Combining this with (19), we find W Z. We now have Z = W and hence (17) holds. Line (18) is similarly obtained using (6). Lemma 3.3. With reference to Assumption 3.1, U i = (V 0 +V 1 + +V i ) (V 0 +V 1 + +V d i ) (0 i d). (20) Proof: Since {U i } d i=0 is a decomposition of V, U i = (U 0 +U 1 + +U i ) (U i +U i+1 + +U d ) (0 i d). (21) Evaluating (21) using (17), (18) we obtain (20). Corollary 3.4. With referenceto Assumption 3.1, the splitdecomposition{u i } d i=0 is uniquely determined by the given orderings of the eigenvalues {θ i } d i=0 and {θ i }d i=0. Proof: Immediate from Lemma 3.3. Lemma 3.5. With reference to Assumption 3.1, for 0 i d both Moreover (A,A ) is a Hessenberg pair on V. Proof: To obtain (22), observe A V i V 0 +V 1 + +V i+1, (22) AV i V 0 +V 1 + +V i+1. (23) A V i A i = A d = V h h=d i d h=d i 1 U h (by (17)) U h (by (6)) i+1 V h (by (17)). 7

To obtain (23), observe AV i A = A = i i+1 V h i U h (by (18)) U h (by (5)) i+1 Vh (by (18)). Theorem 1.8 is immediate from Lemma 3.5. We finish this section with a comment. Corollary 3.6. With reference to Assumption 3.1, for 0 i d the dimensions of V d i, V i, U i are the same. Proof: Recall that {V i } d i=0 and {U i} d i=0 are decompositions of V. By this and (17), dim(u i )+dim(u i+1 )+ +dim(u d ) = dim(v 0 )+dim(v 1 )+ +dim(v d i ) for 0 i d. Consequently, the dimensions of V d i and U i are the same for 0 i d. A similar argument using (18) shows that the dimensions of Vi and U i are the same for 0 i d. The result follows. 4 Hessenberg pairs and tridiagonal pairs In this section, we explain how Hessenberg pairs are related to tridiagonal pairs. Using this relationship we show that some results [1, Lemma 4.5], [1, Theorem 4.6] about tridiagonal pairs are direct consequences of our results on Hessenberg pairs. We start by recalling the definition of a tridiagonal pair. Definition 4.1. [1, Definition 1.1] By a tridiagonal pair on V, we mean an ordered pair (A,A ) of linear transformations on V that satisfy (i) (iv) below. (i) Each of A,A is diagonalizable on V. (ii) There exists an ordering {V i } d i=0 of the eigenspaces of A such that where V 1 = 0, V d+1 = 0. A V i V i 1 +V i +V i+1 (0 i d), (24) 8

(iii) There exists an ordering {V i } δ i=0 of the eigenspaces of A such that where V 1 = 0, V δ+1 = 0. AV i V i 1 +V i +V i+1 (0 i δ), (25) (iv) The pair (A,A ) is irreducible in the sense of Definition 1.4. Definition 4.2. Let (A,A ) denote an ordered pair of diagonalizable linear transformations onv. Let{V i } d i=0 (resp. {Vi }δ i=0 )denoteanyorderingoftheeigenspaces ofa(resp. A ). We say that the pair (A,A ) is tridiagonal with respect to ({V i } d i=0;{vi } δ i=0) whenever conditions (ii) (iv) in Definition 4.1 are satisfied. Remark 4.3. With reference to Definition 4.2, assume that (A,A ) is tridiagonal with respect to ({V i } d i=0;{vi } δ i=0). By [1, Lemma 2.4] the pair (A,A ) is tridiagonal with respect to each of ({V d i } d i=0 ;{V i }δ i=0 ), ({V i} d i=0 ;{V δ i }δ i=0 ), ({V d i} d i=0 ;{V δ i }δ i=0 ) and no further orderings of the eigenspaces. Hessenberg pairs and tridiagonal pairs are related as follows. Proposition 4.4. Let A (resp. A ) denote a diagonalizable linear transformation on V with eigenspaces {V i } d i=0 (resp. {V i } δ i=0). Then the following (i) (iv) are equivalent. (i) The pair (A,A ) is tridiagonal with respect to ({V i } d i=0 ;{V i }δ i=0 ). (ii) Thepair (A,A ) is irreducible, andhessenbergwith respectto eachof({v i } d i=0;{v ({V d i } d i=0 ;{V i }δ i=0 ), ({V i} d i=0 ;{V δ i }δ i=0 ), ({V d i} d i=0 ;{V δ i }δ i=0 ). (iii) Thepair (A,A ) is irreducible, andhessenbergwith respectto eachof({v i } d i=0;{v ({V d i } d i=0 ;{V δ i }δ i=0 ). (iv) Thepair (A,A ) is irreducible, andhessenbergwith respectto eachof({v d i } d i=0;{v ({V i } d i=0 ;{V δ i }δ i=0 ). i } δ i=0), i } δ i=0), Proof: Observe that {V i } d i=0 satisfies (24) if and only if both {V i } d i=0 and {V d i } d i=0 satisfy (1). Similarly {Vi }δ i=0 satisfies (25) if and only if both {Vi }δ i=0 and {Vd i }δ i=0 satisfy (2). The result follows. In Proposition 4.4 we showed how Hessenberg pairs are related to tridiagonal pairs. We now use this relationship to obtain some results on tridiagonal pairs. Theorem 4.5. [1, Lemma 4.5] Let (A,A ) denote a tridiagonal pair as in Definition 4.1. Then the scalars d and δ from that definition are equal. Proof: Combine Proposition 2.4 and Proposition 4.4. Definition 4.6. Let (A,A ) denote an ordered pair of diagonalizable linear transformations on V. Let {θ i } d i=0 (resp. {θ i }δ i=0 ) denote any ordering of the eigenvalues of A (resp. A ). Let {V i } d i=0 (resp. {Vi }δ i=0 ) denote the corresponding ordering of the eigenspaces of A (resp. A ). We say that the pair (A,A ) is tridiagonal with respect to ({θ i } d i=0;{θi} δ i=0) whenever (A,A ) is tridiagonal with respect to ({V i } d i=0 ;{V i }δ i=0 ). 9 i } δ i=0),

Theorem 4.7. [1, Theorem 4.6] Let d denote a nonnegative integer. Let A (resp. A ) denote a diagonalizable linear transformation on V with eigenvalues {θ i } d i=0 (resp. {θ i }d i=0 ). Then the following (i) (iv) are equivalent. (i) The pair (A,A ) is tridiagonal with respect to ({θ i } d i=0 ;{θ i }d i=0 ). (ii) The pair (A,A ) is irreducible, and there exist (A,A )-split decompositions of V with respect to each of ({θ i } d i=0;{θ i} d i=0), ({θ d i } d i=0;{θ i} d i=0), ({θ i } d i=0;{θ d i }d i=0), ({θ d i } d i=0 ;{θ d i }d i=0 ). (iii) The pair (A,A ) is irreducible, and there exist (A,A )-split decompositions of V with respect to each of ({θ i } d i=0;{θ i} d i=0), ({θ d i } d i=0;{θ d i }d i=0). (iv) The pair (A,A ) is irreducible, and there exist (A,A )-split decompositions of V with respect to each of ({θ d i } d i=0;{θ i} d i=0), ({θ i } d i=0;{θ d i }d i=0). Proof: Combine Corollary 1.9 and Proposition 4.4. 5 Acknowledgement This paper was written while the author was a graduate student at the University of Wisconsin-Madison. The author would like to thank his advisor Paul Terwilliger for his many valuable ideas and suggestions. References [1] T. Ito, K. Tanabe, and P. Terwilliger. Some algebra related to P- and Q-polynomial association schemes, in: Codes and Association Schemes (Piscataway NJ, 1999), Amer. Math. Soc., Providence RI, 2001, pp. 167 192; arxiv:math.co/0406556. [2] R. Horn and C. Johnson, Matrix Analysis, Cambridge University Press, New York, 1985. Ali Godjali Department of Mathematics University of Wisconsin Van Vleck Hall 480 Lincoln Drive Madison, WI 53706-1388 USA email: godjali@math.wisc.edu 10