Assistant Professor of Mathematics 3/7/18
Motivation Let A = [ 1 3 1 2 ]. Suppose we d like to find the 99 th power of A, Would you believe that A 99 = A 99 = A } A {{ A A}. 99 times [ 1 0 0 1 ]?
Motivation Let A = [ 1 3 1 2 ]. Suppose we d like to find the 99 th power of A, Would you believe that A 99 = A 99 = A } A {{ A A}. 99 times [ 1 0 0 1 ]?
Motivation Let A = [ 1 3 1 2 ]. Suppose we d like to find the 99 th power of A, Would you believe that A 99 = A 99 = A } A {{ A A}. 99 times [ 1 0 0 1 ]?
Setup [ a b In general, let A = c d ]. Suppose we d like to find the n th power of A, A n = A } A {{ A A}. n times Matrix multiplication is a costly operation. Can we find a formula for A n?
Setup [ a b In general, let A = c d ]. Suppose we d like to find the n th power of A, A n = A } A {{ A A}. n times Matrix multiplication is a costly operation. Can we find a formula for A n?
Setup [ a b In general, let A = c d ]. Suppose we d like to find the n th power of A, A n = A } A {{ A A}. n times Matrix multiplication is a costly operation. Can we find a formula for A n?
Setup Yes! (With some caveats.) Suppose we already know the eigenvalues of A. Definition We say a matrix A has eigenvalue α if Ax = αx for some (eigen)vector x.
Setup Yes! (With some caveats.) Suppose we already know the eigenvalues of A. Definition We say a matrix A has eigenvalue α if Ax = αx for some (eigen)vector x.
Setup Yes! (With some caveats.) Suppose we already know the eigenvalues of A. Definition We say a matrix A has eigenvalue α if Ax = αx for some (eigen)vector x.
Setup Definition The complete homogeneous symmetric polynomial of degree k in two variables is h k (x, y) = x i y j. Some examples: h 0 (x, y) = 1 h 1 (x, y) = x + y h 2 (x, y) = x 2 + xy + y 2 h 3 (x, y) = x 3 + x 2 y + xy 2 + y 3 i+j=k, i,j 0
2 2 Theorem Theorem (2015) Suppose a 2 2 matrix A has eigenvalues α and β. Then, for any integer n 2, where I = [ 1 0 0 1 A n = h n 1 (α, β)a αβh n 2 (α, β)i, ].
An Example Example Suppose A = [ 1 3 1 2 ]. Let s find A 3. A has eigenvalues α = 1+ 3 2 and β = 1 3 2. A 3 = h 2 (α, β)a αβh 1 (α, β)i = (α 2 + αβ + β 2 )A αβ(α + β)i = 0A (1)( 1)I = I.
An Example Example Suppose A = [ 1 3 1 2 ]. Let s find A 3. A has eigenvalues α = 1+ 3 2 and β = 1 3 2. A 3 = h 2 (α, β)a αβh 1 (α, β)i = (α 2 + αβ + β 2 )A αβ(α + β)i = 0A (1)( 1)I = I.
An Example Example Suppose A = [ 1 3 1 2 ]. Let s find A 3. A has eigenvalues α = 1+ 3 2 and β = 1 3 2. A 3 = h 2 (α, β)a αβh 1 (α, β)i = (α 2 + αβ + β 2 )A αβ(α + β)i = 0A (1)( 1)I = I.
Back to the First Slide Notice that this matrix is the same A from the first slide. Since A 3 = I, it follows that A 99 = (A 3 ) 33 = I 33 = I. This behavior appears to be easiest to describe if we find the smallest n with A n = I.
Back to the First Slide Notice that this matrix is the same A from the first slide. Since A 3 = I, it follows that A 99 = (A 3 ) 33 = I 33 = I. This behavior appears to be easiest to describe if we find the smallest n with A n = I.
Back to the First Slide Notice that this matrix is the same A from the first slide. Since A 3 = I, it follows that A 99 = (A 3 ) 33 = I 33 = I. This behavior appears to be easiest to describe if we find the smallest n with A n = I.
Back to the First Slide Notice that this matrix is the same A from the first slide. Since A 3 = I, it follows that A 99 = (A 3 ) 33 = I 33 = I. This behavior appears to be easiest to describe if we find the smallest n with A n = I.
Projective Order Definition We say A has projective order n if n is the smallest positive integer so that A n = λi, λ 0. In the last example we discovered that A 3 = I. This is because A has projective order 3. It just so happens that in this case, λ = 1.
Projective Order Definition We say A has projective order n if n is the smallest positive integer so that A n = λi, λ 0. In the last example we discovered that A 3 = I. This is because A has projective order 3. It just so happens that in this case, λ = 1.
Projective Order Definition We say A has projective order n if n is the smallest positive integer so that A n = λi, λ 0. In the last example we discovered that A 3 = I. This is because A has projective order 3. It just so happens that in this case, λ = 1.
Projective Order Definition We say A has projective order n if n is the smallest positive integer so that A n = λi, λ 0. In the last example we discovered that A 3 = I. This is because A has projective order 3. It just so happens that in this case, λ = 1.
Motivation Part 2 Suppose we represent a function f with the matrix A. If A has projective order n and we compose f with itself n times: f (f (f f (x))) }{{} n times this composition behaves like a multiple of the identity!
Motivation Part 2 Suppose we represent a function f with the matrix A. If A has projective order n and we compose f with itself n times: f (f (f f (x))) }{{} n times this composition behaves like a multiple of the identity!
Motivation Part 2 Suppose we represent a function f with the matrix A. If A has projective order n and we compose f with itself n times: f (f (f f (x))) }{{} n times this composition behaves like a multiple of the identity!
Linear Fractional Transformation Definition [ ] a b Let M = with ad bc 0. The linear fractional c d transformation is defined as: M x = ax + b cx + d Note that if λ 0, (λm) x = M x.
Linear Fractional Transformation Definition [ ] a b Let M = with ad bc 0. The linear fractional c d transformation is defined as: M x = ax + b cx + d Note that if λ 0, (λm) x = M x.
Interpretation [ a b Suppose M = c d ] has projective order n. Then n is the smallest positive integer such that M n x = (λi) x = λx+0 0x+λ = x.
Interpretation [ a b Suppose M = c d ] has projective order n. Then n is the smallest positive integer such that M n x = (λi) x = λx+0 0x+λ = x.
A Particular Application If c = 0, M x acts like a (boring) linear function. If c 0, we may essentially force c = 1, since λm and M are equivalent: [ ] [ a b a b M = = c ] c d 1 d where a = a/c, b = b/c, d = d/c. We will focus on this second case.
A Particular Application If c = 0, M x acts like a (boring) linear function. If c 0, we may essentially force c = 1, since λm and M are equivalent: [ ] [ a b a b M = = c ] c d 1 d where a = a/c, b = b/c, d = d/c. We will focus on this second case.
A Particular Application If c = 0, M x acts like a (boring) linear function. If c 0, we may essentially force c = 1, since λm and M are equivalent: [ ] [ a b a b M = = c ] c d 1 d where a = a/c, b = b/c, d = d/c. We will focus on this second case.
A Particular Application If c = 0, M x acts like a (boring) linear function. If c 0, we may essentially force c = 1, since λm and M are equivalent: [ ] [ a b a b M = = c ] c d 1 d where a = a/c, b = b/c, d = d/c. We will focus on this second case.
The Problem If M = [ a b 1 d ] and has projective order n, what must its entries look like?
The Problem Notice that M appears to have projective order 2 if a + d = 0. In this case, and M 2 = M = [ a b 1 a ] [ a 2 + b 0 0 a 2 + b ]. To make sure that M isn t a zero matrix, we should also require that b a 2. This case is covered by the definition of the linear fractional transformation.
The Problem Notice that M appears to have projective order 2 if a + d = 0. In this case, and M 2 = M = [ a b 1 a ] [ a 2 + b 0 0 a 2 + b ]. To make sure that M isn t a zero matrix, we should also require that b a 2. This case is covered by the definition of the linear fractional transformation.
The Problem Notice that M appears to have projective order 2 if a + d = 0. In this case, and M 2 = M = [ a b 1 a ] [ a 2 + b 0 0 a 2 + b ]. To make sure that M isn t a zero matrix, we should also require that b a 2. This case is covered by the definition of the linear fractional transformation.
The Problem Theorem (2013) If M λi has entries from Q, only these projective orders are possible: order 2: order 3: order 4: order 6: [ a b M = λ 1 a [ a a M = λ 2 ad d 2 1 d [ ( a 1 M = λ 2 a 2 d 2) 1 d ], b a 2 [ ( a 1 M = λ 3 a 2 + ad d 2) 1 d ], d a ], d a ], d a
The Problem Theorem (2013) If M λi has entries from Q, only these projective orders are possible: order 2: order 3: order 4: order 6: [ a b M = λ 1 a [ a a M = λ 2 ad d 2 1 d [ ( a 1 M = λ 2 a 2 d 2) 1 d ], b a 2 [ ( a 1 M = λ 3 a 2 + ad d 2) 1 d ], d a ], d a ], d a
Example for Order 4 Example Let λ = 1, a = 2, and d = 4. Then, if b = 1 2 we have [ ] 2 10 M = 1 4 and [ ] M 2 6 20 = 2 6 [ ] M 3 8 20 = 2 4 [ ] M 4 4 0 = 0 4 = 4I ( a 2 + d 2) = 10,
Example for Order 4 Example Let λ = 1, a = 2, and d = 4. Then, if b = 1 2 we have [ ] 2 10 M = 1 4 and [ ] M 2 6 20 = 2 6 [ ] M 3 8 20 = 2 4 [ ] M 4 4 0 = 0 4 = 4I ( a 2 + d 2) = 10,
Example for Order 4 Example Let λ = 1, a = 2, and d = 4. Then, if b = 1 2 we have [ ] 2 10 M = 1 4 and [ ] M 2 6 20 = 2 6 [ ] M 3 8 20 = 2 4 [ ] M 4 4 0 = 0 4 = 4I ( a 2 + d 2) = 10,
What About the Eigenvalues? We can come up with the previous theorem in at least two ways: By simply multiplying n copies of a generic M together and solving the nonlinear system of equations. By using M n = h n 1 (α, β)m αβh n 2 (α, β)i and searching for a pattern in the eigenvalues.
What About the Eigenvalues? We can come up with the previous theorem in at least two ways: By simply multiplying n copies of a generic M together and solving the nonlinear system of equations. By using M n = h n 1 (α, β)m αβh n 2 (α, β)i and searching for a pattern in the eigenvalues.
What About the Eigenvalues? We can come up with the previous theorem in at least two ways: By simply multiplying n copies of a generic M together and solving the nonlinear system of equations. By using M n = h n 1 (α, β)m αβh n 2 (α, β)i and searching for a pattern in the eigenvalues.
Further Explorations With some generalization, we may... investigate cases besides just the rational numbers. [ ] 1 1 M = 1 2 has projective order 5 over the integers modulo 11. generalize to higher dimensions. 1 3 3 M = 3 1 3 2 2 0 has projective order 3 over the integers modulo 7.
Further Explorations With some generalization, we may... investigate cases besides just the rational numbers. [ ] 1 1 M = 1 2 has projective order 5 over the integers modulo 11. generalize to higher dimensions. 1 3 3 M = 3 1 3 2 2 0 has projective order 3 over the integers modulo 7.
Further Explorations With some generalization, we may... investigate cases besides just the rational numbers. [ ] 1 1 M = 1 2 has projective order 5 over the integers modulo 11. generalize to higher dimensions. 1 3 3 M = 3 1 3 2 2 0 has projective order 3 over the integers modulo 7.
Further Explorations With some generalization, we may... investigate cases besides just the rational numbers. [ ] 1 1 M = 1 2 has projective order 5 over the integers modulo 11. generalize to higher dimensions. 1 3 3 M = 3 1 3 2 2 0 has projective order 3 over the integers modulo 7.
Continued Research In order to move these concepts into higher dimensions, we need a similar formula for A n when A is 3 3, 4 4, and beyond. My research over the last year has been consumed with this pursuit.
Continued Research In order to move these concepts into higher dimensions, we need a similar formula for A n when A is 3 3, 4 4, and beyond. My research over the last year has been consumed with this pursuit.
Higher Dimensions Definition The complete homogeneous symmetric polynomial of degree k in n variables is Some examples: h k (x 1, x 2,..., x n ) = h 0 (x, y, z) = 1 h 1 (x, y, z) = x + y + z i 1 +i 2 + i n=k, i j 0 x i 1 1 x i 2 2 x in n. h 3 (x, y, z) = x 3 +x 2 y+x 2 z+xy 2 +xz 2 +xyz+y 3 +y 2 z+yz 2 +z 3
The 3 3 Case Theorem (2017) Suppose A is a 3 3 matrix with eigenvalues α, β, γ. Then, for any integer n 2, A n = h n (α, β, γ)i h n 1 (α, β, γ) [(α + β + γ)i A] + h n 2 (α, β, γ) [ (αβ + αγ + βγ)i (α + β + γ)a + A 2], where I is the 3 3 identity matrix.
The 4 4 Case Theorem (2018) Suppose A is a 4 4 matrix with eigenvalues α, β, γ, δ. Then, for any integer n 3, A n = h n I h n 1 [e 1 I A] + h n 2 [ e2 I e 1 A + A 2] h n 3 [ e3 I e 2 A + e 1 A 2 A 3], where I is the 4 4 identity matrix, h k = h k (α, β, γ, δ), e 1 = α + β + γ + δ, e 2 = αβ + αγ + αδ + βγ + βδ + γδ, and e 3 = αβγ + αβδ + αγδ + βγδ.
Noticing a Pattern Definition The elementary symmetric polynomial of degree k in n variables is e k (x 1, x 2,..., x n ) = x i1 x i2 x ik. 1 i 1 <i 2 < i k n if k n and 0 otherwise. Some examples: e 1 (x, y, z) = x + y + z e 2 (x, y, z) = xy + xz + yz e 3 (a, b, c, d) = abc + abd + acd + bcd e k (x, y, z, 0) = e k (x, y, z)
Noticing a Pattern Definition The elementary symmetric polynomial of degree k in n variables is e k (x 1, x 2,..., x n ) = x i1 x i2 x ik. 1 i 1 <i 2 < i k n if k n and 0 otherwise. Some examples: e 1 (x, y, z) = x + y + z e 2 (x, y, z) = xy + xz + yz e 3 (a, b, c, d) = abc + abd + acd + bcd e k (x, y, z, 0) = e k (x, y, z)
The Conjecture Conjecture (2018) Suppose A is a k k matrix with known eigenvalues. Let h i and e i refer to those polynomials evaluated at the eigenvalues. Then, for any integer n k 1, k 1 i A n = ( 1) i h n i ( 1) j e i j A j, i=0 j=0 where we allow the convention A 0 = I.
One Last Note Finding the eigenvalues of a matrix is a time-consuming job, and appears to only be the first step to using these formulas. In general, eigenvalue decomposition allows you to (usually) write a square matrix A as A = QΛQ 1, where Q is a square matrix with columns made from the eigenvectors of A, and Λ is a diagonal matrix whose diagonal elements are the corresponding eigenvalues. Typically, this is the route taken to find large powers of A, since A n = QΛQ 1 QΛQ 1 QΛQ 1 = QΛ n Q 1.
One Last Note Finding the eigenvalues of a matrix is a time-consuming job, and appears to only be the first step to using these formulas. In general, eigenvalue decomposition allows you to (usually) write a square matrix A as A = QΛQ 1, where Q is a square matrix with columns made from the eigenvectors of A, and Λ is a diagonal matrix whose diagonal elements are the corresponding eigenvalues. Typically, this is the route taken to find large powers of A, since A n = QΛQ 1 QΛQ 1 QΛQ 1 = QΛ n Q 1.
One Last Note Finding the eigenvalues of a matrix is a time-consuming job, and appears to only be the first step to using these formulas. In general, eigenvalue decomposition allows you to (usually) write a square matrix A as A = QΛQ 1, where Q is a square matrix with columns made from the eigenvectors of A, and Λ is a diagonal matrix whose diagonal elements are the corresponding eigenvalues. Typically, this is the route taken to find large powers of A, since A n = QΛQ 1 QΛQ 1 QΛQ 1 = QΛ n Q 1.
One Last Note Does this mean our formula is strictly worse than eigenvalue decomposition? No! If we re careful, we may actually write these symmetric polynomials entirely in terms of the entries of A, without doing any eigen-stuff at all! Additionally, our formula works even if the matrix isn t diagonalizable.
One Last Note Does this mean our formula is strictly worse than eigenvalue decomposition? No! If we re careful, we may actually write these symmetric polynomials entirely in terms of the entries of A, without doing any eigen-stuff at all! Additionally, our formula works even if the matrix isn t diagonalizable.
One Last Note Does this mean our formula is strictly worse than eigenvalue decomposition? No! If we re careful, we may actually write these symmetric polynomials entirely in terms of the entries of A, without doing any eigen-stuff at all! Additionally, our formula works even if the matrix isn t diagonalizable.
Future Research Special matrices (stochastic, symmetric, etc) Geometric interpretation Further study of symmetric polynomials Computation!
Future Research Special matrices (stochastic, symmetric, etc) Geometric interpretation Further study of symmetric polynomials Computation!
Future Research Special matrices (stochastic, symmetric, etc) Geometric interpretation Further study of symmetric polynomials Computation!
Future Research Special matrices (stochastic, symmetric, etc) Geometric interpretation Further study of symmetric polynomials Computation!
References J. Boone, Higher-order Lucas sequences and Dickson polynomials, Ph. D. Dissertation, (2013). R. Fitzgerald and J. Yucas, A generalization of Dickson polynomials via linear fractional transformations, International Journal of Mathematics and Computer Science 1 (2006), 391 416. S. Friedberg, A. Insel, and L. Spence, Linear Algebra, Prentice Hall, (2003). I.G. Macdonald, Symmetric functions and Hall polynomials, Oxford: Clarendon Press, (1995).
Questions/Comments? Thank you for your kind attention! Feel free to contact me with any comments, concerns, ideas, or questions: Hamilton Math & Science Building Office 341 joshua.boone@lmunet.edu joshuaboone.wordpress.com