JORDAN AND RATIONAL CANONICAL FORMS

Similar documents
Topics in linear algebra

GRE Subject test preparation Spring 2016 Topic: Abstract Algebra, Linear Algebra, Number Theory.

NONCOMMUTATIVE POLYNOMIAL EQUATIONS. Edward S. Letzter. Introduction

1 Last time: least-squares problems

A proof of the Jordan normal form theorem

MATH 8253 ALGEBRAIC GEOMETRY WEEK 12

MATH SOLUTIONS TO PRACTICE MIDTERM LECTURE 1, SUMMER Given vector spaces V and W, V W is the vector space given by

c Igor Zelenko, Fall

MATHEMATICS 217 NOTES

Vector Spaces and Linear Transformations

Solution. That ϕ W is a linear map W W follows from the definition of subspace. The map ϕ is ϕ(v + W ) = ϕ(v) + W, which is well-defined since

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )

ALGEBRA QUALIFYING EXAM PROBLEMS LINEAR ALGEBRA

M.6. Rational canonical form

Linear Algebra 1. M.T.Nair Department of Mathematics, IIT Madras. and in that case x is called an eigenvector of T corresponding to the eigenvalue λ.

Definition (T -invariant subspace) Example. Example

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix

5 Quiver Representations

Jordan Canonical Form Homework Solutions

Eigenvalues and Eigenvectors

0.1 Rational Canonical Forms

Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Notes on the matrix exponential

LINEAR ALGEBRA BOOT CAMP WEEK 1: THE BASICS

Lecture Summaries for Linear Algebra M51A

First we introduce the sets that are going to serve as the generalizations of the scalars.

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Solution to Homework 1

Linear Algebra M1 - FIB. Contents: 5. Matrices, systems of linear equations and determinants 6. Vector space 7. Linear maps 8.

Honors Algebra 4, MATH 371 Winter 2010 Assignment 4 Due Wednesday, February 17 at 08:35

Proofs for Quizzes. Proof. Suppose T is a linear transformation, and let A be a matrix such that T (x) = Ax for all x R m. Then

A classification of sharp tridiagonal pairs. Tatsuro Ito, Kazumasa Nomura, Paul Terwilliger

THE MINIMAL POLYNOMIAL AND SOME APPLICATIONS

MATH 304 Linear Algebra Lecture 33: Bases of eigenvectors. Diagonalization.

IDEAL CLASSES AND RELATIVE INTEGERS

UNIVERSAL IDENTITIES. b = det

Math 113 Homework 5. Bowei Liu, Chao Li. Fall 2013

Dimension. Eigenvalue and eigenvector

Section II.1. Free Abelian Groups

Quadratic forms. Here. Thus symmetric matrices are diagonalizable, and the diagonalization can be performed by means of an orthogonal matrix.

Jordan Normal Form. Chapter Minimal Polynomials

OHSx XM511 Linear Algebra: Solutions to Online True/False Exercises

Linear and Bilinear Algebra (2WF04) Jan Draisma

MATH JORDAN FORM

A NOTE ON THE JORDAN CANONICAL FORM

Bare-bones outline of eigenvalue theory and the Jordan canonical form

ELE/MCE 503 Linear Algebra Facts Fall 2018

Some notes on linear algebra

New simple Lie algebras over fields of characteristic 2

Bulletin of the Iranian Mathematical Society

MATH 221 NOTES BRENT HO. Date: January 3, 2009.

Then x 1,..., x n is a basis as desired. Indeed, it suffices to verify that it spans V, since n = dim(v ). We may write any v V as r

Math 110, Summer 2012: Practice Exam 1 SOLUTIONS

Question: Given an n x n matrix A, how do we find its eigenvalues? Idea: Suppose c is an eigenvalue of A, then what is the determinant of A-cI?

Notes on nilpotent orbits Computational Theory of Real Reductive Groups Workshop. Eric Sommers

The converse is clear, since

MATH 304 Linear Algebra Lecture 34: Review for Test 2.

Integral Extensions. Chapter Integral Elements Definitions and Comments Lemma

Lecture 21: The decomposition theorem into generalized eigenspaces; multiplicity of eigenvalues and upper-triangular matrices (1)

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017

Chapter 7. Canonical Forms. 7.1 Eigenvalues and Eigenvectors

Generalized Eigenvectors and Jordan Form

235 Final exam review questions

Q N id β. 2. Let I and J be ideals in a commutative ring A. Give a simple description of

Galois Theory. This material is review from Linear Algebra but we include it for completeness.

MATH 304 Linear Algebra Lecture 23: Diagonalization. Review for Test 2.

1. In this problem, if the statement is always true, circle T; otherwise, circle F.

MATH 423 Linear Algebra II Lecture 20: Geometry of linear transformations. Eigenvalues and eigenvectors. Characteristic polynomial.

Math 240 Calculus III

MATH 205 HOMEWORK #3 OFFICIAL SOLUTION. Problem 1: Find all eigenvalues and eigenvectors of the following linear transformations. (a) F = R, V = R 3,

January 2016 Qualifying Examination

ft-uiowa-math2550 Assignment NOTRequiredJustHWformatOfQuizReviewForExam3part2 due 12/31/2014 at 07:10pm CST

BASIC ALGORITHMS IN LINEAR ALGEBRA. Matrices and Applications of Gaussian Elimination. A 2 x. A T m x. A 1 x A T 1. A m x

Linear algebra 2. Yoav Zemel. March 1, 2012

Infinite-Dimensional Triangularization

4. Noether normalisation

12x + 18y = 30? ax + by = m

Algebra Homework, Edition 2 9 September 2010

MATH 1553 PRACTICE MIDTERM 3 (VERSION B)

Cartan s Criteria. Math 649, Dan Barbasch. February 26

1 Invariant subspaces

QUALIFYING EXAM IN ALGEBRA August 2011

A PRIMER ON SESQUILINEAR FORMS

Structure of rings. Chapter Algebras

Calculating determinants for larger matrices

Equality: Two matrices A and B are equal, i.e., A = B if A and B have the same order and the entries of A and B are the same.

A connection between number theory and linear algebra

(f + g)(s) = f(s) + g(s) for f, g V, s S (cf)(s) = cf(s) for c F, f V, s S

4.1 Eigenvalues, Eigenvectors, and The Characteristic Polynomial

Linear Algebra problems

Chapter 2: Linear Independence and Bases

Lecture Notes: Eigenvalues and Eigenvectors. 1 Definitions. 2 Finding All Eigenvalues

MATH 115A: SAMPLE FINAL SOLUTIONS

Math Final December 2006 C. Robinson

ALGEBRAIC GROUPS. Disclaimer: There are millions of errors in these notes!

Lecture 11: Finish Gaussian elimination and applications; intro to eigenvalues and eigenvectors (1)

Further linear algebra. Chapter IV. Jordan normal form.

NOTES ON LINEAR ALGEBRA OVER INTEGRAL DOMAINS. Contents. 1. Introduction 1 2. Rank and basis 1 3. The set of linear maps 4. 1.

Math 3191 Applied Linear Algebra

Homework For each of the following matrices, find the minimal polynomial and determine whether the matrix is diagonalizable.

Transcription:

JORDAN AND RATIONAL CANONICAL FORMS MATH 551 Throughout this note, let V be a n-dimensional vector space over a field k, and let φ: V V be a linear map Let B = {e 1,, e n } be a basis for V, and let A be the matrix for φ with respect to the basis B Thus φ(e j ) = n i=1 a ije j (so the jth column of A records φ(e j )) This is the standard convention for talking about vector spaces over a field k To make these conventions coincide with Hungerford, consider V as a right module over k Recall that if B is another basis for V, then the matrix for φ with respect to the basis B is CAC 1, where C is the matrix whose ith column is the description of the ith element of B in the basis B 1 Rational Canonical Form We give a k[x]-module structure to V by setting x v = φ(v) = Av Recall that the structure theorem for modules over a PID (such as k[x]) guarantees that V = k[x] r l i=1 k[x]/f i as a k[x]-module Since V is a finite-dimensional k-module (vector space!), and k[x] is an infinitedimensional k-module, we must have r = 0, so V = l i=1k[x]/f i By the classification theorem we may assume that f 1 f 2 f l We may also assume that each f i is monic (has leading coefficient one) To see this, let f = x s + s 1 i=0 a ix i Then s 1 f = (x) s + s 1 i=0 a i s 1 i (x) i Now k[x]/f = k[x]/ s 1 f, since the two polynomials generate the same ideal, and k[x]/ s 1 f = k[y]/f, where f (y) = y s + s 1 i=0 a i s 1 i y i This transformation can be done preserving the relationship that f i divides f i+1 Let ψ : l i=1k[x]/f i V be the isomorphism, and let V i be ψ(k[x]/f i ) (the image of this term of the direct sum) If we choose a basis for V consisting of the unions of bases for each V i, then the matrix for φ will be in block form, since φ(v) V i for each v V i Thus we can restrict our attention to φ Vi Let f i = x m + m 1 j=1 a ijx j Notice that {1, x, x 2,, x m 1 } is a basis for k[x]/f i Let v = ψ(1) V i Then {v, Av, A 2 v,, A m 1 v} is a basis for V i The matrix for φ Vi in this basis is: 1

2 MATH 551 0 0 0 a i0 1 0 0 a i1 0 1 0 a i2 0 0 0 a i(m 2) 0 0 1 a i(m 1) Thus if we take as our basis for V the union of these bases for V i we have proved the existence of Rational Canonical Form Definition 1 Let f = x n + m i=1 a ix i Then the companion matrix of f is 0 0 0 a 0 1 0 0 a 1 0 1 0 a 2 0 0 0 a n 2 0 0 1 a n 1 Theorem 2 Every n n matrix A is similar to a matrix B which is block-diagonal, with the ith block the companion matrix of a monic polynomial f i, with f 1 f 2 f l 2 Jordan Canonical Form For this section we assume that the field k is algebraically closed Definition 3 A field k is algebraically closed if for every polynomial f k[x] there is a k with f(a) = 0 Recall the alternative statement of the classification of modules over a PID: instead of having f 1 f 2 f l, we can choose to have each f i = p n i i, where p i is a prime in k[x] If k is algebraically closed, then the primes in k[x] are all of the form x a for a k, so when V = i k[x]/f i = i V i, V i is isomorphic to k[x]/(x i ) n i for some i k, n i N Let B = A i I, and consider the k[y]-module structure on V given by y v = Bv Then for v V i, y v = x v v V i, so we also have V = i V i as a k[y]-module Note that y ni V i = 0, but y n i 1 V i 0,

JORDAN AND RATIONAL CANONICAL FORMS 3 so the rational canonical form of B Vi is 0 0 0 0 1 0 0 0 0 1 0 0 0 0 0 0 0 0 1 0 The matrix of φ Vi with respect to this basis is thus i 0 0 0 1 i 0 0 0 1 0 0 0 0 i 0 0 0 1 i Reversing the order of the basis, we get the matrix of φ Vi is (1) We thus have: i 1 0 0 0 i 0 0 0 0 0 0 0 0 i 1 0 0 0 i Theorem 4 Any n n matrix A is similar to a matrix J which is in block-diagonal form, where every block is of the form (1) for some i 3 Computing the Jordan Canonical Form Recall first the definition of eigenvalues of a matrix A Definition 5 If A is a n n matrix over k, then k is an eigenvalue for A if there is v 0 in V with Av = v If k is an eigenvalue, then v V is an eigenvector for A if Av = v The characteristic polynomial of A is p A (x) = det(a xi) k[x] An element k is an eigenvalue for A if and only p A () = 0 Definition 6 For k, and m N, let E m = {v V : (A I)m v = 0} Since E m is the kernel of a matrix, it is a subspace of V

4 MATH 551 Lemma 7 The subspace E m 0 for some m if and only if is an eigenvalue of A, and E m En µ {0} for some m, n > 0 implies that = µ Proof Suppose first that is an eigenvalue of A Then E 1 is the eigenspace corresponding to, which is thus nonempty Conversely, suppose that E m is nonempty for some m We will show that E1 is nonempty, so is an eigenvalue of A To see this, consider v E m with v 0 We may assume that v E m 1 (otherwise replace m by m 1 until this is possible or until v E 1) Consider w = (A I)m 1 v Since v E m 1, w 0, and (A I)w = (A I) m v = 0, so v E 1 \ {0}, and thus is an eigenvalue Suppose that v E m En µ with v 0 As above we may assume that m, n have been chosen minimally Then consider w = (A I) m 1 Now w E 1 En µ and w 0 Replace n by a smaller integer if necessary so that w Eµ n 1 Then w = (A µi) n 1 w 0, and w E 1 E1 µ But this means Aw = w = µw, so = µ Proposition 8 If the characteristic polynomial of A is p A (x) = (x ) n, then E m E n for all m, and dim E n = n Furthermore, V = E n Proof Since the characteristic polynomial is the same for similar matrices (since det(a xi) = det(c(a xi)c 1 ) = det(cac 1 xi)), we can compute the characteristic polynomial from the Jordan canonical form We thus see that n is the sum of the sizes of all Jordan blocks Also, note that if J is a Jordan block, then the corresponding standard basis vectors all lie in E m for some m n, and are linearly independent, and by the Lemma E m for different eigenvalues do not intersect, so we see that V = E n Since n = n, and dim E n n, we must thus have dim E n = n, and E m = En for m > n Thus we have the following algorithm to compute the Jordan Canonical Form of A: (1) Compute and factor the characteristic polyno- Algorithm 9 mial of A (2) For each, compute a basis B = {v 1,, v k } for E n /En 1, and lift to elements of E n Add the elements (A I)m v i to B for 1 m < n (3) Set i = n 1 (4) Complete B E i to a basis for Ei /Ei 1 (A I) m v to B for all m and v B Add the element

JORDAN AND RATIONAL CANONICAL FORMS 5 (5) If i 1, set i = i 1, and return to the previous step (6) Output B - the matrix for A with respect to a suitable ordering of B is in Jordan Canonical Form Proof of correctness To show that this algorithm works we need to check that it is always possible to complete B E k to a basis for E k/ek 1 Suppose B E k is linearly dependent Then there are v 1,, v s B E k with i c iv i = 0, with not all c i = 0 By the construction of B we know that v i = (A I)w i for some w i B, so consider w = i c iw i Then w 0, since the w i are linearly independent, and not all c i are zero In fact, by the construction of the w i, we know w E k But (A I)w = 0, so w E1, which is a contradiction, since k 1 Example 10 Consider the matrix A = 2 4 2 2 2 0 1 3 2 2 3 3 2 6 3 7 Then the characteristic polynomial of A is (x 2) 2 (x 4) 2 A basis for E2 1 is {(2, 1, 0, 2), (0, 1, 2, 0)}, so since there is a two-dimensional eigenspace for 2, the Jordan canonical form will have two distinct 2 blocks, each of size one To confirm this, check that E2 m = E2 1 for all m > 1 A basis for E4 1 is {(0, 1, 1, 1)}, while a basis for E4 2 is {(0, 1, 1, 1), (1, 0, 0, 1)}, so we can take {(1, 0, 0, 1)} as a basis for E4/E 2 4 1 Then (A 4I)(1, 0, 0, 1) T = (0, 1, 1, 1) T, so our basis is then {(2, 1, 0, 2), (0, 1, 2, 0), (0, 1, 1, 1), (1, 0, 0, 1)} The matrix of the transformation with respect to this basis is: 2 0 0 0 0 2 0 0 0 0 4 1 0 0 0 4 4 The minimal and characteristic polynomials Definition 11 Let I = {f k[x] : f v = 0 for all v V } = {f k[x] : f(a) = 0} Then I is an ideal of k[x], so since k[x] is a PID, I = g for some polynomial g k[x] We can choose g to be monic (have leading coefficient one) The polynomial g is called the minimal polynomial of the matrix A (or linear transformation φ) Lemma 12 The minimal polynomial of a nonzero matrix A is nonzero

6 MATH 551 Proof Let {v 1,, v n } be a basis for V Then for each i O vi = {f k[x] : f(a)v i = 0} is nonzero, since {v i, Av i, A 2 v i,, A n v i } is linearly dependent Pick nonzero f i O vi for each i Then n i=1 f i n i=1o vi = I, so I 0, and thus the generator is nonzero Proposition 13 If A is the companion matrix of a monic polynomial f, then f is the minimal polynomial of A Proof First note that e i = A i 1 e 1, so {e 1, Ae 1,, A n 1 e 1 } is linearly independent Thus the minimal polynomial of A has degree at least n If f = x n + n 1 i=0 c ix i, then f(a)e 1 = n 1 i=0 c ie i + n 1 i=0 c ie i = 0 by the construction of the companion matrix Also f(a)e i = f(a)a i 1 e 1 = A i 1 f(a)e 1 = 0, so f(a) = 0, and thus f I If f were not the minimal polynomial, then there would be a monic g I with g dividing f But since f is itself monic g would have to have degree less than n, which we showed above is impossible, so f is the minimal polynomial of its companion matrix Corollary 14 The minimal polynomial of A is f l, if its rational canonical form has blocks the companion matrices of f 1,, f l with f 1 f 2 f l This is (x )m(), where m() is the size of the largest Jordan block corresponding to the eigenvalue Proof Since applying a polynomial to a matrix in block-diagonal form applies it to each block, we know that f l (A) = 0, and thus the minimal polynomial of A divides f l Conversely, if f(a) = 0, then f l divides f, since f applied to the last block of the rational canonical form is zero Thus f l is the minimal polynomial of A The second description of the minimal polynomial follows from the method to convert between the two different descriptions of modules over a PID Theorem 15 If p A (x) is the characteristic polynomial of A, then p A (A) = 0 Proof By Proposition 8 we know that the characteristic polynomial of A is i (x i) n i where the ith Jordan block has eigenvalue i and size n i Thus by Corollary 14 the minimal polynomial of A divides the characteristic polynomial of A, and thus p A (A) = 0