Minimal Polynomials and Jordan Normal Forms

Similar documents
Question: Given an n x n matrix A, how do we find its eigenvalues? Idea: Suppose c is an eigenvalue of A, then what is the determinant of A-cI?

Diagonalization. MATH 322, Linear Algebra I. J. Robert Buchanan. Spring Department of Mathematics

LINEAR ALGEBRA BOOT CAMP WEEK 2: LINEAR OPERATORS

Linear Algebra II Lecture 13

A = 3 1. We conclude that the algebraic multiplicity of the eigenvalues are both one, that is,

Computationally, diagonal matrices are the easiest to work with. With this idea in mind, we introduce similarity:

Notes on the matrix exponential

MAT 1302B Mathematical Methods II

Topics in linear algebra

Math 315: Linear Algebra Solutions to Assignment 7

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

ft-uiowa-math2550 Assignment NOTRequiredJustHWformatOfQuizReviewForExam3part2 due 12/31/2014 at 07:10pm CST

and let s calculate the image of some vectors under the transformation T.

BASIC ALGORITHMS IN LINEAR ALGEBRA. Matrices and Applications of Gaussian Elimination. A 2 x. A T m x. A 1 x A T 1. A m x

a 11 a 12 a 11 a 12 a 13 a 21 a 22 a 23 . a 31 a 32 a 33 a 12 a 21 a 23 a 31 a = = = = 12

Math 3191 Applied Linear Algebra

Eigenvalues and Eigenvectors

Linear algebra II Tutorial solutions #1 A = x 1

Chapters 5 & 6: Theory Review: Solutions Math 308 F Spring 2015

AMS10 HW7 Solutions. All credit is given for effort. (-5 pts for any missing sections) Problem 1 (20 pts) Consider the following matrix 2 A =

Math 110 Linear Algebra Midterm 2 Review October 28, 2017

City Suburbs. : population distribution after m years

Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix

2. Every linear system with the same number of equations as unknowns has a unique solution.

The Jordan Normal Form and its Applications

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )

DM554 Linear and Integer Programming. Lecture 9. Diagonalization. Marco Chiarandini

Eigenvalues and Eigenvectors

Diagonalization of Matrix

Generalized Eigenvectors and Jordan Form

Math 489AB Exercises for Chapter 2 Fall Section 2.3

Bare-bones outline of eigenvalue theory and the Jordan canonical form

Further linear algebra. Chapter IV. Jordan normal form.

Econ Slides from Lecture 7

Review for Exam Find all a for which the following linear system has no solutions, one solution, and infinitely many solutions.

Linear Algebra 1. M.T.Nair Department of Mathematics, IIT Madras. and in that case x is called an eigenvector of T corresponding to the eigenvalue λ.

Therefore, A and B have the same characteristic polynomial and hence, the same eigenvalues.

(a) II and III (b) I (c) I and III (d) I and II and III (e) None are true.

8. Diagonalization.

ft-uiowa-math2550 Assignment OptionalFinalExamReviewMultChoiceMEDIUMlengthForm due 12/31/2014 at 10:36pm CST

Eigenvalues and Eigenvectors

(b) If a multiple of one row of A is added to another row to produce B then det(b) =det(a).

235 Final exam review questions

Lecture 21: The decomposition theorem into generalized eigenspaces; multiplicity of eigenvalues and upper-triangular matrices (1)

Eigenvalues for Triangular Matrices. ENGI 7825: Linear Algebra Review Finding Eigenvalues and Diagonalization

Lecture 23: Trace and determinants! (1) (Final lecture)

Linear Algebra Highlights

Final Review Sheet. B = (1, 1 + 3x, 1 + x 2 ) then 2 + 3x + 6x 2

Eigenvalues and Eigenvectors 7.2 Diagonalization

The minimal polynomial

Unit 5: Matrix diagonalization

Generalized eigenspaces

Chapter 5. Eigenvalues and Eigenvectors

MATH SOLUTIONS TO PRACTICE MIDTERM LECTURE 1, SUMMER Given vector spaces V and W, V W is the vector space given by

GRE Subject test preparation Spring 2016 Topic: Abstract Algebra, Linear Algebra, Number Theory.

Warm-up. True or false? Baby proof. 2. The system of normal equations for A x = y has solutions iff A x = y has solutions

Linear Algebra Practice Final

Homework For each of the following matrices, find the minimal polynomial and determine whether the matrix is diagonalizable.

18.06SC Final Exam Solutions

Math 304 Fall 2018 Exam 3 Solutions 1. (18 Points, 3 Pts each part) Let A, B, C, D be square matrices of the same size such that

MAT1302F Mathematical Methods II Lecture 19

Chapter 5 Eigenvalues and Eigenvectors

Definition (T -invariant subspace) Example. Example

Solutions to Final Exam

Dimension. Eigenvalue and eigenvector

Homework Set #8 Solutions

THE MINIMAL POLYNOMIAL AND SOME APPLICATIONS

Jordan Normal Form and Singular Decomposition

av 1 x 2 + 4y 2 + xy + 4z 2 = 16.

c c c c c c c c c c a 3x3 matrix C= has a determinant determined by

Eigenvalues for Triangular Matrices. ENGI 7825: Linear Algebra Review Finding Eigenvalues and Diagonalization

Math 240 Calculus III

EK102 Linear Algebra PRACTICE PROBLEMS for Final Exam Spring 2016

Matrices and Linear Algebra

Diagonalization. Hung-yi Lee

Contents. Preface for the Instructor. Preface for the Student. xvii. Acknowledgments. 1 Vector Spaces 1 1.A R n and C n 2

MTH 464: Computational Linear Algebra

Examples True or false: 3. Let A be a 3 3 matrix. Then there is a pattern in A with precisely 4 inversions.

MATH 320: PRACTICE PROBLEMS FOR THE FINAL AND SOLUTIONS

Final A. Problem Points Score Total 100. Math115A Nadja Hempel 03/23/2017

LINEAR ALGEBRA REVIEW

The converse is clear, since

5.3.5 The eigenvalues are 3, 2, 3 (i.e., the diagonal entries of D) with corresponding eigenvalues. Null(A 3I) = Null( ), 0 0

MA 265 FINAL EXAM Fall 2012

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

Linear Algebra (MATH ) Spring 2011 Final Exam Practice Problem Solutions

1 Last time: least-squares problems

Eigenvalues and Eigenvectors: An Introduction

University of Colorado Denver Department of Mathematical and Statistical Sciences Applied Linear Algebra Ph.D. Preliminary Exam June 8, 2012

REVIEW FOR EXAM II. The exam covers sections , the part of 3.7 on Markov chains, and

(VI.C) Rational Canonical Form

Linear Algebra: Lecture Notes. Dr Rachel Quinlan School of Mathematics, Statistics and Applied Mathematics NUI Galway

Eigenvalues and Eigenvectors

4. Linear transformations as a vector space 17

Advanced Engineering Mathematics Prof. Pratima Panigrahi Department of Mathematics Indian Institute of Technology, Kharagpur

Chapter 7. Canonical Forms. 7.1 Eigenvalues and Eigenvectors

3 (Maths) Linear Algebra

Jordan blocks. Defn. Let λ F, n Z +. The size n Jordan block with e-value λ is the n n upper triangular matrix. J n (λ) =

Solutions to Final Exam 2011 (Total: 100 pts)

Transcription:

Minimal Polynomials and Jordan Normal Forms 1. Minimal Polynomials Let A be an n n real matrix. M1. There is a polynomial p such that p(a) =. Proof. The space M n n (R) of n n real matrices is an n 2 -dimensional vector space over R. Take I,A,A 2,...,A n2. There are n 2 + 1 elements here, so they are linearly dependent: µ I +µ 1 A+...+µ n 2A n2 = for some µ i, not all. Let p(t) = µ i t i, then p(a) =. So we have a polynomial giving for A. Let m be a polynomial such that m(a) = which is monic (leading coefficient = 1) and of smallest degree. This is the minimal polynomial of A. M2. The minimal polynomial of A is unique. Proof. If we had two such polynomials, they must both have the same degree and the same leading coefficient 1, and so their difference is a polynomial of smaller degree which still gives when applied to A. But this would contradict the minimality of m. M3. If p is some polynomial such that p(a) =, then m divides p. Proof. Clearly deg(p) deg(m). By polynomial division, we may write p = qm+r for some polynomials q,r, such that deg(r) < deg(m). Then r(a) = p(a) q(a)m(a) =, which contradicts the minimality of m, unless r =. So r =, and m divides p. M4. If is an eigenvalue of A, then it is a root of m. Proof. We have Av = v, for some v. Then A 2 v = 2 v, and continuing, A k v = k v for any integer k. Therefore, for any polynomial p, we have p(a)v = p()v. In particular, we have m(a)v = m()v. But m(a) = and v, thus m() =. Cayley-Hamilton. Let χ(t) = det(a ti) be the characteristic polynomial of A. Then χ(a) =. Proof. See the notes. If we are given χ(t), we have some idea what m(t) might be. M5. If χ(t) = r i=1 (t i) ai, what are the possibilities for m(t)? By Cayley-Hamilton and M3, we know that m divides χ. And by M4 we know that every root of χ must appear in m. Thus the only possibilities for m are r i=1 (t i) ci, where 1 c i a i. Lastly, a useful property: M6. χ(t) and m(t) are invariant under change of basis. Proof. Let B = P 1 AP. Since B k = (P 1 AP) k = P 1 A k P, if p(t) is some polynomial then p(b) = P 1 p(a)p. So p(b) = iff p(a) =. Thus the minimal polynomials of A and B divide each other, so are equal. Also, det(b ti) = det(p 1 (A ti)p) = det(p 1 )det(a ti)det(p) = det(a ti). so χ B (t) = χ A (t). 1

2. Jordan Blocks & Jordan Normal Form What do the characteristic and minimal polynomials tell us about the Jordan Normal Form? (Be warned that most of the following is explanation rather than proof.) First, a Jordan block. These are of the form 1...... 1 for some, with all other entries. The Jordan Normal Form of a matrix A looks like J... J where the Js are Jordan block matrices of some (possibly different) sizes and with (possibly different) s. The actual details of what sizes and s we have depend upon the matrix A. (We are allowed to write the blocks in a different order and still call it the JNF. That s a cosmetic change.) Note: I assume here that we re working over an algebraically closed field such as C, so that the eigenvalues and JNF actually exist. J1. Let J be a Jordan block with diagonal entries. Then is the only eigenvalue, and the associated eigenspace is only 1-dimensional. Proof. Since J is upper-triangular, it is clear that the only eigenvalue is. Solving Jx = x gives us the equations x i + x i+1 = x i for i < n, and x n = x n, from which we see that x 2 =... = x n =, giving the eigenvector (1,,...,). (Or just note that n(j I) = 1 since r(j I) = n 1, as the last n 1 columns of J I are clearly linearly independent.) J2. Let J be a k k Jordan block. Then (J I) n = for n = k, but is non-zero for n < k. Sketch. If we square the matrix (J I), we get 1............... 1 2 1......... =...... 1... Each subsequent multiplication by (J I) pushes the diagonal of 1s further towards the top-right corner. After k 1 steps, there is a single 1 in the top-right, and after k steps we have. J3. Suppose that A,B are block matrices, with the same block sizes. Then A 1... 1 B... 1 B 1 = A... A n B n A n B n 2

Sketch. Just try it. For a row/column passing through a particular block, the rest of the row/column outside the block is entirely full of s, so the blocks never interact when multiplying, and every entry in AB outside the blocks is. J4. Suppose a matrix A is in Jordan Normal Form, containing some number of Jordan blocks down the diagonal. SupposethatthedistinctdiagonalentriesofAare 1,..., k, with i appearingwithmultiplicity a i. Suppose that the number of Jordan blocks whose diagonal entries are i is g i. And suppose that the largest Jordan block whose diagonal entries are i is size c i. Then we have the following results: (i) χ(t) = k i=1 (t i) ai (ii) m(t) = k i=1 (t i) ci ( the power of (t i ) in m(t) is the size of the largest i -block ) (iii) n(a i I) = g i Sketch. ( the dimension of the i -eigenspace is the number of i -blocks ) (i) Since A is in Jordan Normal Form, it is upper-triangular, and so its eigenvalues are the diagonal entries. (ii) By the previous point and M4 we know that m(t) = k i=1 (t i) ki for some k i. First consider (A i I) ki for one particular i. By J3, this is the same as replacing each Jordan block J in A by (J i I) ki. By J2, we need k i = c i to kill off the largest i -block, by which point any smaller i block has already become. So in (A i I) ki, the i -blocks are now all. Similarly, (A j I) cj makes the j -blocks all. So, by J3, (A i I) ki (A j I) cj has the i - and j -blocks all. Repeating this for each leaves us with the matrix. At each stage, we needed k i to be as big as c i to kill off the largest block. (iii) By J1, we get one dimension of eigenspace for each Jordan block, thus n(a i I) = g i 3. Small Examples Exercise. Let A be the matrix 1 1 µ 1 µ Find (A I) 2,(A I) 3,(A µi) 2 and show (A I) 3 (A µi) 2 =. Solve Ax = x and Ax = µx. Example. Suppose that we are told that A is a 3 3 matrix, and that we are given its characteristic and minimal polynomials. Then we know the Jordan Normal Form of A. If χ(t) = (t )(t µ)(t ν), then by M6 we know m(t) = χ(t), so by J4(ii) we know that the JNF is just diagonal. If χ(t) = (t )(t µ) 2, then by M6 we know m(t) is either (t )(t µ) or equals χ. Then we know whether there is one 2 2 µ-block or two 1 1 blocks. And if χ(t) = (t ) 3, we know m(t) = (t ) k where k = 1,2,3. Again, knowing the size of the largest block determines the JNF exactly. 3

Example. Knowing χ and m isn t enough if A is a 4 4 matrix. We can assume that the eigenvalues are all the same, otherwise we look at each block in turn and are done by the previous example. So χ(t) = (x ) 4 for some. If the largest Jordan block is size 1, 3 or 4 then the JNF is forced. But if the largest block is size 2, we can write the rest of the matrix as either another block of size 2, or two blocks of size 1. 1 1 1 Note that those two 4 4 JNFs could have been distinguished if we had known the eigenspace dimension, since they have a different number of blocks. Exercise. SupposewearetoldthatAisa6 6matrix, andthatwearegivenitscharacteristicandminimal polynomials, and also the dimensions of the eigenspaces. Show that the JNF is uniquely determined. Give two 7 7 matrices which have the same characteristic and minimal polynomials, and the same eigenspace dimensions, but which have distinct JNFs. 4. Finding the basis for a JNF So we know what JNFs look like and some of their properties. But how do we find the change of basis that gives us them? Here s an example. Let A be the matrix 1 4 4 2 1 2 We find that χ(t) = (t 2) 3. Clearly A 2I, so m(t) is either (t 2) 2 or (t 2) 3. So let s try it: 2 2 1 (A 2I) 2 = 4 2 = 2 1 So m(t) = (t 2) 2, which tells us that the JNF of A is 2 1 2 2 If we are in the basis {e 1,e 2,e 3 } wrt which A is in JNF, then the matrix tells us: Ae 1 = 2e 1, Ae 2 = 2e 2 +e 1, Ae 3 = 2e 3. Let s rewrite those equations as: (A 2I)e 1 =, (A 2I)e 2 = e 1, (A 2I)e 3 =. So we want two vectors that are in ker(a 2I), and a vector that gets sent to one of those. So let s find e 2 first and then define e 1 to be (A 2I)e 2. That makes sure that our e 1 and e 2 behave appropriately. So, ker(a 2I) = {2x = y}, by the matrix for (A 2I) 2 above. Let e 2 be some vector not in there, say e 2 = (1,,). Then let e 1 = (A 2I)e 2 = ( 2, 4, 2). For e 3 we can then just pick any linearly independent vector in ker(a 2I), such as e 3 = (1,2,3). Let P be the change of basis matrix, then P 1 AP = 1 3 2 1 2 1 1 2 1 8 4 4 4 4 2 = 2 8 2 4 2 1 2 2 3 2 If we d wanted a bigger block, say an n n block for the eigenvalue, then we could pick any non-zero e n thatisn tinker(a I) n 1. Thenwecanlete n 1 = (A I)e n, e n 2 = (A I)e n 1,..., e 1 = (A I)e 2. Note that (A I)e 1 =, so e 1 is our single eigenvector that we expected by J1. These e i then behave precisely as required for the matrix to become the Jordan block. 4

5. Uniqueness For a given matrix A, the JNF is unique. (Well, up to reordering its Jordan blocks, but that s not an exciting change.) Let s do an example. Let A be a matrix whose JNF is the following matrix J (assuming only one eigenvalue for now): 1 1 1 J = J I = 1 1 1 1 As before, the number of eigenvectors gives the number of blocks. And indeed, J I has five eigenvectors. This can be seen by looking at the nullity of J I, and since 1 1 1 (J I) 2 = 1 1 1 1 we can see clearly that the nullity is 5, from those five columns full of s. Now square this, using our block behaviour from section 2: 1 1 1 This has nine columns of s, and so the nullity of (J I) 2 equals 9. But more importantly, notice where those new columns came from. The first four blocks each provided an extra column of s, but the little 1 1 can t do so, as it has no more columns to give. I.e., we gain a new column of s for each block of size at least 2. 5

Let s try the cube: (J I) 3 = 1 This now has eleven columns of s. The two new columns appeared in the two biggest blocks, because this time the 2 2 blocks have no more columns to give. So this time, we gain a new column of s for each block of size at least 3. And the fourth power: (J I) 4 = This time, only the biggest block can give us a new column of s, because the 3 3 block ran out. We see that as we go from (J I) k 1 to (J I) k increases by 1 for each block still able to provide us with a column of s. And since in (J I) k 1, any block of size less than k is already entirely, we see that this increase is precisely the number of blocks of size at least k. In notation, we have n((j I) k ) n((j I) k 1 ) = number of -blocks of size at least k. This formula holds for A as well as J. Since J is the JNF of A, there is some invertible matrix P with J = P 1 AP. Then J I = P 1 AP I = P 1 (A I)P, and more generally (J I) k = P 1 (A I) k P, and since P is invertible we have n((j I) k ) = n((a I) k ). So the number of blocks of each size in the JNF is determined by A, and hence the JNF is unique. In this example, whatever the original A was, we d find: n(a I) = 5, so there are 5 -blocks in total n((a I) 2 ) n(a I) = 4, so there are 4 blocks of size 2, and hence 5 4 = 1 block of size 1. n((a I) 3 ) n((a I) 2 ) = 2, so there are 2 blocks of size 3, and hence 4 2 = 2 blocks of size 2 n((a I) 4 ) n((a I) 3 ) = 1, so there is 1 block of size 4, and hence 2 1 = 1 block of size 3. n((a I) 5 ) n((a I) 4 ) =, so there are no blocks of size 5, and hence 1 = 1 block of size 4. So the number of blocks of each size is determined, and hence so is the JNF. (This method works for matrices with more than one eigenvalue, of course.) 6