# Jordan Normal Form. Chapter Minimal Polynomials

Save this PDF as:

Size: px
Start display at page:

Download "Jordan Normal Form. Chapter Minimal Polynomials"

## Transcription

1 Chapter 8 Jordan Normal Form 81 Minimal Polynomials Recall p A (x) =det(xi A) is called the characteristic polynomial of the matrix A Theorem 811 Let A M n Then there exists a unique monic polynomial q A (x) of minimum degree for which q A (A) =0Ifp(x) is any polynomial such that p(a) =0,thenq A (x) divides p(x) Proof Since there is a polynomial p A (x) forwhichp A (A) = 0, there is one of minimal degree, which we can assume is monic by the Euclidean algorithm p A (x) =q A (x)h(x)+r(x) where deg r(x) < deg q A (x) We know p A (A) =q A (A)h(A)+r(A) Hence r(a) = 0, and by the minimality assumption r(x) 0 Thus q A divided p A (x) and also any polynomial for which p(a) = 0 to establish that q A is unique, suppose q(x) is another monic polynomial of the same degree for which q(a) = 0 Then r(x) =q(x) q A (x) is a polynomial of degree less than q A (x) forwhichr(a) =q(a) q A (A) =0 This cannot be unless r(x) 0=0q = q A Definition 811 The polynomial q A (x) in the theorem above is called the minimal polynomial 221

2 222 CHAPTER 8 JORDAN NORMAL FORM Corollary 811 If A, B M n are similar, then they have the same minimal polynomial Proof B = S 1 AS q A (B) =q A (S 1 AS) =S 1 q A (A)S = q A (A) =0 If there is a minimal polynomial for B of smaller degree, say q B (x), then q B (A) = 0 by the same argument This contradicts the minimality of q A (x) Now that we have a minimum polynomial for any matrix, can we find a matrix with a given polynomial as its minimum polynomial? Can the degree the polynomial and the size of the matrix match? The answers to both questions are affirmative and presented below in one theorem Theorem 812 For any n th degree polynomial p(x) =x n + a n 1 x n 1 + a n 2 x n a 1 x + a 0 there is a matrix A M n (C) for which it is the minimal polynomial Proof Consider the matrix given by 0 0 a a 1 A = a a n 1 Observe that Ie 1 = e 1 = A 0 e 1 Ae 1 = e 2 = Ae 1 Ae 2 = e 3 = A 2 e 1 Ae n 1 = e n = A n 1 e 1

3 81 MINIMAL POLYNOMIALS 223 and Ae n = a n 1 e n a n 2 e n 1 a 1 e 2 a 0 e 1 Since Ae n = A n e 1, it follows that Also p(a)e 1 = A n e 1 + a n 1 A n 1 e 1 + a n 2 A 2 e a 1 Ae 1 + a 0 Ie 1 =0 p(a)e k = p(a)a k 1 e 1 = A k 1 p(a)e 1 = A k 1 (0) = 0 k =2,,n Hence p(a)e j =0 for j =1n Thus p(a) = 0 We know also that p(x) is monic Suppose now that where m<nand q(a) = 0 Then q(x) =x m + b m 1 x m 1 + b 1 x + b 0 q(a)e 1 = A m e 1 + b m 1 A m 1 e b 1 Ae 1 + b 0 e 1 = e m+1 + b m 1 e m + + b 1 e 2 + b 0 e 1 =0 But the vectors e m+1 e 1 are linear independent from which we conclude that q(a) = 0 is impossible Thus p(x) isminimal Definition 812 For a given monic polynomial p(x), the matrix A constructed above is called the companion matrix to p The transpose of the companion matrix can also be used to generate a linear differential system which has the same characteristic polynomial as a given n th order differential equation Consider the linear differential equation y (n) + a n 1 y (n 1) + + a 1 y 0 + a 0 =0 This n th order ODE can be converted to a first order system as follows: u 1 = y u 2 = u 0 1 = y 0 u 3 = u 0 2 = y 00 u n = u 0 n 1 = y (n 1)

4 224 CHAPTER 8 JORDAN NORMAL FORM Then we have 0 u u 1 u u = 0 1 u n a 0 a 1 a n 1 82 Invariant subspaces There seems to be no truly simple way to the Jordan normal form The approach taken here is intended to reveal a number of features of a matrix, interesting in their own right In particular, we will construct generalized eigenspaces that envelop the entire connection of a matrix with its eigenvalues We have in various ways considered subspaces V of C n that are invariant under the matrix A M n (C) Recall this means that AV V For example, eigenvectors can be used to create invariant subspaces Null spaces, the eigenspace of the zero eigenvalue, are invariant as well Triangular matrices furnish an easily recognizable sequence of invariant subspaces Assuming T M n (C) is upper triangular, it is easy to see that the subspaces generated by the coordinate vectors {e 1,,e m } for m =1,,nare invariant under T We now consider a specific type of invariant subspace that will lead the so-called Jordan normal form of a matrix, the closest matrix similar to A that resembles a diagonal matrix Definition 821 (Generalized Eigenspace) Let A M n (C)withspectrum σ (A) ={λ 1,,λ k } Define the generalized eigenspace pertaining to λ i by V λi = {x C n (A λ i I) n x =0} Observe that all the eigenvectors pertaining to λ i are contained in V λi If the span of the eigenvectors pertaining to λ i is not equal to V λi then there must be a positive power p and a vector x such that (A λ i I) p x = 0 but that y =(A λ i I) p 1 x 6= 0 Thusy is an eigenvector pertaining to λ i Forthis reason we will call V λi the space of generalized eigenvectors pertaining to λ i Ourfirst result, that V λi is invariant under A, issimpletoprove,noting that only closure under vector addition and scalar multiplication need be established u n

5 82 INVARIANT SUBSPACES 225 Theorem 821 Let A M n (C) with spectrum σ (A) = {λ 1,,λ k } Then for each i =1,,k, V λi is an invariant subspace of A One might question as to whether V λi could be enlarged by allowing higher powers than n in the definition The negative answer is most simply expressed by evoking the Hamilton-Cayley theorem We write the characteristic polynomial p A (λ) = Q (λ λ i ) m A(λ i ),wherem A (λ i ) is the algebraic multiplicity of λ i Since p A (A) = Q (A λ i I) m A(λ i ) = 0, it is an easy matter to see that we exhaust all of C n with the spaces V λi This is to say that allowing higher powers in the definition will not increase the subspaces V λi Indeed, as we shall see, the power of (λ λ i ) can be decreased to the geometric multiplicity m g (λ i )thepowerforλ i For now the general power n will suffice One very important result, and an essential firststepinderiving the Jordan form, is to establish that any square matrix A is similar to a block diagonal matrix, with each block carrying a single eigenvalue Theorem 822 Let A M n (C) with spectrum σ (A) ={λ 1,,λ k } and with invariant subspaces V λi, i =1, 2,,k Then (i) The spaces V λi, j = 1,,k are mutually linearly independent (ii) L k i=1 V λ i = C n (alternatively C n = S(V λ1,,v λk )) (iii) dim V λi = m A (λ i ) (iv) A is similar to a block diagonal matrix with k blocks A 1,,A k Moreover, σ (A i ) = {λ i } and dim A i = m A (λ i ) Proof (i) It should be clear that the subspaces V λi are linearly independent of each other For if there is a vector x in both V λi and V λj then there is a vector for some integer q, it must be true that (A λ j I) q 1 x 6= 0 but (A λ j I) q x =0 This means that y =(A λ j I) q 1 x is an eigenvector pertaining to λ j Since (A λ i I) n x = 0 we must also have that (A λ j I) q 1 (A λ i I) n x = (A λ i I) n (A λ j I) q 1 x = (A λ i I) n y =0 nx µ n = ( λ i ) n k A k y k k=0 nx µ n = ( λ i ) n k λ k j y k k=0 = (λ j λ i ) n y =0 This is impossible unless λ j = λ i (ii) The key part of the proof is to block diagonalize A with respect to these invariant subspaces To that end, let S

6 226 CHAPTER 8 JORDAN NORMAL FORM be the matrix with columns generated from bases of the individual V λi taken in the order of the indices Supposing there are more linearly independent vectors in C n other than those already selected, fill out the matrix S with vectors linearly independent to the subspaces V λi,i=1,,k Now define Ã = S 1 AS We conclude by the invariance of the subspaces and their mutual linear independence that Ã has the following block structure A A 2 0 Ã = S 1 AS = A k 0 0 B It follows that p A (λ) =pã (λ) =³Y pai (λ) p B (λ) Any root r of p B (λ) must be an eigenvalue of A, say λ j, and there must be an eigenvector x pertaining to λ j Moreover, due to the block structure we can assume that x =[0,,0,x] T, where there are k blocked zeros of the sizes of the A i respectively Then it is easy to see that ASx = λ j Sx, and this implies that Sx V λj Thus there is another vector in V λj, which contradicts its definition Therefore B is null, or what is the same thing, k i=1 V λ i = C n (iii) Let d i =dimv λi From (ii) know P d i = n Suppose that λ i σ (A j ) Then there is another eigenvector x pertaining to λ i and for which A j x = λ i x Moreover, this vector has the form x =[0,,0,x,0,0] T, analogous to the argument above By construction Sx / V λi, but ASx = λ i Sx, and this contradicts the definition of V λi Wethushavethatp Ai (λ) =(λ λ i ) d i Since p A (λ) = Q p Ai (λ) = Q (λ λ i ) m A(λ i ), it follows that d i = m A (λ i ) (iv) Putting (ii), and (iii) together gives the block diagonal structure as required On account of the mutual linear independence of the invariant subspaces V λi and the fact that they exhaust C n the following corollary is immediate Corollary 821 Let A M n (C) with spectrum σ (A) =λ 1,,λ k and with generalized eigenspaces V λi, i =1, 2,,k Then each x C n has a unique representation x = P k i=1 x i where x i V λi Another interesting result which reveals how the matrix works as a linear transformation is to decompose the it into components with respect to the

7 82 INVARIANT SUBSPACES 227 generalized eigenspaces In particular, viewing the block diagonal form A Ã = S 1 0 A 2 0 AS = Ak the space C n can be split into a direct sum of subspaces E 1,,E k based on coordinate blocks This is accomplished in such that any vector y C n can be written uniquely as y = P k i=1 y i where the y i E i (Keep in mind that each y i C n ; its coordinates are zero outside the coordinate block pertaining E i ) Then Ãy = Ã P k i=1 y i = P k i=1 Ãy i = P k i=1 A iy i This provides a computational tool when this block diagonal form is known Note that the blocks correspond directly to the invariant subspaces by SE i = V λi We can use these invariant subspaces to get at the minimal polynomial For each i =1,,k define m i =min j {(A λ i I) j x =0 x V λi } Theorem 823 Let A M n (C) with spectrum σ (A) =λ 1,,λ k and with invariant subspaces V λi, i =1, 2,,k Then the minimal polynomial of A is given by q (λ) = ky (λ λ i ) m i i=1 Proof Certainly we see that for any vector x V λj Ã ky! q (A) x = (A λ i I) m i x =0 i=1 Hence, the minimal polynomial q A (λ) dividesq (A) To see that indeed they are in fact equal, suppose that the minimal polynomial has the form q A (λ) = ky (λ λ i ) ˆm i i=1 where ˆm i m i, for i =1,,kandinparticular ˆm j <m j Byconstruction there must exist a vector x V λj such that (A λ j I) m j x = 0 but y =

8 228 CHAPTER 8 JORDAN NORMAL FORM (A λ j I) m j 1 x 6= 0 Then if q (A) x = = = 0 Ã ky! (A λ i I) ˆm i x i=1 ky (A λ i I) ˆm i y i=1 i6=j This cannot be because the contrary implies that there is another vector in one of the invariant subspaces V λk Just one more step is needed before the Jordan normal form can be derived For a given V λi we can interpret the spaces in a heirarchical viewpoint We know that V λi contains all the eigenvectors pertaining to λ i Call these eigenvectors the first order generalized eigenvectors If the span of these is not equal to V λi, then there must be a vector x V λi for which y = (A λ i I) 2 x = 0 but (A λ i I) x 6= 0 Thatistosayy is an eigenvector of A pertaining to λ i Call such vectors second order generalized eigenvectors In general we call an x V λi a generalized eigenvector of order j if y = (A λ i I) j x = 0 but (A λ i I) j 1 x 6= 0 In light of our previous discussion V λi contains generalized eigenvectors of order up to but not greater than m λi Theorem 824 Let A M n (C) with spectrum σ (A) ={λ 1,,λ k } and with invariant subspaces V λi, i =1, 2,,k (i) Let x V λi be a generalized eigenvector of order p Then the vectors x, (A λ i I) x, (A λ i I) 2 x,, (A λ i I) p 1 x (1) are linearly independent (ii) The subspace of C n generated by the vectors in (1) is an invariant subspace of A Proof (i) To prove linear independence of a set of vectors we suppose linear dependence That is there is a smallest integer k and constants b j such that kx x j = kx b j (A λ i I) j x =0

9 82 INVARIANT SUBSPACES 229 where b k 6=0 Solving we obtain b k (A λ i I) k x = P k 1 b j (A λ i I) j x Now apply (A λ i I) p k to both sides and obtain a new linearly dependent set as the following calculation shows Xk 1 0 = b k (A λ i I) p k+k x = b j (A λ i I) j+p k x = p 1 X j=p k b j+p k (A λ i I) j x Thekeypointtonotehereisthatthelowerlimitofthesumisincreased This new linearly dependent set, which we denote with the notation P p 1 j=p k c j (A λ i I) j x can be split in the same way as before, where we assume with no loss in generality that c p 1 6=0 Then c p 1 (A λ i I) p 1 x = p 2 X j=p k c j (A λ i I) j x Apply (A λ i I) to both sides to get 0 = c p 1 (A λ i I) p x = = p 1 X j=p k+1 p 2 X j=p k c j 1 (A λ i I) j x c j (A λ i I) j+1 x Thus we have obtained another linearly independent set with the lower limit of powers increased by one Continue this process until the linear dependence of (A λ i I) p 1 x and (A λ i I) p 2 x is achieved Thus we have c (A λ i I) p 1 x = d (A λ i I) p 2 x (A λ i I) y = d c y where y =(A λ i I) p 2 x This implies that λ i + d is a new eigenvalue c with eigenvector y V λi, and of course this is a contradiction (ii) The invariance under A is more straightforward First note that while x 1 = x,

10 230 CHAPTER 8 JORDAN NORMAL FORM x 2 =(A λi) x = Ax λx so that Ax = x 2 λx 1 Consider any vector y defined by y = P p 1 b j (A λ i I) j x It follows that Xp 1 Ay = A b j (A λ i I) j x = = = = p 1 X b j (A λ i I) j Ax p 1 X b j (A λ i I) j (x 2 λx 1 ) p 1 X b j (A λ i I) j [(A λ i I) x 1 λx 1 ] p 1 X c j (A λ i I) j x where c j = b j 1 λ for j>0andc 0 = λ, which proves the result 83 The Jordan Normal Form We need a lemma that points in the direction we are headed, that being the use of invariant subspaces as a basis for the (Jordan) block diagonalization of any matrix These results were discussed in detail in the Section 82 The restatement here illustrates the invariant subspace nature of the result, irrespective of generalized eigenspaces Its proof is elementary and is left to the reader Lemma 831 Let A M n (C) with invariant subspace V C n (i) Suppose v 1 v k is a basis for V and S is an invertible matrix with the first k columns given by v 1 v k Then S 1 AS = 1k 0

11 83 THE JORDAN NORMAL FORM 231 (ii) Suppose that V 1,V 2 C n are two invariant subspaces of A and C n = V 1 V 2 Let the (invertible) matrix S consist respectively of bases from V 1 and V 2 as its columns Then S 1 AS = 0 0 Definition 831 Let λ C A Jordan block J k (λ) isak k upper triangular matrix of the form λ 1 0 J k (λ) = λ λ A Jordan matrix is any matrix of the form J n1 (λ 1 ) 0 J = 0 J nk (λ k ) where the matrices J n1 are Jordan blocks If J M n (C), then n 1 + n 2 + n k = n Theorem 831 (Jordan normal form) Let A M n (C) Then there is a nonsingular matrix S M n such that J n1 (λ 1 ) 0 A = S S 1 = SJS 1 0 J nk (λ k ) where J ni (λ i ) is a Jordan block, where n 1 +n 2 + +n k = n J is unique up to permutations of the blocks The eigenvalues λ 1,,λ k are not necessarily distinct If A is real with real eigenvalues, then S can be taken as real Proof This result is proved in four steps (1) Block diagonalize (by similarity) into invariant subspaces pertaining to σ(a) This is accomplished as follows First block diagonalize the matrix according to the generalized eigenspaces V λi = {x C n (A λ i I) n x =

12 232 CHAPTER 8 JORDAN NORMAL FORM 0} as discussed in the previous section Beginning with the highest order eigenvector in x V λi, construct the invariant subspace as in (1) of Section 82 Repeat this process until all generalized eigenvectors have been included in an invariant subspace This includes of course first order eigenvectors that are not associated with higher order eigenvectors These invariant subspaces have dimension one Each of these invariant subspaces is linearly independent from the others Continue this process for all the generalized eigenspaces This exhausts C n Each of the blocks contains exactly one eigenvector The dimensions of these invariant subspaces can range from one to m λi,therebeingat least one subspace of dimension m λi (2) Triangularize each block by Schur s theorem, so that each block has the form λ K(λ) = 0 λ You will note that K(λ) =λi + N where N is nilpotent, or K(λ) is1 1 (3) Jordanize each triangular block Assume that K 1 (λ)ism m, where m>1 By construction K 1 (λ) pertains to an invariant subspace for which there is a unique vector x for which N m 1 x 6= 0 and N m x =0 Thus N m 1 x is an eigenvector of K 1 (λ), the unique eigenvector Define y i = N i 1 x i =1, 2,,m Expand the set {y i } m i=1 as a basis of C mdefine " # ym y m 1 y 1 S 1 = Then " # 0 ym y m 1 y 2 NS 1 =

13 83 THE JORDAN NORMAL FORM 233 So S 1 1 NS 1 = We conclude that λ 1 0 λ 1 S1 1 K λ 1 1(λ)S 1 = 1 0 λ (4) Assemble all of the blocks to form the Jordan form For example, the block K 1 (λ) and the corresponding similarity transformation S 1 studied above can be treated in the assembly process as follows: Define the n n matrix I 0 0 Ŝ 1 = 0 S I where S 1 istheblockconstructedaboveandplacedinthen n matrix in the position that K 1 (λ) was extracted from the block triangular form of A Repeat this for each of the blocks pertaining to minimally invariant subspaces This gives a sequence of block diagonal matrices Ŝ 1, Ŝ2, Ŝk Define T = Ŝ1Ŝ2 Ŝk It has the form T = Ŝ 1 0 Ŝ 2 0 Ŝ k Together with the original matrix P that transformed the matrix to the minimal invariant subspace blocked form and the unitary matrix

14 234 CHAPTER 8 JORDAN NORMAL FORM V used to triangularize A, it follows that J n1 (λ 1 ) 0 A = PVT (PVT) 1 = SJS 1 0 J nk (λ k ) with S = PVT Example 831 Let J = In this example, there are four blocks, with two of the blocks pertaining to the single eigenvalue 2 For the first block there is the single eigenvector e 1, but the invariant subspace is S (e 1, e 2 ) For the second block, the eigenvector, e 3, generates the one dimensional invariant subspace The block pertaining to the eigenvector 3 has the single eigenvector e 4 while the minimal invariant subspace is S (e 4, e 5,e 6 ) Finally, the one dimensional subspace pertaining to the eigenvector 1 is spanned by e 7 The minimal invariant polynomial is q (λ) =(λ 2) 2 (λ 3) 3 (λ +1) 84 Convergent matrices Using the Jordan normal form, the study of convergent matrices becomes relatively straightforward and simpler Theorem 841 If A M n and ρ(a) < 1 Then lim k A =0

15 85 EXERCISES 235 Proof We assume A is a Jordan matrix Each Jordan block J k (λ) canbe written as J k (λ) =λi k + N k where N k = is nilpotent Now J n1 (λ 1 ) A = Jmk (λk) We compute, for m>n k (J nk (λ k )) M =(λi + N) m Xm k µ m = λ m I + λ m j N j j because N j =0forj>n k Wehave µ m λ m j 0asm j since λ < 1 The results follows 85 Exercises 1 Prove Theorem Find a 3 3 matrix that has the same eigenvalues are the squares of the roots of the equation λ 3 3λ 2 +4λ 5=0 3 Suppose that A is a square matrix with σ(a) ={3}, m a (3) = 6, and m g (3) = 3 Up to permutations of the blocks show all possible Jordan normal forms for A

16 236 CHAPTER 8 JORDAN NORMAL FORM 4 Let A M n (C) andletx 1 C n Define x i+1 = Ax i for i =1,,n 1 Show that V = S({x 1,,x n }) is an invariant subspace of A Show that V contains an eigenvector of A 5 Referring to the previous problem, let A M n (R) beapermutation matrix (i) Find starting vectors so that dim V = n (ii) Find starting vectors so that dim V = 1 (iii) Show that if λ = 1 is a simple eigenvalue of A then dim V =1ordimV = n

17 Chapter 9 Hermitian and Symmetric Matrices Example 901 Let f : D R, D R n TheHessian is defined by H(x) =h ij (x) Since for functions f C 2 it is known that 2 f x i x j = it follows that H(x) is symmetric f x i x j M n 2 f x j x i Definition 901 A function f : R R is convex if f(λx +(1 λ)y) λf(x)+(1 λ)f(y) for x, y D (domain) and 0 λ 1 Proposition 901 If f C 2 (D) and f 00 (x) 0 on D then f(x) is convex Proof Because f 00 0, this implies that f 0 (x) is increasing Therefore if x<x m <ywe must have and f(x m ) f(x)+f 0 (x m )(x m x) f(x m ) f(y)+f 0 (x m )(x m y) 237

### Foundations of Matrix Analysis

1 Foundations of Matrix Analysis In this chapter we recall the basic elements of linear algebra which will be employed in the remainder of the text For most of the proofs as well as for the details, the

### Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )

Section 3.2 Theorem 3.6. Let A be an m n matrix of rank r. Then r m, r n, and, by means of a finite number of elementary row and column operations, A can be transformed into the matrix ( ) Ir O D = 1 O

### 0.1 Rational Canonical Forms

We have already seen that it is useful and simpler to study linear systems using matrices. But matrices are themselves cumbersome, as they are stuffed with many entries, and it turns out that it s best

### Bare-bones outline of eigenvalue theory and the Jordan canonical form

Bare-bones outline of eigenvalue theory and the Jordan canonical form April 3, 2007 N.B.: You should also consult the text/class notes for worked examples. Let F be a field, let V be a finite-dimensional

### 1. General Vector Spaces

1.1. Vector space axioms. 1. General Vector Spaces Definition 1.1. Let V be a nonempty set of objects on which the operations of addition and scalar multiplication are defined. By addition we mean a rule

### Definition (T -invariant subspace) Example. Example

Eigenvalues, Eigenvectors, Similarity, and Diagonalization We now turn our attention to linear transformations of the form T : V V. To better understand the effect of T on the vector space V, we begin

### Linear algebra and applications to graphs Part 1

Linear algebra and applications to graphs Part 1 Written up by Mikhail Belkin and Moon Duchin Instructor: Laszlo Babai June 17, 2001 1 Basic Linear Algebra Exercise 1.1 Let V and W be linear subspaces

### (Can) Canonical Forms Math 683L (Summer 2003) M n (F) C((x λ) ) =

(Can) Canonical Forms Math 683L (Summer 2003) Following the brief interlude to study diagonalisable transformations and matrices, we must now get back to the serious business of the general case. In this

### Math 489AB Exercises for Chapter 2 Fall Section 2.3

Math 489AB Exercises for Chapter 2 Fall 2008 Section 2.3 2.3.3. Let A M n (R). Then the eigenvalues of A are the roots of the characteristic polynomial p A (t). Since A is real, p A (t) is a polynomial

### NOTES II FOR 130A JACOB STERBENZ

NOTES II FOR 130A JACOB STERBENZ Abstract. Here are some notes on the Jordan canonical form as it was covered in class. Contents 1. Polynomials 1 2. The Minimal Polynomial and the Primary Decomposition

### SUPPLEMENT TO CHAPTERS VII/VIII

SUPPLEMENT TO CHAPTERS VII/VIII The characteristic polynomial of an operator Let A M n,n (F ) be an n n-matrix Then the characteristic polynomial of A is defined by: C A (x) = det(xi A) where I denotes

### Math 110 Linear Algebra Midterm 2 Review October 28, 2017

Math 11 Linear Algebra Midterm Review October 8, 17 Material Material covered on the midterm includes: All lectures from Thursday, Sept. 1st to Tuesday, Oct. 4th Homeworks 9 to 17 Quizzes 5 to 9 Sections

### MATHEMATICS 217 NOTES

MATHEMATICS 27 NOTES PART I THE JORDAN CANONICAL FORM The characteristic polynomial of an n n matrix A is the polynomial χ A (λ) = det(λi A), a monic polynomial of degree n; a monic polynomial in the variable

### Notes on the matrix exponential

Notes on the matrix exponential Erik Wahlén erik.wahlen@math.lu.se February 14, 212 1 Introduction The purpose of these notes is to describe how one can compute the matrix exponential e A when A is not

### AN ELEMENTARY PROOF OF THE SPECTRAL RADIUS FORMULA FOR MATRICES

AN ELEMENTARY PROOF OF THE SPECTRAL RADIUS FORMULA FOR MATRICES JOEL A. TROPP Abstract. We present an elementary proof that the spectral radius of a matrix A may be obtained using the formula ρ(a) lim

### ALGEBRA QUALIFYING EXAM PROBLEMS LINEAR ALGEBRA

ALGEBRA QUALIFYING EXAM PROBLEMS LINEAR ALGEBRA Kent State University Department of Mathematical Sciences Compiled and Maintained by Donald L. White Version: August 29, 2017 CONTENTS LINEAR ALGEBRA AND

### Linear Algebra II Lecture 13

Linear Algebra II Lecture 13 Xi Chen 1 1 University of Alberta November 14, 2014 Outline 1 2 If v is an eigenvector of T : V V corresponding to λ, then v is an eigenvector of T m corresponding to λ m since

### The Jordan Normal Form and its Applications

The and its Applications Jeremy IMPACT Brigham Young University A square matrix A is a linear operator on {R, C} n. A is diagonalizable if and only if it has n linearly independent eigenvectors. What happens

### The converse is clear, since

14. The minimal polynomial For an example of a matrix which cannot be diagonalised, consider the matrix ( ) 0 1 A =. 0 0 The characteristic polynomial is λ 2 = 0 so that the only eigenvalue is λ = 0. The

### DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix

DIAGONALIZATION Definition We say that a matrix A of size n n is diagonalizable if there is a basis of R n consisting of eigenvectors of A ie if there are n linearly independent vectors v v n such that

### Topics in linear algebra

Chapter 6 Topics in linear algebra 6.1 Change of basis I want to remind you of one of the basic ideas in linear algebra: change of basis. Let F be a field, V and W be finite dimensional vector spaces over

### 4.1 Eigenvalues, Eigenvectors, and The Characteristic Polynomial

Linear Algebra (part 4): Eigenvalues, Diagonalization, and the Jordan Form (by Evan Dummit, 27, v ) Contents 4 Eigenvalues, Diagonalization, and the Jordan Canonical Form 4 Eigenvalues, Eigenvectors, and

### The minimal polynomial

The minimal polynomial Michael H Mertens October 22, 2015 Introduction In these short notes we explain some of the important features of the minimal polynomial of a square matrix A and recall some basic

### Math 113 Homework 5. Bowei Liu, Chao Li. Fall 2013

Math 113 Homework 5 Bowei Liu, Chao Li Fall 2013 This homework is due Thursday November 7th at the start of class. Remember to write clearly, and justify your solutions. Please make sure to put your name

### Ma/CS 6b Class 23: Eigenvalues in Regular Graphs

Ma/CS 6b Class 3: Eigenvalues in Regular Graphs By Adam Sheffer Recall: The Spectrum of a Graph Consider a graph G = V, E and let A be the adjacency matrix of G. The eigenvalues of G are the eigenvalues

### Schur s Triangularization Theorem. Math 422

Schur s Triangularization Theorem Math 4 The characteristic polynomial p (t) of a square complex matrix A splits as a product of linear factors of the form (t λ) m Of course, finding these factors is a

### LINEAR ALGEBRA BOOT CAMP WEEK 2: LINEAR OPERATORS

LINEAR ALGEBRA BOOT CAMP WEEK 2: LINEAR OPERATORS Unless otherwise stated, all vector spaces in this worksheet are finite dimensional and the scalar field F has characteristic zero. The following are facts

### MATRICES ARE SIMILAR TO TRIANGULAR MATRICES

MATRICES ARE SIMILAR TO TRIANGULAR MATRICES 1 Complex matrices Recall that the complex numbers are given by a + ib where a and b are real and i is the imaginary unity, ie, i 2 = 1 In what we describe below,

### Matrix Theory. A.Holst, V.Ufnarovski

Matrix Theory AHolst, VUfnarovski 55 HINTS AND ANSWERS 9 55 Hints and answers There are two different approaches In the first one write A as a block of rows and note that in B = E ij A all rows different

### 1 Invariant subspaces

MATH 2040 Linear Algebra II Lecture Notes by Martin Li Lecture 8 Eigenvalues, eigenvectors and invariant subspaces 1 In previous lectures we have studied linear maps T : V W from a vector space V to another

### Chapter 2: Linear Independence and Bases

MATH20300: Linear Algebra 2 (2016 Chapter 2: Linear Independence and Bases 1 Linear Combinations and Spans Example 11 Consider the vector v (1, 1 R 2 What is the smallest subspace of (the real vector space

### M.6. Rational canonical form

book 2005/3/26 16:06 page 383 #397 M.6. RATIONAL CANONICAL FORM 383 M.6. Rational canonical form In this section we apply the theory of finitely generated modules of a principal ideal domain to study the

### DIAGONALIZATION BY SIMILARITY TRANSFORMATIONS

DIAGONALIZATION BY SIMILARITY TRANSFORMATIONS The correct choice of a coordinate system (or basis) often can simplify the form of an equation or the analysis of a particular problem. For example, consider

### Math 108b: Notes on the Spectral Theorem

Math 108b: Notes on the Spectral Theorem From section 6.3, we know that every linear operator T on a finite dimensional inner product space V has an adjoint. (T is defined as the unique linear operator

### Solutions for Math 225 Assignment #4 1

Solutions for Math 225 Assignment #4 () Let B {(3, 4), (4, 5)} and C {(, ), (0, )} be two ordered bases of R 2 (a) Find the change-of-basis matrices P C B and P B C (b) Find v] B if v] C ] (c) Find v]

### Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

Math 0, Winter 07 Final Exam Review Chapter. Matrices and Gaussian Elimination { x + x =,. Different forms of a system of linear equations. Example: The x + 4x = 4. [ ] [ ] [ ] vector form (or the column

### Linear Algebra Lecture Notes-II

Linear Algebra Lecture Notes-II Vikas Bist Department of Mathematics Panjab University, Chandigarh-64 email: bistvikas@gmail.com Last revised on March 5, 8 This text is based on the lectures delivered

### j=1 x j p, if 1 p <, x i ξ : x i < ξ} 0 as p.

LINEAR ALGEBRA Fall 203 The final exam Almost all of the problems solved Exercise Let (V, ) be a normed vector space. Prove x y x y for all x, y V. Everybody knows how to do this! Exercise 2 If V is a

### Chapter 7. Canonical Forms. 7.1 Eigenvalues and Eigenvectors

Chapter 7 Canonical Forms 7.1 Eigenvalues and Eigenvectors Definition 7.1.1. Let V be a vector space over the field F and let T be a linear operator on V. An eigenvalue of T is a scalar λ F such that there

### Numerical Linear Algebra Homework Assignment - Week 2

Numerical Linear Algebra Homework Assignment - Week 2 Đoàn Trần Nguyên Tùng Student ID: 1411352 8th October 2016 Exercise 2.1: Show that if a matrix A is both triangular and unitary, then it is diagonal.

### MATH 583A REVIEW SESSION #1

MATH 583A REVIEW SESSION #1 BOJAN DURICKOVIC 1. Vector Spaces Very quick review of the basic linear algebra concepts (see any linear algebra textbook): (finite dimensional) vector space (or linear space),

### Given a finite-dimensional vector space V over a field K, recall that a linear

Jordan normal form Sebastian Ørsted December 16, 217 Abstract In these notes, we expand upon the coverage of linear algebra as presented in Thomsen (216). Namely, we introduce some concepts and results

### MATH 423 Linear Algebra II Lecture 33: Diagonalization of normal operators.

MATH 423 Linear Algebra II Lecture 33: Diagonalization of normal operators. Adjoint operator and adjoint matrix Given a linear operator L on an inner product space V, the adjoint of L is a transformation

### Linear Algebra 1. M.T.Nair Department of Mathematics, IIT Madras. and in that case x is called an eigenvector of T corresponding to the eigenvalue λ.

Linear Algebra 1 M.T.Nair Department of Mathematics, IIT Madras 1 Eigenvalues and Eigenvectors 1.1 Definition and Examples Definition 1.1. Let V be a vector space (over a field F) and T : V V be a linear

### Numerical Methods for Solving Large Scale Eigenvalue Problems

Peter Arbenz Computer Science Department, ETH Zürich E-mail: arbenz@inf.ethz.ch arge scale eigenvalue problems, Lecture 2, February 28, 2018 1/46 Numerical Methods for Solving Large Scale Eigenvalue Problems

### Math 350 Fall 2011 Notes about inner product spaces. In this notes we state and prove some important properties of inner product spaces.

Math 350 Fall 2011 Notes about inner product spaces In this notes we state and prove some important properties of inner product spaces. First, recall the dot product on R n : if x, y R n, say x = (x 1,...,

### Linear Algebra. Workbook

Linear Algebra Workbook Paul Yiu Department of Mathematics Florida Atlantic University Last Update: November 21 Student: Fall 2011 Checklist Name: A B C D E F F G H I J 1 2 3 4 5 6 7 8 9 10 xxx xxx xxx

### Eigenvalues and Eigenvectors

Chapter 1 Eigenvalues and Eigenvectors Among problems in numerical linear algebra, the determination of the eigenvalues and eigenvectors of matrices is second in importance only to the solution of linear

### University of Colorado at Denver Mathematics Department Applied Linear Algebra Preliminary Exam With Solutions 16 January 2009, 10:00 am 2:00 pm

University of Colorado at Denver Mathematics Department Applied Linear Algebra Preliminary Exam With Solutions 16 January 2009, 10:00 am 2:00 pm Name: The proctor will let you read the following conditions

### AMS526: Numerical Analysis I (Numerical Linear Algebra)

AMS526: Numerical Analysis I (Numerical Linear Algebra) Lecture 16: Eigenvalue Problems; Similarity Transformations Xiangmin Jiao Stony Brook University Xiangmin Jiao Numerical Analysis I 1 / 18 Eigenvalue

### GENERALIZED EIGENVECTORS, MINIMAL POLYNOMIALS AND THEOREM OF CAYLEY-HAMILTION

GENERALIZED EIGENVECTORS, MINIMAL POLYNOMIALS AND THEOREM OF CAYLEY-HAMILTION FRANZ LUEF Abstract. Our exposition is inspired by S. Axler s approach to linear algebra and follows largely his exposition

### Symmetric and self-adjoint matrices

Symmetric and self-adjoint matrices A matrix A in M n (F) is called symmetric if A T = A, ie A ij = A ji for each i, j; and self-adjoint if A = A, ie A ij = A ji or each i, j Note for A in M n (R) that

### NONCOMMUTATIVE POLYNOMIAL EQUATIONS. Edward S. Letzter. Introduction

NONCOMMUTATIVE POLYNOMIAL EQUATIONS Edward S Letzter Introduction My aim in these notes is twofold: First, to briefly review some linear algebra Second, to provide you with some new tools and techniques

### JORDAN AND RATIONAL CANONICAL FORMS

JORDAN AND RATIONAL CANONICAL FORMS MATH 551 Throughout this note, let V be a n-dimensional vector space over a field k, and let φ: V V be a linear map Let B = {e 1,, e n } be a basis for V, and let A

### Homework For each of the following matrices, find the minimal polynomial and determine whether the matrix is diagonalizable.

Math 5327 Fall 2018 Homework 7 1. For each of the following matrices, find the minimal polynomial and determine whether the matrix is diagonalizable. 3 1 0 (a) A = 1 2 0 1 1 0 x 3 1 0 Solution: 1 x 2 0

### 4. Linear transformations as a vector space 17

4 Linear transformations as a vector space 17 d) 1 2 0 0 1 2 0 0 1 0 0 0 1 2 3 4 32 Let a linear transformation in R 2 be the reflection in the line = x 2 Find its matrix 33 For each linear transformation

### LECTURE VII: THE JORDAN CANONICAL FORM MAT FALL 2006 PRINCETON UNIVERSITY. [See also Appendix B in the book]

LECTURE VII: THE JORDAN CANONICAL FORM MAT 204 - FALL 2006 PRINCETON UNIVERSITY ALFONSO SORRENTINO [See also Appendix B in the book] 1 Introduction In Lecture IV we have introduced the concept of eigenvalue

### LINEAR ALGEBRA BOOT CAMP WEEK 4: THE SPECTRAL THEOREM

LINEAR ALGEBRA BOOT CAMP WEEK 4: THE SPECTRAL THEOREM Unless otherwise stated, all vector spaces in this worksheet are finite dimensional and the scalar field F is R or C. Definition 1. A linear operator

### 2 Eigenvectors and Eigenvalues in abstract spaces.

MA322 Sathaye Notes on Eigenvalues Spring 27 Introduction In these notes, we start with the definition of eigenvectors in abstract vector spaces and follow with the more common definition of eigenvectors

### BASIC ALGORITHMS IN LINEAR ALGEBRA. Matrices and Applications of Gaussian Elimination. A 2 x. A T m x. A 1 x A T 1. A m x

BASIC ALGORITHMS IN LINEAR ALGEBRA STEVEN DALE CUTKOSKY Matrices and Applications of Gaussian Elimination Systems of Equations Suppose that A is an n n matrix with coefficents in a field F, and x = (x,,

### Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Sec 6 Eigenvalues and Eigenvectors Definition An eigenvector of an n n matrix A is a nonzero vector x such that A x λ x for some scalar λ A scalar λ is called an eigenvalue of A if there is a nontrivial

### 1 Last time: least-squares problems

MATH Linear algebra (Fall 07) Lecture Last time: least-squares problems Definition. If A is an m n matrix and b R m, then a least-squares solution to the linear system Ax = b is a vector x R n such that

### Econ Slides from Lecture 7

Econ 205 Sobel Econ 205 - Slides from Lecture 7 Joel Sobel August 31, 2010 Linear Algebra: Main Theory A linear combination of a collection of vectors {x 1,..., x k } is a vector of the form k λ ix i for

### October 4, 2017 EIGENVALUES AND EIGENVECTORS. APPLICATIONS

October 4, 207 EIGENVALUES AND EIGENVECTORS. APPLICATIONS RODICA D. COSTIN Contents 4. Eigenvalues and Eigenvectors 3 4.. Motivation 3 4.2. Diagonal matrices 3 4.3. Example: solving linear differential

### Elementary linear algebra

Chapter 1 Elementary linear algebra 1.1 Vector spaces Vector spaces owe their importance to the fact that so many models arising in the solutions of specific problems turn out to be vector spaces. The

### First of all, the notion of linearity does not depend on which coordinates are used. Recall that a map T : R n R m is linear if

5 Matrices in Different Coordinates In this section we discuss finding matrices of linear maps in different coordinates Earlier in the class was the matrix that multiplied by x to give ( x) in standard

### JORDAN NORMAL FORM. Contents Introduction 1 Jordan Normal Form 1 Conclusion 5 References 5

JORDAN NORMAL FORM KATAYUN KAMDIN Abstract. This paper outlines a proof of the Jordan Normal Form Theorem. First we show that a complex, finite dimensional vector space can be decomposed into a direct

### The Cayley-Hamilton Theorem and the Jordan Decomposition

LECTURE 19 The Cayley-Hamilton Theorem and the Jordan Decomposition Let me begin by summarizing the main results of the last lecture Suppose T is a endomorphism of a vector space V Then T has a minimal

### The Jordan canonical form

The Jordan canonical form Francisco Javier Sayas University of Delaware November 22, 213 The contents of these notes have been translated and slightly modified from a previous version in Spanish. Part

### Eigenvectors. Prop-Defn

Eigenvectors Aim lecture: The simplest T -invariant subspaces are 1-dim & these give rise to the theory of eigenvectors. To compute these we introduce the similarity invariant, the characteristic polynomial.

### Linear Algebra II Lecture 22

Linear Algebra II Lecture 22 Xi Chen University of Alberta March 4, 24 Outline Characteristic Polynomial, Eigenvalue, Eigenvector and Eigenvalue, Eigenvector and Let T : V V be a linear endomorphism. We

### A proof of the Jordan normal form theorem

A proof of the Jordan normal form theorem Jordan normal form theorem states that any matrix is similar to a blockdiagonal matrix with Jordan blocks on the diagonal. To prove it, we first reformulate it

### Q N id β. 2. Let I and J be ideals in a commutative ring A. Give a simple description of

Additional Problems 1. Let A be a commutative ring and let 0 M α N β P 0 be a short exact sequence of A-modules. Let Q be an A-module. i) Show that the naturally induced sequence is exact, but that 0 Hom(P,

### Diagonalization of Matrix

of Matrix King Saud University August 29, 2018 of Matrix Table of contents 1 2 of Matrix Definition If A M n (R) and λ R. We say that λ is an eigenvalue of the matrix A if there is X R n \ {0} such that

### LINEAR ALGEBRA BOOT CAMP WEEK 1: THE BASICS

LINEAR ALGEBRA BOOT CAMP WEEK 1: THE BASICS Unless otherwise stated, all vector spaces in this worksheet are finite dimensional and the scalar field F has characteristic zero. The following are facts (in

### Eigenvalues and Eigenvectors

Eigenvalues and Eigenvectors week -2 Fall 26 Eigenvalues and eigenvectors The most simple linear transformation from R n to R n may be the transformation of the form: T (x,,, x n ) (λ x, λ 2,, λ n x n

### September 26, 2017 EIGENVALUES AND EIGENVECTORS. APPLICATIONS

September 26, 207 EIGENVALUES AND EIGENVECTORS. APPLICATIONS RODICA D. COSTIN Contents 4. Eigenvalues and Eigenvectors 3 4.. Motivation 3 4.2. Diagonal matrices 3 4.3. Example: solving linear differential

### Lecture 22: Jordan canonical form of upper-triangular matrices (1)

Lecture 22: Jordan canonical form of upper-triangular matrices (1) Travis Schedler Tue, Dec 6, 2011 (version: Tue, Dec 6, 1:00 PM) Goals (2) Definition, existence, and uniqueness of Jordan canonical form

### Further linear algebra. Chapter IV. Jordan normal form.

Further linear algebra. Chapter IV. Jordan normal form. Andrei Yafaev In what follows V is a vector space of dimension n and B is a basis of V. In this chapter we are concerned with linear maps T : V V.

### Infinite-Dimensional Triangularization

Infinite-Dimensional Triangularization Zachary Mesyan March 11, 2018 Abstract The goal of this paper is to generalize the theory of triangularizing matrices to linear transformations of an arbitrary vector

### a 11 a 12 a 11 a 12 a 13 a 21 a 22 a 23 . a 31 a 32 a 33 a 12 a 21 a 23 a 31 a = = = = 12

24 8 Matrices Determinant of 2 2 matrix Given a 2 2 matrix [ ] a a A = 2 a 2 a 22 the real number a a 22 a 2 a 2 is determinant and denoted by det(a) = a a 2 a 2 a 22 Example 8 Find determinant of 2 2

### LINEAR ALGEBRA MICHAEL PENKAVA

LINEAR ALGEBRA MICHAEL PENKAVA 1. Linear Maps Definition 1.1. If V and W are vector spaces over the same field K, then a map λ : V W is called a linear map if it satisfies the two conditions below: (1)

### Lecture Summaries for Linear Algebra M51A

These lecture summaries may also be viewed online by clicking the L icon at the top right of any lecture screen. Lecture Summaries for Linear Algebra M51A refers to the section in the textbook. Lecture

### OHSx XM511 Linear Algebra: Solutions to Online True/False Exercises

This document gives the solutions to all of the online exercises for OHSx XM511. The section ( ) numbers refer to the textbook. TYPE I are True/False. Answers are in square brackets [. Lecture 02 ( 1.1)

### Contents. Preface for the Instructor. Preface for the Student. xvii. Acknowledgments. 1 Vector Spaces 1 1.A R n and C n 2

Contents Preface for the Instructor xi Preface for the Student xv Acknowledgments xvii 1 Vector Spaces 1 1.A R n and C n 2 Complex Numbers 2 Lists 5 F n 6 Digression on Fields 10 Exercises 1.A 11 1.B Definition

### MATH 205 HOMEWORK #3 OFFICIAL SOLUTION. Problem 1: Find all eigenvalues and eigenvectors of the following linear transformations. (a) F = R, V = R 3,

MATH 205 HOMEWORK #3 OFFICIAL SOLUTION Problem 1: Find all eigenvalues and eigenvectors of the following linear transformations. a F = R, V = R 3, b F = R or C, V = F 2, T = T = 9 4 4 8 3 4 16 8 7 0 1

### Eigenvalues and Eigenvectors

/88 Chia-Ping Chen Department of Computer Science and Engineering National Sun Yat-sen University Linear Algebra Eigenvalue Problem /88 Eigenvalue Equation By definition, the eigenvalue equation for matrix

### A linear algebra proof of the fundamental theorem of algebra

A linear algebra proof of the fundamental theorem of algebra Andrés E. Caicedo May 18, 2010 Abstract We present a recent proof due to Harm Derksen, that any linear operator in a complex finite dimensional

### A linear algebra proof of the fundamental theorem of algebra

A linear algebra proof of the fundamental theorem of algebra Andrés E. Caicedo May 18, 2010 Abstract We present a recent proof due to Harm Derksen, that any linear operator in a complex finite dimensional

### Solution. That ϕ W is a linear map W W follows from the definition of subspace. The map ϕ is ϕ(v + W ) = ϕ(v) + W, which is well-defined since

MAS 5312 Section 2779 Introduction to Algebra 2 Solutions to Selected Problems, Chapters 11 13 11.2.9 Given a linear ϕ : V V such that ϕ(w ) W, show ϕ induces linear ϕ W : W W and ϕ : V/W V/W : Solution.

### MATH 115A: SAMPLE FINAL SOLUTIONS

MATH A: SAMPLE FINAL SOLUTIONS JOE HUGHES. Let V be the set of all functions f : R R such that f( x) = f(x) for all x R. Show that V is a vector space over R under the usual addition and scalar multiplication

### Chap 3. Linear Algebra

Chap 3. Linear Algebra Outlines 1. Introduction 2. Basis, Representation, and Orthonormalization 3. Linear Algebraic Equations 4. Similarity Transformation 5. Diagonal Form and Jordan Form 6. Functions

### CSL361 Problem set 4: Basic linear algebra

CSL361 Problem set 4: Basic linear algebra February 21, 2017 [Note:] If the numerical matrix computations turn out to be tedious, you may use the function rref in Matlab. 1 Row-reduced echelon matrices

### Linear Algebra: Matrix Eigenvalue Problems

CHAPTER8 Linear Algebra: Matrix Eigenvalue Problems Chapter 8 p1 A matrix eigenvalue problem considers the vector equation (1) Ax = λx. 8.0 Linear Algebra: Matrix Eigenvalue Problems Here A is a given

### Eigenvalues and Eigenvectors

LECTURE 3 Eigenvalues and Eigenvectors Definition 3.. Let A be an n n matrix. The eigenvalue-eigenvector problem for A is the problem of finding numbers λ and vectors v R 3 such that Av = λv. If λ, v are

### Chapters 5 & 6: Theory Review: Solutions Math 308 F Spring 2015

Chapters 5 & 6: Theory Review: Solutions Math 308 F Spring 205. If A is a 3 3 triangular matrix, explain why det(a) is equal to the product of entries on the diagonal. If A is a lower triangular or diagonal

### Generalized eigenspaces

Generalized eigenspaces November 30, 2012 Contents 1 Introduction 1 2 Polynomials 2 3 Calculating the characteristic polynomial 5 4 Projections 7 5 Generalized eigenvalues 10 6 Eigenpolynomials 15 1 Introduction