Jordan Normal Form. Chapter Minimal Polynomials

Save this PDF as:
 WORD  PNG  TXT  JPG

Size: px
Start display at page:

Download "Jordan Normal Form. Chapter Minimal Polynomials"

Transcription

1 Chapter 8 Jordan Normal Form 81 Minimal Polynomials Recall p A (x) =det(xi A) is called the characteristic polynomial of the matrix A Theorem 811 Let A M n Then there exists a unique monic polynomial q A (x) of minimum degree for which q A (A) =0Ifp(x) is any polynomial such that p(a) =0,thenq A (x) divides p(x) Proof Since there is a polynomial p A (x) forwhichp A (A) = 0, there is one of minimal degree, which we can assume is monic by the Euclidean algorithm p A (x) =q A (x)h(x)+r(x) where deg r(x) < deg q A (x) We know p A (A) =q A (A)h(A)+r(A) Hence r(a) = 0, and by the minimality assumption r(x) 0 Thus q A divided p A (x) and also any polynomial for which p(a) = 0 to establish that q A is unique, suppose q(x) is another monic polynomial of the same degree for which q(a) = 0 Then r(x) =q(x) q A (x) is a polynomial of degree less than q A (x) forwhichr(a) =q(a) q A (A) =0 This cannot be unless r(x) 0=0q = q A Definition 811 The polynomial q A (x) in the theorem above is called the minimal polynomial 221

2 222 CHAPTER 8 JORDAN NORMAL FORM Corollary 811 If A, B M n are similar, then they have the same minimal polynomial Proof B = S 1 AS q A (B) =q A (S 1 AS) =S 1 q A (A)S = q A (A) =0 If there is a minimal polynomial for B of smaller degree, say q B (x), then q B (A) = 0 by the same argument This contradicts the minimality of q A (x) Now that we have a minimum polynomial for any matrix, can we find a matrix with a given polynomial as its minimum polynomial? Can the degree the polynomial and the size of the matrix match? The answers to both questions are affirmative and presented below in one theorem Theorem 812 For any n th degree polynomial p(x) =x n + a n 1 x n 1 + a n 2 x n a 1 x + a 0 there is a matrix A M n (C) for which it is the minimal polynomial Proof Consider the matrix given by 0 0 a a 1 A = a a n 1 Observe that Ie 1 = e 1 = A 0 e 1 Ae 1 = e 2 = Ae 1 Ae 2 = e 3 = A 2 e 1 Ae n 1 = e n = A n 1 e 1

3 81 MINIMAL POLYNOMIALS 223 and Ae n = a n 1 e n a n 2 e n 1 a 1 e 2 a 0 e 1 Since Ae n = A n e 1, it follows that Also p(a)e 1 = A n e 1 + a n 1 A n 1 e 1 + a n 2 A 2 e a 1 Ae 1 + a 0 Ie 1 =0 p(a)e k = p(a)a k 1 e 1 = A k 1 p(a)e 1 = A k 1 (0) = 0 k =2,,n Hence p(a)e j =0 for j =1n Thus p(a) = 0 We know also that p(x) is monic Suppose now that where m<nand q(a) = 0 Then q(x) =x m + b m 1 x m 1 + b 1 x + b 0 q(a)e 1 = A m e 1 + b m 1 A m 1 e b 1 Ae 1 + b 0 e 1 = e m+1 + b m 1 e m + + b 1 e 2 + b 0 e 1 =0 But the vectors e m+1 e 1 are linear independent from which we conclude that q(a) = 0 is impossible Thus p(x) isminimal Definition 812 For a given monic polynomial p(x), the matrix A constructed above is called the companion matrix to p The transpose of the companion matrix can also be used to generate a linear differential system which has the same characteristic polynomial as a given n th order differential equation Consider the linear differential equation y (n) + a n 1 y (n 1) + + a 1 y 0 + a 0 =0 This n th order ODE can be converted to a first order system as follows: u 1 = y u 2 = u 0 1 = y 0 u 3 = u 0 2 = y 00 u n = u 0 n 1 = y (n 1)

4 224 CHAPTER 8 JORDAN NORMAL FORM Then we have 0 u u 1 u u = 0 1 u n a 0 a 1 a n 1 82 Invariant subspaces There seems to be no truly simple way to the Jordan normal form The approach taken here is intended to reveal a number of features of a matrix, interesting in their own right In particular, we will construct generalized eigenspaces that envelop the entire connection of a matrix with its eigenvalues We have in various ways considered subspaces V of C n that are invariant under the matrix A M n (C) Recall this means that AV V For example, eigenvectors can be used to create invariant subspaces Null spaces, the eigenspace of the zero eigenvalue, are invariant as well Triangular matrices furnish an easily recognizable sequence of invariant subspaces Assuming T M n (C) is upper triangular, it is easy to see that the subspaces generated by the coordinate vectors {e 1,,e m } for m =1,,nare invariant under T We now consider a specific type of invariant subspace that will lead the so-called Jordan normal form of a matrix, the closest matrix similar to A that resembles a diagonal matrix Definition 821 (Generalized Eigenspace) Let A M n (C)withspectrum σ (A) ={λ 1,,λ k } Define the generalized eigenspace pertaining to λ i by V λi = {x C n (A λ i I) n x =0} Observe that all the eigenvectors pertaining to λ i are contained in V λi If the span of the eigenvectors pertaining to λ i is not equal to V λi then there must be a positive power p and a vector x such that (A λ i I) p x = 0 but that y =(A λ i I) p 1 x 6= 0 Thusy is an eigenvector pertaining to λ i Forthis reason we will call V λi the space of generalized eigenvectors pertaining to λ i Ourfirst result, that V λi is invariant under A, issimpletoprove,noting that only closure under vector addition and scalar multiplication need be established u n

5 82 INVARIANT SUBSPACES 225 Theorem 821 Let A M n (C) with spectrum σ (A) = {λ 1,,λ k } Then for each i =1,,k, V λi is an invariant subspace of A One might question as to whether V λi could be enlarged by allowing higher powers than n in the definition The negative answer is most simply expressed by evoking the Hamilton-Cayley theorem We write the characteristic polynomial p A (λ) = Q (λ λ i ) m A(λ i ),wherem A (λ i ) is the algebraic multiplicity of λ i Since p A (A) = Q (A λ i I) m A(λ i ) = 0, it is an easy matter to see that we exhaust all of C n with the spaces V λi This is to say that allowing higher powers in the definition will not increase the subspaces V λi Indeed, as we shall see, the power of (λ λ i ) can be decreased to the geometric multiplicity m g (λ i )thepowerforλ i For now the general power n will suffice One very important result, and an essential firststepinderiving the Jordan form, is to establish that any square matrix A is similar to a block diagonal matrix, with each block carrying a single eigenvalue Theorem 822 Let A M n (C) with spectrum σ (A) ={λ 1,,λ k } and with invariant subspaces V λi, i =1, 2,,k Then (i) The spaces V λi, j = 1,,k are mutually linearly independent (ii) L k i=1 V λ i = C n (alternatively C n = S(V λ1,,v λk )) (iii) dim V λi = m A (λ i ) (iv) A is similar to a block diagonal matrix with k blocks A 1,,A k Moreover, σ (A i ) = {λ i } and dim A i = m A (λ i ) Proof (i) It should be clear that the subspaces V λi are linearly independent of each other For if there is a vector x in both V λi and V λj then there is a vector for some integer q, it must be true that (A λ j I) q 1 x 6= 0 but (A λ j I) q x =0 This means that y =(A λ j I) q 1 x is an eigenvector pertaining to λ j Since (A λ i I) n x = 0 we must also have that (A λ j I) q 1 (A λ i I) n x = (A λ i I) n (A λ j I) q 1 x = (A λ i I) n y =0 nx µ n = ( λ i ) n k A k y k k=0 nx µ n = ( λ i ) n k λ k j y k k=0 = (λ j λ i ) n y =0 This is impossible unless λ j = λ i (ii) The key part of the proof is to block diagonalize A with respect to these invariant subspaces To that end, let S

6 226 CHAPTER 8 JORDAN NORMAL FORM be the matrix with columns generated from bases of the individual V λi taken in the order of the indices Supposing there are more linearly independent vectors in C n other than those already selected, fill out the matrix S with vectors linearly independent to the subspaces V λi,i=1,,k Now define à = S 1 AS We conclude by the invariance of the subspaces and their mutual linear independence that à has the following block structure A A 2 0 à = S 1 AS = A k 0 0 B It follows that p A (λ) =pã (λ) =³Y pai (λ) p B (λ) Any root r of p B (λ) must be an eigenvalue of A, say λ j, and there must be an eigenvector x pertaining to λ j Moreover, due to the block structure we can assume that x =[0,,0,x] T, where there are k blocked zeros of the sizes of the A i respectively Then it is easy to see that ASx = λ j Sx, and this implies that Sx V λj Thus there is another vector in V λj, which contradicts its definition Therefore B is null, or what is the same thing, k i=1 V λ i = C n (iii) Let d i =dimv λi From (ii) know P d i = n Suppose that λ i σ (A j ) Then there is another eigenvector x pertaining to λ i and for which A j x = λ i x Moreover, this vector has the form x =[0,,0,x,0,0] T, analogous to the argument above By construction Sx / V λi, but ASx = λ i Sx, and this contradicts the definition of V λi Wethushavethatp Ai (λ) =(λ λ i ) d i Since p A (λ) = Q p Ai (λ) = Q (λ λ i ) m A(λ i ), it follows that d i = m A (λ i ) (iv) Putting (ii), and (iii) together gives the block diagonal structure as required On account of the mutual linear independence of the invariant subspaces V λi and the fact that they exhaust C n the following corollary is immediate Corollary 821 Let A M n (C) with spectrum σ (A) =λ 1,,λ k and with generalized eigenspaces V λi, i =1, 2,,k Then each x C n has a unique representation x = P k i=1 x i where x i V λi Another interesting result which reveals how the matrix works as a linear transformation is to decompose the it into components with respect to the

7 82 INVARIANT SUBSPACES 227 generalized eigenspaces In particular, viewing the block diagonal form A à = S 1 0 A 2 0 AS = Ak the space C n can be split into a direct sum of subspaces E 1,,E k based on coordinate blocks This is accomplished in such that any vector y C n can be written uniquely as y = P k i=1 y i where the y i E i (Keep in mind that each y i C n ; its coordinates are zero outside the coordinate block pertaining E i ) Then Ãy = à P k i=1 y i = P k i=1 Ãy i = P k i=1 A iy i This provides a computational tool when this block diagonal form is known Note that the blocks correspond directly to the invariant subspaces by SE i = V λi We can use these invariant subspaces to get at the minimal polynomial For each i =1,,k define m i =min j {(A λ i I) j x =0 x V λi } Theorem 823 Let A M n (C) with spectrum σ (A) =λ 1,,λ k and with invariant subspaces V λi, i =1, 2,,k Then the minimal polynomial of A is given by q (λ) = ky (λ λ i ) m i i=1 Proof Certainly we see that for any vector x V λj à ky! q (A) x = (A λ i I) m i x =0 i=1 Hence, the minimal polynomial q A (λ) dividesq (A) To see that indeed they are in fact equal, suppose that the minimal polynomial has the form q A (λ) = ky (λ λ i ) ˆm i i=1 where ˆm i m i, for i =1,,kandinparticular ˆm j <m j Byconstruction there must exist a vector x V λj such that (A λ j I) m j x = 0 but y =

8 228 CHAPTER 8 JORDAN NORMAL FORM (A λ j I) m j 1 x 6= 0 Then if q (A) x = = = 0 Ã ky! (A λ i I) ˆm i x i=1 ky (A λ i I) ˆm i y i=1 i6=j This cannot be because the contrary implies that there is another vector in one of the invariant subspaces V λk Just one more step is needed before the Jordan normal form can be derived For a given V λi we can interpret the spaces in a heirarchical viewpoint We know that V λi contains all the eigenvectors pertaining to λ i Call these eigenvectors the first order generalized eigenvectors If the span of these is not equal to V λi, then there must be a vector x V λi for which y = (A λ i I) 2 x = 0 but (A λ i I) x 6= 0 Thatistosayy is an eigenvector of A pertaining to λ i Call such vectors second order generalized eigenvectors In general we call an x V λi a generalized eigenvector of order j if y = (A λ i I) j x = 0 but (A λ i I) j 1 x 6= 0 In light of our previous discussion V λi contains generalized eigenvectors of order up to but not greater than m λi Theorem 824 Let A M n (C) with spectrum σ (A) ={λ 1,,λ k } and with invariant subspaces V λi, i =1, 2,,k (i) Let x V λi be a generalized eigenvector of order p Then the vectors x, (A λ i I) x, (A λ i I) 2 x,, (A λ i I) p 1 x (1) are linearly independent (ii) The subspace of C n generated by the vectors in (1) is an invariant subspace of A Proof (i) To prove linear independence of a set of vectors we suppose linear dependence That is there is a smallest integer k and constants b j such that kx x j = kx b j (A λ i I) j x =0

9 82 INVARIANT SUBSPACES 229 where b k 6=0 Solving we obtain b k (A λ i I) k x = P k 1 b j (A λ i I) j x Now apply (A λ i I) p k to both sides and obtain a new linearly dependent set as the following calculation shows Xk 1 0 = b k (A λ i I) p k+k x = b j (A λ i I) j+p k x = p 1 X j=p k b j+p k (A λ i I) j x Thekeypointtonotehereisthatthelowerlimitofthesumisincreased This new linearly dependent set, which we denote with the notation P p 1 j=p k c j (A λ i I) j x can be split in the same way as before, where we assume with no loss in generality that c p 1 6=0 Then c p 1 (A λ i I) p 1 x = p 2 X j=p k c j (A λ i I) j x Apply (A λ i I) to both sides to get 0 = c p 1 (A λ i I) p x = = p 1 X j=p k+1 p 2 X j=p k c j 1 (A λ i I) j x c j (A λ i I) j+1 x Thus we have obtained another linearly independent set with the lower limit of powers increased by one Continue this process until the linear dependence of (A λ i I) p 1 x and (A λ i I) p 2 x is achieved Thus we have c (A λ i I) p 1 x = d (A λ i I) p 2 x (A λ i I) y = d c y where y =(A λ i I) p 2 x This implies that λ i + d is a new eigenvalue c with eigenvector y V λi, and of course this is a contradiction (ii) The invariance under A is more straightforward First note that while x 1 = x,

10 230 CHAPTER 8 JORDAN NORMAL FORM x 2 =(A λi) x = Ax λx so that Ax = x 2 λx 1 Consider any vector y defined by y = P p 1 b j (A λ i I) j x It follows that Xp 1 Ay = A b j (A λ i I) j x = = = = p 1 X b j (A λ i I) j Ax p 1 X b j (A λ i I) j (x 2 λx 1 ) p 1 X b j (A λ i I) j [(A λ i I) x 1 λx 1 ] p 1 X c j (A λ i I) j x where c j = b j 1 λ for j>0andc 0 = λ, which proves the result 83 The Jordan Normal Form We need a lemma that points in the direction we are headed, that being the use of invariant subspaces as a basis for the (Jordan) block diagonalization of any matrix These results were discussed in detail in the Section 82 The restatement here illustrates the invariant subspace nature of the result, irrespective of generalized eigenspaces Its proof is elementary and is left to the reader Lemma 831 Let A M n (C) with invariant subspace V C n (i) Suppose v 1 v k is a basis for V and S is an invertible matrix with the first k columns given by v 1 v k Then S 1 AS = 1k 0

11 83 THE JORDAN NORMAL FORM 231 (ii) Suppose that V 1,V 2 C n are two invariant subspaces of A and C n = V 1 V 2 Let the (invertible) matrix S consist respectively of bases from V 1 and V 2 as its columns Then S 1 AS = 0 0 Definition 831 Let λ C A Jordan block J k (λ) isak k upper triangular matrix of the form λ 1 0 J k (λ) = λ λ A Jordan matrix is any matrix of the form J n1 (λ 1 ) 0 J = 0 J nk (λ k ) where the matrices J n1 are Jordan blocks If J M n (C), then n 1 + n 2 + n k = n Theorem 831 (Jordan normal form) Let A M n (C) Then there is a nonsingular matrix S M n such that J n1 (λ 1 ) 0 A = S S 1 = SJS 1 0 J nk (λ k ) where J ni (λ i ) is a Jordan block, where n 1 +n 2 + +n k = n J is unique up to permutations of the blocks The eigenvalues λ 1,,λ k are not necessarily distinct If A is real with real eigenvalues, then S can be taken as real Proof This result is proved in four steps (1) Block diagonalize (by similarity) into invariant subspaces pertaining to σ(a) This is accomplished as follows First block diagonalize the matrix according to the generalized eigenspaces V λi = {x C n (A λ i I) n x =

12 232 CHAPTER 8 JORDAN NORMAL FORM 0} as discussed in the previous section Beginning with the highest order eigenvector in x V λi, construct the invariant subspace as in (1) of Section 82 Repeat this process until all generalized eigenvectors have been included in an invariant subspace This includes of course first order eigenvectors that are not associated with higher order eigenvectors These invariant subspaces have dimension one Each of these invariant subspaces is linearly independent from the others Continue this process for all the generalized eigenspaces This exhausts C n Each of the blocks contains exactly one eigenvector The dimensions of these invariant subspaces can range from one to m λi,therebeingat least one subspace of dimension m λi (2) Triangularize each block by Schur s theorem, so that each block has the form λ K(λ) = 0 λ You will note that K(λ) =λi + N where N is nilpotent, or K(λ) is1 1 (3) Jordanize each triangular block Assume that K 1 (λ)ism m, where m>1 By construction K 1 (λ) pertains to an invariant subspace for which there is a unique vector x for which N m 1 x 6= 0 and N m x =0 Thus N m 1 x is an eigenvector of K 1 (λ), the unique eigenvector Define y i = N i 1 x i =1, 2,,m Expand the set {y i } m i=1 as a basis of C mdefine " # ym y m 1 y 1 S 1 = Then " # 0 ym y m 1 y 2 NS 1 =

13 83 THE JORDAN NORMAL FORM 233 So S 1 1 NS 1 = We conclude that λ 1 0 λ 1 S1 1 K λ 1 1(λ)S 1 = 1 0 λ (4) Assemble all of the blocks to form the Jordan form For example, the block K 1 (λ) and the corresponding similarity transformation S 1 studied above can be treated in the assembly process as follows: Define the n n matrix I 0 0 Ŝ 1 = 0 S I where S 1 istheblockconstructedaboveandplacedinthen n matrix in the position that K 1 (λ) was extracted from the block triangular form of A Repeat this for each of the blocks pertaining to minimally invariant subspaces This gives a sequence of block diagonal matrices Ŝ 1, Ŝ2, Ŝk Define T = Ŝ1Ŝ2 Ŝk It has the form T = Ŝ 1 0 Ŝ 2 0 Ŝ k Together with the original matrix P that transformed the matrix to the minimal invariant subspace blocked form and the unitary matrix

14 234 CHAPTER 8 JORDAN NORMAL FORM V used to triangularize A, it follows that J n1 (λ 1 ) 0 A = PVT (PVT) 1 = SJS 1 0 J nk (λ k ) with S = PVT Example 831 Let J = In this example, there are four blocks, with two of the blocks pertaining to the single eigenvalue 2 For the first block there is the single eigenvector e 1, but the invariant subspace is S (e 1, e 2 ) For the second block, the eigenvector, e 3, generates the one dimensional invariant subspace The block pertaining to the eigenvector 3 has the single eigenvector e 4 while the minimal invariant subspace is S (e 4, e 5,e 6 ) Finally, the one dimensional subspace pertaining to the eigenvector 1 is spanned by e 7 The minimal invariant polynomial is q (λ) =(λ 2) 2 (λ 3) 3 (λ +1) 84 Convergent matrices Using the Jordan normal form, the study of convergent matrices becomes relatively straightforward and simpler Theorem 841 If A M n and ρ(a) < 1 Then lim k A =0

15 85 EXERCISES 235 Proof We assume A is a Jordan matrix Each Jordan block J k (λ) canbe written as J k (λ) =λi k + N k where N k = is nilpotent Now J n1 (λ 1 ) A = Jmk (λk) We compute, for m>n k (J nk (λ k )) M =(λi + N) m Xm k µ m = λ m I + λ m j N j j because N j =0forj>n k Wehave µ m λ m j 0asm j since λ < 1 The results follows 85 Exercises 1 Prove Theorem Find a 3 3 matrix that has the same eigenvalues are the squares of the roots of the equation λ 3 3λ 2 +4λ 5=0 3 Suppose that A is a square matrix with σ(a) ={3}, m a (3) = 6, and m g (3) = 3 Up to permutations of the blocks show all possible Jordan normal forms for A

16 236 CHAPTER 8 JORDAN NORMAL FORM 4 Let A M n (C) andletx 1 C n Define x i+1 = Ax i for i =1,,n 1 Show that V = S({x 1,,x n }) is an invariant subspace of A Show that V contains an eigenvector of A 5 Referring to the previous problem, let A M n (R) beapermutation matrix (i) Find starting vectors so that dim V = n (ii) Find starting vectors so that dim V = 1 (iii) Show that if λ = 1 is a simple eigenvalue of A then dim V =1ordimV = n

17 Chapter 9 Hermitian and Symmetric Matrices Example 901 Let f : D R, D R n TheHessian is defined by H(x) =h ij (x) Since for functions f C 2 it is known that 2 f x i x j = it follows that H(x) is symmetric f x i x j M n 2 f x j x i Definition 901 A function f : R R is convex if f(λx +(1 λ)y) λf(x)+(1 λ)f(y) for x, y D (domain) and 0 λ 1 Proposition 901 If f C 2 (D) and f 00 (x) 0 on D then f(x) is convex Proof Because f 00 0, this implies that f 0 (x) is increasing Therefore if x<x m <ywe must have and f(x m ) f(x)+f 0 (x m )(x m x) f(x m ) f(y)+f 0 (x m )(x m y) 237

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v ) Section 3.2 Theorem 3.6. Let A be an m n matrix of rank r. Then r m, r n, and, by means of a finite number of elementary row and column operations, A can be transformed into the matrix ( ) Ir O D = 1 O

More information

Definition (T -invariant subspace) Example. Example

Definition (T -invariant subspace) Example. Example Eigenvalues, Eigenvectors, Similarity, and Diagonalization We now turn our attention to linear transformations of the form T : V V. To better understand the effect of T on the vector space V, we begin

More information

1. General Vector Spaces

1. General Vector Spaces 1.1. Vector space axioms. 1. General Vector Spaces Definition 1.1. Let V be a nonempty set of objects on which the operations of addition and scalar multiplication are defined. By addition we mean a rule

More information

Linear algebra and applications to graphs Part 1

Linear algebra and applications to graphs Part 1 Linear algebra and applications to graphs Part 1 Written up by Mikhail Belkin and Moon Duchin Instructor: Laszlo Babai June 17, 2001 1 Basic Linear Algebra Exercise 1.1 Let V and W be linear subspaces

More information

Math 110 Linear Algebra Midterm 2 Review October 28, 2017

Math 110 Linear Algebra Midterm 2 Review October 28, 2017 Math 11 Linear Algebra Midterm Review October 8, 17 Material Material covered on the midterm includes: All lectures from Thursday, Sept. 1st to Tuesday, Oct. 4th Homeworks 9 to 17 Quizzes 5 to 9 Sections

More information

AN ELEMENTARY PROOF OF THE SPECTRAL RADIUS FORMULA FOR MATRICES

AN ELEMENTARY PROOF OF THE SPECTRAL RADIUS FORMULA FOR MATRICES AN ELEMENTARY PROOF OF THE SPECTRAL RADIUS FORMULA FOR MATRICES JOEL A. TROPP Abstract. We present an elementary proof that the spectral radius of a matrix A may be obtained using the formula ρ(a) lim

More information

ALGEBRA QUALIFYING EXAM PROBLEMS LINEAR ALGEBRA

ALGEBRA QUALIFYING EXAM PROBLEMS LINEAR ALGEBRA ALGEBRA QUALIFYING EXAM PROBLEMS LINEAR ALGEBRA Kent State University Department of Mathematical Sciences Compiled and Maintained by Donald L. White Version: August 29, 2017 CONTENTS LINEAR ALGEBRA AND

More information

Topics in linear algebra

Topics in linear algebra Chapter 6 Topics in linear algebra 6.1 Change of basis I want to remind you of one of the basic ideas in linear algebra: change of basis. Let F be a field, V and W be finite dimensional vector spaces over

More information

Math 113 Homework 5. Bowei Liu, Chao Li. Fall 2013

Math 113 Homework 5. Bowei Liu, Chao Li. Fall 2013 Math 113 Homework 5 Bowei Liu, Chao Li Fall 2013 This homework is due Thursday November 7th at the start of class. Remember to write clearly, and justify your solutions. Please make sure to put your name

More information

1 Invariant subspaces

1 Invariant subspaces MATH 2040 Linear Algebra II Lecture Notes by Martin Li Lecture 8 Eigenvalues, eigenvectors and invariant subspaces 1 In previous lectures we have studied linear maps T : V W from a vector space V to another

More information

M.6. Rational canonical form

M.6. Rational canonical form book 2005/3/26 16:06 page 383 #397 M.6. RATIONAL CANONICAL FORM 383 M.6. Rational canonical form In this section we apply the theory of finitely generated modules of a principal ideal domain to study the

More information

Linear Algebra. Workbook

Linear Algebra. Workbook Linear Algebra Workbook Paul Yiu Department of Mathematics Florida Atlantic University Last Update: November 21 Student: Fall 2011 Checklist Name: A B C D E F F G H I J 1 2 3 4 5 6 7 8 9 10 xxx xxx xxx

More information

Eigenvalues and Eigenvectors

Eigenvalues and Eigenvectors Chapter 1 Eigenvalues and Eigenvectors Among problems in numerical linear algebra, the determination of the eigenvalues and eigenvectors of matrices is second in importance only to the solution of linear

More information

October 4, 2017 EIGENVALUES AND EIGENVECTORS. APPLICATIONS

October 4, 2017 EIGENVALUES AND EIGENVECTORS. APPLICATIONS October 4, 207 EIGENVALUES AND EIGENVECTORS. APPLICATIONS RODICA D. COSTIN Contents 4. Eigenvalues and Eigenvectors 3 4.. Motivation 3 4.2. Diagonal matrices 3 4.3. Example: solving linear differential

More information

JORDAN AND RATIONAL CANONICAL FORMS

JORDAN AND RATIONAL CANONICAL FORMS JORDAN AND RATIONAL CANONICAL FORMS MATH 551 Throughout this note, let V be a n-dimensional vector space over a field k, and let φ: V V be a linear map Let B = {e 1,, e n } be a basis for V, and let A

More information

NONCOMMUTATIVE POLYNOMIAL EQUATIONS. Edward S. Letzter. Introduction

NONCOMMUTATIVE POLYNOMIAL EQUATIONS. Edward S. Letzter. Introduction NONCOMMUTATIVE POLYNOMIAL EQUATIONS Edward S Letzter Introduction My aim in these notes is twofold: First, to briefly review some linear algebra Second, to provide you with some new tools and techniques

More information

LECTURE VII: THE JORDAN CANONICAL FORM MAT FALL 2006 PRINCETON UNIVERSITY. [See also Appendix B in the book]

LECTURE VII: THE JORDAN CANONICAL FORM MAT FALL 2006 PRINCETON UNIVERSITY. [See also Appendix B in the book] LECTURE VII: THE JORDAN CANONICAL FORM MAT 204 - FALL 2006 PRINCETON UNIVERSITY ALFONSO SORRENTINO [See also Appendix B in the book] 1 Introduction In Lecture IV we have introduced the concept of eigenvalue

More information

BASIC ALGORITHMS IN LINEAR ALGEBRA. Matrices and Applications of Gaussian Elimination. A 2 x. A T m x. A 1 x A T 1. A m x

BASIC ALGORITHMS IN LINEAR ALGEBRA. Matrices and Applications of Gaussian Elimination. A 2 x. A T m x. A 1 x A T 1. A m x BASIC ALGORITHMS IN LINEAR ALGEBRA STEVEN DALE CUTKOSKY Matrices and Applications of Gaussian Elimination Systems of Equations Suppose that A is an n n matrix with coefficents in a field F, and x = (x,,

More information

4. Linear transformations as a vector space 17

4. Linear transformations as a vector space 17 4 Linear transformations as a vector space 17 d) 1 2 0 0 1 2 0 0 1 0 0 0 1 2 3 4 32 Let a linear transformation in R 2 be the reflection in the line = x 2 Find its matrix 33 For each linear transformation

More information

September 26, 2017 EIGENVALUES AND EIGENVECTORS. APPLICATIONS

September 26, 2017 EIGENVALUES AND EIGENVECTORS. APPLICATIONS September 26, 207 EIGENVALUES AND EIGENVECTORS. APPLICATIONS RODICA D. COSTIN Contents 4. Eigenvalues and Eigenvectors 3 4.. Motivation 3 4.2. Diagonal matrices 3 4.3. Example: solving linear differential

More information

A proof of the Jordan normal form theorem

A proof of the Jordan normal form theorem A proof of the Jordan normal form theorem Jordan normal form theorem states that any matrix is similar to a blockdiagonal matrix with Jordan blocks on the diagonal. To prove it, we first reformulate it

More information

The Jordan canonical form

The Jordan canonical form The Jordan canonical form Francisco Javier Sayas University of Delaware November 22, 213 The contents of these notes have been translated and slightly modified from a previous version in Spanish. Part

More information

Lecture 22: Jordan canonical form of upper-triangular matrices (1)

Lecture 22: Jordan canonical form of upper-triangular matrices (1) Lecture 22: Jordan canonical form of upper-triangular matrices (1) Travis Schedler Tue, Dec 6, 2011 (version: Tue, Dec 6, 1:00 PM) Goals (2) Definition, existence, and uniqueness of Jordan canonical form

More information

Eigenvalues and Eigenvectors

Eigenvalues and Eigenvectors Eigenvalues and Eigenvectors week -2 Fall 26 Eigenvalues and eigenvectors The most simple linear transformation from R n to R n may be the transformation of the form: T (x,,, x n ) (λ x, λ 2,, λ n x n

More information

A linear algebra proof of the fundamental theorem of algebra

A linear algebra proof of the fundamental theorem of algebra A linear algebra proof of the fundamental theorem of algebra Andrés E. Caicedo May 18, 2010 Abstract We present a recent proof due to Harm Derksen, that any linear operator in a complex finite dimensional

More information

Eigenvalues and Eigenvectors

Eigenvalues and Eigenvectors LECTURE 3 Eigenvalues and Eigenvectors Definition 3.. Let A be an n n matrix. The eigenvalue-eigenvector problem for A is the problem of finding numbers λ and vectors v R 3 such that Av = λv. If λ, v are

More information

THE MINIMAL POLYNOMIAL AND SOME APPLICATIONS

THE MINIMAL POLYNOMIAL AND SOME APPLICATIONS THE MINIMAL POLYNOMIAL AND SOME APPLICATIONS KEITH CONRAD. Introduction The easiest matrices to compute with are the diagonal ones. The sum and product of diagonal matrices can be computed componentwise

More information

Linear algebra 2. Yoav Zemel. March 1, 2012

Linear algebra 2. Yoav Zemel. March 1, 2012 Linear algebra 2 Yoav Zemel March 1, 2012 These notes were written by Yoav Zemel. The lecturer, Shmuel Berger, should not be held responsible for any mistake. Any comments are welcome at zamsh7@gmail.com.

More information

Real symmetric matrices/1. 1 Eigenvalues and eigenvectors

Real symmetric matrices/1. 1 Eigenvalues and eigenvectors Real symmetric matrices 1 Eigenvalues and eigenvectors We use the convention that vectors are row vectors and matrices act on the right. Let A be a square matrix with entries in a field F; suppose that

More information

1. Row Operations. Math 211 Linear Algebra Skeleton Notes S. Waner

1. Row Operations. Math 211 Linear Algebra Skeleton Notes S. Waner 1 Math 211 Linear Algebra Skeleton Notes S. Waner 1. Row Operations Definitions 1.1 A field is a set F with two binary operations +,, such that: 1. Addition and multiplication are commutative: x, y é F,

More information

Chapters 5 & 6: Theory Review: Solutions Math 308 F Spring 2015

Chapters 5 & 6: Theory Review: Solutions Math 308 F Spring 2015 Chapters 5 & 6: Theory Review: Solutions Math 308 F Spring 205. If A is a 3 3 triangular matrix, explain why det(a) is equal to the product of entries on the diagonal. If A is a lower triangular or diagonal

More information

Spectral radius, symmetric and positive matrices

Spectral radius, symmetric and positive matrices Spectral radius, symmetric and positive matrices Zdeněk Dvořák April 28, 2016 1 Spectral radius Definition 1. The spectral radius of a square matrix A is ρ(a) = max{ λ : λ is an eigenvalue of A}. For an

More information

Jordan Canonical Form

Jordan Canonical Form Jordan Canonical Form Suppose A is a n n matrix operating on V = C n. First Reduction (to a repeated single eigenvalue). Let φ(x) = det(x A) = r (x λ i ) e i (1) be the characteristic equation of A. Factor

More information

A NOTE ON THE JORDAN CANONICAL FORM

A NOTE ON THE JORDAN CANONICAL FORM A NOTE ON THE JORDAN CANONICAL FORM H. Azad Department of Mathematics and Statistics King Fahd University of Petroleum & Minerals Dhahran, Saudi Arabia hassanaz@kfupm.edu.sa Abstract A proof of the Jordan

More information

Math 4153 Exam 3 Review. The syllabus for Exam 3 is Chapter 6 (pages ), Chapter 7 through page 137, and Chapter 8 through page 182 in Axler.

Math 4153 Exam 3 Review. The syllabus for Exam 3 is Chapter 6 (pages ), Chapter 7 through page 137, and Chapter 8 through page 182 in Axler. Math 453 Exam 3 Review The syllabus for Exam 3 is Chapter 6 (pages -2), Chapter 7 through page 37, and Chapter 8 through page 82 in Axler.. You should be sure to know precise definition of the terms we

More information

MATH 511 ADVANCED LINEAR ALGEBRA SPRING 2006

MATH 511 ADVANCED LINEAR ALGEBRA SPRING 2006 MATH 511 ADVANCED LINEAR ALGEBRA SPRING 2006 Sherod Eubanks HOMEWORK 2 2.1 : 2, 5, 9, 12 2.3 : 3, 6 2.4 : 2, 4, 5, 9, 11 Section 2.1: Unitary Matrices Problem 2 If λ σ(u) and U M n is unitary, show that

More information

Lecture 12: Diagonalization

Lecture 12: Diagonalization Lecture : Diagonalization A square matrix D is called diagonal if all but diagonal entries are zero: a a D a n 5 n n. () Diagonal matrices are the simplest matrices that are basically equivalent to vectors

More information

In particular, if A is a square matrix and λ is one of its eigenvalues, then we can find a non-zero column vector X with

In particular, if A is a square matrix and λ is one of its eigenvalues, then we can find a non-zero column vector X with Appendix: Matrix Estimates and the Perron-Frobenius Theorem. This Appendix will first present some well known estimates. For any m n matrix A = [a ij ] over the real or complex numbers, it will be convenient

More information

Chapter 3. Matrices. 3.1 Matrices

Chapter 3. Matrices. 3.1 Matrices 40 Chapter 3 Matrices 3.1 Matrices Definition 3.1 Matrix) A matrix A is a rectangular array of m n real numbers {a ij } written as a 11 a 12 a 1n a 21 a 22 a 2n A =.... a m1 a m2 a mn The array has m rows

More information

Jordan Canonical Form of A Partitioned Complex Matrix and Its Application to Real Quaternion Matrices

Jordan Canonical Form of A Partitioned Complex Matrix and Its Application to Real Quaternion Matrices COMMUNICATIONS IN ALGEBRA, 29(6, 2363-2375(200 Jordan Canonical Form of A Partitioned Complex Matrix and Its Application to Real Quaternion Matrices Fuzhen Zhang Department of Math Science and Technology

More information

Then since v is an eigenvector of T, we have (T λi)v = 0. Then

Then since v is an eigenvector of T, we have (T λi)v = 0. Then Problem F02.10. Let T be a linear operator on a finite dimensional complex inner product space V such that T T = T T. Show that there is an orthonormal basis of V consisting of eigenvectors of B. Solution.

More information

Optimization Theory. A Concise Introduction. Jiongmin Yong

Optimization Theory. A Concise Introduction. Jiongmin Yong October 11, 017 16:5 ws-book9x6 Book Title Optimization Theory 017-08-Lecture Notes page 1 1 Optimization Theory A Concise Introduction Jiongmin Yong Optimization Theory 017-08-Lecture Notes page Optimization

More information

The University of Texas at Austin Department of Electrical and Computer Engineering. EE381V: Large Scale Learning Spring 2013.

The University of Texas at Austin Department of Electrical and Computer Engineering. EE381V: Large Scale Learning Spring 2013. The University of Texas at Austin Department of Electrical and Computer Engineering EE381V: Large Scale Learning Spring 2013 Assignment Two Caramanis/Sanghavi Due: Tuesday, Feb. 19, 2013. Computational

More information

av 1 x 2 + 4y 2 + xy + 4z 2 = 16.

av 1 x 2 + 4y 2 + xy + 4z 2 = 16. 74 85 Eigenanalysis The subject of eigenanalysis seeks to find a coordinate system, in which the solution to an applied problem has a simple expression Therefore, eigenanalysis might be called the method

More information

Math Linear Algebra Final Exam Review Sheet

Math Linear Algebra Final Exam Review Sheet Math 15-1 Linear Algebra Final Exam Review Sheet Vector Operations Vector addition is a component-wise operation. Two vectors v and w may be added together as long as they contain the same number n of

More information

Solutions for Math 225 Assignment #5 1

Solutions for Math 225 Assignment #5 1 Solutions for Math 225 Assignment #5 1 (1) Find a polynomial f(x) of degree at most 3 satisfying that f(0) = 2, f( 1) = 1, f(1) = 3 and f(3) = 1. Solution. By Lagrange Interpolation, ( ) (x + 1)(x 1)(x

More information

2 so Q[ 2] is closed under both additive and multiplicative inverses. a 2 2b 2 + b

2 so Q[ 2] is closed under both additive and multiplicative inverses. a 2 2b 2 + b . FINITE-DIMENSIONAL VECTOR SPACES.. Fields By now you ll have acquired a fair knowledge of matrices. These are a concrete embodiment of something rather more abstract. Sometimes it is easier to use matrices,

More information

Math Linear Algebra II. 1. Inner Products and Norms

Math Linear Algebra II. 1. Inner Products and Norms Math 342 - Linear Algebra II Notes 1. Inner Products and Norms One knows from a basic introduction to vectors in R n Math 254 at OSU) that the length of a vector x = x 1 x 2... x n ) T R n, denoted x,

More information

Linear Algebra in Actuarial Science: Slides to the lecture

Linear Algebra in Actuarial Science: Slides to the lecture Linear Algebra in Actuarial Science: Slides to the lecture Fall Semester 2010/2011 Linear Algebra is a Tool-Box Linear Equation Systems Discretization of differential equations: solving linear equations

More information

w T 1 w T 2. w T n 0 if i j 1 if i = j

w T 1 w T 2. w T n 0 if i j 1 if i = j Lyapunov Operator Let A F n n be given, and define a linear operator L A : C n n C n n as L A (X) := A X + XA Suppose A is diagonalizable (what follows can be generalized even if this is not possible -

More information

Lecture 11: Finish Gaussian elimination and applications; intro to eigenvalues and eigenvectors (1)

Lecture 11: Finish Gaussian elimination and applications; intro to eigenvalues and eigenvectors (1) Lecture 11: Finish Gaussian elimination and applications; intro to eigenvalues and eigenvectors (1) Travis Schedler Tue, Oct 18, 2011 (version: Tue, Oct 18, 6:00 PM) Goals (2) Solving systems of equations

More information

MATH 369 Linear Algebra

MATH 369 Linear Algebra Assignment # Problem # A father and his two sons are together 00 years old. The father is twice as old as his older son and 30 years older than his younger son. How old is each person? Problem # 2 Determine

More information

Math Camp Lecture 4: Linear Algebra. Xiao Yu Wang. Aug 2010 MIT. Xiao Yu Wang (MIT) Math Camp /10 1 / 88

Math Camp Lecture 4: Linear Algebra. Xiao Yu Wang. Aug 2010 MIT. Xiao Yu Wang (MIT) Math Camp /10 1 / 88 Math Camp 2010 Lecture 4: Linear Algebra Xiao Yu Wang MIT Aug 2010 Xiao Yu Wang (MIT) Math Camp 2010 08/10 1 / 88 Linear Algebra Game Plan Vector Spaces Linear Transformations and Matrices Determinant

More information

Quadratic forms. Here. Thus symmetric matrices are diagonalizable, and the diagonalization can be performed by means of an orthogonal matrix.

Quadratic forms. Here. Thus symmetric matrices are diagonalizable, and the diagonalization can be performed by means of an orthogonal matrix. Quadratic forms 1. Symmetric matrices An n n matrix (a ij ) n ij=1 with entries on R is called symmetric if A T, that is, if a ij = a ji for all 1 i, j n. We denote by S n (R) the set of all n n symmetric

More information

22m:033 Notes: 7.1 Diagonalization of Symmetric Matrices

22m:033 Notes: 7.1 Diagonalization of Symmetric Matrices m:33 Notes: 7. Diagonalization of Symmetric Matrices Dennis Roseman University of Iowa Iowa City, IA http://www.math.uiowa.edu/ roseman May 3, Symmetric matrices Definition. A symmetric matrix is a matrix

More information

Recall the convention that, for us, all vectors are column vectors.

Recall the convention that, for us, all vectors are column vectors. Some linear algebra Recall the convention that, for us, all vectors are column vectors. 1. Symmetric matrices Let A be a real matrix. Recall that a complex number λ is an eigenvalue of A if there exists

More information

(a) II and III (b) I (c) I and III (d) I and II and III (e) None are true.

(a) II and III (b) I (c) I and III (d) I and II and III (e) None are true. 1 Which of the following statements is always true? I The null space of an m n matrix is a subspace of R m II If the set B = {v 1,, v n } spans a vector space V and dimv = n, then B is a basis for V III

More information

MTH50 Spring 07 HW Assignment 7 {From [FIS0]}: Sec 44 #4a h 6; Sec 5 #ad ac 4ae 4 7 The due date for this assignment is 04/05/7 Sec 44 #4a h Evaluate the erminant of the following matrices by any legitimate

More information

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET This is a (not quite comprehensive) list of definitions and theorems given in Math 1553. Pay particular attention to the ones in red. Study Tip For each

More information

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP) MATH 20F: LINEAR ALGEBRA LECTURE B00 (T KEMP) Definition 01 If T (x) = Ax is a linear transformation from R n to R m then Nul (T ) = {x R n : T (x) = 0} = Nul (A) Ran (T ) = {Ax R m : x R n } = {b R m

More information

The Spectral Theorem for normal linear maps

The Spectral Theorem for normal linear maps MAT067 University of California, Davis Winter 2007 The Spectral Theorem for normal linear maps Isaiah Lankham, Bruno Nachtergaele, Anne Schilling (March 14, 2007) In this section we come back to the question

More information

(VI.C) Rational Canonical Form

(VI.C) Rational Canonical Form (VI.C) Rational Canonical Form Let s agree to call a transformation T : F n F n semisimple if there is a basis B = { v,..., v n } such that T v = λ v, T v 2 = λ 2 v 2,..., T v n = λ n for some scalars

More information

Generalized eigenvector - Wikipedia, the free encyclopedia

Generalized eigenvector - Wikipedia, the free encyclopedia 1 of 30 18/03/2013 20:00 Generalized eigenvector From Wikipedia, the free encyclopedia In linear algebra, for a matrix A, there may not always exist a full set of linearly independent eigenvectors that

More information

Stat 159/259: Linear Algebra Notes

Stat 159/259: Linear Algebra Notes Stat 159/259: Linear Algebra Notes Jarrod Millman November 16, 2015 Abstract These notes assume you ve taken a semester of undergraduate linear algebra. In particular, I assume you are familiar with the

More information

Calculating determinants for larger matrices

Calculating determinants for larger matrices Day 26 Calculating determinants for larger matrices We now proceed to define det A for n n matrices A As before, we are looking for a function of A that satisfies the product formula det(ab) = det A det

More information

Generalized Eigenvectors and Jordan Form

Generalized Eigenvectors and Jordan Form Generalized Eigenvectors and Jordan Form We have seen that an n n matrix A is diagonalizable precisely when the dimensions of its eigenspaces sum to n. So if A is not diagonalizable, there is at least

More information

8. Limit Laws. lim(f g)(x) = lim f(x) lim g(x), (x) = lim x a f(x) g lim x a g(x)

8. Limit Laws. lim(f g)(x) = lim f(x) lim g(x), (x) = lim x a f(x) g lim x a g(x) 8. Limit Laws 8.1. Basic Limit Laws. If f and g are two functions and we know the it of each of them at a given point a, then we can easily compute the it at a of their sum, difference, product, constant

More information

EXERCISE SET 5.1. = (kx + kx + k, ky + ky + k ) = (kx + kx + 1, ky + ky + 1) = ((k + )x + 1, (k + )y + 1)

EXERCISE SET 5.1. = (kx + kx + k, ky + ky + k ) = (kx + kx + 1, ky + ky + 1) = ((k + )x + 1, (k + )y + 1) EXERCISE SET 5. 6. The pair (, 2) is in the set but the pair ( )(, 2) = (, 2) is not because the first component is negative; hence Axiom 6 fails. Axiom 5 also fails. 8. Axioms, 2, 3, 6, 9, and are easily

More information

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET This is a (not quite comprehensive) list of definitions and theorems given in Math 1553. Pay particular attention to the ones in red. Study Tip For each

More information

Linear algebra II Homework #1 solutions A = This means that every eigenvector with eigenvalue λ = 1 must have the form

Linear algebra II Homework #1 solutions A = This means that every eigenvector with eigenvalue λ = 1 must have the form Linear algebra II Homework # solutions. Find the eigenvalues and the eigenvectors of the matrix 4 6 A =. 5 Since tra = 9 and deta = = 8, the characteristic polynomial is f(λ) = λ (tra)λ+deta = λ 9λ+8 =

More information

HOMEWORK PROBLEMS FROM STRANG S LINEAR ALGEBRA AND ITS APPLICATIONS (4TH EDITION)

HOMEWORK PROBLEMS FROM STRANG S LINEAR ALGEBRA AND ITS APPLICATIONS (4TH EDITION) HOMEWORK PROBLEMS FROM STRANG S LINEAR ALGEBRA AND ITS APPLICATIONS (4TH EDITION) PROFESSOR STEVEN MILLER: BROWN UNIVERSITY: SPRING 2007 1. CHAPTER 1: MATRICES AND GAUSSIAN ELIMINATION Page 9, # 3: Describe

More information

Eigenvalues, Eigenvectors, and Diagonalization

Eigenvalues, Eigenvectors, and Diagonalization Week12 Eigenvalues, Eigenvectors, and Diagonalization 12.1 Opening Remarks 12.1.1 Predicting the Weather, Again Let us revisit the example from Week 4, in which we had a simple model for predicting the

More information

SQUARE ROOTS OF 2x2 MATRICES 1. Sam Northshield SUNY-Plattsburgh

SQUARE ROOTS OF 2x2 MATRICES 1. Sam Northshield SUNY-Plattsburgh SQUARE ROOTS OF x MATRICES Sam Northshield SUNY-Plattsburgh INTRODUCTION A B What is the square root of a matrix such as? It is not, in general, A B C D C D This is easy to see since the upper left entry

More information

Group Theory. 1. Show that Φ maps a conjugacy class of G into a conjugacy class of G.

Group Theory. 1. Show that Φ maps a conjugacy class of G into a conjugacy class of G. Group Theory Jan 2012 #6 Prove that if G is a nonabelian group, then G/Z(G) is not cyclic. Aug 2011 #9 (Jan 2010 #5) Prove that any group of order p 2 is an abelian group. Jan 2012 #7 G is nonabelian nite

More information

MATH JORDAN FORM

MATH JORDAN FORM MATH 53 JORDAN FORM Let A,, A k be square matrices of size n,, n k, respectively with entries in a field F We define the matrix A A k of size n = n + + n k as the block matrix A 0 0 0 0 A 0 0 0 0 A k It

More information

Symmetric Matrices and Eigendecomposition

Symmetric Matrices and Eigendecomposition Symmetric Matrices and Eigendecomposition Robert M. Freund January, 2014 c 2014 Massachusetts Institute of Technology. All rights reserved. 1 2 1 Symmetric Matrices and Convexity of Quadratic Functions

More information

Jordan normal form notes (version date: 11/21/07)

Jordan normal form notes (version date: 11/21/07) Jordan normal form notes (version date: /2/7) If A has an eigenbasis {u,, u n }, ie a basis made up of eigenvectors, so that Au j = λ j u j, then A is diagonal with respect to that basis To see this, let

More information

5 Linear Transformations

5 Linear Transformations Lecture 13 5 Linear Transformations 5.1 Basic Definitions and Examples We have already come across with the notion of linear transformations on euclidean spaces. We shall now see that this notion readily

More information

REPRESENTATION THEORY WEEK 7

REPRESENTATION THEORY WEEK 7 REPRESENTATION THEORY WEEK 7 1. Characters of L k and S n A character of an irreducible representation of L k is a polynomial function constant on every conjugacy class. Since the set of diagonalizable

More information

Review of Linear Algebra Definitions, Change of Basis, Trace, Spectral Theorem

Review of Linear Algebra Definitions, Change of Basis, Trace, Spectral Theorem Review of Linear Algebra Definitions, Change of Basis, Trace, Spectral Theorem Steven J. Miller June 19, 2004 Abstract Matrices can be thought of as rectangular (often square) arrays of numbers, or as

More information

Math 2331 Linear Algebra

Math 2331 Linear Algebra 5. Eigenvectors & Eigenvalues Math 233 Linear Algebra 5. Eigenvectors & Eigenvalues Shang-Huan Chiu Department of Mathematics, University of Houston schiu@math.uh.edu math.uh.edu/ schiu/ Shang-Huan Chiu,

More information

Further Mathematical Methods (Linear Algebra) 2002

Further Mathematical Methods (Linear Algebra) 2002 Further Mathematical Methods (Linear Algebra) 00 Solutions For Problem Sheet 0 In this Problem Sheet we calculated some left and right inverses and verified the theorems about them given in the lectures.

More information

Cartan s Criteria. Math 649, Dan Barbasch. February 26

Cartan s Criteria. Math 649, Dan Barbasch. February 26 Cartan s Criteria Math 649, 2013 Dan Barbasch February 26 Cartan s Criteria REFERENCES: Humphreys, I.2 and I.3. Definition The Cartan-Killing form of a Lie algebra is the bilinear form B(x, y) := Tr(ad

More information

Diagonalisierung. Eigenwerte, Eigenvektoren, Mathematische Methoden der Physik I. Vorlesungsnotizen zu

Diagonalisierung. Eigenwerte, Eigenvektoren, Mathematische Methoden der Physik I. Vorlesungsnotizen zu Eigenwerte, Eigenvektoren, Diagonalisierung Vorlesungsnotizen zu Mathematische Methoden der Physik I J. Mark Heinzle Gravitational Physics, Faculty of Physics University of Vienna Version 5/5/2 2 version

More information

MATH 304 Linear Algebra Lecture 33: Bases of eigenvectors. Diagonalization.

MATH 304 Linear Algebra Lecture 33: Bases of eigenvectors. Diagonalization. MATH 304 Linear Algebra Lecture 33: Bases of eigenvectors. Diagonalization. Eigenvalues and eigenvectors of an operator Definition. Let V be a vector space and L : V V be a linear operator. A number λ

More information

Problems in Linear Algebra and Representation Theory

Problems in Linear Algebra and Representation Theory Problems in Linear Algebra and Representation Theory (Most of these were provided by Victor Ginzburg) The problems appearing below have varying level of difficulty. They are not listed in any specific

More information

Notes on Linear Algebra and Matrix Analysis

Notes on Linear Algebra and Matrix Analysis Notes on Linear Algebra and Matrix Analysis Maxim Neumann May 2006, Version 0.1.1 1 Matrix Basics Literature to this topic: [1 4]. x y < y,x >: standard inner product. x x = 1 : x is normalized x y = 0

More information

Eigenvalues and Eigenvectors

Eigenvalues and Eigenvectors Eigenvalues and Eigenvectors Philippe B. Laval KSU Fall 2015 Philippe B. Laval (KSU) Eigenvalues and Eigenvectors Fall 2015 1 / 14 Introduction We define eigenvalues and eigenvectors. We discuss how to

More information

Diagonalisierung. Eigenwerte, Eigenvektoren, Mathematische Methoden der Physik I. Vorlesungsnotizen zu

Diagonalisierung. Eigenwerte, Eigenvektoren, Mathematische Methoden der Physik I. Vorlesungsnotizen zu Eigenwerte, Eigenvektoren, Diagonalisierung Vorlesungsnotizen zu Mathematische Methoden der Physik I J. Mark Heinzle Gravitational Physics, Faculty of Physics University of Vienna Version /6/29 2 version

More information

Matrices and Matrix Algebra.

Matrices and Matrix Algebra. Matrices and Matrix Algebra 3.1. Operations on Matrices Matrix Notation and Terminology Matrix: a rectangular array of numbers, called entries. A matrix with m rows and n columns m n A n n matrix : a square

More information

Eigenvectors and Hermitian Operators

Eigenvectors and Hermitian Operators 7 71 Eigenvalues and Eigenvectors Basic Definitions Let L be a linear operator on some given vector space V A scalar λ and a nonzero vector v are referred to, respectively, as an eigenvalue and corresponding

More information

Linear Algebra Practice Problems

Linear Algebra Practice Problems Linear Algebra Practice Problems Math 24 Calculus III Summer 25, Session II. Determine whether the given set is a vector space. If not, give at least one axiom that is not satisfied. Unless otherwise stated,

More information

LECTURE VI: SELF-ADJOINT AND UNITARY OPERATORS MAT FALL 2006 PRINCETON UNIVERSITY

LECTURE VI: SELF-ADJOINT AND UNITARY OPERATORS MAT FALL 2006 PRINCETON UNIVERSITY LECTURE VI: SELF-ADJOINT AND UNITARY OPERATORS MAT 204 - FALL 2006 PRINCETON UNIVERSITY ALFONSO SORRENTINO 1 Adjoint of a linear operator Note: In these notes, V will denote a n-dimensional euclidean vector

More information

Math 240 Calculus III

Math 240 Calculus III Generalized Calculus III Summer 2015, Session II Thursday, July 23, 2015 Agenda 1. 2. 3. 4. Motivation Defective matrices cannot be diagonalized because they do not possess enough eigenvectors to make

More information

Eigenvalues and Eigenvectors

Eigenvalues and Eigenvectors CHAPTER Eigenvalues and Eigenvectors CHAPTER CONTENTS. Eigenvalues and Eigenvectors 9. Diagonalization. Complex Vector Spaces.4 Differential Equations 6. Dynamical Systems and Markov Chains INTRODUCTION

More information

Vector Space Basics. 1 Abstract Vector Spaces. 1. (commutativity of vector addition) u + v = v + u. 2. (associativity of vector addition)

Vector Space Basics. 1 Abstract Vector Spaces. 1. (commutativity of vector addition) u + v = v + u. 2. (associativity of vector addition) Vector Space Basics (Remark: these notes are highly formal and may be a useful reference to some students however I am also posting Ray Heitmann's notes to Canvas for students interested in a direct computational

More information

Eigenvalues, Eigenvectors, and Diagonalization

Eigenvalues, Eigenvectors, and Diagonalization Math 240 TA: Shuyi Weng Winter 207 February 23, 207 Eigenvalues, Eigenvectors, and Diagonalization The concepts of eigenvalues, eigenvectors, and diagonalization are best studied with examples. We will

More information

Math 25a Practice Final #1 Solutions

Math 25a Practice Final #1 Solutions Math 25a Practice Final #1 Solutions Problem 1. Suppose U and W are subspaces of V such that V = U W. Suppose also that u 1,..., u m is a basis of U and w 1,..., w n is a basis of W. Prove that is a basis

More information

EE/ACM Applications of Convex Optimization in Signal Processing and Communications Lecture 2

EE/ACM Applications of Convex Optimization in Signal Processing and Communications Lecture 2 EE/ACM 150 - Applications of Convex Optimization in Signal Processing and Communications Lecture 2 Andre Tkacenko Signal Processing Research Group Jet Propulsion Laboratory April 5, 2012 Andre Tkacenko

More information

over a field F with char F 2: we define

over a field F with char F 2: we define Chapter 3 Involutions In this chapter, we define the standard involution (also called conjugation) on a quaternion algebra. In this way, we characterize division quaternion algebras as noncommutative division

More information