The Jordan canonical form

Size: px
Start display at page:

Download "The Jordan canonical form"

Transcription

1 The Jordan canonical form Francisco Javier Sayas University of Delaware November 22, 213 The contents of these notes have been translated and slightly modified from a previous version in Spanish. Part of the notation for the iterated kernels and many of the ideas for the proof of Jordan s Decomposition Theorems are borrowed from [1. The flavor to functional analysis (Riesz theory) is taken from [2. Finally, notation has been adapted from [3 to fit in what I was teaching in MATH 672 in the Fall 213 semester. 1 Preliminary notions Definition 1. Let A M(n; F), and let V be a subspace of F n = F n c A invariant if A u V, u V.. We say that V is Examples. If λ is an eigenvalue of A, then ker(a λi) = {u F n : A u = λ u} is A invariant. More generally Proof. If u ker(a λi) j, then ker(a λi) j is A invariant j. (A λ I) j u = = = A (A λ I) j u ( ) = (A λ I) j A u To prove (*), we just have to notice that = Au ker(a λi) j. (A λ I) j = and therefore A (A λ I) j = (A λ I) j A. j ( ) j ( λ) j i A i, i i= Definition 2. If A M(n; F) and F n = V W, where V and W are A invariant, we say that F n = V W is an A invariant decomposition. The same definition can be extended to F n = V 1... V k, where V j is A invariant for all j. 1

2 Example. If A is diagonalizable and λ 1,..., λ k are the distinct eigenvalues of A, then F n = ker(a λ 1 I)... ker(a λ k I) is an A invariant decomposition. Also if Av j = µ j v j, j and {v 1,..., v n } is a basis for F n, then F n = span[v 1 span[v 2... span[v n is an A invariant decomposition. The associated matrix. If F n = V W is an A invariant decomposition, and we construct a basis for F n as follows: and we build the matrix P = v 1,..., v }{{} l, v l+1,..., v }{{ n } basis for V basis for W v 1... v l v l+1... v n, then [ P 1 Al l A P = A n l n l In general, invariant decomposition produce block diagonal matrices. Note that if A 1 P 1 A 2 A P =... A k where A j is a square matrix for all j, then: (a) the corresponding partition of the columns of P creates an A invariant decomposition of F n in k subspaces; (b) the characteristic polynomial of A is the product of the characteristic polynomials of the matrices A j. First goal. In a first step we will try to find a matrix P such that A 1 P 1 A 2 A P =... A k where χ Aj (x) = (λ j x) m j λ i λ j i j. 2

3 This will be possible for any matrix such that χ A (x) = (λ 1 x) m 1... (λ k x) m k m m k = n. We will have thus created an A invariant decomposition F n = V 1 V 2... V k, where dim V j = m j. From now on, we will restrict our attention to the case F = C. The case where A is a real matrix whose characteristic polynomial has only real roots will follow as a particular case. 2 Iterated kernels Notation. From now on we will consider a matrix whose characteristic polynomial is χ A (x) = (λ 1 x) m 1... (λ k x) m k, with pairwise different roots {λ 1,..., λ k } with multiplicities m j 1. Definition 3. Let A M(n; C) and λ be an eigenvalue of A. The subspaces E j (λ) = ker(a λ I) j, j 1, are called iterated kernels. We denote E (λ) = ker( I ) =. Elementary properties. (1) E 1 (λ) is the set of all eigenvectors associated to λ and the zero vector (which is never considered an eigenvector). (2) E j (λ) E j+1 (λ) for all j. Proof. It follows from the following straightforward argument (A λ I) j u = = (A λ I) j+1 u =. (3) If E l (λ) = E l+1 (λ), then E l+1 (λ) = E l+2 (λ). Consequently E l (λ) = E l+1 (λ) = E l+2 (λ) =.... Proof. Let u be such that (A λ I) l+2 u = and define v = (A λ I)u. Then (A λi) l+1 v = (A λi) l+2 u =, which implies that v E l+2 (λ) = E l+1 (λ), and therefore (A λi) l+1 u = (A λi) l v =. This argument shows then that E l+2 (λ) E l+1 (λ) E l+2 (λ). (4) The spaces E j (λ) are A invariant. 3

4 Not so elementary properties. Section 3. The proofs for the following results will be given in (5) For all j 1 dim E j+1 (λ) dim E j (λ) dim E j (λ) dim E j 1 (λ). (6) Let λ 1,..., λ k be eigenvalues of A. Let j 1,..., j k 1. Then, the following sum of subspaces is direct: E j1 (λ 1 ) E j2 (λ 2 )... E jk (λ k ). In other words, the spaces E j1 (λ 1 ),..., E jk (λ k ) are independent. (7) The maximum dimension of the iterated kernels E k (λ) is the multiplicity of λ as a root of the characteristic polynomial. In other words, for every eigenvalue λ, there exists l such that E l (λ) = E l+1 (λ), dim E l (λ) = m λ, where m λ is the algebraic multiplicity of λ as a root of χ A. On the dimensions of the iterated kernels. Let n j = dim E j (λ) = dim ker(a λ I) j, j (note that n = ). The preceding properties imply that this sequence of integers satisfies the following properties: (1) = n < n 1 <... < n l = n l+1 =..., (2) n l n l 1 n l 1 n l 2... n 2 n 1 n 1 n = n 1 (3) n l = m λ. The sequence of iterated kernels is then E (λ) E 1 (λ) E 2 (λ)... E l (λ) = E l+1 (λ) =... dim : = n < n 1 < n 2 <... < n l = n l+1 =... A simple but important observation. If B = P 1 AP, then u ker(b λi) j P u ker(a λi) j. Therefore, the dimensions of the iterated kernels are invariant by similarity transformations. This can be also explained in different words: the dimensions of the iterated kernels depend on the operator and not on the particular matrix representation of the operator. 4

5 Theorem 1 (First Jordan decomposition theorem). If χ A (x) = (λ 1 x) m 1 (λ 2 x) m 2... (λ k x) m k, then there exist integers l j such that C n = E l1 (λ 1 )... E lk (λ k ) and there is a change of basis such that P 1 A P = A 1..., where χ Ap (x) = (λ p x) mp Furthermore, the dimensions of the iterated kernels of the submatrices A p coincide with the dimensions of the kernels E j (λ p ) for the matrix A, that is, A k p. dim ker(a p λ p I) j = dim ker(a λ p I) j j. Proof. The first assertion is a direct consequence of properties (6) and (7). Taking bases of the iterated kernels E lj (λ j ) and placing them as columns of a matrix P we reach a block diagonal form. Let us show that the dimensions of the iterated kernels of A 1 coincide with the dimensions of E j (λ 1 ). Let B = P 1 A P. Since u ker(b λ 1 I) j P u E j (λ 1 ) E l1 (λ 1 ) = span[p 1, p 2,..., p m1 (p j are the column vectors of P ), then ker(b λ 1 I) j span[e 1, e 2,..., e m1 (e j are the canonical vectors). This means that elements of ker(b λ 1 I) j can only have its first m 1 entries non-vanishing. From that, it is obvious that [ v u ker(b λ 1 I) j u =, v ker(a 1 λ 1 I) j. The same proof applies to all other eigenvalues. Very relevant conclusion. Once the first Jordan Theorem has been proved, we can restrict our attention to matrices whose characterstic polynomial is of the form (λ 1 x) n, since this theorem allows us to separate the space into smaller subspaces where the operator/matrix only involves one eigenvalue. 3 Proofs 5

6 This section can be skipped in a first reading. Property (5). For all j 1 dim E j+1 (λ) dim E j (λ) dim E j (λ) dim E j 1 (λ). Proof. Let m = dim E j+1 (λ) dim E j (λ) and let us take independent vectors {v 1,..., v m } such that E j+1 (λ) = E j (λ) span[v 1,..., v m. This can be written in the following form: v 1,..., v m E j+1 (λ), ξ 1 v ξ m v m E j (λ) ξ 1 =... = ξ m =. Let now w i := (A λ I)v i. It is easy to verify that and therefore w 1,..., w m E j (λ), ξ 1 w ξ m w m E j 1 (λ) ξ 1 =... = ξ m =, dim span[w 1,..., w m = m span[w 1,..., w m E j (λ), span[w 1,..., w m E j 1 (λ) = {}. Therefore dim E j (λ) dim E j 1 (λ) m. (the vectors are independent), Lemma 1. If p(a)u = and q(a)u =, where p, q C[x, and we define r = g.c.d.(p, q), then r(a)u =. Proof. Both p and q have to be multiples of the minimal polynomial of u with respect to A. Therefore r is a multiple of this same polynomial which implies the result. (Recall that the minimal polynomial for a vector u with respect to a matrix A is the lowest order monic polynomial p such that p(a)u = and that if q(a)u =, then p divides q.) Property (6). Let λ 1,..., λ k be pairwise different eigenvalues of A. Let j 1,..., j k 1. The the following sum of spaces is direct: Proof. Let u 1,..., u k such that Then (A λ 1 I) j 1 u 1 =, E j1 (λ 1 ) E j2 (λ 2 )... E jk (λ k ). u 1 + u u k =, u i E ji (λ i ). (A λ 2 I) j 2... (A λ k I) j k u 1 = (A λ 2 I) j 2... (A λ k I) j k (u u k ) =. 6

7 Since ( ) g.c.d. (x λ 1 ) j 1, (x λ 2 ) j 2... (x λ k ) j k = 1, then u 1 =. Proceeding by induction we prove that u j = for all j. This proves that the subspaces are independent. Property (7a). For all j dim E j (λ) m λ, where χ A (x) = (x λ) m λ q(x) with q(λ). Proof. Let P be invertible such that its first m columns form a basis of E j (λ) (m = dim E j (λ)). Therefore [ P 1 Am C A P = m, A B m M(m; C). m If A m v = µ v, then P 1 A P [ v = µ [ v = A P [ v = µp [ v. Therefore w = P [ v E 1 (µ) E j (λ). If µ λ, then E 1 (µ) E j (λ) = {}, so w = and v =. This shows that χ Am (x) = (λ x) m. Since χ A (x) = χ Am (x)χ Bm (x) = (λ x) m χ Bm (x) = (λ x) m λ q(x), q(λ) it follows that m m λ. Property (7b). If E l+1 (λ) = E l (λ), then dim E l (λ) m λ. Proof. Let m = dim E l (λ). We take P as in the previous proof, so that [ P 1 Am C A P = m = B Ã, m where A m M(m; C). We are going to see that B m cannot have λ as an eigenvalue. Let us recall that for all j Therefore u ker(ã λ I)j P u ker(a λ I) j. ker(ã λ I)l = ker(ã λ I)l+k, ker(ã λ I)l = span[e 1,..., e m. k 7

8 The latter identity implies that the first m columns of (Ã λ I)l vanish. We can then write [ [ (Ã λ (Am λ I) I)l = l Ĉ m Ĉm (B m λ I) l = (B m λ I) l. Now, if (B m λi)v =, then (Ã λ I)2l [ v = [ Ĉm (B m λ I) l [ Ĉ m v (B m λ I) l v = [ Ĉm (B m λ I) l [ Ĉm v = which implies and therefore v =. (Ã λ I)l [ v = Nota. Properties (7a) and (7b) imply Property (7). However, Property (7b) is enough to show that dim E l (λ) = m(λ). The reason for this is that Property (6) implies n dim E l1 (λ 1 ) dim E lk (λ k ) m m k = n which forces all the inequalities to be equalities. 4 Jordan blocks and Jordan matrices Definition 4. A Jordan block of order k associated to a value λ is a k k matrix of the form λ 1 λ 1 J k (λ) := λ λ Properties. (1) The characteristic polynomial of J k (λ) is (λ x) k. Moreover, dim ker(j k (λ) λi) = 1. (2) The dimensions of the iterated kernels for J k (λ) are: { j, j = 1,..., k dim ker(j k (λ) λi) j = k, j k } = min(k, j). 8

9 Proof. Note that J k (λ) λi = and take increasing powers of this matrix, noticing how the rank reduces by onw with each power. Otherwise, note that n 1 = 1 (the dimension of the space of eigenvectors is one) and by Property (5) of the iterated kernels the dimensions can only increase one by one until it reaches k. Definition 5. A Jordan matrix associated to a value λ is a block diagonal matrix whose blocks are Jordan blocks for the value λ: J k1 (λ) J k2 (λ)... J kr (λ) A general Jordan matrix is a block diagonal matrix, whose blocks are Jordan matrices associated to different values. Equivalently, it is a block diagonal matrix whose blocks are Jordan blocks. 5 Two combinatorial problems Objetivo. We will next show how to establish a one-to-one correspondence between the different possibilities for dimensions of iterated kernels for an m m Jordan matrix (with a single eigenvalue) and the possibilities of setting up the Jordan boxes until we fill the m m space. Problem (P). Find a sequence of integers satisfying (a) = n < n 1 <... < n l = n l+1 =..., (b) n l n l 1 n l 1 n l 2... n 2 n 1 n 1 n = n 1 (c) n l = m. Problem (Q). Find a sequence of non-negative integers q 1, q 2,..., q l,... such that (a) q l+1 = q l+2 =... = (b) q q q l q l = m 9

10 Remark. Before showing how Problem (P) and Problem (Q) are related, let us look ahead and explain what they are going to mean in the context of Jordan matrices. We also give the interpretation for the solutions of an intermediate problem: {n j } will be the dimensions of the iterated kernels; {p j } will be the number of Jordan blocks of size j j or higher; {q j } will be the number of j j Jordan blocks. The relation between both sequences will be done without taking into account what is the value of l where the solution of (P) stagnates and where the one of (Q) takes its last non-zero value. This value will be however the same in both sequences. From (P) to (Q). Given a solution to (P), we define p j = n j n j 1, j 1. This sequence satisfies =... = p l+1 < p l p l+1... p 1 = n 1. We next define q j = p j p j+1, j 1, so that q l = p l and q j = for all j > l. The sequence {q j } is a solution to (Q). From (Q) to (P). Given a solution to (Q), we define p j = q j + q j q l, j = 1,..., l, p j = j l, and then define a new sequence using a recurrence { n = n j = p j + n j 1, j 1. Then {n j } is a solution of (P). This shows that there is a bijection between solutions of (P) and (Q). Theorem 2. Let {n j } be a solution to (P) and let {q j } be the associated solution to (Q). Then the m m Jordan matrix associated to the value λ, built with q j blocks of size j j for all j satisfies dim E j (λ) = n j, j. 1

11 Proof. Let us consider the Jordan matrix constructed as explained in the statement. Since the matrix is block diagonal, it follows that dim E j (λ) = q l dim ker(j l (λ) λi) j + q l 1 dim ker(j l 1 (λ) λi) j q 1 dim ker(j 1 (λ) λi) j = q l min(j, l) + q l 1 min(j, l 1) q 1 min(j, 1) = j (q l + q l q j+1 ) + jq j + (j 1)q j q 2 + q 1 = j(p l + p l 1 p l p j+1 p j+2 ) +j(p j p j+1 ) (p 2 p 3 ) + (p 1 p 2 ) = j p j+1 j p j+1 + p j + p j p 1 = n j. This proves the result. Conclusions. (1) For any solution of problem (P), there exists an m m Jordan matrix with a single eigenvalue such that dim E j (λ) = n j, j. (2) Consequently, given possible configurations of the dimensions of the iterated kernels associated to all eigenvalues, there exists a matrix (a Jordan matrix actually) whose iterated kernels have the given dimensions. (3) Two essentially different Jordan matrices (that cannot be obtained by permutation in the order of the blocks) are not similar. Proof. The block-configuration (how many blocks of each size for each eigenvalue) determines univocally the dimensions of the iterated kernels (different block configurations imply different dimensions of the kernels). However, the dimensions of the iterated kernel do not vary under similarity transformations. 6 Existence of a Jordan canonical form An observation. If P 1 A P = and u 1,..., u k are the first columns of P, then Au 1 = λu 1 [ Jk (λ) Au 2 = λu 2 + u 1. so for all j 2 Au k = λu k + u k 1 u j 1 = (A λi)u j. 11

12 Lemma 2. If V is a subspace satisfying E j 1 (λ) V E j (λ) and W = (A λi)v = {(A λi) u : u V}, then E j 2 (λ) W E j 1 (λ), dim W = dim V. Proof. There are three things to check: (a) W E j 1 (λ), which is quite obvious, since u E j (λ) = (A λi)u E j 1 (λ); (b) W E j 2 (λ) =, since if v W (so v = (A λi)u, for some u V), then (A λi) j 2 v = = (A λi) j 1 u = = u V E j 1 (λ) = ; (c) if u V, then (A λi)u = = u E 1 (λ) V E j 1 (λ) V =, so linear independence is preserved by multiplication by (A λi). The proof of Theorem 3 is quite technical. It can be skipped in a first reading. Theorem 3 (Second Jordan decomposition theorem). Every m m with characteristic polynomial (x λ) m is similar to a Jordan matrix associated to the value λ. Moreover, the dimensions of the Jordan boxes are given by the coefficients {q j } associated to the dimensions {n j } of the iterated kernels. Proof. Let E l (λ) = C m be the first iterated kernel of maximum dimension. Let V l = V l tal que E l (λ) = E l 1 (λ) V l, so that dim V l = n l n l 1 = p l = q l. Using Lemma 2 above E l 1 (λ) = E l 2 (λ) (A λi)vl Vl 1, }{{} V l 1 with Likewise dim V l 1 = (n l 1 n l 2 ) dim V l = p l 1 p l = q l 1. E l 2 (λ) = E l 3 (λ) (A λi)v l 1 Vl 2 = }{{} V l 2 = E l 3 (λ) (A λi) 2 V l (A λi)v l 1 V l 2, 12

13 with dim V l 2 = q l 2. Proceeding succesively, we can build a partition of E l (λ) E l (λ) = Vl (A λi)vl... (A λi) l 2 Vl (A λi) l 1 Vl Vl 1... (A λi) l 3 Vl 1 (A λi) l 2 Vl 1.. V2 (A λi)v2 V1 so that, dim Vj = q j, j = 1,..., l. Note that we can decompose E l ()λ) in two different ways: E l (λ) = V l V l 1... V 2 V 1 = W l W l 1... W 2 W 1, where V j = V j (A λi)v j+1... (A λi) l j V l (add by columns) and W j = V j (A λi)v j... (A λi) j 1 V j (add by rows) We next build a basis for E l (λ) in the following form. If V j {}, we start {u j,1,..., u j,qj } basis for V j. We then create sequences of j vectors for each of the above vectors. If u is any of them, we define v 1 = (A λi) j 1 u = (A λi)v 2 v 2 = (A λi) j 2 u = (A λi)v 3 (A λi) j 1 V j (A λi) j 2 V j so that We thus obtain a basis for. v j 1 = (A λi)u = (A λi)v j (A λi)vj v j = u, Vj, Av 1 = λv 1, Av k = v k 1 + λv k, k = 2,..., j. W j = (A λi) j 1 V j... (A λi)v j V j. If we change basis to this new basis (collect the corresponding basis for each row of the large decomposition above, noting that some rows might be zero), we have a Jordan matrix with q j blocks of size j j for each j. 13

14 Theorem 4. Every matrix is similar to a Jordan matrix (its Jordan canonical form). The Jordan form is unique up to permutation of its blocks, and it is the only general Jordan matrix such that the dimensions of the iterated kernels for all the eigenvalues coincide with those of A. Proof. Use the First Jordan Theorem to separate eigenvalues, and then use the construction of the second theorem. Corollary 1. The similarity class of a matrix is determined by the characteristic polynomials and the dimensions of all the iterated kernels (for all roots of the characteristic polynomial). Matrices with different Jordan forms are not similar. 7 Additional topics 7.1 Cayley Hamilton s Theorem Theorem 5. For every matrix χ A (A) =. Proof. A simple form of proving this theorem is using the Jordan form. Note that there are other proofs using much less complicated results. Let J be the Jordan form for A. Since χ J = χ A, we only need to prove that χ J (J) =. Hoever χ J (J k1 (λ 1 )) χ J (J k2 (λ 2 )) χ J (J) =... χ J (J kr(λ r )) and on the other hand which proves the theorem. (J k (λ) λi) k =, Another proof. It is possible to prove this same result using the First Jordan decomposition theorem. Using this theorem and the fact that p(a 1 ) p(p 1 p(a 2 ) AP ) =..., p(a k ) it is enough to prove the result for matrices with only one eigenvalue. Now, if the characteristic polynomial of A is (x λ) n, there exists l n such that E l (λ) = ker, (A λi) l = C n, which implies that (A λi) l = with l n. 14

15 Some comments. The minimal polynomial for A is the lowest degree monic polynomial satisfying Q(A) =. Assume that χ A (x) = (λ 1 x) m 1... (λ k x) m k. Since the minimal polynomial for A and its Jordan form J are the same, it is easy to see that if for every λ j we pick the minimum l j such that then the minimal polynomial is dim E lj (λ j ) = m j, (x λ 1 ) l 1... (x λ k ) l k. Another simple argument shows then that the following statements are equivalent: (a) the matrix is diagonalizable (b) the Jordan form of the matrix is diagonal (all blocks are 1 1) (c) all iterated kernels have maximum dimension E 2 (λ) = E 1 (λ), λ eigenvalue of A. (d) the minimal polynomial has simple roots. 7.2 The transposed Jordan form In many contexts it is common to meet a Jordan form made up of blocks of the form λ 1 λ......, 1 λ that is, with the 1s under the main diagonal. If we have been able to get to the usual Jordan form, we only need to reoder the basis to change to this transposed form, namely, if the vectors u 1, u 2,..., u p correspond to a p p block associated to λ, then the vectors u p, u p 1,..., u 1 give the same block with the one elements under the main diagonal. Corollary 2. Every matrix is similar to its transpose. Proof. A simple reordering of the basis shows that every Jordan matrix is similar to its transpose. Also, the tranpose of a matrix is similar to the transpose of its Jordan form. This finishes the proof. 15

16 7.3 The real Jordan form If A M(n; R), λ = α + βı (with β ) is an eigenvalue and we have the basis for E l (λ) giving the Jordan blocks for λ, then we just need to conjugate the vectors of the basis to obtain the basis giving the Jordan blocks for λ = α βı. The reason is simple: u E j (λ) u E j (λ) This means that the configuration of Jordan blocks for λ and λ are the same. If u 1,..., u p are the basis vectors associated to λ = α + βı (with β ), for a basis leading to Jordan form, then u 1,..., u p can be taken as the vectors associated to λ. We can then take v 1 = Re u 1, v 2 = Im u 1 v 3 = Re u 2, v 4 = Im u 2. v 2p 1 = Re u p, v 2p = Im u p (Re and Im denote the real and imaginary part of the vector). With these vectors, the blocks corresponding to λ and λ mix up, producing bigger blocks with real numbers. For instance [ α + βı 1 α β 1 α + βı [ β α 1 α β α βı 1 β α α βı In this way, when a real matrix has complex eigenvalues, we can restrict our bases to have real components and work with block-based Jordan blocks Λ I 2 Λ... [ α β J k (λ, λ) :=... I 2, Λ := β α Λ 7.4 Symmetric matrices Theorem 6. Every real symmetric matrix is diagonalizable. [ 1, I 2 := 1 Proof. Asssume that A M(n; R) satisfyes A T r = A. Let Au = λu, with u C n and λ C. Then λ u 2 = u T r A u = (A u) T r u = λ u 2, 16

17 which proves that λ is real. We now only need to prove that E 2 (λ) = E 1 (λ). This is a consequence of the following chain of implications: u E 2 (λ) = (A λi) 2 u = = u T r (A λi) 2 u = = ((A λi)u) T r ((A λi)u) = = (A λi)u 2 = = u E 1 (λ). 8 Exercises 1. (Section 1) Show that if BA = AB, then ker B and range B are A invariant. 2. (Section 1) Let V be A invariant and let W be such that F n = V W. Show that A is similar to a block upper triangular matrix. (Hint. Take a basis of F n composed of vectors of V followed by vectors of W.) 3. (Section 1) Let A be such that P 1 AP = and let {p 1,..., p 4 } be the columns of P. span[p 1, p 2, p 3 are A invariant. Show that span[p 1, span[p 1, p 2 and 4. (Section 2) We can define the iterated ranges as follows with R (λ) = range(i) = C n. Show that (a) R j+1 (λ) R j (λ) for all j. R j (λ) = range(a λi) j, j 1 (b) If R l+1 (λ) = R l (λ), then R l+2 (λ) = R l+1 (λ). (Hint. This result can be proved directly or using the equivalent result for the iterated kernels.) (c) dim R j+1 (λ) dim R j (λ) dim R j (λ) dim R j 1 (λ). If R l+1 (λ) = R l (λ), what is the dimension of this subspace? 5. (Section 4) Let A be such that P 1 AP = J n (λ) and let {p 1, p 2,..., p n } be the column vectors of P. 17

18 (a) Show that p j E j (λ). (b) Show that (A λi) j 1 p j = p 1. (c) Use the previous results to find the minimal polynomial of p j. (d) Use (c) to find the minimal polynomial of A. 6. (Sections 5 and 6) In a 12-dimensional space, we let q 1 = 2, q 2 =, q 3 = 2, q 4 = 1, q j = j 5. (This corresponds to a Jordan matrix with two 1 1 blocks, two 3 3 blocks and one 4 4 blocks.) What is the associated solution of Problem (P)? In other words, what are the dimensions of the iterated kernels E j (λ) for a matrix with the Jordan structure given above? 7. (Section 6) Let A be such that χ A (x) = (2 x) 3 (4 x) 2. List all possible Jordan forms for A. List the associated minimal polynomial. (Hint. It is easy to find the minimal polynomial for all the vectors in the basis that leads to the Jordan form. The l.c.m. of the minimal polynomials for the vectors in any basis is the minimal polynomial for the matrix.) 8. (Section 6) Let A be such that χ A (x) = (3 x) 4 (2 x) 2 and minp A (x) = (x 3) 2 (x 2) 2. How many possible Jordan forms can A have? 9. (Section 6) Let A be such that χ A (x) = (2 x) 6. Write down all possible Jordan forms for A. For each of them, list the dimensions of the iterated kernels and the minimal polynomial of the matrix. 1. (Section 7) Prove that every complex Hermitian matrix has only real eigenvalues and is diagonalizable. (Hint. Instead of transposing, use conjugate transposition in the proof of Theorem 6.) References [1 Hernández, Eugenio. Algebra and Geometry (Spanish), Addison-Wesley. [2 Kress, Rainer. Linear Integral Operators. Springer Verlag. [3 Katznelson, Yitzhak, and Katznelson, Yonatan R. A (Terse) Introduction to Linear Algebra. AMS Student Mathematical Library. by F. J. Sayas Mathematics, Ho!, October 22 Revised and translated November

Spectral Theorem for Self-adjoint Linear Operators

Spectral Theorem for Self-adjoint Linear Operators Notes for the undergraduate lecture by David Adams. (These are the notes I would write if I was teaching a course on this topic. I have included more material than I will cover in the 45 minute lecture;

More information

MATHEMATICS 217 NOTES

MATHEMATICS 217 NOTES MATHEMATICS 27 NOTES PART I THE JORDAN CANONICAL FORM The characteristic polynomial of an n n matrix A is the polynomial χ A (λ) = det(λi A), a monic polynomial of degree n; a monic polynomial in the variable

More information

A proof of the Jordan normal form theorem

A proof of the Jordan normal form theorem A proof of the Jordan normal form theorem Jordan normal form theorem states that any matrix is similar to a blockdiagonal matrix with Jordan blocks on the diagonal. To prove it, we first reformulate it

More information

Bare-bones outline of eigenvalue theory and the Jordan canonical form

Bare-bones outline of eigenvalue theory and the Jordan canonical form Bare-bones outline of eigenvalue theory and the Jordan canonical form April 3, 2007 N.B.: You should also consult the text/class notes for worked examples. Let F be a field, let V be a finite-dimensional

More information

Linear Algebra Lecture Notes-II

Linear Algebra Lecture Notes-II Linear Algebra Lecture Notes-II Vikas Bist Department of Mathematics Panjab University, Chandigarh-64 email: bistvikas@gmail.com Last revised on March 5, 8 This text is based on the lectures delivered

More information

LINEAR ALGEBRA BOOT CAMP WEEK 2: LINEAR OPERATORS

LINEAR ALGEBRA BOOT CAMP WEEK 2: LINEAR OPERATORS LINEAR ALGEBRA BOOT CAMP WEEK 2: LINEAR OPERATORS Unless otherwise stated, all vector spaces in this worksheet are finite dimensional and the scalar field F has characteristic zero. The following are facts

More information

1 Invariant subspaces

1 Invariant subspaces MATH 2040 Linear Algebra II Lecture Notes by Martin Li Lecture 8 Eigenvalues, eigenvectors and invariant subspaces 1 In previous lectures we have studied linear maps T : V W from a vector space V to another

More information

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v ) Section 3.2 Theorem 3.6. Let A be an m n matrix of rank r. Then r m, r n, and, by means of a finite number of elementary row and column operations, A can be transformed into the matrix ( ) Ir O D = 1 O

More information

Notes on the matrix exponential

Notes on the matrix exponential Notes on the matrix exponential Erik Wahlén erik.wahlen@math.lu.se February 14, 212 1 Introduction The purpose of these notes is to describe how one can compute the matrix exponential e A when A is not

More information

ALGEBRA QUALIFYING EXAM PROBLEMS LINEAR ALGEBRA

ALGEBRA QUALIFYING EXAM PROBLEMS LINEAR ALGEBRA ALGEBRA QUALIFYING EXAM PROBLEMS LINEAR ALGEBRA Kent State University Department of Mathematical Sciences Compiled and Maintained by Donald L. White Version: August 29, 2017 CONTENTS LINEAR ALGEBRA AND

More information

NONCOMMUTATIVE POLYNOMIAL EQUATIONS. Edward S. Letzter. Introduction

NONCOMMUTATIVE POLYNOMIAL EQUATIONS. Edward S. Letzter. Introduction NONCOMMUTATIVE POLYNOMIAL EQUATIONS Edward S Letzter Introduction My aim in these notes is twofold: First, to briefly review some linear algebra Second, to provide you with some new tools and techniques

More information

The Cayley-Hamilton Theorem and the Jordan Decomposition

The Cayley-Hamilton Theorem and the Jordan Decomposition LECTURE 19 The Cayley-Hamilton Theorem and the Jordan Decomposition Let me begin by summarizing the main results of the last lecture Suppose T is a endomorphism of a vector space V Then T has a minimal

More information

MATH 583A REVIEW SESSION #1

MATH 583A REVIEW SESSION #1 MATH 583A REVIEW SESSION #1 BOJAN DURICKOVIC 1. Vector Spaces Very quick review of the basic linear algebra concepts (see any linear algebra textbook): (finite dimensional) vector space (or linear space),

More information

Topics in linear algebra

Topics in linear algebra Chapter 6 Topics in linear algebra 6.1 Change of basis I want to remind you of one of the basic ideas in linear algebra: change of basis. Let F be a field, V and W be finite dimensional vector spaces over

More information

Definition (T -invariant subspace) Example. Example

Definition (T -invariant subspace) Example. Example Eigenvalues, Eigenvectors, Similarity, and Diagonalization We now turn our attention to linear transformations of the form T : V V. To better understand the effect of T on the vector space V, we begin

More information

MATRICES ARE SIMILAR TO TRIANGULAR MATRICES

MATRICES ARE SIMILAR TO TRIANGULAR MATRICES MATRICES ARE SIMILAR TO TRIANGULAR MATRICES 1 Complex matrices Recall that the complex numbers are given by a + ib where a and b are real and i is the imaginary unity, ie, i 2 = 1 In what we describe below,

More information

Linear Algebra 2 Spectral Notes

Linear Algebra 2 Spectral Notes Linear Algebra 2 Spectral Notes In what follows, V is an inner product vector space over F, where F = R or C. We will use results seen so far; in particular that every linear operator T L(V ) has a complex

More information

j=1 x j p, if 1 p <, x i ξ : x i < ξ} 0 as p.

j=1 x j p, if 1 p <, x i ξ : x i < ξ} 0 as p. LINEAR ALGEBRA Fall 203 The final exam Almost all of the problems solved Exercise Let (V, ) be a normed vector space. Prove x y x y for all x, y V. Everybody knows how to do this! Exercise 2 If V is a

More information

LINEAR ALGEBRA BOOT CAMP WEEK 1: THE BASICS

LINEAR ALGEBRA BOOT CAMP WEEK 1: THE BASICS LINEAR ALGEBRA BOOT CAMP WEEK 1: THE BASICS Unless otherwise stated, all vector spaces in this worksheet are finite dimensional and the scalar field F has characteristic zero. The following are facts (in

More information

Linear Algebra. Workbook

Linear Algebra. Workbook Linear Algebra Workbook Paul Yiu Department of Mathematics Florida Atlantic University Last Update: November 21 Student: Fall 2011 Checklist Name: A B C D E F F G H I J 1 2 3 4 5 6 7 8 9 10 xxx xxx xxx

More information

BASIC ALGORITHMS IN LINEAR ALGEBRA. Matrices and Applications of Gaussian Elimination. A 2 x. A T m x. A 1 x A T 1. A m x

BASIC ALGORITHMS IN LINEAR ALGEBRA. Matrices and Applications of Gaussian Elimination. A 2 x. A T m x. A 1 x A T 1. A m x BASIC ALGORITHMS IN LINEAR ALGEBRA STEVEN DALE CUTKOSKY Matrices and Applications of Gaussian Elimination Systems of Equations Suppose that A is an n n matrix with coefficents in a field F, and x = (x,,

More information

Foundations of Matrix Analysis

Foundations of Matrix Analysis 1 Foundations of Matrix Analysis In this chapter we recall the basic elements of linear algebra which will be employed in the remainder of the text For most of the proofs as well as for the details, the

More information

Chapter 7. Canonical Forms. 7.1 Eigenvalues and Eigenvectors

Chapter 7. Canonical Forms. 7.1 Eigenvalues and Eigenvectors Chapter 7 Canonical Forms 7.1 Eigenvalues and Eigenvectors Definition 7.1.1. Let V be a vector space over the field F and let T be a linear operator on V. An eigenvalue of T is a scalar λ F such that there

More information

GRE Subject test preparation Spring 2016 Topic: Abstract Algebra, Linear Algebra, Number Theory.

GRE Subject test preparation Spring 2016 Topic: Abstract Algebra, Linear Algebra, Number Theory. GRE Subject test preparation Spring 2016 Topic: Abstract Algebra, Linear Algebra, Number Theory. Linear Algebra Standard matrix manipulation to compute the kernel, intersection of subspaces, column spaces,

More information

235 Final exam review questions

235 Final exam review questions 5 Final exam review questions Paul Hacking December 4, 0 () Let A be an n n matrix and T : R n R n, T (x) = Ax the linear transformation with matrix A. What does it mean to say that a vector v R n is an

More information

Numerical Linear Algebra Homework Assignment - Week 2

Numerical Linear Algebra Homework Assignment - Week 2 Numerical Linear Algebra Homework Assignment - Week 2 Đoàn Trần Nguyên Tùng Student ID: 1411352 8th October 2016 Exercise 2.1: Show that if a matrix A is both triangular and unitary, then it is diagonal.

More information

Topic 1: Matrix diagonalization

Topic 1: Matrix diagonalization Topic : Matrix diagonalization Review of Matrices and Determinants Definition A matrix is a rectangular array of real numbers a a a m a A = a a m a n a n a nm The matrix is said to be of order n m if it

More information

THE MINIMAL POLYNOMIAL AND SOME APPLICATIONS

THE MINIMAL POLYNOMIAL AND SOME APPLICATIONS THE MINIMAL POLYNOMIAL AND SOME APPLICATIONS KEITH CONRAD. Introduction The easiest matrices to compute with are the diagonal ones. The sum and product of diagonal matrices can be computed componentwise

More information

Linear Algebra 1. M.T.Nair Department of Mathematics, IIT Madras. and in that case x is called an eigenvector of T corresponding to the eigenvalue λ.

Linear Algebra 1. M.T.Nair Department of Mathematics, IIT Madras. and in that case x is called an eigenvector of T corresponding to the eigenvalue λ. Linear Algebra 1 M.T.Nair Department of Mathematics, IIT Madras 1 Eigenvalues and Eigenvectors 1.1 Definition and Examples Definition 1.1. Let V be a vector space (over a field F) and T : V V be a linear

More information

MATH JORDAN FORM

MATH JORDAN FORM MATH 53 JORDAN FORM Let A,, A k be square matrices of size n,, n k, respectively with entries in a field F We define the matrix A A k of size n = n + + n k as the block matrix A 0 0 0 0 A 0 0 0 0 A k It

More information

(f + g)(s) = f(s) + g(s) for f, g V, s S (cf)(s) = cf(s) for c F, f V, s S

(f + g)(s) = f(s) + g(s) for f, g V, s S (cf)(s) = cf(s) for c F, f V, s S 1 Vector spaces 1.1 Definition (Vector space) Let V be a set with a binary operation +, F a field, and (c, v) cv be a mapping from F V into V. Then V is called a vector space over F (or a linear space

More information

(Can) Canonical Forms Math 683L (Summer 2003) M n (F) C((x λ) ) =

(Can) Canonical Forms Math 683L (Summer 2003) M n (F) C((x λ) ) = (Can) Canonical Forms Math 683L (Summer 2003) Following the brief interlude to study diagonalisable transformations and matrices, we must now get back to the serious business of the general case. In this

More information

4. Linear transformations as a vector space 17

4. Linear transformations as a vector space 17 4 Linear transformations as a vector space 17 d) 1 2 0 0 1 2 0 0 1 0 0 0 1 2 3 4 32 Let a linear transformation in R 2 be the reflection in the line = x 2 Find its matrix 33 For each linear transformation

More information

OHSx XM511 Linear Algebra: Solutions to Online True/False Exercises

OHSx XM511 Linear Algebra: Solutions to Online True/False Exercises This document gives the solutions to all of the online exercises for OHSx XM511. The section ( ) numbers refer to the textbook. TYPE I are True/False. Answers are in square brackets [. Lecture 02 ( 1.1)

More information

Generalized eigenspaces

Generalized eigenspaces Generalized eigenspaces November 30, 2012 Contents 1 Introduction 1 2 Polynomials 2 3 Calculating the characteristic polynomial 5 4 Projections 7 5 Generalized eigenvalues 10 6 Eigenpolynomials 15 1 Introduction

More information

Throughout these notes we assume V, W are finite dimensional inner product spaces over C.

Throughout these notes we assume V, W are finite dimensional inner product spaces over C. Math 342 - Linear Algebra II Notes Throughout these notes we assume V, W are finite dimensional inner product spaces over C 1 Upper Triangular Representation Proposition: Let T L(V ) There exists an orthonormal

More information

0.1 Rational Canonical Forms

0.1 Rational Canonical Forms We have already seen that it is useful and simpler to study linear systems using matrices. But matrices are themselves cumbersome, as they are stuffed with many entries, and it turns out that it s best

More information

Jordan blocks. Defn. Let λ F, n Z +. The size n Jordan block with e-value λ is the n n upper triangular matrix. J n (λ) =

Jordan blocks. Defn. Let λ F, n Z +. The size n Jordan block with e-value λ is the n n upper triangular matrix. J n (λ) = Jordan blocks Aim lecture: Even over F = C, endomorphisms cannot always be represented by a diagonal matrix. We give Jordan s answer, to what is the best form of the representing matrix. Defn Let λ F,

More information

We simply compute: for v = x i e i, bilinearity of B implies that Q B (v) = B(v, v) is given by xi x j B(e i, e j ) =

We simply compute: for v = x i e i, bilinearity of B implies that Q B (v) = B(v, v) is given by xi x j B(e i, e j ) = Math 395. Quadratic spaces over R 1. Algebraic preliminaries Let V be a vector space over a field F. Recall that a quadratic form on V is a map Q : V F such that Q(cv) = c 2 Q(v) for all v V and c F, and

More information

The Jordan Normal Form and its Applications

The Jordan Normal Form and its Applications The and its Applications Jeremy IMPACT Brigham Young University A square matrix A is a linear operator on {R, C} n. A is diagonalizable if and only if it has n linearly independent eigenvectors. What happens

More information

Lecture 11: Finish Gaussian elimination and applications; intro to eigenvalues and eigenvectors (1)

Lecture 11: Finish Gaussian elimination and applications; intro to eigenvalues and eigenvectors (1) Lecture 11: Finish Gaussian elimination and applications; intro to eigenvalues and eigenvectors (1) Travis Schedler Tue, Oct 18, 2011 (version: Tue, Oct 18, 6:00 PM) Goals (2) Solving systems of equations

More information

Jordan Canonical Form

Jordan Canonical Form Jordan Canonical Form Massoud Malek Jordan normal form or Jordan canonical form (named in honor of Camille Jordan) shows that by changing the basis, a given square matrix M can be transformed into a certain

More information

Jordan Normal Form. Chapter Minimal Polynomials

Jordan Normal Form. Chapter Minimal Polynomials Chapter 8 Jordan Normal Form 81 Minimal Polynomials Recall p A (x) =det(xi A) is called the characteristic polynomial of the matrix A Theorem 811 Let A M n Then there exists a unique monic polynomial q

More information

A PRIMER ON SESQUILINEAR FORMS

A PRIMER ON SESQUILINEAR FORMS A PRIMER ON SESQUILINEAR FORMS BRIAN OSSERMAN This is an alternative presentation of most of the material from 8., 8.2, 8.3, 8.4, 8.5 and 8.8 of Artin s book. Any terminology (such as sesquilinear form

More information

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination Math 0, Winter 07 Final Exam Review Chapter. Matrices and Gaussian Elimination { x + x =,. Different forms of a system of linear equations. Example: The x + 4x = 4. [ ] [ ] [ ] vector form (or the column

More information

Jordan Canonical Form

Jordan Canonical Form Jordan Canonical Form Suppose A is a n n matrix operating on V = C n. First Reduction (to a repeated single eigenvalue). Let φ(x) = det(x A) = r (x λ i ) e i (1) be the characteristic equation of A. Factor

More information

Lecture Note 12: The Eigenvalue Problem

Lecture Note 12: The Eigenvalue Problem MATH 5330: Computational Methods of Linear Algebra Lecture Note 12: The Eigenvalue Problem 1 Theoretical Background Xianyi Zeng Department of Mathematical Sciences, UTEP The eigenvalue problem is a classical

More information

Eigenvalues, Eigenvectors. Eigenvalues and eigenvector will be fundamentally related to the nature of the solutions of state space systems.

Eigenvalues, Eigenvectors. Eigenvalues and eigenvector will be fundamentally related to the nature of the solutions of state space systems. Chapter 3 Linear Algebra In this Chapter we provide a review of some basic concepts from Linear Algebra which will be required in order to compute solutions of LTI systems in state space form, discuss

More information

NOTES ON LINEAR ODES

NOTES ON LINEAR ODES NOTES ON LINEAR ODES JONATHAN LUK We can now use all the discussions we had on linear algebra to study linear ODEs Most of this material appears in the textbook in 21, 22, 23, 26 As always, this is a preliminary

More information

Solution for Homework 5

Solution for Homework 5 Solution for Homework 5 ME243A/ECE23A Fall 27 Exercise 1 The computation of the reachable subspace in continuous time can be handled easily introducing the concepts of inner product, orthogonal complement

More information

Diagonalization of Matrix

Diagonalization of Matrix of Matrix King Saud University August 29, 2018 of Matrix Table of contents 1 2 of Matrix Definition If A M n (R) and λ R. We say that λ is an eigenvalue of the matrix A if there is X R n \ {0} such that

More information

Math 291-2: Lecture Notes Northwestern University, Winter 2016

Math 291-2: Lecture Notes Northwestern University, Winter 2016 Math 291-2: Lecture Notes Northwestern University, Winter 2016 Written by Santiago Cañez These are lecture notes for Math 291-2, the second quarter of MENU: Intensive Linear Algebra and Multivariable Calculus,

More information

First we introduce the sets that are going to serve as the generalizations of the scalars.

First we introduce the sets that are going to serve as the generalizations of the scalars. Contents 1 Fields...................................... 2 2 Vector spaces.................................. 4 3 Matrices..................................... 7 4 Linear systems and matrices..........................

More information

Math 4153 Exam 3 Review. The syllabus for Exam 3 is Chapter 6 (pages ), Chapter 7 through page 137, and Chapter 8 through page 182 in Axler.

Math 4153 Exam 3 Review. The syllabus for Exam 3 is Chapter 6 (pages ), Chapter 7 through page 137, and Chapter 8 through page 182 in Axler. Math 453 Exam 3 Review The syllabus for Exam 3 is Chapter 6 (pages -2), Chapter 7 through page 37, and Chapter 8 through page 82 in Axler.. You should be sure to know precise definition of the terms we

More information

LINEAR ALGEBRA BOOT CAMP WEEK 4: THE SPECTRAL THEOREM

LINEAR ALGEBRA BOOT CAMP WEEK 4: THE SPECTRAL THEOREM LINEAR ALGEBRA BOOT CAMP WEEK 4: THE SPECTRAL THEOREM Unless otherwise stated, all vector spaces in this worksheet are finite dimensional and the scalar field F is R or C. Definition 1. A linear operator

More information

Matrix Theory. A.Holst, V.Ufnarovski

Matrix Theory. A.Holst, V.Ufnarovski Matrix Theory AHolst, VUfnarovski 55 HINTS AND ANSWERS 9 55 Hints and answers There are two different approaches In the first one write A as a block of rows and note that in B = E ij A all rows different

More information

No books, no notes, no calculators. You must show work, unless the question is a true/false, yes/no, or fill-in-the-blank question.

No books, no notes, no calculators. You must show work, unless the question is a true/false, yes/no, or fill-in-the-blank question. Math 304 Final Exam (May 8) Spring 206 No books, no notes, no calculators. You must show work, unless the question is a true/false, yes/no, or fill-in-the-blank question. Name: Section: Question Points

More information

Elementary Linear Algebra

Elementary Linear Algebra Matrices J MUSCAT Elementary Linear Algebra Matrices Definition Dr J Muscat 2002 A matrix is a rectangular array of numbers, arranged in rows and columns a a 2 a 3 a n a 2 a 22 a 23 a 2n A = a m a mn We

More information

Lecture Notes for Math 414: Linear Algebra II Fall 2015, Michigan State University

Lecture Notes for Math 414: Linear Algebra II Fall 2015, Michigan State University Lecture Notes for Fall 2015, Michigan State University Matthew Hirn December 11, 2015 Beginning of Lecture 1 1 Vector Spaces What is this course about? 1. Understanding the structural properties of a wide

More information

Symmetric and self-adjoint matrices

Symmetric and self-adjoint matrices Symmetric and self-adjoint matrices A matrix A in M n (F) is called symmetric if A T = A, ie A ij = A ji for each i, j; and self-adjoint if A = A, ie A ij = A ji or each i, j Note for A in M n (R) that

More information

Problem Set (T) If A is an m n matrix, B is an n p matrix and D is a p s matrix, then show

Problem Set (T) If A is an m n matrix, B is an n p matrix and D is a p s matrix, then show MTH 0: Linear Algebra Department of Mathematics and Statistics Indian Institute of Technology - Kanpur Problem Set Problems marked (T) are for discussions in Tutorial sessions (T) If A is an m n matrix,

More information

Online Exercises for Linear Algebra XM511

Online Exercises for Linear Algebra XM511 This document lists the online exercises for XM511. The section ( ) numbers refer to the textbook. TYPE I are True/False. Lecture 02 ( 1.1) Online Exercises for Linear Algebra XM511 1) The matrix [3 2

More information

Solution. That ϕ W is a linear map W W follows from the definition of subspace. The map ϕ is ϕ(v + W ) = ϕ(v) + W, which is well-defined since

Solution. That ϕ W is a linear map W W follows from the definition of subspace. The map ϕ is ϕ(v + W ) = ϕ(v) + W, which is well-defined since MAS 5312 Section 2779 Introduction to Algebra 2 Solutions to Selected Problems, Chapters 11 13 11.2.9 Given a linear ϕ : V V such that ϕ(w ) W, show ϕ induces linear ϕ W : W W and ϕ : V/W V/W : Solution.

More information

Linear Algebra II Lecture 13

Linear Algebra II Lecture 13 Linear Algebra II Lecture 13 Xi Chen 1 1 University of Alberta November 14, 2014 Outline 1 2 If v is an eigenvector of T : V V corresponding to λ, then v is an eigenvector of T m corresponding to λ m since

More information

A linear algebra proof of the fundamental theorem of algebra

A linear algebra proof of the fundamental theorem of algebra A linear algebra proof of the fundamental theorem of algebra Andrés E. Caicedo May 18, 2010 Abstract We present a recent proof due to Harm Derksen, that any linear operator in a complex finite dimensional

More information

A linear algebra proof of the fundamental theorem of algebra

A linear algebra proof of the fundamental theorem of algebra A linear algebra proof of the fundamental theorem of algebra Andrés E. Caicedo May 18, 2010 Abstract We present a recent proof due to Harm Derksen, that any linear operator in a complex finite dimensional

More information

The minimal polynomial

The minimal polynomial The minimal polynomial Michael H Mertens October 22, 2015 Introduction In these short notes we explain some of the important features of the minimal polynomial of a square matrix A and recall some basic

More information

Further linear algebra. Chapter IV. Jordan normal form.

Further linear algebra. Chapter IV. Jordan normal form. Further linear algebra. Chapter IV. Jordan normal form. Andrei Yafaev In what follows V is a vector space of dimension n and B is a basis of V. In this chapter we are concerned with linear maps T : V V.

More information

GENERALIZED EIGENVECTORS, MINIMAL POLYNOMIALS AND THEOREM OF CAYLEY-HAMILTION

GENERALIZED EIGENVECTORS, MINIMAL POLYNOMIALS AND THEOREM OF CAYLEY-HAMILTION GENERALIZED EIGENVECTORS, MINIMAL POLYNOMIALS AND THEOREM OF CAYLEY-HAMILTION FRANZ LUEF Abstract. Our exposition is inspired by S. Axler s approach to linear algebra and follows largely his exposition

More information

3 (Maths) Linear Algebra

3 (Maths) Linear Algebra 3 (Maths) Linear Algebra References: Simon and Blume, chapters 6 to 11, 16 and 23; Pemberton and Rau, chapters 11 to 13 and 25; Sundaram, sections 1.3 and 1.5. The methods and concepts of linear algebra

More information

Elementary linear algebra

Elementary linear algebra Chapter 1 Elementary linear algebra 1.1 Vector spaces Vector spaces owe their importance to the fact that so many models arising in the solutions of specific problems turn out to be vector spaces. The

More information

Lecture Summaries for Linear Algebra M51A

Lecture Summaries for Linear Algebra M51A These lecture summaries may also be viewed online by clicking the L icon at the top right of any lecture screen. Lecture Summaries for Linear Algebra M51A refers to the section in the textbook. Lecture

More information

Chapter 6: Orthogonality

Chapter 6: Orthogonality Chapter 6: Orthogonality (Last Updated: November 7, 7) These notes are derived primarily from Linear Algebra and its applications by David Lay (4ed). A few theorems have been moved around.. Inner products

More information

Jordan Normal Form Revisited

Jordan Normal Form Revisited Mathematics & Statistics Auburn University, Alabama, USA Oct 3, 2007 Jordan Normal Form Revisited Speaker: Tin-Yau Tam Graduate Student Seminar Page 1 of 19 tamtiny@auburn.edu Let us start with some online

More information

Chapter 0 Preliminaries

Chapter 0 Preliminaries Chapter 0 Preliminaries Objective of the course Introduce basic matrix results and techniques that are useful in theory and applications. It is important to know many results, but there is no way I can

More information

Math 554 Qualifying Exam. You may use any theorems from the textbook. Any other claims must be proved in details.

Math 554 Qualifying Exam. You may use any theorems from the textbook. Any other claims must be proved in details. Math 554 Qualifying Exam January, 2019 You may use any theorems from the textbook. Any other claims must be proved in details. 1. Let F be a field and m and n be positive integers. Prove the following.

More information

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET This is a (not quite comprehensive) list of definitions and theorems given in Math 1553. Pay particular attention to the ones in red. Study Tip For each

More information

Linear Algebra: Matrix Eigenvalue Problems

Linear Algebra: Matrix Eigenvalue Problems CHAPTER8 Linear Algebra: Matrix Eigenvalue Problems Chapter 8 p1 A matrix eigenvalue problem considers the vector equation (1) Ax = λx. 8.0 Linear Algebra: Matrix Eigenvalue Problems Here A is a given

More information

1 Last time: least-squares problems

1 Last time: least-squares problems MATH Linear algebra (Fall 07) Lecture Last time: least-squares problems Definition. If A is an m n matrix and b R m, then a least-squares solution to the linear system Ax = b is a vector x R n such that

More information

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET This is a (not quite comprehensive) list of definitions and theorems given in Math 1553. Pay particular attention to the ones in red. Study Tip For each

More information

Math 25a Practice Final #1 Solutions

Math 25a Practice Final #1 Solutions Math 25a Practice Final #1 Solutions Problem 1. Suppose U and W are subspaces of V such that V = U W. Suppose also that u 1,..., u m is a basis of U and w 1,..., w n is a basis of W. Prove that is a basis

More information

MATH SOLUTIONS TO PRACTICE MIDTERM LECTURE 1, SUMMER Given vector spaces V and W, V W is the vector space given by

MATH SOLUTIONS TO PRACTICE MIDTERM LECTURE 1, SUMMER Given vector spaces V and W, V W is the vector space given by MATH 110 - SOLUTIONS TO PRACTICE MIDTERM LECTURE 1, SUMMER 2009 GSI: SANTIAGO CAÑEZ 1. Given vector spaces V and W, V W is the vector space given by V W = {(v, w) v V and w W }, with addition and scalar

More information

Numerical Linear Algebra

Numerical Linear Algebra University of Alabama at Birmingham Department of Mathematics Numerical Linear Algebra Lecture Notes for MA 660 (1997 2014) Dr Nikolai Chernov April 2014 Chapter 0 Review of Linear Algebra 0.1 Matrices

More information

JORDAN NORMAL FORM. Contents Introduction 1 Jordan Normal Form 1 Conclusion 5 References 5

JORDAN NORMAL FORM. Contents Introduction 1 Jordan Normal Form 1 Conclusion 5 References 5 JORDAN NORMAL FORM KATAYUN KAMDIN Abstract. This paper outlines a proof of the Jordan Normal Form Theorem. First we show that a complex, finite dimensional vector space can be decomposed into a direct

More information

Lecture notes: Applied linear algebra Part 1. Version 2

Lecture notes: Applied linear algebra Part 1. Version 2 Lecture notes: Applied linear algebra Part 1. Version 2 Michael Karow Berlin University of Technology karow@math.tu-berlin.de October 2, 2008 1 Notation, basic notions and facts 1.1 Subspaces, range and

More information

The Spectral Theorem for normal linear maps

The Spectral Theorem for normal linear maps MAT067 University of California, Davis Winter 2007 The Spectral Theorem for normal linear maps Isaiah Lankham, Bruno Nachtergaele, Anne Schilling (March 14, 2007) In this section we come back to the question

More information

Algebra C Numerical Linear Algebra Sample Exam Problems

Algebra C Numerical Linear Algebra Sample Exam Problems Algebra C Numerical Linear Algebra Sample Exam Problems Notation. Denote by V a finite-dimensional Hilbert space with inner product (, ) and corresponding norm. The abbreviation SPD is used for symmetric

More information

1-/5 i-y. A-3i. b[ - O o. x-f -' o -^ ^ Math 545 Final Exam Fall 2011

1-/5 i-y. A-3i. b[ - O o. x-f -' o -^ ^ Math 545 Final Exam Fall 2011 Math 545 Final Exam Fall 2011 Name: Aluf XCWl/ Solve five of the following six problems. Show all your work and justify all your answers. Please indicate below which problem is not to be graded (otherwise,

More information

Here is an example of a block diagonal matrix with Jordan Blocks on the diagonal: J

Here is an example of a block diagonal matrix with Jordan Blocks on the diagonal: J Class Notes 4: THE SPECTRAL RADIUS, NORM CONVERGENCE AND SOR. Math 639d Due Date: Feb. 7 (updated: February 5, 2018) In the first part of this week s reading, we will prove Theorem 2 of the previous class.

More information

(VI.C) Rational Canonical Form

(VI.C) Rational Canonical Form (VI.C) Rational Canonical Form Let s agree to call a transformation T : F n F n semisimple if there is a basis B = { v,..., v n } such that T v = λ v, T v 2 = λ 2 v 2,..., T v n = λ n for some scalars

More information

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition Prof. Tesler Math 283 Fall 2016 Also see the separate version of this with Matlab and R commands. Prof. Tesler Diagonalizing

More information

ON THE MATRIX EQUATION XA AX = X P

ON THE MATRIX EQUATION XA AX = X P ON THE MATRIX EQUATION XA AX = X P DIETRICH BURDE Abstract We study the matrix equation XA AX = X p in M n (K) for 1 < p < n It is shown that every matrix solution X is nilpotent and that the generalized

More information

Lecture 21: The decomposition theorem into generalized eigenspaces; multiplicity of eigenvalues and upper-triangular matrices (1)

Lecture 21: The decomposition theorem into generalized eigenspaces; multiplicity of eigenvalues and upper-triangular matrices (1) Lecture 21: The decomposition theorem into generalized eigenspaces; multiplicity of eigenvalues and upper-triangular matrices (1) Travis Schedler Tue, Nov 29, 2011 (version: Tue, Nov 29, 1:00 PM) Goals

More information

Math 240 Calculus III

Math 240 Calculus III Generalized Calculus III Summer 2015, Session II Thursday, July 23, 2015 Agenda 1. 2. 3. 4. Motivation Defective matrices cannot be diagonalized because they do not possess enough eigenvectors to make

More information

Generalized Eigenvectors and Jordan Form

Generalized Eigenvectors and Jordan Form Generalized Eigenvectors and Jordan Form We have seen that an n n matrix A is diagonalizable precisely when the dimensions of its eigenspaces sum to n. So if A is not diagonalizable, there is at least

More information

Math Matrix Algebra

Math Matrix Algebra Math 44 - Matrix Algebra Review notes - 4 (Alberto Bressan, Spring 27) Review of complex numbers In this chapter we shall need to work with complex numbers z C These can be written in the form z = a+ib,

More information

2 b 3 b 4. c c 2 c 3 c 4

2 b 3 b 4. c c 2 c 3 c 4 OHSx XM511 Linear Algebra: Multiple Choice Questions for Chapter 4 a a 2 a 3 a 4 b b 1. What is the determinant of 2 b 3 b 4 c c 2 c 3 c 4? d d 2 d 3 d 4 (a) abcd (b) abcd(a b)(b c)(c d)(d a) (c) abcd(a

More information

Math 408 Advanced Linear Algebra

Math 408 Advanced Linear Algebra Math 408 Advanced Linear Algebra Chi-Kwong Li Chapter 4 Hermitian and symmetric matrices Basic properties Theorem Let A M n. The following are equivalent. Remark (a) A is Hermitian, i.e., A = A. (b) x

More information

Q N id β. 2. Let I and J be ideals in a commutative ring A. Give a simple description of

Q N id β. 2. Let I and J be ideals in a commutative ring A. Give a simple description of Additional Problems 1. Let A be a commutative ring and let 0 M α N β P 0 be a short exact sequence of A-modules. Let Q be an A-module. i) Show that the naturally induced sequence is exact, but that 0 Hom(P,

More information

A = 3 1. We conclude that the algebraic multiplicity of the eigenvalues are both one, that is,

A = 3 1. We conclude that the algebraic multiplicity of the eigenvalues are both one, that is, 65 Diagonalizable Matrices It is useful to introduce few more concepts, that are common in the literature Definition 65 The characteristic polynomial of an n n matrix A is the function p(λ) det(a λi) Example

More information