Further linear algebra. Chapter IV. Jordan normal form.

Size: px
Start display at page:

Download "Further linear algebra. Chapter IV. Jordan normal form."

Transcription

1 Further linear algebra. Chapter IV. Jordan normal form. Andrei Yafaev In what follows V is a vector space of dimension n and B is a basis of V. In this chapter we are concerned with linear maps T : V V. Let A be the matrix representing T in the basis B. Because A is an n n matrix, we can form powers A k for any k with the convention that A = I n. Note that A k represents the transformation T composed ktimes. Notice for example that when the matrix is diagonal with coefficients λ i on the diagonal, then A n is diagonal with coefficients λ n i. Notice also that such a matrix is invertible if and only if all λ i s are non-zero, then A is the diagonal matrix with coefficients λ i on the diagonal. Definition. Let f(x) = a i X i k[x]. We define f(t) = a i T i. where we define T = Id. This is a linear transformation. If A M n (k) is a matrix then we define f(a) = a i A i, This matrix f(a) represents f(t) in the basis B. What is means is that we can evaluate a polynomial at a n n matrix and get another n n matrix. We write [f(t)] B for this matrix in the basis B, obviously it is the same as f([t] B ). Let s look at an example. Take A = ( ) and f(x) = x 2 5x + 3. Then f(a) = A 2 5A + 3 = ( )

2 Another example: V = M n (k) and T sends M to M t. Consider f(x) = x 2. As T 2 = I, we see that f(t) =. Notice that [f(t)] B = f([t] B ) It follows that if T is a linear map, f(t) = if and only if f(a) = where A is the matrix of T is some (equivalently any) basis. Another property worth noting is that if f,g k[x], then f(t)g(t) = (fg)(t) = (gf)(t) = g(t)f(t) Definition.2 Recall that the characteristic polynomial of an n n matrix A is defined by ch A (x) = det(x I n A) = ( ) n det(a x I n ). This is a monic polynomial of degree n over k. Now suppose T : V V is a linear map. We can define ch T to be ch [T]B but we need to check that this does not depend on the basis B. If C is another basis with transition matrix M then we have: ch [T]C (X) = det(x I n M [T] B M) = det(m (X I n [T] B )M) = det(m) det(x I n [T] B ) det(m) = det(x I n [T] B ) = ch [T]B (X) In other words, the characteristic polynomial does not depend on the choice of the basis in which we write our matrix. The following (important!) theorem was proved in the first year courses. Theorem. (Cayley Hamilton Theorem) For any A be an n n matrix. We have ch A (A) =. We therefore have: Theorem.2 (Cayley Hamilton Theorem) For any T : V V linear map, we have ch T (T) =. 2

3 ( ) λ Example.3 Take A = Then ch λ A (x) = (x λ )(x λ 2 ) and 2 clearly ch A (A)( =. ) 2 Take A =. Calculate ch 3 4 A (x) and check that ch A (A) =. There are plenty of polynomials f such that f(a) =, all the multiples of ch A for example. What can also happen is that some divisor g of ch A is already such that g(a) =. Take the identity I n for example. Its characteristic polynomial is (x ) n but in fact g = x is already such that g(i n ) =. This leads us to the notion of minimal polynomial.. Minimal polynomials. Definition.3 Let V be a finite dimensional vector space over a field k and T : V V a linear map. A minimal polynomial of T is the monic polynomial m k[x] such that m(t) = ; if f(t) = and f then deg f deg m. Notice that if m is a polynomial as in the definition, and f a polynomial such that f(t) = and deg(f) < deg(m), then f =. Indeed, if f was not zero, then dividing f by the coefficient of the leading term, one would obtain a monic polynomial with degree strictly less than m which contradicts the definition of m. Theorem.4 Every linear map T : V V has a unique minimal polynomial m T. Furthermore if f k[x] is such that f(t) = iff m T f. Proof. Firstly the Cayley-Hamilton theorem implies that there exists a polynomial f satisfying f(t) =, namely f = ch T. Among monic polynomials satisfying f(t) = we choose one of smallest degree, call it m T. This shows that the minimal polynomial exists. Suppose that m T is not unique then there exists another monic polynomial n(x) k[x] satisfying the conditions of the definition. Because n(t) =, deg(n) deg(m T ) and because m T (T) =, deg(m T ) deg(n) hence deg(m) = deg(n). 3

4 If f(x) = m T (x) n(x) then f(t) = m T (T) n(t) =, also deg(f) < deg(m T ) = deg(n). By the remark following the definition of the minimal polynomial, we see that f = i.e. m T = n. This proves the uniqueness. Suppose f k[x] and f(t) =. By the Division Algorithm for polynomials there exist unique q,r k[x] with deg(r) < deg(m) and Then f = qm + r. r(t) = f(t) q(t)m(t) = q(t) =. So r is the zero polynomial (by the remark following the definition.) Hence f = qm and so m f. Conversely if f k[x] and m f then f = qm for some q k[x] and so f(t) = q(t)m(t) = q(t) =. Corollary.5 If T : V V is a linear map then m T ch T. Proof. By the Cayley-Hamilton Theorem ch T (T) =. Using the corollary we can calculate the minimal polynomial as follows: Calculate ch T and factorise it into irreducibles. Make a list of all the factors. Find the monic factor m of smallest degree such that m(t) =. 2 Example.6 Suppose T is represented by the matrix 2. The 2 characteristic polynomial is ch T (X) = (X 2) 3. The factors of this are:, (X 2), (X 2) 2, (X 2) 3. The minimal polynomial is (X 2) 2. 4

5 In fact this method can be speeded up: there are certain factors of the characteristic polynomial which cannot arise. To explain this we recall the definition of an eigenvalue Definition.4 Recall that a number λ k is called an eigenvalue of T if there is a non-zero vector v satisfying T(v) = λ v. The non-zero vector v is called an eigenvector Remark.7 It is important that an eigenvector be non-zero. If you allow zero to be an eigenvector, then any λ would be an eigenvalue. Proposition.8 Let v be an eigenvector of T with eigenvalue λ k. Then for any polynomial f k[x], (f(t))(v) = f(λ) v. Proof. Just use that T(v) = λv. Theorem.9 If T : V V is linear and λ k then the following are equivalent: (i) λ is an eigenvalue of T. (ii) m T (λ) =. (iii) ch T (λ) =. Proof. (i) (ii): Assume T(v) = λv with v. Then by the proposition, (m T (T))(v) = m T (λ) v. But m T (T) = so we have m T (λ) v =. Since v this implies m T (λ) =. (ii) (iii): This is trivial since we have already shown that m T is a factor of ch T. (iii) (i): Suppose ch T (λ) =. Therefore det(λ Id T) =. It follows that (λ Id T) is not invertible hence λ Id T has a non-zero kernel. Therefore there exists v V such that (λ Id T)(v) =. But then T(v) = λ v. 5

6 as Now suppose the characteristic polynomial of T factorises into irreducibles ch T (X) = r (X λ i ) a. i= By fundamental theorem of algebra, if k = C, we can always factorise it like this. Then the minimal polynomial has the form m T (X) = r (X λ i ) b, b i a i. i= This makes it much quicker to calculate the minimal polynomial. Indeed, in practice, the number of factors and the a i s are quite small. Example. Suppose T is represented by the matrix diag(2, 2, 3). The characteristic polynomial is ch T (X) = (X 2) 2 (X 3). The possibilities for the minimal polynomial are: (X 2)(X 3), (X 3) 2 (X 3). The minimal polynomial is (X 2)(X 3)..2 Generalised Eigenspaces Definition.5 Let V be a finite dimensional vector space over a field k, and let λ k be an eigenvalue of a linear map T : V V. We define for t N the t-th generalised eigenspace by: V t (λ) = ker((λ Id T) t ). Note that V (λ) is the usual eigenspace (i.e. the set of eigenvectors together with zero). Remark. We obviously have V (λ) V 2 (λ)... 6

7 and by definition, dimv t (λ) = Nullity ( (λ Id T) t). Another property is that generalised eigenspaces are T invariant: T(V i (λ)) V i (λ) To see this, let v V i (λ). Then T(T λid) i v = = (T λid) i Tv, therefore T(v) V i (λ). Example.2 Let A = We have ch A (X) = (X 2) 3 so 2 is the only eigenvalue. We ll now calculate the generalised eigenspaces V t (2): 2 2 V (2) = ker 2. We calculate the kernel by row-reducing the matrix: V (2) = ker = Span. Similarly V 2 (2) = ker = Span,. V 3 (2) = ker = Span,,. Example.3 Let 2 A =

8 Let V be a vector space and U and W be two subspaces. Then (exercise on Sheet 4), U + W is a subspace of V. If furthermore U V = {}, then we call this subspace the direct sum of U and W and denote it by U W. In this case dim(u W) = dimu + dimw Theorem.4 (Primary Decomposition Theorem) If V is a finite dimensional vector space over k and T : V V is linear, with distinct eigenvalues λ,...,λ r k. Let f be a monic polynomial such that f(t) = and suppose that f factorises in k[x] into a product of degree one polynomials then r f(x) = (X λ i ) b i, i= V = V b (λ ) V br (λ r ). Lemma.5 Let k be a field. If f,g k[x] satisfy gcd(f,g) = and T is as above then ker(fg(t)) = ker(f(t)) ker(g(t)). Proof. (of the theorem) We have f(t) =, so ker(f(t)) = V. We have a factorization of f into pairwise coprime factors of the form (x λ i ) b i so the lemma implies that ( r ) V = ker(f(α)) = ker (T λ i ) b i = ker(t λ ) b ker(t λ t ) bt i= = V b (λ ) V bt (λ t ). Proof. ( of the lemma) Let f,g k[x] satisfy gcd(f,g) =. Firstly if v ker f(t) + kerg(t), say v = w + w 2, with w ker f(t) and w 2 ker g(t) then fg(t)v = fg(t)(w + w 2 ) = f(g(t)w ) + f(g(t)w 2 ) = f(g(t)w ) Now, f and g are polynomials in k[x], hence fg = gf, therefore 8

9 f(g(t)w ) = g(f(t)w ) = because w 2 ker(f(t)). Therefore ker(f(α)) + ker(g(α)) ker(f g(α)). We need to prove the eqaulity, here we will use that gcd(f,g) =. Now since gcd(f,g) = there exist a,b k[x] such that af + bg =. So a(t)f(t) + b(t)g(t) = Let v ker(fg(t)). If v = a(t)f(t)v, (the identity map). v 2 = b(t)g(t)v then v = v + v 2 and g(t)v = (gaf)(t)v = a(fg(t)v) = a(t) =. So v ker(g(t)). Similarly v 2 ker(f(t)) since f(t)v 2 = (fbg)(t)v = b(fg(t)v) = b(t) =. Hence ker fg(t) = kerf(t) + kerg(t). Moreover, if v ker f(t) ker g(t) then v = = v 2 so v =. Hence ker fg(t) = kerf(t) kerg(t). Notice that, by fundamental theorem of algebra, in C[x] every polynomial factorises into a product of degree one ones. The theorem applies to ch T (x) and m T (x). Definition.6 Recall that a linear map T : V V is diagonalisable if there is a basis B of V such that [T] B is a diagonal matrix. This is equivalent to saying that the basis vectors in B are all eigenvectors. 9

10 Theorem.6 Let V is a finite dimensional vector space over a field k and let T : V V be a linear map with distinct eigenvalues λ,...,λ r k. Then T is diagonalizable iff we have (in k[x]): m T (X) = (X λ )...(X λ r ). Proof. First suppose that T is diagonalisable and let B be a basis of eigenvectors. Let f(x) = (X λ )...(X λ r ). We already know that f m T, so to prove that f = m T we just have to check that f(t) =. To show this, it is sufficient to check that f(t)(v) = for each basis vector v B. Suppose v B, so v is an eigenvector with some eigenvalue λ i. Then we have f(t)(v) = f(λ) v = v =. Therefore m T = f. Conversely if m T = f then by the primary decomposition theorem we have V = V (λ )... V (λ r ). Let B i be a basis for V (λ i ). Then obviously the elements of B i are eigenvectors and B = B... B r is a basis of V. Therefore T is diagonalisable. Remark.7 Observe that if the characteristic polynomial splits as ch T (x) = (x λ ) (x λ r ) where λ i s are distinct eigenvalues, then m T = ch T and the matrix is diagonalisable. The converse, of course, does not hold. Example.8 Let k = C and let A = ( ) The characteristic polynomials is (x )(x 6). The minimal polynomial is the same. The matrix is diagonalisable. One finds that the basis of eigenvectors is. ( 2 3 ), ( In fact this matrix is diagonalisable over R or even Q. )

11 Example.9 Let k = R and let A = ( ). The characteristic polynomial is x 2 +. It is irreducible over R. The minimal polynomial is the same. The matrix is not diagonalisable over R, however over C x 2 + = (x i)(x + i) and the matrix is diagonalisable. Example.2 Let k = C and let A = ( ). The characteristic polynomials is X 2. Since A the minimal polynomial is also X 2. Since this is not a product of distinct linear factors, A is not diagonalizable over C. Example.2 Let k = C and let A = ( ). The minimal polynomial is (x ) 2. Not diagonalisable..3 Jordan Bases in the one eigenvalue case Let T : V V be a linear map. Fix a basis for V and let A be the matrix of T in this basis. As we have seen above, it is not always the case that T can be diagonalised; i.e. there is not always a basis of V consisting of eigenvalues of T. This the case that there is no basis of eigenvalues, the best kind of basis is a Jordan basis. We shall define a Jordan basis in several steps. Suppose λ k is the only eigenvalue of a linear map T : V V. We have defined generalized eigenspaces: V (λ) V 2 (λ)... V b (λ), where b is the power of X λ in the minimal polynomial m T. We can choose a basis B for V (λ). Then we can choose B 2 so that B B 2 is a basis for V 2 (λ) etc. Eventually we end up with a basis B... B b for V b (λ). We ll call such a basis a pre-jordan basis.

12 Consider A = ( ) One calculates the characteristic polynomial and finds ((x+) ) 2 hence λ = is the only eigenvalue. The unique eigenvector is v =. Hence V 2 ( ) = Span(v). Of course we have (A λi 2 ) 2 = hence ( ) V 2 ( ) = C 2 and we complete v to a basis of C 2 = V 2 ( ), by v 2 = for example. We have Av 2 = 2v v 2 and hence in the basis {v,v 2 } the matrix of A ( ) 2 The basis {v,v 2 } is a pre-jordan basis for A. Example A = 2 2 We have ch A (X) = (X ) 3 and m A (X) = (X ) 2. There is only one eigenvalue λ =, and we have generalized eigenspaces V () = ker ( 2 ), V 2 () = ker() = C 3. So we can choose a pre-jordan basis as follows: B =, 2, B 2 =. This in fact also works over R. Now note the following: Lemma.23 If v V t (λ) with t > then (T λ Id)(v) V t (λ). Proof. Clear from the definition of the generalised eigenspaces. 2

13 Now suppose we have a pre-jordan basis B... B b. We call this a Jordan basis if in addition we have the condition: (T λ Id)B t B t, t = 2, 3,...,b. If we have a pre-jordan basis B... B b, then to find a Jordan basis, we do the following: For each basis vector v B b, replace one of the vectors in B b by (T λ Id)(v). When choosing which vector to replace, we just need to take care that we still have a basis at the end. For each basis vector v B b, replace one of the vectors in B b 2 by (T λ Id)(v). When choosing which vector to replace, we just need to take care that we still have a basis at the end. etc. For each basis vector v B 2, replace one of the vectors in B by (T λ Id)(v). When choosing which vector to replace, we just need to take care that we still have a basis at the end. Once finished, order the vectors of the basis approriately. We ll prove later that this method always works. Example.24 Let s look again at ( ) 3 2 A =. 8 5 We have seen that {v,v 2 } is a pre-jordan basis, here v 2 is the second vector in the standard basis. ( ) 2 Replace v by the vector (A + I 2 )v 2 =. Then {v 4,v 2 } still forms a basis of C 2. This is the Jordan basis for A. We have Av = v and Av 2 = v v 2 (you do not need to calculate, just use (A + I 2 )v 2 = v ). Hence in the new basis the matrix is ( ) 3

14 Example.25 In the example above, we replace one of the vectors in B by (A I 3 ) =. So we can choose a Jordan basis as follows: B =,, B 2 =. Example.26 Take k = R. A = ( ). Here, we have seen that the characteristic and minimal polynomials are x 2. Therefore, ( is ) the only eigenvalue. Clearly v = generates the eigenspace and V 2 () = R 2. We complete ( ) the basis by taking v 2 =. We get a pre-jordan basis. ( ) Let s construct a Jordan basis. Replace v by Av 2 =. This is a Jordan basis. The matrix of A in the new basis is ( ) Example A = Clearly, the characteristic polynomial is (x 2) 3 and it is equal to the minimal polynomial, 2 is the only eigenvalue. V (2) has equations y = z =, hence it s spanned by v = 4

15 V 2 (2) is z = and V 3 (2) is R 3. Therefore, the standard basis is a pre- Jordan basis. We have 4 (A 2I 3 ) 2 = hence we get Now, We have 2 2 A 2I 3 = 2 2 (A 2I 3 )v 3 = 2 and we replace v 2 by this vector. 4 Now (A 2I 3 )v 2 = We get: 4 v =, 2 v 2 = 2, v 3 = We have Av = 2v, Av 2 = v + 2v 2, Av 3 = v 2 + 2v 3 In this basis the matrix of A is: This is a Jordan basis. 5

16 .4 Jordan Canonical (or Normal) Form in the one eigenvalue case The Jordan canonical form of a linear map T : V V is essentially the matrix of T with respect to a Jordan basis. We just need to order the vecors appropriately. Everything is over a field k, often k will be C. Suppose for the moment that m T = (x λ) b, in particular T has only one eigenvalue λ. Choose a Jordan basis: B = B... B b, Of course, as (A λid) b =, we have V b (λ) = V. We have a chain of subspaces V (λ) V 2 (λ) V b (λ) = V and the pre-jordan basis was constructed by starting with a basis of V (λ) and completing it successefully to get a basis of V b (λ) = V. WE then altered this basis so that (T λid)b i B i Notice that we can arrange the basis elements in chains : starting with a vector v B b we get a chain v, (T λid)v, (T λid) 2 v,...,(t λid) b v This last vector w = (T λid) b v is in V (λ). Indeed (T λid) b v = hence therefore therefore w V (λ). We have the following (T λid)w = Tw = λw Lemma.28 For any v B b (in particular v / B i for i < b!), the vectors v, (T λid)v, (T λid) 2 v,...,(t λid) b v are linearly independent. 6

17 Proof. Suppose that Then µ i (T λid) i v = i µ v + (T λid)w = where w is a lineat combination of (T λid) k v. Multiplying by (T λid) b, we get µ (T λid) b v = but, as v / V b (λ), we see that (T λid) b v hence µ =. Repeating the process inductively, we get that µ i = for all i and the vectors we consider are linearly independent. Let us number the vectors in this chain as v b = (T λid) b v,...,v = v. In other words v i = (T λid) b i v Then i.e. In other words, (T λid)v i = v i Tv i = λv i + v i. T(v i ) =, λ. in the basis formed by this chain. This gives a Jordan block i.e. b b matrix: 7

18 λ λ J k (λ) = λ In this way, we arrange our Jordan basis in chains starting with B i (for i = b,b,...,) and terminating at V (λ). By putting the chains together, we get that in the Jordan basis, the matrix is of the following form: We nad write it as λ λ λ λ λ λ λ λ λ λ [T] B = diag(j h (λ),...,j hw (λ)). λ λ. where the J hi s are blocks corresponding to a chain of length h i. We can actually say more; in fact the following results determines the number of blocks: Lemma.29 The number of blocks is the dimension of the eigenspace V (λ). Proof. Let (v,...,v k ) be the Jordan basis of the subspace U corresponding to one block. It is a chain, we have Tv = λv 8

19 and Tv i = λv i + v i for 2 i k. Let v U be an eigenvector : Tv = λv. Write v = k i= c iv i. Then Tv = c λv + i 2 c i (λv i + v i ) = λv + i 2 c i v i It follows that T v = λv if and only if i 2 c iv i = which implies that c 2 = = c n = and hence v is in the subspace generated by v. Therefore, each block determines exactly one eigenvector for eigenvalue λ. As eigenvectors from different blocks are linearly independent : they are members of a basis, the number of blocks is exactly the dimension of the eigenspace V (λ). SUMMARY : To summarise what we have seen so far. Suppose T has one eigenvalue λ, let m T (x) = (x λid) b be its minimal polynomial. We construct a pre- Jordan basis by choosing a basis B for the eigenspace V (λ) and then complete by B 2 (a certain number of vectors in V 2 (λ)) and then to B 3,...,B b. Note that V b (λ) = V. We get a pre-jordan basis B = B B b. Now we alter the pre-jordan basis by doing the following. Start with a vector v b B, replace one of the vectors in B b by v b = (T λid)v b making sure that this v b is linearly independent of the other vectors in B b. Then replace a vector in B b 2 by v b 2 = (T λid)v b (again choose a vector to replace by choosing one such that v b 2 is linearly independent of the others)... continue until you get to V (λ). The last vector will be v V (λ) i.e. v is an eigenvector. We obtain a chain of vectors v = (T λid)v 2,v 2 = (T λid)v 3,...,v b = (T λid)v b,v b Hence in particular Tv k = v k + λv k The subspace U spanned by this chain is T-stable (because Tv k = v k +λv k ) and this chain is linearly independent hence the chain forms a basis of U. 9

20 In restriction to U and with respect to this basis the matrix of T is λ λ λ λ J(b)(λ) = λ λ λ λ λ λ λ λ. One constructs such chains with all elements of B b. Once done, one looks for elements in B b which are not in the previously constructed chains starting at B b and constructs chains with them. Then with B b 2, etc... In the end, the union of chains will be a Jordan basis and in it the matrix of T is of the form : diag(j h (λ),...,j hw (λ)). Notice the following two observations :. There is always a block of size b b. Hence by knowing the degree of the minimal polynomial, in some cases it is possible to determine the shape of Jordan normal form. 2. The number of blocks is the dimension of the eigenspace V (λ) Here are some examples: Suppose you have a matrix such that and ch A = (x λ) 5 m A (x) = (x λ) 4 There is always a block of size 4 4, hence the Jordan normal form has one 4 4 block and one block. Suppose ch A is the same but m A (x) = (x λ) 3. Here you need to know more. There is one 3 3 block and then either two blocks or one 2 2 2

21 block. This is determined by the diomension of V (λ). If it s three then the first possibility, if it s two then the second. A = 2 One calculates that ch A (x) = (x ) 4. We have A I = Clearly the rank of A I is, hence dim V (λ) = 3. This means that the Jordan normal form will have three blocks. Therefore there will be two blocks of size and one of size 2 2. The Jordan normal form is Another example: A = One calculates that ch A (x) = (x + 2) 4. We have A + 2I = We see that (A+2I) 2 = and therefore m A (x) = (x+2) 2. As there is always a block of size two, there are two possibilities : either two 2 2 blocks or one 2 2 and two. 2

22 To decide which one it is, we see that the rank of A + 2I is 2 hence the dimension of the kernel is 2. There are therefore 2 blocks and the Jordan normal form is Jordan canonical form in general. Once we know how to determine Jordan canonical form in one eigenvalue case, the general case is easy. Let T be a linear transformation and λ,...,λ r its eigenvalues. Suppose that the minimal polynomial decomposes as r m T (x) = (x λ i ) b i i= (recall again that this is always true over C.) The we have seen that V = V b (λ ) V br (λ r ) and each V bi (λ i ) is stable by T. Therefore in restriction to each V bi (λ i ), T is a linear transformation with one eigenvelue, namely λ i and the minimal polynomial of T restricted to V bi (λ i ) is (x λ i ) b i. One gets the Jordan basis by taking the union of Jordan bases for each V bi (λ i ) which are constructed as previously. Let s look at an example. A = 2 2 One calculates that ch A (x) = x(x ) 2. Then and are the only eigenvalues and V = V () V 2 () One finds that V () is generated by v = 22

23 That will be the first vector of the basis. For λ =. We have 2 A I = 2 We find that V (λ) is spanned by v = and Then and V 2 (λ) is spanned by (A I) 2 = v = v 2 = Notice that (A I)v 2 = v and therefore this is already a Jordan basis. The matrix in this basis is Another example : A = One calculates that ch A (x) = (x ) 2 (x ). Then and are the only eigenvalues. 23

24 One finds V () = Span(u =,u 2 ) 2 The dimension is two, therefore there will be two blocks of size corresponding to the eigenvalue. For V (), one finds 2 V () = Span(u 3 = 2 ) In the basis (u,u 2,u 3 ), the matrix is It is diagonal, the matrix is diagonalisable, in fact m A = (x )(x ). And a last example : find Jordan basis and normal form of : 4 A = One finds that the characteristic polynomial is ch A (x) = (x 2) 2 (x 3) 2. Hence 2 and 3 are eigenvalues and we have 2 A 2I = Clearly the dimension of the kernel is 2 and spanned by e 2 and e 4. The eigenspace has dimension two. So we will have two blocks of size corresponding to eigenvalue 2. For the eigenvalue 3: 24

25 A 3I = We see that rows one and three are identical, others are linearly independent. It follows that the eigenspace is one-dimensional and spanned by u = ) 3 We will have one block. Let us calculate: (A 3I) 2 = We take the vector v = 4 ) to complete the basis of ker(a 3I)2. 2 Now, we have (A 3I)v = u hence we already have a Jordan basis. The basis (e 2,e 4,u,v) is a Jordan basis and in this basis the matrix is

A proof of the Jordan normal form theorem

A proof of the Jordan normal form theorem A proof of the Jordan normal form theorem Jordan normal form theorem states that any matrix is similar to a blockdiagonal matrix with Jordan blocks on the diagonal. To prove it, we first reformulate it

More information

5. Diagonalization. plan given T : V V Does there exist a basis β of V such that [T] β is diagonal if so, how can it be found

5. Diagonalization. plan given T : V V Does there exist a basis β of V such that [T] β is diagonal if so, how can it be found 5. Diagonalization plan given T : V V Does there exist a basis β of V such that [T] β is diagonal if so, how can it be found eigenvalues EV, eigenvectors, eigenspaces 5.. Eigenvalues and eigenvectors.

More information

(Can) Canonical Forms Math 683L (Summer 2003) M n (F) C((x λ) ) =

(Can) Canonical Forms Math 683L (Summer 2003) M n (F) C((x λ) ) = (Can) Canonical Forms Math 683L (Summer 2003) Following the brief interlude to study diagonalisable transformations and matrices, we must now get back to the serious business of the general case. In this

More information

MATH JORDAN FORM

MATH JORDAN FORM MATH 53 JORDAN FORM Let A,, A k be square matrices of size n,, n k, respectively with entries in a field F We define the matrix A A k of size n = n + + n k as the block matrix A 0 0 0 0 A 0 0 0 0 A k It

More information

LINEAR ALGEBRA BOOT CAMP WEEK 2: LINEAR OPERATORS

LINEAR ALGEBRA BOOT CAMP WEEK 2: LINEAR OPERATORS LINEAR ALGEBRA BOOT CAMP WEEK 2: LINEAR OPERATORS Unless otherwise stated, all vector spaces in this worksheet are finite dimensional and the scalar field F has characteristic zero. The following are facts

More information

1 Invariant subspaces

1 Invariant subspaces MATH 2040 Linear Algebra II Lecture Notes by Martin Li Lecture 8 Eigenvalues, eigenvectors and invariant subspaces 1 In previous lectures we have studied linear maps T : V W from a vector space V to another

More information

Bare-bones outline of eigenvalue theory and the Jordan canonical form

Bare-bones outline of eigenvalue theory and the Jordan canonical form Bare-bones outline of eigenvalue theory and the Jordan canonical form April 3, 2007 N.B.: You should also consult the text/class notes for worked examples. Let F be a field, let V be a finite-dimensional

More information

SUPPLEMENT TO CHAPTERS VII/VIII

SUPPLEMENT TO CHAPTERS VII/VIII SUPPLEMENT TO CHAPTERS VII/VIII The characteristic polynomial of an operator Let A M n,n (F ) be an n n-matrix Then the characteristic polynomial of A is defined by: C A (x) = det(xi A) where I denotes

More information

Topics in linear algebra

Topics in linear algebra Chapter 6 Topics in linear algebra 6.1 Change of basis I want to remind you of one of the basic ideas in linear algebra: change of basis. Let F be a field, V and W be finite dimensional vector spaces over

More information

Notes on the matrix exponential

Notes on the matrix exponential Notes on the matrix exponential Erik Wahlén erik.wahlen@math.lu.se February 14, 212 1 Introduction The purpose of these notes is to describe how one can compute the matrix exponential e A when A is not

More information

(VI.D) Generalized Eigenspaces

(VI.D) Generalized Eigenspaces (VI.D) Generalized Eigenspaces Let T : C n C n be a f ixed linear transformation. For this section and the next, all vector spaces are assumed to be over C ; in particular, we will often write V for C

More information

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v ) Section 3.2 Theorem 3.6. Let A be an m n matrix of rank r. Then r m, r n, and, by means of a finite number of elementary row and column operations, A can be transformed into the matrix ( ) Ir O D = 1 O

More information

GENERALIZED EIGENVECTORS, MINIMAL POLYNOMIALS AND THEOREM OF CAYLEY-HAMILTION

GENERALIZED EIGENVECTORS, MINIMAL POLYNOMIALS AND THEOREM OF CAYLEY-HAMILTION GENERALIZED EIGENVECTORS, MINIMAL POLYNOMIALS AND THEOREM OF CAYLEY-HAMILTION FRANZ LUEF Abstract. Our exposition is inspired by S. Axler s approach to linear algebra and follows largely his exposition

More information

MATH 320: PRACTICE PROBLEMS FOR THE FINAL AND SOLUTIONS

MATH 320: PRACTICE PROBLEMS FOR THE FINAL AND SOLUTIONS MATH 320: PRACTICE PROBLEMS FOR THE FINAL AND SOLUTIONS There will be eight problems on the final. The following are sample problems. Problem 1. Let F be the vector space of all real valued functions on

More information

M.6. Rational canonical form

M.6. Rational canonical form book 2005/3/26 16:06 page 383 #397 M.6. RATIONAL CANONICAL FORM 383 M.6. Rational canonical form In this section we apply the theory of finitely generated modules of a principal ideal domain to study the

More information

Linear algebra II Homework #1 solutions A = This means that every eigenvector with eigenvalue λ = 1 must have the form

Linear algebra II Homework #1 solutions A = This means that every eigenvector with eigenvalue λ = 1 must have the form Linear algebra II Homework # solutions. Find the eigenvalues and the eigenvectors of the matrix 4 6 A =. 5 Since tra = 9 and deta = = 8, the characteristic polynomial is f(λ) = λ (tra)λ+deta = λ 9λ+8 =

More information

Linear Algebra 1. M.T.Nair Department of Mathematics, IIT Madras. and in that case x is called an eigenvector of T corresponding to the eigenvalue λ.

Linear Algebra 1. M.T.Nair Department of Mathematics, IIT Madras. and in that case x is called an eigenvector of T corresponding to the eigenvalue λ. Linear Algebra 1 M.T.Nair Department of Mathematics, IIT Madras 1 Eigenvalues and Eigenvectors 1.1 Definition and Examples Definition 1.1. Let V be a vector space (over a field F) and T : V V be a linear

More information

NOTES II FOR 130A JACOB STERBENZ

NOTES II FOR 130A JACOB STERBENZ NOTES II FOR 130A JACOB STERBENZ Abstract. Here are some notes on the Jordan canonical form as it was covered in class. Contents 1. Polynomials 1 2. The Minimal Polynomial and the Primary Decomposition

More information

(a) II and III (b) I (c) I and III (d) I and II and III (e) None are true.

(a) II and III (b) I (c) I and III (d) I and II and III (e) None are true. 1 Which of the following statements is always true? I The null space of an m n matrix is a subspace of R m II If the set B = {v 1,, v n } spans a vector space V and dimv = n, then B is a basis for V III

More information

Definition (T -invariant subspace) Example. Example

Definition (T -invariant subspace) Example. Example Eigenvalues, Eigenvectors, Similarity, and Diagonalization We now turn our attention to linear transformations of the form T : V V. To better understand the effect of T on the vector space V, we begin

More information

First we introduce the sets that are going to serve as the generalizations of the scalars.

First we introduce the sets that are going to serve as the generalizations of the scalars. Contents 1 Fields...................................... 2 2 Vector spaces.................................. 4 3 Matrices..................................... 7 4 Linear systems and matrices..........................

More information

MATHEMATICS 217 NOTES

MATHEMATICS 217 NOTES MATHEMATICS 27 NOTES PART I THE JORDAN CANONICAL FORM The characteristic polynomial of an n n matrix A is the polynomial χ A (λ) = det(λi A), a monic polynomial of degree n; a monic polynomial in the variable

More information

Control Systems. Linear Algebra topics. L. Lanari

Control Systems. Linear Algebra topics. L. Lanari Control Systems Linear Algebra topics L Lanari outline basic facts about matrices eigenvalues - eigenvectors - characteristic polynomial - algebraic multiplicity eigenvalues invariance under similarity

More information

THE MINIMAL POLYNOMIAL AND SOME APPLICATIONS

THE MINIMAL POLYNOMIAL AND SOME APPLICATIONS THE MINIMAL POLYNOMIAL AND SOME APPLICATIONS KEITH CONRAD. Introduction The easiest matrices to compute with are the diagonal ones. The sum and product of diagonal matrices can be computed componentwise

More information

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix DIAGONALIZATION Definition We say that a matrix A of size n n is diagonalizable if there is a basis of R n consisting of eigenvectors of A ie if there are n linearly independent vectors v v n such that

More information

Given a finite-dimensional vector space V over a field K, recall that a linear

Given a finite-dimensional vector space V over a field K, recall that a linear Jordan normal form Sebastian Ørsted December 16, 217 Abstract In these notes, we expand upon the coverage of linear algebra as presented in Thomsen (216). Namely, we introduce some concepts and results

More information

ALGEBRA QUALIFYING EXAM PROBLEMS LINEAR ALGEBRA

ALGEBRA QUALIFYING EXAM PROBLEMS LINEAR ALGEBRA ALGEBRA QUALIFYING EXAM PROBLEMS LINEAR ALGEBRA Kent State University Department of Mathematical Sciences Compiled and Maintained by Donald L. White Version: August 29, 2017 CONTENTS LINEAR ALGEBRA AND

More information

Further linear algebra. Chapter II. Polynomials.

Further linear algebra. Chapter II. Polynomials. Further linear algebra. Chapter II. Polynomials. Andrei Yafaev 1 Definitions. In this chapter we consider a field k. Recall that examples of felds include Q, R, C, F p where p is prime. A polynomial is

More information

Comps Study Guide for Linear Algebra

Comps Study Guide for Linear Algebra Comps Study Guide for Linear Algebra Department of Mathematics and Statistics Amherst College September, 207 This study guide was written to help you prepare for the linear algebra portion of the Comprehensive

More information

GRE Subject test preparation Spring 2016 Topic: Abstract Algebra, Linear Algebra, Number Theory.

GRE Subject test preparation Spring 2016 Topic: Abstract Algebra, Linear Algebra, Number Theory. GRE Subject test preparation Spring 2016 Topic: Abstract Algebra, Linear Algebra, Number Theory. Linear Algebra Standard matrix manipulation to compute the kernel, intersection of subspaces, column spaces,

More information

Final A. Problem Points Score Total 100. Math115A Nadja Hempel 03/23/2017

Final A. Problem Points Score Total 100. Math115A Nadja Hempel 03/23/2017 Final A Math115A Nadja Hempel 03/23/2017 nadja@math.ucla.edu Name: UID: Problem Points Score 1 10 2 20 3 5 4 5 5 9 6 5 7 7 8 13 9 16 10 10 Total 100 1 2 Exercise 1. (10pt) Let T : V V be a linear transformation.

More information

The Jordan Normal Form and its Applications

The Jordan Normal Form and its Applications The and its Applications Jeremy IMPACT Brigham Young University A square matrix A is a linear operator on {R, C} n. A is diagonalizable if and only if it has n linearly independent eigenvectors. What happens

More information

Linear algebra II Tutorial solutions #1 A = x 1

Linear algebra II Tutorial solutions #1 A = x 1 Linear algebra II Tutorial solutions #. Find the eigenvalues and the eigenvectors of the matrix [ ] 5 2 A =. 4 3 Since tra = 8 and deta = 5 8 = 7, the characteristic polynomial is f(λ) = λ 2 (tra)λ+deta

More information

University of Colorado at Denver Mathematics Department Applied Linear Algebra Preliminary Exam With Solutions 16 January 2009, 10:00 am 2:00 pm

University of Colorado at Denver Mathematics Department Applied Linear Algebra Preliminary Exam With Solutions 16 January 2009, 10:00 am 2:00 pm University of Colorado at Denver Mathematics Department Applied Linear Algebra Preliminary Exam With Solutions 16 January 2009, 10:00 am 2:00 pm Name: The proctor will let you read the following conditions

More information

Linear Algebra 2 More on determinants and Evalues Exercises and Thanksgiving Activities

Linear Algebra 2 More on determinants and Evalues Exercises and Thanksgiving Activities Linear Algebra 2 More on determinants and Evalues Exercises and Thanksgiving Activities 2. Determinant of a linear transformation, change of basis. In the solution set of Homework 1, New Series, I included

More information

4.1 Eigenvalues, Eigenvectors, and The Characteristic Polynomial

4.1 Eigenvalues, Eigenvectors, and The Characteristic Polynomial Linear Algebra (part 4): Eigenvalues, Diagonalization, and the Jordan Form (by Evan Dummit, 27, v ) Contents 4 Eigenvalues, Diagonalization, and the Jordan Canonical Form 4 Eigenvalues, Eigenvectors, and

More information

Jordan Canonical Form Homework Solutions

Jordan Canonical Form Homework Solutions Jordan Canonical Form Homework Solutions For each of the following, put the matrix in Jordan canonical form and find the matrix S such that S AS = J. [ ]. A = A λi = λ λ = ( λ) = λ λ = λ =, Since we have

More information

Solution to Homework 1

Solution to Homework 1 Solution to Homework Sec 2 (a) Yes It is condition (VS 3) (b) No If x, y are both zero vectors Then by condition (VS 3) x = x + y = y (c) No Let e be the zero vector We have e = 2e (d) No It will be false

More information

JORDAN NORMAL FORM. Contents Introduction 1 Jordan Normal Form 1 Conclusion 5 References 5

JORDAN NORMAL FORM. Contents Introduction 1 Jordan Normal Form 1 Conclusion 5 References 5 JORDAN NORMAL FORM KATAYUN KAMDIN Abstract. This paper outlines a proof of the Jordan Normal Form Theorem. First we show that a complex, finite dimensional vector space can be decomposed into a direct

More information

Math Final December 2006 C. Robinson

Math Final December 2006 C. Robinson Math 285-1 Final December 2006 C. Robinson 2 5 8 5 1 2 0-1 0 1. (21 Points) The matrix A = 1 2 2 3 1 8 3 2 6 has the reduced echelon form U = 0 0 1 2 0 0 0 0 0 1. 2 6 1 0 0 0 0 0 a. Find a basis for the

More information

NONCOMMUTATIVE POLYNOMIAL EQUATIONS. Edward S. Letzter. Introduction

NONCOMMUTATIVE POLYNOMIAL EQUATIONS. Edward S. Letzter. Introduction NONCOMMUTATIVE POLYNOMIAL EQUATIONS Edward S Letzter Introduction My aim in these notes is twofold: First, to briefly review some linear algebra Second, to provide you with some new tools and techniques

More information

LINEAR ALGEBRA BOOT CAMP WEEK 1: THE BASICS

LINEAR ALGEBRA BOOT CAMP WEEK 1: THE BASICS LINEAR ALGEBRA BOOT CAMP WEEK 1: THE BASICS Unless otherwise stated, all vector spaces in this worksheet are finite dimensional and the scalar field F has characteristic zero. The following are facts (in

More information

Math 113 Homework 5. Bowei Liu, Chao Li. Fall 2013

Math 113 Homework 5. Bowei Liu, Chao Li. Fall 2013 Math 113 Homework 5 Bowei Liu, Chao Li Fall 2013 This homework is due Thursday November 7th at the start of class. Remember to write clearly, and justify your solutions. Please make sure to put your name

More information

1 Last time: least-squares problems

1 Last time: least-squares problems MATH Linear algebra (Fall 07) Lecture Last time: least-squares problems Definition. If A is an m n matrix and b R m, then a least-squares solution to the linear system Ax = b is a vector x R n such that

More information

Infinite-Dimensional Triangularization

Infinite-Dimensional Triangularization Infinite-Dimensional Triangularization Zachary Mesyan March 11, 2018 Abstract The goal of this paper is to generalize the theory of triangularizing matrices to linear transformations of an arbitrary vector

More information

MATH 115A: SAMPLE FINAL SOLUTIONS

MATH 115A: SAMPLE FINAL SOLUTIONS MATH A: SAMPLE FINAL SOLUTIONS JOE HUGHES. Let V be the set of all functions f : R R such that f( x) = f(x) for all x R. Show that V is a vector space over R under the usual addition and scalar multiplication

More information

A linear algebra proof of the fundamental theorem of algebra

A linear algebra proof of the fundamental theorem of algebra A linear algebra proof of the fundamental theorem of algebra Andrés E. Caicedo May 18, 2010 Abstract We present a recent proof due to Harm Derksen, that any linear operator in a complex finite dimensional

More information

A linear algebra proof of the fundamental theorem of algebra

A linear algebra proof of the fundamental theorem of algebra A linear algebra proof of the fundamental theorem of algebra Andrés E. Caicedo May 18, 2010 Abstract We present a recent proof due to Harm Derksen, that any linear operator in a complex finite dimensional

More information

Name: Solutions. Determine the matrix [T ] C B. The matrix is given by A = [T ] C B =

Name: Solutions. Determine the matrix [T ] C B. The matrix is given by A = [T ] C B = Name: Solutions 1. (5 + 5 + 15 points) Let F = F 7. Recall we defined for a positive integer n a vector space P n (F ) by P n (F ) = {f(x) F [x] : deg(f) n}. (a) What is the dimension of P n (F ) over

More information

2 Eigenvectors and Eigenvalues in abstract spaces.

2 Eigenvectors and Eigenvalues in abstract spaces. MA322 Sathaye Notes on Eigenvalues Spring 27 Introduction In these notes, we start with the definition of eigenvectors in abstract vector spaces and follow with the more common definition of eigenvectors

More information

BASIC ALGORITHMS IN LINEAR ALGEBRA. Matrices and Applications of Gaussian Elimination. A 2 x. A T m x. A 1 x A T 1. A m x

BASIC ALGORITHMS IN LINEAR ALGEBRA. Matrices and Applications of Gaussian Elimination. A 2 x. A T m x. A 1 x A T 1. A m x BASIC ALGORITHMS IN LINEAR ALGEBRA STEVEN DALE CUTKOSKY Matrices and Applications of Gaussian Elimination Systems of Equations Suppose that A is an n n matrix with coefficents in a field F, and x = (x,,

More information

Jordan Canonical Form

Jordan Canonical Form Jordan Canonical Form Suppose A is a n n matrix operating on V = C n. First Reduction (to a repeated single eigenvalue). Let φ(x) = det(x A) = r (x λ i ) e i (1) be the characteristic equation of A. Factor

More information

x 3y 2z = 6 1.2) 2x 4y 3z = 8 3x + 6y + 8z = 5 x + 3y 2z + 5t = 4 1.5) 2x + 8y z + 9t = 9 3x + 5y 12z + 17t = 7

x 3y 2z = 6 1.2) 2x 4y 3z = 8 3x + 6y + 8z = 5 x + 3y 2z + 5t = 4 1.5) 2x + 8y z + 9t = 9 3x + 5y 12z + 17t = 7 Linear Algebra and its Applications-Lab 1 1) Use Gaussian elimination to solve the following systems x 1 + x 2 2x 3 + 4x 4 = 5 1.1) 2x 1 + 2x 2 3x 3 + x 4 = 3 3x 1 + 3x 2 4x 3 2x 4 = 1 x + y + 2z = 4 1.4)

More information

Math 110 Linear Algebra Midterm 2 Review October 28, 2017

Math 110 Linear Algebra Midterm 2 Review October 28, 2017 Math 11 Linear Algebra Midterm Review October 8, 17 Material Material covered on the midterm includes: All lectures from Thursday, Sept. 1st to Tuesday, Oct. 4th Homeworks 9 to 17 Quizzes 5 to 9 Sections

More information

Jordan Normal Form and Singular Decomposition

Jordan Normal Form and Singular Decomposition University of Debrecen Diagonalization and eigenvalues Diagonalization We have seen that if A is an n n square matrix, then A is diagonalizable if and only if for all λ eigenvalues of A we have dim(u λ

More information

Minimal Polynomials and Jordan Normal Forms

Minimal Polynomials and Jordan Normal Forms Minimal Polynomials and Jordan Normal Forms 1. Minimal Polynomials Let A be an n n real matrix. M1. There is a polynomial p such that p(a) =. Proof. The space M n n (R) of n n real matrices is an n 2 -dimensional

More information

Linear Algebra- Final Exam Review

Linear Algebra- Final Exam Review Linear Algebra- Final Exam Review. Let A be invertible. Show that, if v, v, v 3 are linearly independent vectors, so are Av, Av, Av 3. NOTE: It should be clear from your answer that you know the definition.

More information

Linear Algebra II Lecture 22

Linear Algebra II Lecture 22 Linear Algebra II Lecture 22 Xi Chen University of Alberta March 4, 24 Outline Characteristic Polynomial, Eigenvalue, Eigenvector and Eigenvalue, Eigenvector and Let T : V V be a linear endomorphism. We

More information

Linear Algebra II Lecture 13

Linear Algebra II Lecture 13 Linear Algebra II Lecture 13 Xi Chen 1 1 University of Alberta November 14, 2014 Outline 1 2 If v is an eigenvector of T : V V corresponding to λ, then v is an eigenvector of T m corresponding to λ m since

More information

Direct Sums and Invariants. Direct Sums

Direct Sums and Invariants. Direct Sums Math 5327 Direct Sums and Invariants Direct Sums Suppose that W and W 2 are both subspaces of a vector space V Recall from Chapter 2 that the sum W W 2 is the set "u v u " W v " W 2 # That is W W 2 is

More information

Solutions to the Calculus and Linear Algebra problems on the Comprehensive Examination of January 28, 2011

Solutions to the Calculus and Linear Algebra problems on the Comprehensive Examination of January 28, 2011 Solutions to the Calculus and Linear Algebra problems on the Comprehensive Examination of January 8, Solutions to Problems 5 are omitted since they involve topics no longer covered on the Comprehensive

More information

The Cayley-Hamilton Theorem and the Jordan Decomposition

The Cayley-Hamilton Theorem and the Jordan Decomposition LECTURE 19 The Cayley-Hamilton Theorem and the Jordan Decomposition Let me begin by summarizing the main results of the last lecture Suppose T is a endomorphism of a vector space V Then T has a minimal

More information

Jordan Normal Form. Chapter Minimal Polynomials

Jordan Normal Form. Chapter Minimal Polynomials Chapter 8 Jordan Normal Form 81 Minimal Polynomials Recall p A (x) =det(xi A) is called the characteristic polynomial of the matrix A Theorem 811 Let A M n Then there exists a unique monic polynomial q

More information

(f + g)(s) = f(s) + g(s) for f, g V, s S (cf)(s) = cf(s) for c F, f V, s S

(f + g)(s) = f(s) + g(s) for f, g V, s S (cf)(s) = cf(s) for c F, f V, s S 1 Vector spaces 1.1 Definition (Vector space) Let V be a set with a binary operation +, F a field, and (c, v) cv be a mapping from F V into V. Then V is called a vector space over F (or a linear space

More information

MATH 205 HOMEWORK #3 OFFICIAL SOLUTION. Problem 1: Find all eigenvalues and eigenvectors of the following linear transformations. (a) F = R, V = R 3,

MATH 205 HOMEWORK #3 OFFICIAL SOLUTION. Problem 1: Find all eigenvalues and eigenvectors of the following linear transformations. (a) F = R, V = R 3, MATH 205 HOMEWORK #3 OFFICIAL SOLUTION Problem 1: Find all eigenvalues and eigenvectors of the following linear transformations. a F = R, V = R 3, b F = R or C, V = F 2, T = T = 9 4 4 8 3 4 16 8 7 0 1

More information

The converse is clear, since

The converse is clear, since 14. The minimal polynomial For an example of a matrix which cannot be diagonalised, consider the matrix ( ) 0 1 A =. 0 0 The characteristic polynomial is λ 2 = 0 so that the only eigenvalue is λ = 0. The

More information

Linear Algebra Lecture Notes-II

Linear Algebra Lecture Notes-II Linear Algebra Lecture Notes-II Vikas Bist Department of Mathematics Panjab University, Chandigarh-64 email: bistvikas@gmail.com Last revised on March 5, 8 This text is based on the lectures delivered

More information

Algebra Qualifying Exam August 2001 Do all 5 problems. 1. Let G be afinite group of order 504 = 23 32 7. a. Show that G cannot be isomorphic to a subgroup of the alternating group Alt 7. (5 points) b.

More information

Linear Algebra 2 Spectral Notes

Linear Algebra 2 Spectral Notes Linear Algebra 2 Spectral Notes In what follows, V is an inner product vector space over F, where F = R or C. We will use results seen so far; in particular that every linear operator T L(V ) has a complex

More information

0.1 Rational Canonical Forms

0.1 Rational Canonical Forms We have already seen that it is useful and simpler to study linear systems using matrices. But matrices are themselves cumbersome, as they are stuffed with many entries, and it turns out that it s best

More information

What is on this week. 1 Vector spaces (continued) 1.1 Null space and Column Space of a matrix

What is on this week. 1 Vector spaces (continued) 1.1 Null space and Column Space of a matrix Professor Joana Amorim, jamorim@bu.edu What is on this week Vector spaces (continued). Null space and Column Space of a matrix............................. Null Space...........................................2

More information

SPECTRAL THEORY EVAN JENKINS

SPECTRAL THEORY EVAN JENKINS SPECTRAL THEORY EVAN JENKINS Abstract. These are notes from two lectures given in MATH 27200, Basic Functional Analysis, at the University of Chicago in March 2010. The proof of the spectral theorem for

More information

Homework 6 Solutions. Solution. Note {e t, te t, t 2 e t, e 2t } is linearly independent. If β = {e t, te t, t 2 e t, e 2t }, then

Homework 6 Solutions. Solution. Note {e t, te t, t 2 e t, e 2t } is linearly independent. If β = {e t, te t, t 2 e t, e 2t }, then Homework 6 Solutions 1 Let V be the real vector space spanned by the functions e t, te t, t 2 e t, e 2t Find a Jordan canonical basis and a Jordan canonical form of T on V dened by T (f) = f Solution Note

More information

Math 554 Qualifying Exam. You may use any theorems from the textbook. Any other claims must be proved in details.

Math 554 Qualifying Exam. You may use any theorems from the textbook. Any other claims must be proved in details. Math 554 Qualifying Exam January, 2019 You may use any theorems from the textbook. Any other claims must be proved in details. 1. Let F be a field and m and n be positive integers. Prove the following.

More information

235 Final exam review questions

235 Final exam review questions 5 Final exam review questions Paul Hacking December 4, 0 () Let A be an n n matrix and T : R n R n, T (x) = Ax the linear transformation with matrix A. What does it mean to say that a vector v R n is an

More information

MAT2342 : Introduction to Applied Linear Algebra Mike Newman, fall Projections. introduction

MAT2342 : Introduction to Applied Linear Algebra Mike Newman, fall Projections. introduction MAT4 : Introduction to Applied Linear Algebra Mike Newman fall 7 9. Projections introduction One reason to consider projections is to understand approximate solutions to linear systems. A common example

More information

Online Exercises for Linear Algebra XM511

Online Exercises for Linear Algebra XM511 This document lists the online exercises for XM511. The section ( ) numbers refer to the textbook. TYPE I are True/False. Lecture 02 ( 1.1) Online Exercises for Linear Algebra XM511 1) The matrix [3 2

More information

CHARACTERS OF FINITE GROUPS.

CHARACTERS OF FINITE GROUPS. CHARACTERS OF FINITE GROUPS. ANDREI YAFAEV As usual we consider a finite group G and the ground field F = C. Let U be a C[G]-module and let g G. Then g is represented by a matrix [g] in a certain basis.

More information

Lecture Summaries for Linear Algebra M51A

Lecture Summaries for Linear Algebra M51A These lecture summaries may also be viewed online by clicking the L icon at the top right of any lecture screen. Lecture Summaries for Linear Algebra M51A refers to the section in the textbook. Lecture

More information

Lecture 21: The decomposition theorem into generalized eigenspaces; multiplicity of eigenvalues and upper-triangular matrices (1)

Lecture 21: The decomposition theorem into generalized eigenspaces; multiplicity of eigenvalues and upper-triangular matrices (1) Lecture 21: The decomposition theorem into generalized eigenspaces; multiplicity of eigenvalues and upper-triangular matrices (1) Travis Schedler Tue, Nov 29, 2011 (version: Tue, Nov 29, 1:00 PM) Goals

More information

The Jordan canonical form

The Jordan canonical form The Jordan canonical form Francisco Javier Sayas University of Delaware November 22, 213 The contents of these notes have been translated and slightly modified from a previous version in Spanish. Part

More information

Generalized Eigenvectors and Jordan Form

Generalized Eigenvectors and Jordan Form Generalized Eigenvectors and Jordan Form We have seen that an n n matrix A is diagonalizable precisely when the dimensions of its eigenspaces sum to n. So if A is not diagonalizable, there is at least

More information

Real representations

Real representations Real representations 1 Definition of a real representation Definition 1.1. Let V R be a finite dimensional real vector space. A real representation of a group G is a homomorphism ρ VR : G Aut V R, where

More information

Math 113 Winter 2013 Prof. Church Midterm Solutions

Math 113 Winter 2013 Prof. Church Midterm Solutions Math 113 Winter 2013 Prof. Church Midterm Solutions Name: Student ID: Signature: Question 1 (20 points). Let V be a finite-dimensional vector space, and let T L(V, W ). Assume that v 1,..., v n is a basis

More information

PRACTICE FINAL EXAM. why. If they are dependent, exhibit a linear dependence relation among them.

PRACTICE FINAL EXAM. why. If they are dependent, exhibit a linear dependence relation among them. Prof A Suciu MTH U37 LINEAR ALGEBRA Spring 2005 PRACTICE FINAL EXAM Are the following vectors independent or dependent? If they are independent, say why If they are dependent, exhibit a linear dependence

More information

NATIONAL UNIVERSITY OF SINGAPORE MA1101R

NATIONAL UNIVERSITY OF SINGAPORE MA1101R Student Number: NATIONAL UNIVERSITY OF SINGAPORE - Linear Algebra I (Semester 2 : AY25/26) Time allowed : 2 hours INSTRUCTIONS TO CANDIDATES. Write down your matriculation/student number clearly in the

More information

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET This is a (not quite comprehensive) list of definitions and theorems given in Math 1553. Pay particular attention to the ones in red. Study Tip For each

More information

Travis Schedler. Thurs, Oct 27, 2011 (version: Thurs, Oct 27, 1:00 PM)

Travis Schedler. Thurs, Oct 27, 2011 (version: Thurs, Oct 27, 1:00 PM) Lecture 13: Proof of existence of upper-triangular matrices for complex linear transformations; invariant subspaces and block upper-triangular matrices for real linear transformations (1) Travis Schedler

More information

Linear Algebra II. Course No Spring 2007 Michael Stoll

Linear Algebra II. Course No Spring 2007 Michael Stoll Linear Algebra II Course No. 100 222 Spring 2007 Michael Stoll With some additions by Ronald van Luijk, 2016 Contents 1. Review of Eigenvalues, Eigenvectors and Characteristic Polynomial 2 2. Direct Sums

More information

Honors Algebra II MATH251 Course Notes by Dr. Eyal Goren McGill University Winter 2007

Honors Algebra II MATH251 Course Notes by Dr. Eyal Goren McGill University Winter 2007 Honors Algebra II MATH251 Course Notes by Dr Eyal Goren McGill University Winter 2007 Last updated: April 4, 2014 c All rights reserved to the author, Eyal Goren, Department of Mathematics and Statistics,

More information

Dimension. Eigenvalue and eigenvector

Dimension. Eigenvalue and eigenvector Dimension. Eigenvalue and eigenvector Math 112, week 9 Goals: Bases, dimension, rank-nullity theorem. Eigenvalue and eigenvector. Suggested Textbook Readings: Sections 4.5, 4.6, 5.1, 5.2 Week 9: Dimension,

More information

Lecture 11: Finish Gaussian elimination and applications; intro to eigenvalues and eigenvectors (1)

Lecture 11: Finish Gaussian elimination and applications; intro to eigenvalues and eigenvectors (1) Lecture 11: Finish Gaussian elimination and applications; intro to eigenvalues and eigenvectors (1) Travis Schedler Tue, Oct 18, 2011 (version: Tue, Oct 18, 6:00 PM) Goals (2) Solving systems of equations

More information

Theorem 5.3. Let E/F, E = F (u), be a simple field extension. Then u is algebraic if and only if E/F is finite. In this case, [E : F ] = deg f u.

Theorem 5.3. Let E/F, E = F (u), be a simple field extension. Then u is algebraic if and only if E/F is finite. In this case, [E : F ] = deg f u. 5. Fields 5.1. Field extensions. Let F E be a subfield of the field E. We also describe this situation by saying that E is an extension field of F, and we write E/F to express this fact. If E/F is a field

More information

Linear Algebra Highlights

Linear Algebra Highlights Linear Algebra Highlights Chapter 1 A linear equation in n variables is of the form a 1 x 1 + a 2 x 2 + + a n x n. We can have m equations in n variables, a system of linear equations, which we want to

More information

EXERCISES AND SOLUTIONS IN LINEAR ALGEBRA

EXERCISES AND SOLUTIONS IN LINEAR ALGEBRA EXERCISES AND SOLUTIONS IN LINEAR ALGEBRA Mahmut Kuzucuoğlu Middle East Technical University matmah@metu.edu.tr Ankara, TURKEY March 14, 015 ii TABLE OF CONTENTS CHAPTERS 0. PREFACE..................................................

More information

JORDAN AND RATIONAL CANONICAL FORMS

JORDAN AND RATIONAL CANONICAL FORMS JORDAN AND RATIONAL CANONICAL FORMS MATH 551 Throughout this note, let V be a n-dimensional vector space over a field k, and let φ: V V be a linear map Let B = {e 1,, e n } be a basis for V, and let A

More information

A = 3 1. We conclude that the algebraic multiplicity of the eigenvalues are both one, that is,

A = 3 1. We conclude that the algebraic multiplicity of the eigenvalues are both one, that is, 65 Diagonalizable Matrices It is useful to introduce few more concepts, that are common in the literature Definition 65 The characteristic polynomial of an n n matrix A is the function p(λ) det(a λi) Example

More information

Lecture Notes for Math 414: Linear Algebra II Fall 2015, Michigan State University

Lecture Notes for Math 414: Linear Algebra II Fall 2015, Michigan State University Lecture Notes for Fall 2015, Michigan State University Matthew Hirn December 11, 2015 Beginning of Lecture 1 1 Vector Spaces What is this course about? 1. Understanding the structural properties of a wide

More information

Eigenspaces and Diagonalizable Transformations

Eigenspaces and Diagonalizable Transformations Chapter 2 Eigenspaces and Diagonalizable Transformations As we explored how heat states evolve under the action of a diffusion transformation E, we found that some heat states will only change in amplitude.

More information

Lecture 7: Positive Semidefinite Matrices

Lecture 7: Positive Semidefinite Matrices Lecture 7: Positive Semidefinite Matrices Rajat Mittal IIT Kanpur The main aim of this lecture note is to prepare your background for semidefinite programming. We have already seen some linear algebra.

More information