Homework For each of the following matrices, find the minimal polynomial and determine whether the matrix is diagonalizable.

Similar documents
Matrices related to linear transformations

MATH 221, Spring Homework 10 Solutions

Direct Sums and Invariants. Direct Sums

Math Linear Algebra Final Exam Review Sheet

Bare-bones outline of eigenvalue theory and the Jordan canonical form

(a) II and III (b) I (c) I and III (d) I and II and III (e) None are true.

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

AMS10 HW7 Solutions. All credit is given for effort. (-5 pts for any missing sections) Problem 1 (20 pts) Consider the following matrix 2 A =

Definition (T -invariant subspace) Example. Example

Math 20F Final Exam(ver. c)

Name: Final Exam MATH 3320

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

MATH 115A: SAMPLE FINAL SOLUTIONS

MATH SOLUTIONS TO PRACTICE MIDTERM LECTURE 1, SUMMER Given vector spaces V and W, V W is the vector space given by

Study Guide for Linear Algebra Exam 2

Math Final December 2006 C. Robinson

Math 113 Homework 5. Bowei Liu, Chao Li. Fall 2013

Problem Set (T) If A is an m n matrix, B is an n p matrix and D is a p s matrix, then show

MATH 315 Linear Algebra Homework #1 Assigned: August 20, 2018

Final A. Problem Points Score Total 100. Math115A Nadja Hempel 03/23/2017

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix

(b) If a multiple of one row of A is added to another row to produce B then det(b) =det(a).

MAT Linear Algebra Collection of sample exams

YORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH M Test #2 Solutions

5.) For each of the given sets of vectors, determine whether or not the set spans R 3. Give reasons for your answers.

Linear Algebra II Lecture 13

Linear Algebra (MATH ) Spring 2011 Final Exam Practice Problem Solutions

Linear Systems. Class 27. c 2008 Ron Buckmire. TITLE Projection Matrices and Orthogonal Diagonalization CURRENT READING Poole 5.4

MATH 235. Final ANSWERS May 5, 2015

Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

MATH 1553-C MIDTERM EXAMINATION 3

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

Therefore, A and B have the same characteristic polynomial and hence, the same eigenvalues.

1 Last time: least-squares problems

Problem # Max points possible Actual score Total 120

MATH 369 Linear Algebra

Final Exam Practice Problems Answers Math 24 Winter 2012

The eigenvalues are the roots of the characteristic polynomial, det(a λi). We can compute

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

LINEAR ALGEBRA BOOT CAMP WEEK 4: THE SPECTRAL THEOREM

Math 217: Eigenspaces and Characteristic Polynomials Professor Karen Smith

Elementary maths for GMT

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

MIT Final Exam Solutions, Spring 2017

Math 110 Linear Algebra Midterm 2 Review October 28, 2017

Recall : Eigenvalues and Eigenvectors

Eigenvalues, Eigenvectors, and Diagonalization

Math 113 Winter 2013 Prof. Church Midterm Solutions

MATH 431: FIRST MIDTERM. Thursday, October 3, 2013.

Solution. That ϕ W is a linear map W W follows from the definition of subspace. The map ϕ is ϕ(v + W ) = ϕ(v) + W, which is well-defined since

Linear Algebra Practice Problems

Summer Session Practice Final Exam

Math 314/ Exam 2 Blue Exam Solutions December 4, 2008 Instructor: Dr. S. Cooper. Name:

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

Symmetric and self-adjoint matrices

The minimal polynomial

MA 265 FINAL EXAM Fall 2012

LINEAR ALGEBRA BOOT CAMP WEEK 2: LINEAR OPERATORS

ft-uiowa-math2550 Assignment NOTRequiredJustHWformatOfQuizReviewForExam3part2 due 12/31/2014 at 07:10pm CST

EXAM. Exam #3. Math 2360, Spring April 24, 2001 ANSWERS

Maths for Signals and Systems Linear Algebra in Engineering

Chapter 1 Vector Spaces

Quadratic forms. Here. Thus symmetric matrices are diagonalizable, and the diagonalization can be performed by means of an orthogonal matrix.

Math 314H Solutions to Homework # 3

Matrices and Linear Algebra

Extra Problems for Math 2050 Linear Algebra I

YORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH M Test #1. July 11, 2013 Solutions

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

Solutions to Final Exam

Schur s Triangularization Theorem. Math 422

NONCOMMUTATIVE POLYNOMIAL EQUATIONS. Edward S. Letzter. Introduction

Linear Algebra Final Exam Study Guide Solutions Fall 2012

Math 113 Midterm Exam Solutions

(b) The nonzero rows of R form a basis of the row space. Thus, a basis is [ ], [ ], [ ]

Eigenvalues, Eigenvectors, and an Intro to PCA

Homework Set 5 Solutions

EXERCISES AND SOLUTIONS IN LINEAR ALGEBRA

MAT 1302B Mathematical Methods II

OHSx XM511 Linear Algebra: Solutions to Online True/False Exercises

Calculating determinants for larger matrices

Test 3, Linear Algebra

TMA Calculus 3. Lecture 21, April 3. Toke Meier Carlsen Norwegian University of Science and Technology Spring 2013

Math 308 Practice Final Exam Page and vector y =

Lecture 11: Finish Gaussian elimination and applications; intro to eigenvalues and eigenvectors (1)

spring, math 204 (mitchell) list of theorems 1 Linear Systems Linear Transformations Matrix Algebra

Final Review Written by Victoria Kala SH 6432u Office Hours R 12:30 1:30pm Last Updated 11/30/2015

Homework 6 Solutions. Solution. Note {e t, te t, t 2 e t, e 2t } is linearly independent. If β = {e t, te t, t 2 e t, e 2t }, then

Final Review Sheet. B = (1, 1 + 3x, 1 + x 2 ) then 2 + 3x + 6x 2

Linear Equations in Linear Algebra

MATH SOLUTIONS TO PRACTICE PROBLEMS - MIDTERM I. 1. We carry out row reduction. We begin with the row operations

Eigenvalue and Eigenvector Homework

Linear Algebra. Workbook

Eigenvalues and Eigenvectors

Computationally, diagonal matrices are the easiest to work with. With this idea in mind, we introduce similarity:

Linear Algebra M1 - FIB. Contents: 5. Matrices, systems of linear equations and determinants 6. Vector space 7. Linear maps 8.

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL

2. Every linear system with the same number of equations as unknowns has a unique solution.

Linear Algebra II Lecture 22

EXERCISES ON DETERMINANTS, EIGENVALUES AND EIGENVECTORS. 1. Determinants

Transcription:

Math 5327 Fall 2018 Homework 7 1. For each of the following matrices, find the minimal polynomial and determine whether the matrix is diagonalizable. 3 1 0 (a) A = 1 2 0 1 1 0 x 3 1 0 Solution: 1 x 2 0 1 1 x = x((x 3)(x 2) 1) = x(x2 5x + 5). The eigenvalues are 0 and 5 ± 5. Since the minimal polynomial divides the characteristic polynomial and has the same zeros, m A (x) = x(x 2 5x + 5) as well. 2 10 5 0 35 20 0 As a check, A 2 = 5 5 0 and A 3 = 20 15 0. We can see from this 4 3 0 15 10 0 that A 3 + 5A = 5A 2, consistent with m A (x) = x 3 5x 2 + 5x. 0 r 0 0 0 0 rs 0 0 0 0 rst (b) B = 0 0 s 0 0 0 0 t, so B2 = 0 0 0 st 0 0 0 0, B3 = 0 0 0 0 0 0 0 0. 0 0 0 0 0 0 0 0 0 0 0 0 Solution: The characteristic polynomial is x 4, so the minimal polynomial is a divisor of x 4. This is tricky: the minimal polynomial is x 4 if r, s, t are all nonzero, the characteristic polynomial is x 3 if s 0 and at least one of r, t is nonzero, and the characteristic polynomial is x 2 if s = 0 but (at least) one of r, t is not, or when s 0 but both r and t are 0. Finally, the characteristic polynomial is x when r = s = t = 0. Only in this last case is B diagonalizable. 1 2 3 4 (c) C = 0 2 3 4 0 0 3 4 0 0 0 4 Solution: The characteristic polynomial is (x 1)(x 2)(x 3)(x 4), the same as the minimal polynomial. Distinct linear terms means that C is diagonalizable.

2 0 0 0 (d) D = 1 1 0 0 3 2 1 0 0 0 0 1 Solution: The matrix is lower triangular so c D (x) = (x 1) 3 (x 2). This means m D (x) = (x 1) k (x 2) for some k between 1 and 3. We start by calculating 1 0 0 0 0 0 0 0 0 0 0 0 (D I)(D 2I) = 1 0 0 0 1 1 0 0 3 2 0 0 3 2 1 0 = 0 0 0 0 2 2 0 0, 0 0 0 0 0 0 0 1 0 0 0 0 and (D I) 2 (D 2I) = 0. This is enough to tell us that m D = (x 1) 2 (x 2). Since m D (x) has repeated zeros, D is not diagonalizable. 0 0 c 2. Suppose A = 1 0 b. Show that the minimal and characteristic polynomials of A 0 1 a are the same. This shows that every monic cubic polynomial p(x) has a matrix A with p(x) as both its minimal and characteristic polynomials. x 0 c Solution: We have xi A = 1 x b 0 1 x + a (x+a)+c+bx = x 3 +ax 2 +bx+c. 1 0 0 For the minimal polynomial, consider the four matrices I = 0 1 0, A = 0 0 1 0 0 c 0 c ac c ac a 2 c + bc 1 0 b, A 2 = 0 b ab c, A 3 = b ab c a 2 b + b 2 + ac. 0 1 a 1 a a 2 b a a 2 b a 3 + 2ab c Based on the first column, we see that I, A, A 2 are linearly independent, which tells us m A (x) has degree at least 3. Since A 3 +aa 2 +ba+ci = 0, the minimal polynomial is x 3 + ax 2 + bx + c, the same as the characteristic polynomial. Page 2

3. Let A = ( ) 1 1. 4 1 (a) If T : R 2 R 2 is defined by T (v) = Av, show that the only T -invariant subspaces of R 2 are {0} and R 2. Solution: First, 0 and V are always T -invariant for T : V V so R 2 is T -invariant. The only other subspaces of R are 1-dimensional. If U were a 1-dimensional T -invariant subspace, with basis vector u then T (u) must be in U meaning that T (u) = cu for some scalar c. That is, 1-dimensional T -invariant subspaces are eigenspaces. The characteristic polynomial of A is x 2 2x + 5, which has no real eigenvalues, so there can t be any 1-dimensional T -invariant subspaces are eigenspaces. (b) If S : C 2 C 2 is defined by S(v) = Av, show that C has two 1-dimensional S-invariant subspaces. Solution: Since there are two complex eigenvalues (1 + 2i and 1-2i), there are two 1-dimensional S-invariant subspaces. ( ) Specifically, ( ) they are the 1-dimensional 1 1 spaces spanned by the vectors and. 2i 2i 4. Let T : P 3 P 3 be defined by T (ax 3 + bx 2 + cx + d) = (a + b)x 3 + (b a)x 2 + (a + b + d)x + (a b + 2c + d). (a) Show that P 1 and V = {p(x) p(1) = 0} are both T -invariant subspaces of P 3. Solution: For P 1, T (cx + d) = dx + 2c + d, which is still in P 1. For V, suppose q(x) = T (p(x). With p(x) = ax 3 + bx 2 + cx + d, we have q(x) = (a + b)x 3 + (b a)x 2 + (a + b + d)x + (a b + 2c + d). Thus, q(1) = 2a + 2b + 2c + 2d = 2p(1). Thus, when p(1) = 0, q(1) = 0 as well, showing V is T -invariant. (b) Show that V contains a 2-dimensional T -invariant subspace W such that P 3 = P 1 W. Solution: There is probably a better way to do this. I just did the following: A basis for V is {x 1, x 2 1, x 3 1} and if we apply T to each of these, we get (x 1), x 3 + x 2 2, x 3 x 2. If the original basis vectors are b 1, b 2, b 3 then T (b 2 ) = b 1 + b 2 and T (b 3 ) = b 3 b 2. This means that W = {b 2, b 3 } = {x 2 1, x 3 1} is T -invariant. Since {1, x, x 2 1, x 3 1} is a basis for P 3, it follows that P 3 = P 1 W. Page 3

(c) Use your answer to part (b) to find a basis for P 3 for which the matrix of T is block diagonal. Solution: One basis is the one I listed above, B = {1, x, x 2 1, x 3 1}. With respect to this basis we have [T ] B = ([T (1)] B [T (x)] B [T (x 2 1)] B [T (x 3 1)] B ) = ([x + 1] B [2] B [x 3 + x 2 2] B [x 3 x 2 ] B ) 1 2 0 0 1 2 0 0 = 1 0 0 0 0 0 1 1 = 1 0 0 0 0 0 1-1. 0 0 1 1 0 0 1 1 5. Let T : F n n F n n be defined by T (A) = BA, where B is some fixed matrix. Similarly, define S(A) by S(A) = AB. (a) If v is an eigenvector of B, show that A = (v 0 0) is an eigenvector of T. Solution: If Bv = cv then T (A) = BA = B(v 0 0) = (Bv B0 B0) = (cv 0 0) = ca. (b) Prove that m T (x) = m S (x) = m B (x). Solution: Note that T k (A) = B k A and S k (A) = AB k. It follows that for any polynomial p(x) = a m x m + a m 1 x m 1 + + a 1 x + a 0, p(t )(A) = (a m T m + a m 1 T m 1 + + a 1 T + a 0 I)(A) = a m T m (A) + a m 1 T m 1 (A) + + a 1 T (A) + a 0 I(A) = a m B m A + a m 1 B m 1 A + + a 1 BA + a 0 IA = (a m B m + a m 1 B m 1 + + a 1 B + a 0 I)A = p(b)a. Similarly, p(s)(a) = A p(b). If p(x) annihilates B then p(t )(A) = p(b)a = 0A = 0 and p(s)(a) = A p(b) = A0 = 0 so p(x) will annihilate both S and T as well. And if p(x) annihilates T, take A = I and we get 0 = p(t )(I) = p(b)i so p(b) = 0. Similarly, if p(x) annihilates S then 0 = p(s)(i) = Ip(B) so p(b) = 0. What this means is that the annihilating polynomials for B are exactly the same as the annihilating polynomials for S and the annihilating polynomials for T. Consequently, the smallest one is the same in each case so m T (x) = m S (x) = m B (x). Page 4

(c) Prove that the characteristic polynomials for S and T are equal. They should have degree n 2. ( ) 1 2 Solution: An example is probably helpful here. Suppose that B =. 3 4 Then with respect to the standard basis, S = {E 1,1, E 1,2, E 2,1, E 2,2 } we have 1 3 0 0 1 0 2 0 [S] S = 2 4 0 0 0 0 1 3 and [T ] S = 0 1 0 2 3 0 4 0. More generally, with S = 0 0 2 4 0 3 0 4 {E 1,1, E 1,2,..., E 1,n, E 2,1,..., E n,n }, we have S(E i,j ) = E i,j B, and this will be the matrix of all 0 s except that it s i th row will be the j th row of B. The basis vector E i,j is in position (i 1)n + j in the basis, which means we are dealing with this column number in [S]. These entries will be the entries in a row of B, and they get put into the i th block as the j th column. What this means is that [S] S is block diagonal consisting of n blocks of B t. Consequently, the characteristic polynomial of this matrix has the form det(xi B t ) n. But B and B t have the same characteristic polynomial so c S (x) = (c B (x)) n. Obviously, T is more annoying. The trick is to use a slightly different basis. Instead of the standard basis, we use B = {E 1,1, E 2,1,..., E n,1, E 1,2,..., E n,n }. Letting B = (v 1 v 2 v n ), BE i,j is the matrix that has v i in column j and 0 s for all the other columns. What this means is that with respect to this basis, the matrix of T will be block diagonal with B s on the diagonal. The characteristic polynomial of this matrix is again c B (x) n. (d) Prove that S and T are diagonalizable if and only if B is diagonalizable. Solution: This is a nice problem to show the power of the theory we ve built up. Since S, T, and B all have the same minimal polynomial and a matrix/transformation is diagonalizable if and only it factors into distinct linear terms, S, T have this property if and only if B does. Page 5

For extra credit: 6. Generalize 2 to the n n case. 0 0 0 0 a 0 1 0 0 0 a 1 Solution: The result is that the matrix M = 0 1 0 0 a 2 has both..... 0 0 0 1 a n 1 characteristic and minimal polynomial equal to x n + a n 1 x n 1 + + a 1 x + a 0. It is relatively easy to show this is true for the characteristic polynomial. For the minimal polynomial, it is not hard to show that I, M, M 2,..., M n 1 are independent, based on the first column in each matrix. It takes a little more work to get the exact form for the minimal polynomial. 7. Let T : F n n F n n be defined by T (A) = BA AB, where B is some fixed matrix. Prove or give a counterexample: T is diagonalizable if and only if B is diagonalizable. I will give partial credit if you can give examples of eigenvalues and eigenvectors for T. Solution: I will only give a partial solution here. It turns out that T is diagonalizable if and only if B is. I will only talk about the case whee B is diagonalizable. In that case, we try to get n 2 independent eigenvectors (matrices) for T. Since B is diagonalizable, there are n linearly independent eigenvectors v 1, v 2,, v n for B with eigenvalues c 1, c 2,..., c n, not all necessarily distinct. It takes a little effort, but if B is diagonalizable, then B t is also diagonalizable, so B t also has n linearly independent eigenvectors w 1,, w n. Let these vectors have eigenvalues d 1,..., d n. These vectors have the property that B t w = dw, or, taking transposes, w t B = dw t. Now let A i,j = v i wj. t Each A i,j is an n n matrix. Moreover, T (A i,j ) = Bv i w t j v i w t jb = c i v i w t j d j v i w t j = (c i d j )v i w t j. That is, each A i,j is an eigenvector, with eigenvalue c i d j. It turns out that the A s are all linearly independent (I will leave this to you to check). Consequently, we have n 2 independent eigenvectors for T, so T is diagonalizable. Page 6