Eigenspaces and Diagonalizable Transformations

Similar documents
Eigenvalues, Eigenvectors, and Diagonalization

Definition (T -invariant subspace) Example. Example

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Computationally, diagonal matrices are the easiest to work with. With this idea in mind, we introduce similarity:

4. Linear transformations as a vector space 17

1. In this problem, if the statement is always true, circle T; otherwise, circle F.

Announcements Wednesday, November 01

Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Chapter 5. Eigenvalues and Eigenvectors

Eigenvalues and Eigenvectors

1 Invariant subspaces

235 Final exam review questions

Generalized Eigenvectors and Jordan Form

Announcements Wednesday, November 01

2 Eigenvectors and Eigenvalues in abstract spaces.

MATH 304 Linear Algebra Lecture 33: Bases of eigenvectors. Diagonalization.

Study Guide for Linear Algebra Exam 2

Diagonalization. MATH 1502 Calculus II Notes. November 4, 2008

and let s calculate the image of some vectors under the transformation T.

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

Linear Algebra - Part II

MAT224 Practice Exercises - Week 7.5 Fall Comments

Final Review Sheet. B = (1, 1 + 3x, 1 + x 2 ) then 2 + 3x + 6x 2

2. Every linear system with the same number of equations as unknowns has a unique solution.

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

Chapter 7. Canonical Forms. 7.1 Eigenvalues and Eigenvectors

MATH 115A: SAMPLE FINAL SOLUTIONS

MATH SOLUTIONS TO PRACTICE MIDTERM LECTURE 1, SUMMER Given vector spaces V and W, V W is the vector space given by

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

Math Final December 2006 C. Robinson

MATH 583A REVIEW SESSION #1

LINEAR ALGEBRA MICHAEL PENKAVA

MAT 1302B Mathematical Methods II

Linear Algebra Practice Problems

Math 217: Eigenspaces and Characteristic Polynomials Professor Karen Smith

Linear Algebra II Lecture 13

MATH 221, Spring Homework 10 Solutions

Eigenvectors and Hermitian Operators

Review problems for MA 54, Fall 2004.

NONCOMMUTATIVE POLYNOMIAL EQUATIONS. Edward S. Letzter. Introduction

Recall : Eigenvalues and Eigenvectors

Solutions to Final Exam

Practice problems for Exam 3 A =

Math Matrix Algebra

Online Exercises for Linear Algebra XM511

Solutions to the Calculus and Linear Algebra problems on the Comprehensive Examination of January 28, 2011

Lecture Summaries for Linear Algebra M51A

NORMS ON SPACE OF MATRICES

Final A. Problem Points Score Total 100. Math115A Nadja Hempel 03/23/2017

Lecture 15, 16: Diagonalization

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

What is on this week. 1 Vector spaces (continued) 1.1 Null space and Column Space of a matrix

LINEAR ALGEBRA SUMMARY SHEET.

Lecture Notes for Math 414: Linear Algebra II Fall 2015, Michigan State University

0.1 Rational Canonical Forms

(b) If a multiple of one row of A is added to another row to produce B then det(b) =det(a).

Chapter 2 Notes, Linear Algebra 5e Lay

Eigenvalues and Eigenvectors A =

Calculating determinants for larger matrices

Final Exam Practice Problems Answers Math 24 Winter 2012

Quizzes for Math 304

Solutions to Final Practice Problems Written by Victoria Kala Last updated 12/5/2015

(a) II and III (b) I (c) I and III (d) I and II and III (e) None are true.

LINEAR ALGEBRA BOOT CAMP WEEK 4: THE SPECTRAL THEOREM

EIGENVALUES AND EIGENVECTORS 3

MATH 423 Linear Algebra II Lecture 20: Geometry of linear transformations. Eigenvalues and eigenvectors. Characteristic polynomial.

Eigenvalues and Eigenvectors

Systems of Algebraic Equations and Systems of Differential Equations

(the matrix with b 1 and b 2 as columns). If x is a vector in R 2, then its coordinate vector [x] B relative to B satisfies the formula.

Matrices related to linear transformations

Math 205, Summer I, Week 4b:

MATH 315 Linear Algebra Homework #1 Assigned: August 20, 2018

Linear Algebra- Final Exam Review

Therefore, A and B have the same characteristic polynomial and hence, the same eigenvalues.

Generalized eigenspaces

Math 24 Spring 2012 Questions (mostly) from the Textbook

Glossary of Linear Algebra Terms. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB

Math 369 Exam #2 Practice Problem Solutions

MAT2342 : Introduction to Applied Linear Algebra Mike Newman, fall Projections. introduction

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

6 Inner Product Spaces

4.1 Eigenvalues, Eigenvectors, and The Characteristic Polynomial

Math 3191 Applied Linear Algebra

Dimension. Eigenvalue and eigenvector

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

Chapter 4 & 5: Vector Spaces & Linear Transformations

EIGENVALUES AND EIGENVECTORS

MAC Module 12 Eigenvalues and Eigenvectors. Learning Objectives. Upon completing this module, you should be able to:

W2 ) = dim(w 1 )+ dim(w 2 ) for any two finite dimensional subspaces W 1, W 2 of V.

Solutions Problem Set 8 Math 240, Fall

MATH 1553 PRACTICE MIDTERM 3 (VERSION B)

Summer Session Practice Final Exam

Diagonalization of Matrix

LINEAR ALGEBRA QUESTION BANK

Math 110 Linear Algebra Midterm 2 Review October 28, 2017

LU Factorization. A m x n matrix A admits an LU factorization if it can be written in the form of A = LU

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017

9.1 Eigenanalysis I Eigenanalysis II Advanced Topics in Linear Algebra Kepler s laws

Topics in linear algebra

Transcription:

Chapter 2 Eigenspaces and Diagonalizable Transformations As we explored how heat states evolve under the action of a diffusion transformation E, we found that some heat states will only change in amplitude. This means that when we apply the diffusion operator to one of these heat states, we get a result that is a scalar multiple of the original heat state. Mathematically, we write Ev = λv, (2.) for some scalar λ and one of these special heat states v H m (R). We also saw that these heat state vectors satisfy the matrix equation (E λi)v =. (2.2) Since this is a homogeneous equation, we know that this equation has a solution. This means that either there is a unique solution (only the trivial solution v = ) or infinitely many solutions. If we begin with a zero heat state (all temperatures are the same everywhere along the rod) then the diffusion is trivial because nothing happens. It would be nice to find a nonzero vector satisfying the matrix Equation (2.2) because it gets us closer to the possibility of having a basis of these vectors. By Theorem 7.6., we know that this equation has a nonzero solution as long det(e λi) =. Let s remember why we want such a basis. In the case of heat states, we recognize that if B = {v, v 2,..., v m } is a basis of these special vectors so that Ev i = λ i v i and u() is our initial heat state, we can write u() in 353

354 CHAPTER 2. EIGENSPACES coordinates relative to B. That is, there are scalars α, α 2,... α m so that u() = α v + α 2 v 2 +... + α m v m. Then, when we apply the diffusion operator to find the heat state, u(), one time step later, we get u() =Eu() = E(α v + α 2 v 2 +... + α m v m ) =α Ev + α 2 Ev 2 +... + α m Ev m =α λ v + α 2 λ 2 v 2 +... + α m λ m v m. So, if we want to find u(k) for some time step k, far into the future, we get u(k) =E k u() = α E k v + α 2 E k v 2 +... + α m E k v m =α λ k v + α 2 λ k 2v 2 +... + α m λ k mv m. This equation requires no matrix multiply operations and u(k) can be computed directly without the need for computing each intermediate time-step result. We can easily predict long term behavior of the diffusion. 2. Eigenvectors and Eigenvalues In Linear Algebra, we give nonzero vectors that satisfy Equation 2. a name. Definition 2... Let (V, +, ) be a vector space. Given linear tranformation L : V V and a nonzero v V. If Lv = λv for some scalar λ, then we say v is an eigenvector of L with eigenvalue λ. As with the heat states, we see that eigenvectors of a linear transformation only change amplitude (and possibly sign) when the transformation is applied to the vector. This makes repetitive applications of a linear transformation to its eigenvectors very simple.

2.. EIGENVECTORS AND EIGENVALUES 355 Important Note: Throughout our discussion of eigenvectors and eigenvalues we will assume that L is a linear operator on n-dimensional vector space V. That is L : V V linear. We will also let M be a matrix representation of L relative to some basis. The reader should be aware of whether we are working in the vector space V or in an associated coordinate space. Example 2... Consider ( the linear ) transformation ( L ): R 2 R 2 defined by Lx = Ax where A =. The vector v = is an eigenvector 3 of L with eigenvalue 2 because ( ) ( ) ( ) 2 Lv = Av = = = 2v. 3 2 ( ) The vector w = is not eigenvector of L because 2 Lw = Aw = for any scalar λ. ( 3 ) ( 2 ) = ( 3 ) ( λ 2 ) = λw Example 2..2. Consider the vector space of 7-bar LCD characters, V = D(Z 2 ) and the linear transformation T on V whose action is to flip any character upside-down. For example, T = and T =. We see that any nonzero character x with up-down symmetry is an eigenvector of T with eigenvalue because T x = x = x. Notice that in order to find the eigenvalues and eigenvectors of an operator, we need only solve the equation (L + λi)v = or the equivalent matrix equation (M + λi)v =. The following definition and theorem provide a method for finding eigenvalues.

356 CHAPTER 2. EIGENSPACES Definition 2..2. The function f(x) = det(m xi) is called the characteristic polynomial of L (and of M). Theorem 2... Scalar λ is an eigenvalue of L (and of M) if and only if λ is a zero of the characteristic polynomial of L. Proof. Suppose λ is an eigenvalue of L. Then Lv = λv for some nonzero vector v and (L λi)v = has nontrivial solutions. Since at least one solution exists, this system of equations is consistent and we must have det(l λi) =. That is, λ is a zero of the characteristic polynomial det(l λi). Now suppose that λ is a zero of the characteristic polynomial det(l λi). Then det(l λi) = and (L λi)v = has some nontrivial solution v. That is, there exists nonzero v such that Lv = λv and λ is an eigenvalue of L. It is critical to note that eigenvalues are elements of the scalar set over which scalar-vector multiplication is defined. Consider, for example, the characteristic polynomial (λ 4)(λ 2 + ). If scalars λ R, then there is only one zero, λ = 4. There are other zeros, λ = ±i, but not in the context of real-valued vector spaces. Finally, consider the characteristic polynomial (λ 4)(λ )(λ ). There are three real zeros λ =,, 4. A typical procedure for finding eigenvalues begins by finding the zeros of the characteristic polynomial. Then, eignevectors are determined as solutions to Equation 2.2 for the given eigenvalue λ. Example 2..3. Let us find the eigenvalues and eigenvectors of the matrix A = ( 5 3 6 4 ).

2.. EIGENVECTORS AND EIGENVALUES 357 First, we find the zeros of the characteristic polynomial: det(a λi) = ( 5 3 6 4 ) ( λ ) = 5 λ 3 6 4 λ = (5 λ)( 4 λ) + 8 = λ 2 λ 2 = (λ + )(λ 2) = λ =, λ = 2 So, now we have two eigenvalues λ = and λ 2 = 2. (The subscripts on the λ s have nothing to do with the actual eigenvalue. We are just numbering them.) Using these eigenvalues and Equation 2.2, we can find the corresponding eigenvectors. Let s start with λ =. We want to find v so that (A ( )I)v =. We can set this up as a system of equations: ( 6 3 6 3 (A + I)v = ) ( x y ) ( = ) 6x + 3y = 6x + 3y =

358 CHAPTER 2. EIGENSPACES We can see that this system has infinitely many solutions (that s what we expected) and the solution space is E = {( x 2x ) x R} {( = Span 2 )}. Using the same process, we can find the eigenvectors corresponding to λ 2 = 2. ( 3 3 6 6 (A 2I)v = ) ( x y ) ( = ) 3x + 3y = 6x + 6y = So, the solutions space to this system is E 2 = {( x x ) x R} {( = Span )}. We can verify that every nonzero vector in E 2 is an eigenvector of A with eigenvalue λ 2 = 2. Notice, Av = ( 5 3 6 4 ) ( α α ) = ( 2α 2α ) = 2v = λ 2 v. The reader can verify that any nonzero vector in E is an eigenvector of A with eigenvalue λ =. Example 2..4. Let s examine the linear transformation L : P 2 (R) P 2 (R) defined by L(a + bx + cx 2 ) = (a b + c) + (a + b c)x + ( a + b + c)x 2. Consider the standard basis B = {, x, x 2 } for P 2 (R). We have the matrix representation of L relative to this basis: M =.

2.. EIGENVECTORS AND EIGENVALUES 359 M has rank 3 and is invertible, so L is bijective. Thus, Null(L) = {} and Ran(L) = P 2 (R). Next, we seek any eigenvalues by finding the zeros of the characteristic polynomial: det(m λi) = det λ λ λ = ( λ)(λ 2 2λ + 4) = We see that the characteristic polynomial has just one real root, λ =. To find eigenvectors with eigenvalue, we solve (M λi)v = This system of equations has solution set E M = Span x y z =. Any nonzero vector in E M is an eigenvector of M. What are the eigenvectors of L? We used the standard basis to obtain M from L, so [v] B = (,, ) T tells us that v = () + (x) + ()x 2 = + x + x 2. Thus, Indeed, Lv = L( + x + x 2 ) E L = Span { + x + x 2}. = ( + ) + ( + )x + ( + + )x 2 = + x + x 2 = v = λv.

36 CHAPTER 2. EIGENSPACES Suppose v is an eigenvector of linear transformation L with eigenvalue λ =. Because Lv = λv = ()v =, v Null(L). However, if w Null(L) it does not necessarily follow that w is an eigenvector of L. Why? Because w = Null(L), but w = cannot be an eigenvector by definition. However, in our examples thus far, it has been useful for us to express eigenvectors as elements of subspaces. Definition 2..3. Let λ be an eigenvalue of L. Then E λ = {v V Lv = λv} = Null(L λi) is called the eigenspace of L (and of M) corresponding to eigenvalue λ. Exercise 24 asks the reader to show that every eigenspace of a linear transformation L is a subspace of V. 2.2 Eigenbasis Now that we can find any eigenvalue and eigenvector of a linear transformation, we would like to determine if a basis of eigenvectors exists. Computations, such as state evolution, would be much simpler when using a basis of eigenvectors. Since a basis consists of linearly independent vectors, the next theorem is a giant step in the right direction. Theorem 2.2.. Let λ, λ 2,..., λ k be distinct eigenvalues of L with a set of corresponding eigenvectors v, v 2,..., v k. Then, {v, v 2,..., v k } is linearly independent. Proof. (by induction) Suppose k =, then {v } is linearly independent because v. Now, assume that {v, v 2,..., v k } is linearly lindependent for k and for distinct eigenvalues λ, λ 2,..., λ k. We show that {v, v 2,..., v k, v k+ } is linearly independent when λ k+ λ j, j k. Since, v k+ Null(L λ k+ I), we need only show that nonzero vector w

2.2. EIGENBASIS 36 Span{v, v 2,..., v k } is not in Null(L λ k+ I). Let w = α v +α 2 v 2 +...+α k v k. (L λ k+ I)w = (L λ k+ I)(α v + α 2 v 2 +... + α k v k ) = (λ λ k+ )α v + (λ 2 λ k+ )α 2 v 2 +... + (λ k λ k+ )α k v k. Since the eigenvalues are distinct, if any α k then w Null(L λ k+ I). Thus, any nonzero w is not in Null(L λ k+ I) so v k+ Span{v, v 2,..., v k } and {v, v 2,..., v k, v k+ } is linearly independent. Corollary 2.2.. If L has n distinct eigenvalues λ, λ 2,..., λ n with corresponding eigenvectors v, v 2,..., v n, then {v, v 2,..., v n } is a basis for V. Proof. See Exercise 23. When we are searching for eigenvectors and eigenvalues of an n n matrix M, we are really considering the linear transformation L : R n R n defined by Lv = Mv. Then the eigenvectors are vectors in the domain of L that are scaled (by the eigenvalue) when we apply the linear transformation L to them. That is, v is an eigenvector with corresponding eigenvalue λ R if v R n and Lv = λv. An eigenbasis is just a basis for R n made up of eigenvectors. Similarly when searching for eigenvectors and eigenvalues of a general linear transformation L : V V, eigenvectors of L are domain vectors which are scaled by a scalar under the action of L. An eigenbasis for V is a basis made up of eigenvectors of L. Definition 2.2.. A basis for V constisting of eigenvectors of L, is called an eigenbasis of L for V. Let s look back at Example 2..3. Notice that if we create a set out of the {( basis) elements ( )} for both of the eigenspaces E and E 2, we get the set B =,, which is a basis for R 2 2.

362 CHAPTER 2. EIGENSPACES In Example 2..4, we found that the transformation L had only one eigenvalue. As L is a transformation on a space of dimension 3, Corollary 2.2. does not apply, and we do not yet know if we can obtain an eigenbasis. 2 Example 2.2.. Consider the matrix A = 3. We want to 3 know if A has an eigenbasis for R 3. We begin by finding the eigenvectors and eigenvalues. That is, we want to know for which nonzero vectors v and scalars λ does Av = λv. We know that this has a nonzero solution v if det(a λi) =. So, we solve for λ in 2 λ 3 λ 3 λ =. (2 λ)(3 λ)(3 λ) =. Thus, for λ = 2 and λ 2 = 3, there is a nonzero v so that Av = λv. We now find the corresponding eigenvectors (really the corresponding eigenspaces). That is, we want to find the solutions to (A 2I)v = and (A 3I)v =. (λ = 2) In this case we are solving the matrix equation a (A 2I)v = b =. c So, we will reduce the corresponding augmented matrix.. Thus, a is a free variable and b = c =. Thus, E = Span. (λ 2 = 3) Now, in this case we are solving the matrix equation a (A 3I)v = b =. c

2.3. DIAGONALIZABLE TRANSFORMATIONS 363 So, we will reduce the corresponding augmented matrix.. Thus, b is a free variable and a = c =. Thus, E 2 = Span. Notice, that in this example, we only have two linearly independent eigenvectors, v = and v 2 =. In order to get another eigenvector, we would need it to be either in E (and therefore a scalar multiple of v ) or in E 2 (and therefore a scalar multiple of v 2 ). This means that we cannot find an eigenbasis of A for R 3. 2.3 Diagonalizable Transformations Our explorations in heat diffusion have shown us that if B = {v, v 2,..., v n } is an eigenbasis for R n corresponding to the diffusion matrix E then we can write any initial heat state vector v R n as v = α v + α 2 v 2 +... + α n v n. Suppose these eigenvectors have eigenvalues λ, λ 2,..., λ n, respectively. Then with this decomposition into eigenvectors, we can find the heat state at any later time (say k time steps later) by multiplying the initial heat state by E k. This became an easy computation with the above decomposition because it gives us, using the linearity of matrix multiplication, E k v = E k (α v +α 2 v 2 +...+α n v n ) = α λ k v +α 2 λ k 2v 2 +...+α n λ k nv n. (2.3) We can then apply our knowledge of limits from Calculus here to find the long term behavior. That is, the long term behavior is lim k α λ k v + α 2 λ k 2v 2 +... + α n λ k nv n.

364 CHAPTER 2. EIGENSPACES We see that this limit really depends on the size of the eigenvalues. But we have also seen that if we change to an eigenbasis, the diffusion works out really nicely also. Let s remind ourselves how we went about that. First, we notice that if we want all computations in the eigenbasis, we have to reconsider the diffusion matrix transformation as well. That means, we want the matrix transformation that does the same thing that Lv = Ev does, but this matrix is created using the eigenbasis. That is, we want the matrix representation for the linear transformation that takes a coordinate vector [v] E (where E = {v, v 2,..., v n } is the eigenbasis) and maps it to [Ev] E. Let s call this matrix E. What we are saying is that we want E [v] E = [Ev] E. As always, the columns of this matrix are the vectors that are the result of applying the transformation to the current basis elements (in E). Thus, the columns of E are [Ev ] E, [Ev 2 ] E,..., [Ev n ] E. But [Ev ] E = [λ v ] E = λ [v ] E = λ. [Ev 2 ] E = [λ 2 v 2 ] E = λ 2 [v 2 ] E = λ 2.. [Ev n ] E = [λ n v n ] E = λ n [v n ] E = λ n. So, we found E = λ... λ 2.......... λ n

2.3. DIAGONALIZABLE TRANSFORMATIONS 365 Knowing that a change of basis is a linear transformation (actually, an isomorphism), we can find the matrix representation (usually known as a change of basis matrix). Let s call this matrix Q and let s see how this works. We know that Q[v] E = [v] S. This means that if we are given a coordinate vector with respect to the basis E, this transformation will output a coordinate vector with respect to the standard basis S. Recall, to get the coordinate vector in the new basis, we solve for the coefficients in Then v = α v + α 2 v 2 +... + α n v n. [v] E = α α 2. α n. Our favorite way to solve this is to set up the matrix equation α v v 2... v n α 2. = v. α n Notice that this is the transformation written in matrix form. The matrix representation that takes a coordinate vector with respect to the basis E to a coordinate vector with respect to the standard basis is v v 2... v n So the matrix representation for the transformation that changes from the eigenbasis E to the standard basis is given by Q. Let s use that to rewrite E [u] E = [v] E. That is, [u] E = Q u and [v] E = Q v for some u and v in the standard basis. So, we have:. Q (u(t + Δt)) = Q (Eu(t)) = E Q u(t),

366 CHAPTER 2. EIGENSPACES u(t + Δt) = Eu(t) = QE Q u(t). It is straightforward to show that for time step k: u(t + kδt) = E k u(t) = Q(E ) k Q u(t), λ k... λ k 2 u(t + kδt) = E k u(t) = Q..... Q u(t), Q u(t + kδt) =... λ k n λ k... λ k 2..... Q u(t), [u(t + kδt)] E =... λ k n λ k.... λ k 2.... [u(t)] E.... λ k n We see that when vector are represented as coordinate vectors with repsect to an eigenbasis, the transformation is diagonal. Of course, all of this is dependent on having an eigenbasis for R n. Exercise 2 below gives the necessary tools to show that we indeed have an eigenbasis for the diffusion transformation. Following the same procedure in a general setting, let us see what this means in any context. That is, we want to know when we can actually decompose a matrix A into a matrix product QDQ where Q is invertible and D is diagonal. Notice from above we see that to form the columns of Q we use the eigenvectors of A. This means that as long as we can find an eigenbasis {v, v 2,..., v n }, then Q = v v 2... v n. The invertibility of Q follows directly from the fact that Ran(Q) = Span{v, v 2,..., v n } = R n and Theorem 7.6..

2.3. DIAGONALIZABLE TRANSFORMATIONS 367 Definition 2.3.. L is called diagonalizable if there is an ordered basis B for V such that [L] B is a diagonal matrix. Definition 2.3.2. Given an n n matrix A, we say that A is diagonalizable if there exist invertible matrix Q and diagonal matrix D so that A = QDQ. Before we look at some examples, we have the tools to make three very important statements about diagonalizability of linear transformations. The first theorem provides an existence test for diagonalizability, only requiring that one compute and test the set of eigenvalues. Theorem 2.3.. If L has n distinct eigenvalues, then L is diagonalizable. Proof. The proof follows from Corollary 2.2. and the discussion above. See Exercise 25. The second theorem provides a somewhat less-desireable test for diagonalizability. It tells us that if there are n eigenvalues (some as repeated roots of the characteristic equation), the only way to know if L is diagonalizable is to determine if an eigenbasis exsits for V. Theorem 2.3.2. L is diagonalizable if and only if L has n linearly independent eigenvectors. Proof. See Exercise 26. Corollary 2.3.. If the characteristic equation of M has fewer than n (scalar) roots, then L is not diagonalizable.

368 CHAPTER 2. EIGENSPACES Proof. If the characteristic equation of M has fewer than n (scalar) roots, then we cannot form a linearly indendent set of n eigenvectors. Thus, by Theorem 2.3.2, L is not diagonalizable. Example 2.3.. Let A = 2 3. We want to determine whether A is diagonalizable. To do this, we need to find the eigenvalues and eigenvectors of A. That is, we want to solve Av = λv for both v and λ. It tends to be easier to find λ first. So, that s what we will do. (A λi)v = has infinitely many solutions when det(a λi) =. In our case, det(a λi) = 2 3 λ = =( λ) λ λ = ( λ) ( ( λ) 2 ) set = λ 2 λ 3 λ Thus, λ =,, 2. Now, because there are three eigenvalues, Corollary 2.2. tells us that there are at least three eigenvectors that are linearly independent. Thus, A is diagonalizable. Let s find the eigenbasis and decompose A into QDQ. (λ = ) We want to find v so that (A)v =. We will solve this equation by reducing an augmented matrix. 2 3 v c c R. (λ = ) We want to find v so that (A I)v =. Again, we will solve this equation by reducing an augmented matrix. 2 3 v c c R. (λ = 2) We want to find v so that (A 2I)v =. Once again, we will solve this equation by reducing an augmented matrix. 2 3 5 v c 5 c R.

2.3. DIAGONALIZABLE TRANSFORMATIONS 369 From these, we can form the eigenbasis,, We also know Q = 5 5 and D =. 2 We leave it to the reader to find Q and show that QDQ = A. Notice that in Example 2.2., there are only two eigenvalues. This means that we cannot form an eigenbasis for R 3. Thus, A is not diagonalizable. This might lead someone to think that we can just count the eigenvalues instead of eigenvectors. Let s see an example where this is not the case. Example 2.3.2. Let A = 3 3 2 2 2.. We can find the eigenvalues and eigenvectors of A to determine if A is diagonalizable. Let s step through the same steps. First, we solve the characteristic polynomial det(a λi) =. 3 λ 3 λ 2 2 2 λ = (2 λ) 3 λ 3 λ = (2 λ) ( (3 λ) 2 ) set =. So, λ = 2, 4. We only have two eigenvalues. Let s find the corresponding eigenspaces. (λ = 4:) We will solve (A 4I)v =. 4 2 2 2 2 So, the eigenspace is Span 2. /2 /2.

37 CHAPTER 2. EIGENSPACES (λ = 2:) We will solve (A 2I)v =.. 2 2 So, the eigenspace is Span,. Notice that even though we have two eigenvalues, we still have a set of three linearly independent eigenvectors. So, A is diagonalizable with A = MDM where M = 2 and D = 4 2 2 2.4 Applications: Harmonic Motion This section will contain basic development of solutions of ordinary differential equations that model harmonic motion. 2.5 Exercises For each of the following, find the eigenvectors and their corresponding eigenvalues and eigenspaces. ( ) 2 3. A = 5 6 ( ) 2. A = 2 3 5 3 3. A = 2 3 4. A = 2 5 4 2.

2.5. EXERCISES 37 5. A = 6. A = 3 4 2 2 3 2 2 5 4 2 ( ) a b L = (a d)x 2 + (b + d)x + c c d Determine which of the following matrices is diagonalizable. Whenever it is, write out the diagonalization of A. ( ) 2 3 7. A = 5 6 ( ) 8. A = 2 3 5 3 9. A = 2 3. A = 2 5 4 2 3 4 2. A = 2 3 2 2. A = 2 5 4 2 Prove or disprove the following statements. 3. If v is an eigenvector of M, then v T is an eigenvector of M T.

372 CHAPTER 2. EIGENSPACES 4. If λ is an eigenvalue of M, then λ is also an eigenvalue of M T. 5. If λ is an eigenvalue of M then λ 2 is an eigenvalue of M 2. 6. If u and v are eigenvectors of M, then u + v is also an eigenvector of M. 7. If u is an eigenvector of M, then αu is also an eigenvector of M, for any α. Additional Exercises. 8. Why does Theorem 2.2. tell us that given a matrix with bases E, E 2,... E k for k eigenspaces, we then have E E 2... E k is a linearly independent set? 9. Show that QDQ = A in Example 2.3.. 2. Prove that the eigenvalues of a diagonal matrix are the diagonal entries. 2. Consider the heat diffusion operator E : R m R m with standard basis matrix representation 2δ δ δ 2δ δ... E =, δ 2δ δ.... where < δ < 4. Show that the kth eigenvector v k ( k m) is given by v k = ( sin πk 2πk 3πk, sin, sin m + m + m + (m )πk,..., sin, sin mπk ) m + m + and provide the k th eigenvalue. Discuss the relative size of the eigenvalues. Is the matrix diagonalizable?

2.5. EXERCISES 373 22. Complete Example 2..2 by finding all eigenvalues and bases for each eigenspace. Is the transformation diagonalizable? 23. Prove Corollary 2.2.. 24. Prove that every eigenspace E λ of linear transformation L : V V is a subspace of V. 25. Prove Theorem 2.3.. 26. Prove Theorem 2.3.2. 27. Consider a bijective diagonalizable transformation T = QDQ, where D is diagonal. Show that the inverse transformation T = QD Q. 28. Let L : F(R) F(R) be defined by L(f(x)) = f (x), what are the eigenvectors (functions) of this transformation?(here, we are letting F(R) = {f : R R f is differentiable and continuous}.)

374 CHAPTER 2. EIGENSPACES