Sec 6 Eigenvalues and Eigenvectors Definition An eigenvector of an n n matrix A is a nonzero vector x such that A x λ x for some scalar λ A scalar λ is called an eigenvalue of A if there is a nontrivial solution x of A x λ x; such an x is called an eigenvector corresponding to λ Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero 3 2 2 Ex Let A and u, then A u u Therefore is an eigenvalue of A and u is an eigenvector of A corresponding to the eigenvalue 2 Remark is not the only eigenvector of A corresponding to the eigenvalue 2 For example, 6 is another eigenvector of A corresponding to 2 In fact, if u is an eigenvector of A corresponding 3 to an eigenvalue λ, then so is any nonzero multiple c u of u Ex2 Show that x 2 3 is an eigenvalue of the matrix A 2 3 2 Ex3 Let B 6 5 2 Is u 6 5 an eigenvector of B? How about v 3 2? Remark A matrix may have more than one eigenvalue Ex4 In Ex3 above, we observed that λ 4 is an eigenvalue of B However, w an eigenvector of B corresponding to the eigenvalue 6 B w 5 2 7 7, since 7 7 w is
Questions Why do we study eigenvalues and eigenvectors? 2 How do we find eigenvalues? 3 Given an eigenvalue, how do we find eigenvectors corresponding to the eigenvalue? Let us begin with the last question Definition Let A be an n n square matrix and λ be an eigenvalue of A The null space of A λi n is called the eigenspace of A corresponding to λ The dimension of the eigenspace of A corresponding to λ is called the geometric multiplicity of λ Remark The eigenspace of A corresponding to λ is a subspace of R n Any nonzero vector in the eigenspace corresponding to λ is an eigenvector corresponding to λ 4 6 3 Ex5 Let A 2 6, then 2 is an eigenvalue of A with an eigenvector Find 2 8 the eigenspace of A corresponding to 2 So every eigenvector of A corresponding to 2 can be written as a linear combination of and Conversely every linear combination of these two vectors except the zero vector is an eigenvector corresponding to 2 Theorem If v, v 2,, v k are eigenvectors that correspond to distinct eigenvalues λ, λ 2,, λ k of an n n matrix A, then the vectors v, v 2,, v k are linearly independent 6 6 Ex6 Go back to Ex4 above, where B We saw that and are eigenvectors of B corresponding to 4 and 7, respectively So the vectors, are linearly 5 2 5 6 5 independent
So, how can we find eigenvalues of a given matrix? 2 3 Ex7 Let A We want to find all the eigenvalues of A By definition, λ is an eigenvalue 3 6 of A if and only if A x λ x for some nonzero vector x, which is the same as saying that the homogeneous system (A λi 2 ) x has a nontrivial solution Therefore we see that λ is an eigenvalue of A if and only if the matrix A λi 2 is not invertible, that is, if and only if det(a λi 2 ) Now 2 λ 3 A λi 2 and hence det(a λi 3 6 λ 2 ) (2 λ)( 6 λ) 9 λ 2 + 4λ 2 Therefore and, and only they, are eigenvalues of A In general, to find all the eigenvalues of a given n n matrix A, First, find A λi n ; 2 Second, compute det(a λi n ), which is a polynomial in λ of degree n; 3 Third, finally solve the equation det(a λi n ) The solution set of this equation is exactly the set of eigenvalues of A Definition The polynomial det(a λi n ) is called the characteristic polynomial of A The equation det(a λi n ) is called the characteristic equation of A 5 2 6 Ex8 Find all the eigenvalues of A 3 8 5 4 In general, the eigenvalues of a triangular matrix are precisely the diagonal entries of the matrix Ex9 True or false? Some 3 3 matrices can have 4 distinct eigenvalues Ex Find the eigenvalues of A 2 2 3
Ex Find the eigenvalues of A 7 2 4 2 Ex2 Find the eigenvalues of A 3 2 2 Definition Let A and B be n n matrices A is said to be similar to B if there is an invertible matrix P such that P AP B When A is similar to B, then we write A B 2 2 4 Ex3 A is similar to B because with P (so P 4 3 ) P AP 2 4 2 6 4 2 4 3 B Theorem If A and B are similar n n matrices, then they have the same characteristic polynomials and hence the same eigenvalues Proof
Summary Suppose an n n matrix A is given To find all eigenvalues of A, solve the equation det(a λi n ) ; the roots are precisely the eigenvalues For each eigenvalue λ, to get an eigenvector corresponding to λ, find the null space of (A λi n ) (the eigenspace corresponding to λ) by solving the associated homogeneous system (A λi n ) x ; any nonzero vector inside the null space of (A λi n ) is an eigenvector corresponding to λ
Sec 62 Diagonalization of Matrices Motivation: In many applications, we need to compute A k for large k This requires a lot of computations for general n n matrix A When A is diagonal, however, the computation is quite simple 5 25 Ex If D, then D 3 2 9 Ex2 Compute A 7 2, where A 4 5 In general, D n n 3 n It takes too much time to compute A with bare hands Consider a matrix P then P is invertible with P Then P AP 7 2 4 2 In other words, A is similar to a diagonal matrix D 5 RHS: D 3 5 3 2 2, Take the th power of the both sides: LHS: (P AP ) (P AP )(P AP ) (P AP ) P }{{} A(P P )A(P P )A (P P )AP, which times reduces to P A P Therefore we conclude that P A P A P 5 3 P 2 5 3 5 3, or 2 2 5 3 5 3 2 5 + 2 3 5 + 2 3 Similarly, one can compute A n 2 5 n 3 n 5 n 3 n 2 5 n + 2 3 n 5 n + 2 3 n for general n Question Given A, is it always possible to find an invertible matrix P such that P AP is diagonal? The answer is no We will see an example later
Definition A square matrix A is said to be diagonalizable if A is similar to a diagonal matrix, that is, if P AP D for some invertible matrix P and some diagonal matrix D 7 2 Ex3 The matrix A in Ex2 is diagonalizable 4 Question 2 If A is diagonalizable, how can we find a matrix P such that P AP is diagonal? The Diagonalization Theorem, Part An n n matrix A is diagonalizable if and only if A has n linearly independent eigenvectors In fact, P AP D, with D diagonal, if and only if the columns of P are n linearly independent eigenvectors of A In this case, the diagonal entries of D are eigenvalues of A that correspond, respectively, to the eigenvectors (ie columns) in P Remark So if an n n matrix A has n distinct eigenvalues, then A is diagonalizable 7 2 Ex4 Go back to Ex2 and see how we can obtain the matrix P A, the eigenvalues of A are the zeros of the characteristic polynomial of A: det(a λi 2 ) (7 λ)( λ) + 8 4 λ 2 8λ + 5 (λ 3)(λ 5) Find an eigenvector corresponding to λ 5: Find an eigenvector corresponding to λ 3: Therefore, we can take P Ex5 Diagonalize (that is, find an invertible matrix P and a diagonal matrix D such that P AP 3 3 D) A 3 5 3, if possible 3 3
Ex6 Show that B is not diagonalizable Proof Proof 2 Now we give another characterization of diagonalizable matrices The Diagonalization Theorem, Part 2 Let A be an n n matrix with distinct eigenvalues λ,, λ p a For k p, the dimension of the eigenspace corresponding to λ k (that is, the geometric multiplicity of λ k ) is always greater than equal to and less than or equal to the algebraic multiplicity of the eigenvalue λ k as a zero of the characteristic polynomial of A b The matrix A is diagonalizable if and only if the geometric multiplicity of λ k equals the algebraic multiplicity of λ k for each k Ex7 Using the theorem above, show again that the matrix in Ex5 is diagonalizable, and that the matrix in Ex6 is not diagonalizable Ex8 Check if is diagonalizable C 2 2 3 3
Ex9 Show that A 3 2 2 2 3 2 3 3 2 is diagonalizable and calculate A 3 using diagonalization