A = 3 1. We conclude that the algebraic multiplicity of the eigenvalues are both one, that is,

Similar documents
Therefore, A and B have the same characteristic polynomial and hence, the same eigenvalues.

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix

Diagonalization of Matrix

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Diagonalization. MATH 322, Linear Algebra I. J. Robert Buchanan. Spring Department of Mathematics

Eigenvalues and Eigenvectors

Math Matrix Algebra

and let s calculate the image of some vectors under the transformation T.

Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Eigenvalues and Eigenvectors: An Introduction

a 11 a 12 a 11 a 12 a 13 a 21 a 22 a 23 . a 31 a 32 a 33 a 12 a 21 a 23 a 31 a = = = = 12

Math 3191 Applied Linear Algebra

City Suburbs. : population distribution after m years

Recall : Eigenvalues and Eigenvectors

Announcements Monday, November 06

Solutions Problem Set 8 Math 240, Fall

4. Linear transformations as a vector space 17

Diagonalization. P. Danziger. u B = A 1. B u S.

Econ Slides from Lecture 7

EE5120 Linear Algebra: Tutorial 6, July-Dec Covers sec 4.2, 5.1, 5.2 of GS

Math 314H Solutions to Homework # 3

Definition (T -invariant subspace) Example. Example

Computationally, diagonal matrices are the easiest to work with. With this idea in mind, we introduce similarity:

Math 205, Summer I, Week 4b:

Math 205, Summer I, Week 4b: Continued. Chapter 5, Section 8

CS 246 Review of Linear Algebra 01/17/19

Unit 5: Matrix diagonalization

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

Chapter 5 Eigenvalues and Eigenvectors

Math 315: Linear Algebra Solutions to Assignment 7

Homework sheet 4: EIGENVALUES AND EIGENVECTORS. DIAGONALIZATION (with solutions) Year ? Why or why not? 6 9

9.1 Eigenanalysis I Eigenanalysis II Advanced Topics in Linear Algebra Kepler s laws

MAC Module 12 Eigenvalues and Eigenvectors. Learning Objectives. Upon completing this module, you should be able to:

2 Eigenvectors and Eigenvalues in abstract spaces.

PROBLEM SET. Problems on Eigenvalues and Diagonalization. Math 3351, Fall Oct. 20, 2010 ANSWERS

Jordan Canonical Form Homework Solutions

Linear Algebra. Rekha Santhanam. April 3, Johns Hopkins Univ. Rekha Santhanam (Johns Hopkins Univ.) Linear Algebra April 3, / 7

MAT 1302B Mathematical Methods II

Matrices and Linear Algebra

Eigenvalues, Eigenvectors, and Diagonalization

(the matrix with b 1 and b 2 as columns). If x is a vector in R 2, then its coordinate vector [x] B relative to B satisfies the formula.

1. Foundations of Numerics from Advanced Mathematics. Linear Algebra

22m:033 Notes: 7.1 Diagonalization of Symmetric Matrices

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

Review Notes for Linear Algebra True or False Last Updated: January 25, 2010

Constant coefficients systems

Midterm for Introduction to Numerical Analysis I, AMSC/CMSC 466, on 10/29/2015

MAC Module 12 Eigenvalues and Eigenvectors

Spring 2019 Exam 2 3/27/19 Time Limit: / Problem Points Score. Total: 280

ICS 6N Computational Linear Algebra Eigenvalues and Eigenvectors

DM554 Linear and Integer Programming. Lecture 9. Diagonalization. Marco Chiarandini

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

av 1 x 2 + 4y 2 + xy + 4z 2 = 16.

MATH 221, Spring Homework 10 Solutions

Eigenvalues and Eigenvectors

Lecture 15, 16: Diagonalization

Calculating determinants for larger matrices

Announcements Wednesday, November 01

Question: Given an n x n matrix A, how do we find its eigenvalues? Idea: Suppose c is an eigenvalue of A, then what is the determinant of A-cI?

Chapter 7. Canonical Forms. 7.1 Eigenvalues and Eigenvectors

Linear Algebra - Part II

MATRICES ARE SIMILAR TO TRIANGULAR MATRICES

c c c c c c c c c c a 3x3 matrix C= has a determinant determined by

Eigenvalue and Eigenvector Homework

Systems of Second Order Differential Equations Cayley-Hamilton-Ziebur

Eigenvalues and Eigenvectors 7.2 Diagonalization

LU Factorization. A m x n matrix A admits an LU factorization if it can be written in the form of A = LU

8. Diagonalization.

Computational Methods CMSC/AMSC/MAPL 460. Eigenvalues and Eigenvectors. Ramani Duraiswami, Dept. of Computer Science

UNDERSTANDING THE DIAGONALIZATION PROBLEM. Roy Skjelnes. 1.- Linear Maps 1.1. Linear maps. A map T : R n R m is a linear map if

MATH 583A REVIEW SESSION #1

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

Properties of Linear Transformations from R n to R m

Linear Algebra Practice Problems

Eigenvalues for Triangular Matrices. ENGI 7825: Linear Algebra Review Finding Eigenvalues and Diagonalization

235 Final exam review questions

DIAGONALIZATION BY SIMILARITY TRANSFORMATIONS

Example: Filter output power. maximization. Definition. Eigenvalues, eigenvectors and similarity. Example: Stability of linear systems.

Diagonalization. Hung-yi Lee

Lecture 11: Eigenvalues and Eigenvectors

Symmetric and anti symmetric matrices

Online Exercises for Linear Algebra XM511

MAT1302F Mathematical Methods II Lecture 19

Lecture 12: Diagonalization

ANSWERS. E k E 2 E 1 A = B

1. In this problem, if the statement is always true, circle T; otherwise, circle F.

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

MATH 304 Linear Algebra Lecture 33: Bases of eigenvectors. Diagonalization.

Topic 1: Matrix diagonalization

Linear Algebra II Lecture 13

0.1 Eigenvalues and Eigenvectors

Introduction to Matrix Algebra

6 Inner Product Spaces

Numerical Linear Algebra Homework Assignment - Week 2

ft-uiowa-math2550 Assignment NOTRequiredJustHWformatOfQuizReviewForExam3part2 due 12/31/2014 at 07:10pm CST

systems of linear di erential If the homogeneous linear di erential system is diagonalizable,

Eigenvalues and Eigenvectors

Linear Algebra Primer

Further Mathematical Methods (Linear Algebra) 2002

Conceptual Questions for Review

Transcription:

65 Diagonalizable Matrices It is useful to introduce few more concepts, that are common in the literature Definition 65 The characteristic polynomial of an n n matrix A is the function p(λ) det(a λi) Example 65: Find the characteristic polynomial of matrix A Solution: We need to compute the determinant p(λ) det(a λi) ( λ) ( λ) ( λ) 9 λ λ + 9 We conclude that the characteristic polynomial is p(λ) λ λ 8 Since the matrix A in this example is, its characteristic polynomial has degree two One can show that the characteristic polynomial of an n n matrix has degree n The eigenvalues of the matrix are the roots of the characteristic polynomial Different matrices may have different types of roots, so we try to classify these roots in the following definition Definition 65 Given an n n matrix A with real eigenvalues λ i, where i,, k n, it is always possible to express the characteristic polynomial of A as p(λ) (λ λ ) r (λ λ k ) r k The number r i is called the algebraic multiplicity of the eigenvalue λ i Furthermore, the geometric multiplicity of an eigenvalue λ i, denoted as s i, is the maximum number of eigenvectors of λ i that form a linearly independent set Example 65: Find the eigenvalues algebraic and geometric multiplicities of the matrix A Solution: In order to find the algebraic multiplicity of the eigenvalues we need first to find the eigenvalues We now that the characteristic polynomial of this matrix is given by p(λ) ( λ) ( λ) (λ ) 9 The roots of this polynomial are λ 4 and λ, so we know that p(λ) can be rewritten in the following way, p(λ) (λ 4)(λ + ) We conclude that the algebraic multiplicity of the eigenvalues are both one, that is, λ 4, r, and λ, r In order to find the geometric multiplicities of matrix eigenvalues we need first to find the matrix eigenvectors This part of the work was already done in the Example?? above and the result is λ 4, v (), λ, v () From this expression we conclude that the geometric multiplicities for each eigenvalue are just one, that is, λ 4, s, and λ, s

The following example shows that two matrices can have the same eigenvalues, and so the same algebraic multiplicities, but different eigenvectors with different geometric multiplicities Example 65: Find the eigenvalues and eigenvectors of the matrix A Solution: We start finding the eigenvalues, the roots of the characteristic polynomial ( λ) { λ, r p(λ) ( λ), (λ )(λ ) ( λ) λ, r We now compute the eigenvector associated with the eigenvalue λ, which is the solution of the linear system (A I)v () v () v () v () After the few Gauss elimination operation we obtain the following, v () v(), v () v (), v () free Therefore, choosing v () we obtain that v (), λ, r, s In a similar way we now compute the eigenvectors for the eigenvalue λ, which are all solutions of the linear system (A I)v () v () v () v () After the few Gauss elimination operation we obtain the following, v () free, v () free, v () Therefore, we obtain two linearly independent solutions, the first one v () with the choice v (), v (), and the second one w () with the choice v (), v (), that is v (), w (), λ, r, s Summarizing, the matrix in this example has three linearly independent eigenvectors

Example 654: Find the eigenvalues and eigenvectors of the matrix A Solution: Notice that this matrix has only the coefficient a different from the previous example Again, we start finding the eigenvalues, which are the roots of the characteristic polynomial ( λ) { λ, r p(λ) ( λ), (λ )(λ ) ( λ) λ, r So this matrix has the same eigenvalues and algebraic multiplicities as the matrix in the previous example We now compute the eigenvector associated with the eigenvalue λ, which is the solution of the linear system v () (A I)v () v () v () After the few Gauss elimination operation we obtain the following, v (), v () v () v () free Therefore, choosing v () we obtain that v (), λ, r, s In a similar way we now compute the eigenvectors for the eigenvalue λ However, in this case we obtain only one solution, as this calculation shows, (A I)v () v () v () v () After the few Gauss elimination operation we obtain the following, v () free, v (), v () Therefore, we obtain only one linearly independent solution, which corresponds to the choice v (), that is, v (), λ, r, s Summarizing, the matrix in this example has only two linearly independent eigenvectors, and in the case of the eigenvalue λ we have the strict inequality s < r,

4 We first introduce the notion of a diagonal matrix Later on we define a diagonalizable matrix as a matrix that can be transformed into a diagonal matrix by a simple transformation Definition 65 An n n matrix A is called diagonal iff A a a nn That is, a matrix is diagonal iff every nondiagonal coefficient vanishes From now on we use the following notation for a diagonal matrix A: A diag [ a a,, a nn a nn This notation says that the matrix is diagonal and shows only the diagonal coefficients, since any other coefficient vanishes The next result says that the eigenvalues of a diagonal matrix are the matrix diagonal elements, and it gives the corresponding eigenvectors Theorem 654 If D diag[d,, d nn, then eigenpairs of D are λ d, v (),, λ n d nn, v (n) Diagonal matrices are simple to manipulate since they share many properties with numbers For example the product of two diagonal matrices is commutative It is simple to compute power functions of a diagonal matrix It is also simple to compute more involved functions of a diagonal matrix, like the exponential function Example 655: For every positive integer n find A n, where A Solution: We start computing A as follows, A AA We now compute A, A A A [ Using induction, it is simple to see that A n n n Many properties of diagonal matrices are shared by diagonalizable matrices These are matrices that can be transformed into a diagonal matrix by a simple transformation Definition 655 An n n matrix A is called diagonalizable iff there exists an invertible matrix P and a diagonal matrix D such that Remarks: A P DP

5 (a) Systems of linear differential equations are simple to solve in the case that the coefficient matrix is diagonalizable One decouples the differential equations, solves the decoupled equations, and transforms the solutions back to the original unknowns (b) Not every square matrix is diagonalizable For example, matrix A below is diagonalizable while B is not, A, B 5 Example 656: Show that matrix A is diagonalizable, where 4 P and D Solution: That matrix P is invertible can be verified by computing its determinant, det(p ) ( ) Since the determinant is nonzero, P is invertible Using linear algebra methods one can find out that the inverse matrix is P Now we only need to verify that P DP is indeed A A straightforward calculation shows P DP 4 4 4 P DP A There is a deep relation between the eigenpairs of a matrix and whether that matrix is diagonalizable Theorem 656 (Diagonalizable Matrix) An n n matrix A is diagonalizable iff A has a linearly independent set of n eigenvectors Furthermore, if λ i, v i, for i,, n, are eigenpairs of A, then A P DP, P [v,, v n, D diag [ λ,, λ n Proof of Theorem 656: ( ) Since matrix A is diagonalizable, there exist an invertible matrix P and a diagonal matrix D such that A P DP Multiply this equation by P on the left and by P on the right, we get D P AP (65) Since n n matrix D is diagonal, it has a linearly independent set of n eigenvectors, given by the column vectors of the identity matrix, that is, De (i) d ii e (i), D diag [ d,, d nn, I [ e (),, e (n)

6 So, the pair d ii, e (i) is an eigenvalue-eigenvector pair of D, for i, n information in Eq (65) we get d ii e (i) De (i) P AP e (i) A ( P e (i)) d ii ( P e (i) ), Using this where the last equation comes from multiplying the former equation by P on the left This last equation says that the vectors v (i) P e (i) are eigenvectors of A with eigenvalue d ii By definition, v (i) is the i-th column of matrix P, that is, P [ v (),, v (n) Since matrix P is invertible, the eigenvectors set {v (),, v (n) } is linearly independent This establishes this part of the Theorem ( ) Let λ i, v (i) be eigenvalue-eigenvector pairs of matrix A, for i,, n Now use the eigenvectors to construct matrix P [ v (),, v (n) This matrix is invertible, since the eigenvector set {v (),, v (n) } is linearly independent We now show that matrix P AP is diagonal We start computing the product that is, AP A [ v (),, v (n) [ Av (),, Av (n), [ λ v (), λ n v (n) P AP P [ λ v (),, λ n v (n) [ λ P v (),, λ n P v (n) At this point it is useful to recall that P is the inverse of P, I P P [ e (),, e (n) P [ v (),, v (n) [ P v (),, P v (n) So, e (i) P v (i), for i, n Using these equations in the equation for P AP, P AP [ λ e (),, λ n e (n) diag [ λ,, λ n Denoting D diag [ λ,, λ n we conclude that P AP D, or equivalently A P DP, P [ v (),, v (n), D diag [ λ,, λ n This means that A is diagonalizable This establishes the Theorem Example 657: Show that matrix A is diagonalizable Solution: We know that the eigenvalue-eigenvector pairs are [ λ 4, v and λ, v Introduce P and D as follows, P P, D [ 4 We must show that A P DP This is indeed the case, since P DP 4 P DP 4 4 We conclude, P DP P DP A, that is, A is diagonalizable With Theorem 656 we can show that a matrix is not diagonalizable

Example 658: Show that matrix B is not diagonalizable 5 Solution: We first compute the matrix eigenvalues The characteristic polynomial is ( ) p(λ) λ ( )( 5 ) ( 5 ) λ λ λ + 4 λ 4λ + 4 7 The roots of the characteristic polynomial are computed in the usual way, λ [ 4 ± 6 6 λ, r We have obtained a single eigenvalue with algebraic multiplicity r The associated eigenvectors are computed as the solutions to the equation (A I)v Then, ( ) (A I) ( 5 ) v, s We conclude that the biggest linearly independent set of eigenvalues for the matrix B contains only one vector, insted of two Therefore, matrix B is not diagonalizable Theorem 656 shows the importance of knowing whether an n n matrix has a linearly independent set of n eigenvectors However, more often than not, there is no simple way to check this property other than to compute all the matrix eigenvectors But there is a simpler particular case, the case when an n n matrix has n different eigenvalues Then, we do not need to compute the eigenvectors The following result says that such matrix always have a linearly independent set of n eigenvectors, hence, by Theorem 656, it is diagonalizable Theorem 657 (Different Eigenvalues) If an n n matrix has n different eigenvalues, then this matrix has a linearly independent set of n eigenvectors Proof of Theorem 657: Let λ,, λ n be the eigenvalues of an n n matrix A, all different from each other Let v (),, v (n) the corresponding eigenvectors, that is, Av (i) λ i v (i), with i,, n We have to show that the set {v (),, v (n) } is linearly independent We assume that the opposite is true and we obtain a contradiction Let us assume that the set above is linearly dependent, that is, there are constants c,, c n, not all zero, such that, c v () + + c n v (n) (65) Let us name the eigenvalues and eigenvectors such that c Now, multiply the equation above by the matrix A, the result is, c λ v () + + c n λ n v (n) Multiply Eq (65) by the eigenvalue λ n, the result is, c λ n v () + + c n λ n v (n) Subtract the second from the first of the equations above, then the last term on the righthand sides cancels out, and we obtain, c (λ λ n )v () + + c n (λ n λ n )v (n ) (65)

8 Repeat the whole procedure starting with Eq (65), that is, multiply this later equation by matrix A and also by λ n, then subtract the second from the first, the result is, c (λ λ n )(λ λ n )v () + + c n (λ n λ n )(λ n λ n )v (n ) Repeat the whole procedure a total of n times, in the last step we obtain the equation c (λ λ n )(λ λ n ) (λ λ )(λ λ )v () Since all the eigenvalues are different, we conclude that c, however this contradicts our assumption that c Therefore, the set of n eigenvectors must be linearly independent This establishes the Theorem Example 659: Is matrix A diagonalizable? Solution: We compute the matrix eigenvalues, starting with the characteristic polynomial, p(λ) ( λ) ( λ) ( λ) λ λ p(λ) λ(λ ) The roots of the characteristic polynomial are the matrix eigenvalues, λ, λ The eigenvalues are different, so by Theorem 657, matrix A is diagonalizable

9 65 Exercises 65-65-