Eigenpairs and Diagonalizability Math 401, Spring 2010, Professor David Levermore

Similar documents
π 1 = tr(a), π n = ( 1) n det(a). In particular, when n = 2 one has

FIRST-ORDER SYSTEMS OF ORDINARY DIFFERENTIAL EQUATIONS II: Homogeneous Linear Systems with Constant Coefficients

Eigenvalues and Eigenvectors

Linear Planar Systems Math 246, Spring 2009, Professor David Levermore We now consider linear systems of the form

MTH 102: Linear Algebra Department of Mathematics and Statistics Indian Institute of Technology - Kanpur. Problem Set

= AΦ, Φ(0) = I, = k! tk A k = I + ta t2 A t3 A t4 A 4 +, (3) e ta 1

Math Matrix Algebra

Solutions of the Sample Problems for the Third In-Class Exam Math 246, Fall 2017, Professor David Levermore

Constant coefficients systems

Section 8.2 : Homogeneous Linear Systems

EE263: Introduction to Linear Dynamical Systems Review Session 5

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix

Symmetric and anti symmetric matrices

Chapter 7. Canonical Forms. 7.1 Eigenvalues and Eigenvectors

AMS526: Numerical Analysis I (Numerical Linear Algebra)

Jordan Canonical Form

and let s calculate the image of some vectors under the transformation T.

Announcements Wednesday, November 7

MATH 1553-C MIDTERM EXAMINATION 3

18.06 Problem Set 8 - Solutions Due Wednesday, 14 November 2007 at 4 pm in

Math 3191 Applied Linear Algebra

Linear Algebra II Lecture 13

Chapter 5. Eigenvalues and Eigenvectors

Generalized Eigenvectors and Jordan Form

MATH 304 Linear Algebra Lecture 33: Bases of eigenvectors. Diagonalization.

Applied Mathematics 205. Unit V: Eigenvalue Problems. Lecturer: Dr. David Knezevic

Eigenvalues, Eigenvectors, and Diagonalization

systems of linear di erential If the homogeneous linear di erential system is diagonalizable,

Proofs for Quizzes. Proof. Suppose T is a linear transformation, and let A be a matrix such that T (x) = Ax for all x R m. Then

Announcements Wednesday, November 7

Math 21b Final Exam Thursday, May 15, 2003 Solutions

1. In this problem, if the statement is always true, circle T; otherwise, circle F.

A = 3 1. We conclude that the algebraic multiplicity of the eigenvalues are both one, that is,

Review Notes for Linear Algebra True or False Last Updated: January 25, 2010

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Therefore, A and B have the same characteristic polynomial and hence, the same eigenvalues.

MATH 221, Spring Homework 10 Solutions

MATH 583A REVIEW SESSION #1

Math Final December 2006 C. Robinson

Eigenvalue and Eigenvector Problems

City Suburbs. : population distribution after m years

Dimension. Eigenvalue and eigenvector

EE5120 Linear Algebra: Tutorial 6, July-Dec Covers sec 4.2, 5.1, 5.2 of GS

Linear Algebra Practice Problems

Study Guide for Linear Algebra Exam 2

4 Matrix Diagonalization and Eigensystems

Spring 2019 Exam 2 3/27/19 Time Limit: / Problem Points Score. Total: 280

Third In-Class Exam Solutions Math 246, Professor David Levermore Thursday, 3 December 2009 (1) [6] Given that 2 is an eigenvalue of the matrix

Computational Methods CMSC/AMSC/MAPL 460. Eigenvalues and Eigenvectors. Ramani Duraiswami, Dept. of Computer Science

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

Definition (T -invariant subspace) Example. Example

Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Solutions Problem Set 8 Math 240, Fall

AMS526: Numerical Analysis I (Numerical Linear Algebra for Computational and Data Sciences)

MATH SOLUTIONS TO PRACTICE MIDTERM LECTURE 1, SUMMER Given vector spaces V and W, V W is the vector space given by

235 Final exam review questions

MATH 320: PRACTICE PROBLEMS FOR THE FINAL AND SOLUTIONS

Lecture 3 Eigenvalues and Eigenvectors

Diagonalization. Hung-yi Lee

MATH 304 Linear Algebra Lecture 23: Diagonalization. Review for Test 2.

2.3. VECTOR SPACES 25

Homogeneous Linear Systems of Differential Equations with Constant Coefficients

MATH 1553 PRACTICE MIDTERM 3 (VERSION B)

Math 304 Answers to Selected Problems

Math 2331 Linear Algebra

21 Linear State-Space Representations

6 Inner Product Spaces

Properties of Linear Transformations from R n to R m

ICS 6N Computational Linear Algebra Eigenvalues and Eigenvectors

Numerical Linear Algebra Homework Assignment - Week 2

Math 489AB Exercises for Chapter 2 Fall Section 2.3

GRE Subject test preparation Spring 2016 Topic: Abstract Algebra, Linear Algebra, Number Theory.

Final Exam. Linear Algebra Summer 2011 Math S2010X (3) Corrin Clarkson. August 10th, Solutions

4. Linear transformations as a vector space 17

ft-uiowa-math2550 Assignment NOTRequiredJustHWformatOfQuizReviewForExam3part2 due 12/31/2014 at 07:10pm CST

18.06 Problem Set 8 Solution Due Wednesday, 22 April 2009 at 4 pm in Total: 160 points.

MTH 464: Computational Linear Algebra

Math 315: Linear Algebra Solutions to Assignment 7

Supplement til SDL, Blok

(b) If a multiple of one row of A is added to another row to produce B then det(b) =det(a).

Linear Algebra Exercises

Recall : Eigenvalues and Eigenvectors

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

Math Bootcamp An p-dimensional vector is p numbers put together. Written as. x 1 x =. x p

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

Answer Keys For Math 225 Final Review Problem

We have already seen that the main problem of linear algebra is solving systems of linear equations.

Systems of Second Order Differential Equations Cayley-Hamilton-Ziebur

Linear algebra II Homework #1 solutions A = This means that every eigenvector with eigenvalue λ = 1 must have the form

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

Linear Algebra - Part II

MATHEMATICS 217 NOTES

Math 215 HW #11 Solutions

ACM 104. Homework Set 5 Solutions. February 21, 2001

18.S34 linear algebra problems (2007)

MATRICES ARE SIMILAR TO TRIANGULAR MATRICES

Eigenvalues and Eigenvectors 7.2 Diagonalization

Autonomous system = system without inputs

1. Foundations of Numerics from Advanced Mathematics. Linear Algebra

Eigenvalues and Eigenvectors

Transcription:

Eigenpairs and Diagonalizability Math 40, Spring 200, Professor David Levermore Eigenpairs Let A be an n n matrix A number λ possibly complex even when A is real is an eigenvalue of A if there exists a nonzero vector v possibly complex such that Av = λv Each such vector is an eigenvector associated with λ, and λ,v is an eigenpair of A Fact : If λ,v is an eigenpair of A then so is λ, αv for every complex α 0 In other words, if v is an eigenvector associated with an eigenvalue λ of A then so is αv for every complex α 0 In particular, eigenvectors are not unique Reason Because λ,v is an eigenpair of A you know that holds It follows that Aαv = αav = αλv = λαv Because the scalar α and vector v are nonzero, the vector αv is also nonzero Therefore λ, αv is also an eigenpair of A Finding Eigenvalues The characteristic polynomial of A is defined by 2 p A z = detzi A It has the form p A z = z n + π z n + π 2 z n 2 + + π n z + π n, where the coefficients π, π 2,, π n are real if A is real In other words, it is a monic polynomial of degree n One can show that in general In particular, when n = 2 one has π = tra, π n = n deta p A z = z 2 traz + deta Sometimes you will see the characteristic polynomial defined as deta zi Because detzi A = n deta zi, these two definitions of p A z coincides when n is even, and negatives of each other when n is odd Both conventions are common We have chosen the convention that makes p A z monic What matters most about p A z is its roots and their multiplicity, which are the same for both conventions Fact 2: A number λ is an eigenvalue of A if and only if p A λ = 0 In other words, the eigenvalues of A are the roots of p A z Reason If λ is an eigenvalue of A then by there exists a nonzero vector v such that λi Av = λv Av = 0 It follows that p A λ = detλi A = 0 Conversely, if p A λ = detλi A = 0 then there exists a nonzero vector v such that λi Av = 0 It follows that λv Av = λi Av = 0, whereby λ and v satisfy, which implies λ is an eigenvalue of A

2 Fact 2 shows that the eigenvalues of a n n matrix A can be found if you can find all the roots of the characteristic polynomial of A Because the degree of this characteristic polynomial is n, and because every polynomical of degree n has exactly n roots counting multiplicity, the n n matrix A therefore must have at least one eigenvalue and at most n eigenvalues Example Find the eigenvalues of A = 2 3 Solution The characteristic polynomial of A is p A z = z 2 6z + 5 = z z 5 By Fact 2 the eigenvalues of A are and 5 Example Find the eigenvalues of A = 2 3 Solution The characteristic polynomial of A is p A z = z 2 6z + 3 = z 3 2 + 4 = z 3 2 + 2 2 By Fact 2 the eigenvalues of A are 3 + i2 and 3 i2 4 Example Find the eigenvalues of A = 2 Solution The characteristic polynomial of A is By Fact 2 the only eigenvalue of A is 3 p A z = z 2 6z + 9 = z 3 2 Remark The above examples show all the possibilities that can arise for 2 2 real matrices namely, a 2 2 real matrix A can have either, two real eigenvalues, a conjugate pair of eigenvalues, or a single real eigenvalue 2 Finding Eigenvectors Once you have found the eigenvalues of a matrix you can find all the eigenvectors associated with each eigenvalue by finding a general nonzero solution of You can quickly find the eigenvectors for any 2 2 matrix A with help from the Cayley- Hamilton Theorem, which states that p A A = 0 The eigenvalues λ and λ 2 are the roots of p A z, so p A z = z λ z λ 2 Hence, by the Cayley-Hamilton Theorem 3 0 = p A A = A λ IA λ 2 I = A λ 2 IA λ I It follows that every nonzero column of A λ 2 I is an eigenvector associated with λ, and that every nonzero column of A λ I is an eigenvector associated with λ 2 This observation holds even in the case where λ = λ 2 Example Find the eigenpairs of A = 2 3 Solution We have already showed that the eigenvalues of A are and 5 Compute 2 2 2 2 A I =, A 5I =, 2 2 2 2

3 Every column of A 5I has the form α for some α 0, while every column of A I has the form α for some α 0 It follows from 3 that the eigenpairs of A are,, 5, Example Find the eigenpairs of A = 4 2 Solution We have already showed that the only eigenvalue of A is 3 Compute A 3I = Every column of A 3I has the form α for some α 0 It follows from 3 that an eigenpair of A is 3, Example Find the eigenpairs of A = 2 3 Solution We have already showed that the eigenvalues of A are 3+i2 and 3 i2 Compute i2 2 i2 2 A 3 + i2i =, A 3 i2i = 2 i2 2 i2 Every column of A 3 i2i has the form α for some α 0, i while every column of A 3 + i2i has the form α for some α 0 i It follows from 3 that the eigenpairs of A are 3 + i2,, i 3 i2, i

4 3 Real Matrices Notice that in the above example the eigenvectors associated with 3 i2 are complex conjugates to those associated with 3 + i2 This illustrates a particular instance of the following general fact about real matrices Fact 3: If λ,v is an eigenpair of a real matrix A then so is λ,v Reason Because λ, v is an eigenpair of A you know by that Av = λv Because A is real, the complex conjugate of this equation is Av = λv, where v is nonzero because v is nonzero It follows that λ,v is an eigenpair of A The examples given in the previous section illustrate particular instances of the following general facts Fact 4: Let λ be an eigenvalue of the real matrix A If λ is real then it has a real eigenvector If λ is not real then none of its eigenvectors are real Reason Let v be any eigenvector associated with λ, so that λ,v is an eigenpair of A Let λ = µ + iν and v = u + iw where µ and ν are real numbers and u and w are real vectors One then has Au + iaw = Av = λv = µ + iνu + iw = µu νw + iµw + νu, which is equivalent to Au µu = νw, and Aw µw = νu If ν = 0 then u and w will be real eigenvectors associated with λ whenever they are nonzero But at least one of u and w must be nonzero because v = u + iw is nonzero Conversely, if ν 0 and w = 0 then the second equation above implies u = 0 too, which contradicts the fact that at least one of u and w must be nonzero Hence, if ν 0 then w 0 2 Application: Solutions of First-Order Systems We are now ready to use eigenvalues and eigenvectors to construct solutions of first-order differential systems with a constant coefficient matrix The system we study is dx 4 dt = Ax, where xt is a vector and A is a real n n matrix We begin with the following basic fact Fact 5: If λ,v is an eigenpair of A then a solution of 4 is 5 xt = e λt v Reason By direct calculation we see that dx dt = d e λt v = e λt λv = e λt Av = A e λt v = Ax, dt whereby xt given by 5 solves 4 If λ, v is a real eigenpair of A then recipe 5 will yield a real solution of 4 But if λ is an eigenvalue of A that is not real then recipe 5 will not yield a real solution However, if we also use the solution associated with the conjugate eigenpair λ, v then we can construct two real solutions

5 6 Fact 6: Let λ,v be an eigenpair of A with λ = µ + iν and v = u + iw where µ and ν are real numbers while u and w are real vectors Then two real solutions of 4 are x t = Re e λt v = e µt u cosνt w sinνt, x 2 t = Im e λt v = e µt w cosνt + u sinνt Reason Because λ,v is an eigenpair of A, by Fact 3 so is λ,v By recipe 5 two solutions of 4 are e λt v and e λt v, which are complex conjugates of each other Because equation 4 is linear, it follows that two real solutions of 4 are given by x t = Re e λt v = eλt v + e λt v, x 2 t = Im e λt v = eλt v e λt v 2 i2 Because λ = µ + iν and v = u + iw we see that e λt v = e µt cosνt + i sinνt u + iv = e µt[ u cosνt w sinνt + i w cosνt + u sinνt ], whereby x t and x 2 t are read off from the real and imaginary parts, yielding 6 Example Find two linearly independent real solutions of dx = Ax, where A = dt 2 3 Solution By a previous example we know that A has the real eigenpairs,, 5, By recipe 5 the equation has the real solutions x t = e t, x 2 t = e 5t These solutions are linearly independent because det x 0 x 2 0 = det = 2 0 Example Find two linearly independent real solutions of dx dt = Ax, where A = 2 3 Solution By a previous example we know that A has the conjugate eigenpairs 3 + i2,, 3 i2, i i Because e 3+i2t i = e 3t cos2t + i sin2t cos2t + i sin2t = e i 3t, sin2t + i cos2t by recipe 6 the equation has the real solutions cos2t sin2t x t = e 3t, x sin2t 2 t = e 3t cos2t

6 These solutions are linearly independent because det x 0 x 2 0 0 = det = 0 0 3 Diagonalizable Matrices We call an n n matrix A diagonalizable when there exists an invertible matrix V and a diagonal matrix D such that A = V DV To diagonalize A means to find such a V and D We first show that A is diagonalizable if it has n linearly independent eigenvectors Fact 7: If a real n n matrix A has n eigenpairs, λ,v, λ 2,v 2,, λ n,v n, such that the eigenvectors v, v 2,, v n are linearly independent then 7 A = V DV, where V is the n n matrix whose columns are the vectors v, v 2,, v n ie 8 V = v v 2 v n, while D is the n n diagonal matrix λ 0 0 0 λ 9 D = 2 0 0 0 λ n Reason Underlying this result is the fact that AV = A v v 2 v n = Av Av 2 Av n = λ v λ 2 v 2 λ n v n λ 0 0 0 = 0 λ v v 2 v n 2 0 = V D 0 0 λ n Once we show that V is inverible then 7 follows upon multiplying the above relation on the left by V We claim that detv 0 because the vectors v, v 2,, v n are linearly independent Suppose otherwise Because detv = 0 there exists a nonzero vector c such that V c = 0 This means that c 0 = V c = c v v 2 v n 2 = c v + c 2 v 2 + + c n v n c n Because vectors v, v 2,, v n are linearly independent, this implies c = c 2 = = c n = 0, which contradicts the fact c is nonzero Therefore detv 0 Hence, the matrix V is invertible and 7 follows upon multiplying relation 0 on the left by V Fact 7 states that A is diagonalizable when it has n linearly independent eigenvectors The converse of this statement is also true

Fact 8: If a n n matrix A is diagonalizable then it has n linearly independent eigenvectors Reason Because A is diagonalizable it has the form A = V DV where the matrix V is invertible and the matrix D is diagonal Let the vectors v, v 2,, v n be the columns of V We claim these vectors are linearly independent Indeed, if 0 = c v + c 2 v 2 + + c n v n then because V = v v 2 v n we see that 0 = c v + c 2 v 2 + + c n v n = c v v 2 v n 2 = V c c n Because V is invertible, this implies that c = 0 The vectors v, v 2,, v n are therefore linearly independent Because V = v v 2 v n and because A = V DV where D has the form 9, we see that Av Av 2 Av n = A v v 2 v n = AV = V DV V = V D λ 0 0 = 0 λ v v 2 v 2 n 0 0 0 λ n = λ v λ 2 v 2 λ n v n Because the vectors v, v 2,, v n are linearly independent, they are all nonzero It then follows from the above relation that λ,v, λ 2,v 2,, λ n,v n are eigenpairs of A, such that the eigenvectors v, v 2,, v n are linearly independent Example Show that A = is diagonalizable, and diagonalize it 2 3 Solution By a previous example we know that A has the real eigenpairs,, 5, Because we also know the eigenvectors are linearly independent, A is diagonalizable Then 8 and 9 yield 0 V =, D = 0 5 Because detv = 2, one has V = 2 It follows from 7 that A is diagonalized as A = V DV = 0 2 0 5 c 7

8 Remark While not every matrix is diagonalizable, most matrices are Here we give four criteria that insure a real n n matrix A is diagonalizable If A has n distinct eigenvalues then it is diagonalizable If A is Hermitian A H = A then its eigenvalues are real λ j = λ j, and it will have n real eigenvectors v j that can be normalized so that v H j v k = δ jk With this normalization V = V H If A is skew-hermitian A H = A then its eigenvalues are imaginary λ j = λ j, and it will have n eigenvectors v j that can be normalized so that v H j v k = δ jk With this normalization V = V H If A is normal A H A = AA H then it will have n eigenvectors v j that can be normalized so that v H j v k = δ jk With this normalization V = V H Matrices that are either symmetric or skew-symmetric are also normal There are normal matrices that are neither symmetric nor skew-symmetric Because the normal criterion is harder to verify than the symmetric and skew-symmetric criteria, it should be checked last Both of the examples we have given above have distinct eigenvalues The first example is symmetric The second is normal, but is neither symmetric nor skew-symmetric