Eigenvalues and Eigenvectors

Similar documents
DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

and let s calculate the image of some vectors under the transformation T.

Math 3191 Applied Linear Algebra

Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Review Notes for Linear Algebra True or False Last Updated: January 25, 2010

Recall : Eigenvalues and Eigenvectors

Diagonalization of Matrix

Math Matrix Algebra

Symmetric and anti symmetric matrices

Study Guide for Linear Algebra Exam 2

MATH 1553-C MIDTERM EXAMINATION 3

Diagonalization. MATH 322, Linear Algebra I. J. Robert Buchanan. Spring Department of Mathematics

Therefore, A and B have the same characteristic polynomial and hence, the same eigenvalues.

A = 3 1. We conclude that the algebraic multiplicity of the eigenvalues are both one, that is,

Definition (T -invariant subspace) Example. Example

Chapter 5. Eigenvalues and Eigenvectors

Eigenvalues, Eigenvectors, and Diagonalization

Math 2331 Linear Algebra

Lecture 3 Eigenvalues and Eigenvectors

MAC Module 12 Eigenvalues and Eigenvectors. Learning Objectives. Upon completing this module, you should be able to:

2 Eigenvectors and Eigenvalues in abstract spaces.

Eigenvalues and Eigenvectors 7.1 Eigenvalues and Eigenvecto

Math 315: Linear Algebra Solutions to Assignment 7

EE5120 Linear Algebra: Tutorial 6, July-Dec Covers sec 4.2, 5.1, 5.2 of GS

Diagonalization. Hung-yi Lee

DM554 Linear and Integer Programming. Lecture 9. Diagonalization. Marco Chiarandini

Eigenpairs and Diagonalizability Math 401, Spring 2010, Professor David Levermore

Chapters 5 & 6: Theory Review: Solutions Math 308 F Spring 2015

235 Final exam review questions

Dimension. Eigenvalue and eigenvector

1 Last time: least-squares problems

Chapter 5 Eigenvalues and Eigenvectors

c c c c c c c c c c a 3x3 matrix C= has a determinant determined by

MAC Module 12 Eigenvalues and Eigenvectors

ICS 6N Computational Linear Algebra Eigenvalues and Eigenvectors

Econ Slides from Lecture 7

City Suburbs. : population distribution after m years

Computationally, diagonal matrices are the easiest to work with. With this idea in mind, we introduce similarity:

Solutions to practice questions for the final

π 1 = tr(a), π n = ( 1) n det(a). In particular, when n = 2 one has

Spring 2019 Exam 2 3/27/19 Time Limit: / Problem Points Score. Total: 280

Lecture 15, 16: Diagonalization

(a) II and III (b) I (c) I and III (d) I and II and III (e) None are true.

Jordan Canonical Form

Generalized Eigenvectors and Jordan Form

MATH 221, Spring Homework 10 Solutions

Eigenvalues and Eigenvectors

1. In this problem, if the statement is always true, circle T; otherwise, circle F.

Question: Given an n x n matrix A, how do we find its eigenvalues? Idea: Suppose c is an eigenvalue of A, then what is the determinant of A-cI?

Solutions to Final Exam

Eigenvalues and Eigenvectors

5.3.5 The eigenvalues are 3, 2, 3 (i.e., the diagonal entries of D) with corresponding eigenvalues. Null(A 3I) = Null( ), 0 0

PROBLEM SET. Problems on Eigenvalues and Diagonalization. Math 3351, Fall Oct. 20, 2010 ANSWERS

MATH SOLUTIONS TO PRACTICE MIDTERM LECTURE 1, SUMMER Given vector spaces V and W, V W is the vector space given by

Chapter 7. Canonical Forms. 7.1 Eigenvalues and Eigenvectors

Name: Final Exam MATH 3320

MA 265 FINAL EXAM Fall 2012

Calculating determinants for larger matrices

Eigenvalues and Eigenvectors 7.2 Diagonalization

MATH 1553 PRACTICE MIDTERM 3 (VERSION B)

ft-uiowa-math2550 Assignment NOTRequiredJustHWformatOfQuizReviewForExam3part2 due 12/31/2014 at 07:10pm CST

Jordan Canonical Form Homework Solutions

Lecture 12: Diagonalization

Math Final December 2006 C. Robinson

EIGENVALUE PROBLEMS. Background on eigenvalues/ eigenvectors / decompositions. Perturbation analysis, condition numbers..

MATH 304 Linear Algebra Lecture 33: Bases of eigenvectors. Diagonalization.

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

Math 314H Solutions to Homework # 3

Math 1553 Worksheet 5.3, 5.5

4. Linear transformations as a vector space 17

Eigenvalues and Eigenvectors

Math 489AB Exercises for Chapter 2 Fall Section 2.3

Eigenvalues and Eigenvectors

Announcements Wednesday, November 01

Schur s Triangularization Theorem. Math 422

AMS10 HW7 Solutions. All credit is given for effort. (-5 pts for any missing sections) Problem 1 (20 pts) Consider the following matrix 2 A =

MATH 423 Linear Algebra II Lecture 20: Geometry of linear transformations. Eigenvalues and eigenvectors. Characteristic polynomial.

Topic 1: Matrix diagonalization

Linear Algebra II Lecture 13

MAT1302F Mathematical Methods II Lecture 19

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

The Jordan Normal Form and its Applications

Eigenvalue and Eigenvector Homework

Eigenvalues for Triangular Matrices. ENGI 7825: Linear Algebra Review Finding Eigenvalues and Diagonalization

CAAM 335 Matrix Analysis

Math 240 Calculus III

Diagonalization. P. Danziger. u B = A 1. B u S.

Unit 5: Matrix diagonalization

MATH 1553 PRACTICE MIDTERM 3 (VERSION A)

LECTURE VII: THE JORDAN CANONICAL FORM MAT FALL 2006 PRINCETON UNIVERSITY. [See also Appendix B in the book]

Math 121 Practice Final Solutions

3.3 Eigenvalues and Eigenvectors

a 11 a 12 a 11 a 12 a 13 a 21 a 22 a 23 . a 31 a 32 a 33 a 12 a 21 a 23 a 31 a = = = = 12

Agenda: Understand the action of A by seeing how it acts on eigenvectors.

Homework sheet 4: EIGENVALUES AND EIGENVECTORS. DIAGONALIZATION (with solutions) Year ? Why or why not? 6 9

Third Midterm Exam Name: Practice Problems November 11, Find a basis for the subspace spanned by the following vectors.

Eigenvalue and Eigenvector Problems

Chapter 3. Determinants and Eigenvalues

6 Inner Product Spaces

Transcription:

Eigenvalues and Eigenvectors week -2 Fall 26 Eigenvalues and eigenvectors The most simple linear transformation from R n to R n may be the transformation of the form: T (x,,, x n ) (λ x, λ 2,, λ n x n ), x λ x T λ 2 x n λ n x n Example The linear transformation T : R 2 R 2, defined by [ [ [ 2 x T (x), is to dilate the first coordinate two times and the second coordinate three times Example 2 Let T : R 2 R 2 be a linear transformation defined by [ [ 4 2 x T (x) What can we say about T geometrically? Consider the basis B {u, u 2 } of R 2, where [ [ 2 u, u 2 Then [ 6 T (u ) [ 2 T (u 2 ) 6 For any vector v c u + c 2 u 2, we have [v B Thus [ c [ 2 [ 2 c 2, and u, 2u 2 T (v) c T (u ) + c 2 T (u 2 ) c u 2c 2 u 2, [ [T (v) B [ c 2c 2 2 [ c c 2 If the one uses the basis B to describe vector v with coordinate vector v, then the coordinate vector of T (v) under the basis B simply described as [ [ [v T (v) B 2 B This means that the matrix of T relative to the basis B is as simple as a diagonal matrix

The above discussion demonstrates that the nonzero vectors v satisfying the condition T (v) λv () for scalars λ is important to describe a linear transformation T Definition Given a linear transformation T : R n R n, T (x) Ax A nonzero vector v in R n is called an eigenvector of T (the matrix A) if there exists a scalar λ such that T (v) Av λv (2) The scalar λ is called an eigenvalue of T (the matrix A) and the nonzero vector v is called an eigenvector of T (of the matrix A) corresponding to the eigenvalue λ [ [ [ 6 6 Example Let A Then u is an eigenvector of A However, but v is 5 2 5 2 not an eigenvector of A Proposition 2 For any n n matrix A, the value is an eigenvalue of A det A Proof Note that the set of eigenvectors of A corresponding to the zero eigenvalue is the set Nul A {}; and A is invertible if and only if Nul A {} The theorem follows from the two facts Theorem If v, v 2,, v p be eigenvectors of a matrix A corresponding to distinct eigenvalues λ, λ 2,, λ p, respectively, then v, v 2,, v p are linearly independent Proof Let k be the smallest positive integer such that v, v 2,, v k are linearly independent If k p, nothing is to be proved If k < p, then v k+ is a linear combination of v,, v k ; that is, there exist constants c, c 2,, c k such that v k+ c v + c 2 v 2 + + c k v k Applying the matrix A to both sides, we have Av k+ λ k+ v k+ λ k+ (c v + c 2 v 2 + + c k v k ) c λ k+ v + c 2 λ k+ v 2 + + c k λ k+ v k ; Thus Av k+ A(c v + c 2 v 2 + + c k v k ) c Av + c 2 Av 2 + + c k Av k c λ v + c 2 λ 2 v 2 + + c k λ k v k c (λ k+ λ )v + c 2 (λ k+ λ 2 )v 2 + + c k (λ k+ λ k )v k Since v, v 2,, v k are linearly independent, we have Note that the eigenvalues are distinct Hence c (λ k+ λ ) c 2 (λ k+ λ 2 ) c k (λ k+ λ k ) c c 2 c k, which implies that v k+ is the zero vector This is contradictory to that v k+ 2

2 How to find eigenvectors? To find eigenvectors, it is meant to find vectors x and scalar λ such that that is, Ax λx, (2) (λi A)x (22) Since x is required to be nonzero, the system (22) is required to have nonzero solutions; we thus have Expanding the det(λi A), we see that det(λi A) (2) p(λ) det(λi A) is a polynomial of degree n in λ, called the characteristic polynomial of A To find eigenvalues of A, it is meant to find all roots of the polynomial p(λ) The polynomial equation (2) about λ is called the characteristic equation of A For an eigenvalue λ of A, the system (λi A)x is called the eigensystem for the eigenvalue λ; its solution set Nul (λi A) is called the eigenspace corresponding to the eigenvalue λ Theorem 2 The eigenvalues of a triangular matrix are the entries on its main diagonal Example 2 The matrix has the characteristic polynomial p(λ) 2 5 2 λ 2 λ 5 λ 2 (λ 2)2 (λ 5) Then there are two eigenvalues λ 2 and λ 2 5 For λ 2, the eigensystem λ 2 x λ 5 λ 2 x has two linearly independent eigenvectors v For λ 2 5, the eigensystem λ 2 2 λ 2 5 λ 2 2, v 2 x x has one linearly independent eigenvector v

Example 22 The matrix has the characteristic polynomial p(λ) 5 2 2 λ 2 λ 5 λ 2 (λ 2)2 (λ 5) We obtain two eigenvalues λ 2 and λ 2 5 For λ 2 (though it is of multiplicity 2), the eigensystem x has only one linearly independent eigenvector v For λ 2 5, eigen-system has one linearly independent eigenvector x v 2 9 2 Example 2 Find the eigenvalues and the eigenvectors for the matrix A The characteristic equation of A is det(λi A) (λ ) λ λ λ λ λ + 2 (λ + 2) λ λ + 2 (λ + 2) λ (R 2 R ) λ + 2 (λ + 2) Then A has two eigenvalues λ 2 and λ 2 7 For λ 2 (its multiplicity is 2), the eigen-system x (λ )(λ + 2)(λ 4) 8(λ + 2) (λ + 2) 2 (λ 7) 4

has two linearly independent eigenvectors v, v 2 For λ 7, the eigen-system 6 6 6 x has one linearly independent eigenvector v Theorem 22 Let λ, µ and ν be distinct eigenvalues of a matrix A Let u, u 2,, u p be linearly independent eigenvectors for the eigenvalue λ; v, v 2,, v q be linearly independent eigenvectors for the eigenvalue µ; and w, w 2,, w r be linearly independent eigenvectors for the eigenvalue ν Then the vectors together are linearly independent u, u 2,, u p, v, v 2,, v q, w, w 2,, w r Proof Suppose there are scalars a,, a p, b,, b q, c,, c r such that (a u + + a p u p ) + (b v + + b q v q ) + (c w + + c r w r ) (24) It suffices to show that all the scalars a,, a p, b,, b q, c,, c r are Set Note that u a u + + a p u p, v b v + + b q u q, w c w + + c r w r Au a Au + + a p Au p a λu + a p λu p λu Similarly, Av µv and Aw νw If u, then the linear independence of u,, u p implies that a a p Similarly, v implies b b q, and w implies c c r Now we claim that u v w If not, there are following three types Type : u, v w Since v w, it follows from (24) that u, a contradiction Type 2: u, v, w Then u is the eigenvector of A for the eigenvalue λ and v the eigenvector of A for the eigenvalue µ; they are eigenvectors for distinct eigenvalues So u and v are linearly independent But (24) shows that u + v, which means that u and v are linearly dependent, a contradiction Type : u, v, w This means that u, v, w are eigenvectors of A for distinct eigenvalues λ, µ, ρ respectively So they are linearly independent However, (24) shows that u + v + w, which means that u, v, w are linearly dependent, a contradiction again Note The above theorem is also true for more than three distinct eigenvalues Diagonalization Definition An n n matrix A is said to be similar to an n n matrix B if there is an invertible matrix P such that P AP B 5

Theorem 2 Similar matrices have the same characteristic polynomial and hence have the same eigenvalues Note Similar matrices may have different eigenvectors For instance, the matrices A 2 5 and B 2 5 2 2 have the same eigenvalues λ 2 and λ 2 5; but A and B have different eigenvectors A square matrix A is called diagonal if all non-diagonal entries are zero, that is, a a 2 A a n It is easy to see that for any k, A k a k a k 2 a k n Definition A square matrix A is said to be diagonalizable if A is similar to a diagonal matrix, that is, there exists an invertible matrix P and a diagonal matrix D such that P AP D Theorem 4 (Diagonalization Theorem) An n n matrix A is diagonalizable iff A has n linearly independent eigenvectors Proof We demonstrate the proof for the case n If A is diagonalizable, there exist an invertible matrix P and a diagonal matrix D such that P AP D, where P [u, v, w u v w u 2 v 2 w 2, D λ µ u v w ν Note that P AP D is equivalent to AP P D Since AP A[u, v, w [Au, Av, Aw and P D u v w u 2 v 2 w 2 λ µ λu µv νw λu 2 µv 2 νw 2 [λu, µv, νw, u v w ν λu µv νw we have [Au, Av, Aw [λu, µv, νw Thus Au λu, Av µv, Aw νw Since P is invertible, the vectors u, v, w are linearly independent It follows that u, v, w are three linearly independent eigenvectors of A Conversely, if A has three linear independent eigenvectors u, v, w corresponding to the eigenvalues λ, µ, ν respectively Then Au λu, Av µv, Aw νw Let P [u, v, w u v w u 2 v 2 w 2 u v w, D λ µ ν 6

Then AP A[u, v, w [Au, Av, Aw [λu, µv, νw λu µv νw λu 2 µv 2 νw 2 λu µv νw u v w u 2 v 2 w 2 λ µ P D u v w ν Since u, v, w are linearly independent, thus the matrix P is invertible Therefore This means that A is diagonalizable P AP D Example Diagonalize the matrix A and compute A 8 The characteristic polynomial of A is λ λi A λ R 2 + R λ R (λ )R (λ 2) (λ 2) 2 λ 2 λ 2 λ (λ 2)2 (λ ) There are two eigenvalues λ 2 and λ 2 For λ 2, the eigensystem x has two linearly independent eigenvectors v, v 2 For λ 2, the eigensystem 2 x has one linearly independent eigenvector Set P v, D 2 2 7

Then or equivalently, Thus P AP 2 P DP A D, A 8 (P DP )(P DP ) (P DP ) }{{} 8 28 2 8 8 8 8 2 8 8 2 8 8 2 8 8 8 2 8 2 8 8 2 8 8 2 9 8 Example 2 Compute the matrix A 8, where A 2 P D 8 P 2 The characteristic polynomial of A is λ λi A λ 2 λ (λ ) λ λ (λ )(λ 2 4λ) + 2λ λ(λ 2)(λ ) + 2 λ We have eigenvalues λ, λ 2 2, and λ For λ, x, v 2 Set For λ 2 2, For λ, 2 2 2 2 x x P [v, v 2, v, v 2 2, v /2 2 /2 8

Then Thus P AP 2 A 8 (P DP )(P DP ) (P DP ) }{{} 8 2 /2 2 8 2 8 2 7 2 7 2 8 2 7 2 7 2 8 8 D P 2 8 /2 /2 /2 /2 2 2 2 P 4 Complex eigenvalues Theorem 4 For a 2 2 matrix, if one of the eigenvalues of A is not a real number, then the other eigenvalue must be conjugate to this complex eigenvalue Let λ a bı with b be a complex eigenvalue of A and let x u + ıv be a complex eigenvector of A for λ, that is, A(u + ıv) (a bı)(u + ıv) Let P [u, v Then Proof [ P a b AP b a Ax Au + ıav, Ax λx (a bı)(u + ıv) (au + bv) + ı( bu + av) It follows that Thus Au au + bv, [ a b AP A[u, v [u, v b a Av bu + av [ a b P b a [ 5 2 Example 4 Let A Find a matrix P such that P AP is diagonal or antisymmetric [ λ 5 2 Solution Since λ λ 2 8λ + 7, the eigenvalues of A are complex numbers λ 8 ± 64 4 7 2 For λ 4 ı, we have the eigensystem [ ı 2 ı 4 ± ı [ [ x 9

Solving the system, we have the eigenvector [ [ [ x ı Let P [ [ Then P [ [ 5 2 [ [ + ı [ 4 4