Announcements Monday, November 06

Similar documents
MAT 1302B Mathematical Methods II

Announcements Wednesday, November 7

Announcements Monday, November 13

Announcements Wednesday, November 7

Announcements Wednesday, November 01

Math 3191 Applied Linear Algebra

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Chapter 5. Eigenvalues and Eigenvectors

Lecture 15, 16: Diagonalization

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix

Diagonalization of Matrix

Diagonalization. MATH 322, Linear Algebra I. J. Robert Buchanan. Spring Department of Mathematics

Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Diagonalization. Hung-yi Lee

(the matrix with b 1 and b 2 as columns). If x is a vector in R 2, then its coordinate vector [x] B relative to B satisfies the formula.

Announcements Monday, October 29

Announcements Wednesday, November 01

Econ Slides from Lecture 7

Lecture 3 Eigenvalues and Eigenvectors

Recall : Eigenvalues and Eigenvectors

Announcements Monday, November 13

Math Matrix Algebra

1. In this problem, if the statement is always true, circle T; otherwise, circle F.

A = 3 1. We conclude that the algebraic multiplicity of the eigenvalues are both one, that is,

City Suburbs. : population distribution after m years

and let s calculate the image of some vectors under the transformation T.

MATH 1553 PRACTICE MIDTERM 3 (VERSION B)

Dimension. Eigenvalue and eigenvector

ICS 6N Computational Linear Algebra Eigenvalues and Eigenvectors

Announcements Monday, November 26

Jordan Canonical Form Homework Solutions

Math 205, Summer I, Week 4b:

Math 1553 Worksheet 5.3, 5.5

PROBLEM SET. Problems on Eigenvalues and Diagonalization. Math 3351, Fall Oct. 20, 2010 ANSWERS

ft-uiowa-math2550 Assignment NOTRequiredJustHWformatOfQuizReviewForExam3part2 due 12/31/2014 at 07:10pm CST

MATH 1553 PRACTICE MIDTERM 3 (VERSION A)

Eigenvalues and Eigenvectors 7.2 Diagonalization

4. Linear transformations as a vector space 17

MATH 1553-C MIDTERM EXAMINATION 3

Eigenvalues, Eigenvectors, and Diagonalization

EXAM. Exam #3. Math 2360, Spring April 24, 2001 ANSWERS

DM554 Linear and Integer Programming. Lecture 9. Diagonalization. Marco Chiarandini

Study Guide for Linear Algebra Exam 2

Eigenvalue and Eigenvector Homework

a 11 a 12 a 11 a 12 a 13 a 21 a 22 a 23 . a 31 a 32 a 33 a 12 a 21 a 23 a 31 a = = = = 12

Math 205, Summer I, Week 4b: Continued. Chapter 5, Section 8

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

Question: Given an n x n matrix A, how do we find its eigenvalues? Idea: Suppose c is an eigenvalue of A, then what is the determinant of A-cI?

MAC Module 12 Eigenvalues and Eigenvectors. Learning Objectives. Upon completing this module, you should be able to:

MAC Module 12 Eigenvalues and Eigenvectors

Therefore, A and B have the same characteristic polynomial and hence, the same eigenvalues.

Math 110 Linear Algebra Midterm 2 Review October 28, 2017

Spring 2019 Exam 2 3/27/19 Time Limit: / Problem Points Score. Total: 280

Calculating determinants for larger matrices

Lecture Notes: Eigenvalues and Eigenvectors. 1 Definitions. 2 Finding All Eigenvalues

Eigenvalues and Eigenvectors

Mathematical Methods for Engineers 1 (AMS10/10A)

Mon Mar matrix eigenspaces. Announcements: Warm-up Exercise:

Review Notes for Linear Algebra True or False Last Updated: January 25, 2010

Math 315: Linear Algebra Solutions to Assignment 7

Generalized Eigenvectors and Jordan Form

Eigenvalues for Triangular Matrices. ENGI 7825: Linear Algebra Review Finding Eigenvalues and Diagonalization

AMS10 HW7 Solutions. All credit is given for effort. (-5 pts for any missing sections) Problem 1 (20 pts) Consider the following matrix 2 A =

235 Final exam review questions

Diagonalization of Matrices

Name: Final Exam MATH 3320

Eigenvalues and Eigenvectors

MAT1302F Mathematical Methods II Lecture 19

MTH 464: Computational Linear Algebra

Announcements Monday, November 20

MTH 464: Computational Linear Algebra

Practice problems for Exam 3 A =

Eigenvalues and Eigenvectors

22m:033 Notes: 7.1 Diagonalization of Symmetric Matrices

Jordan Normal Form and Singular Decomposition

Computationally, diagonal matrices are the easiest to work with. With this idea in mind, we introduce similarity:

MATH 304 Linear Algebra Lecture 33: Bases of eigenvectors. Diagonalization.

Linear Algebra. Rekha Santhanam. April 3, Johns Hopkins Univ. Rekha Santhanam (Johns Hopkins Univ.) Linear Algebra April 3, / 7

Eigenvalues and Eigenvectors

1. Linear systems of equations. Chapters 7-8: Linear Algebra. Solution(s) of a linear system of equations (continued)

Announcements Monday, November 26

MATH 221, Spring Homework 10 Solutions

TMA Calculus 3. Lecture 21, April 3. Toke Meier Carlsen Norwegian University of Science and Technology Spring 2013

Linear Algebra II Lecture 13

Eigenvalues for Triangular Matrices. ENGI 7825: Linear Algebra Review Finding Eigenvalues and Diagonalization

Lecture 11: Diagonalization

c c c c c c c c c c a 3x3 matrix C= has a determinant determined by

Elementary Linear Algebra Review for Exam 3 Exam is Friday, December 11th from 1:15-3:15

Math 304 Fall 2018 Exam 3 Solutions 1. (18 Points, 3 Pts each part) Let A, B, C, D be square matrices of the same size such that

MATH 304 Linear Algebra Lecture 23: Diagonalization. Review for Test 2.

Definition (T -invariant subspace) Example. Example

Exercise Set 7.2. Skills

Section 5.5. Complex Eigenvalues

REVIEW FOR EXAM III SIMILARITY AND DIAGONALIZATION

Further Mathematical Methods (Linear Algebra) 2002

From Lay, 5.4. If we always treat a matrix as defining a linear transformation, what role does diagonalisation play?

Section 5.5. Complex Eigenvalues (Part II)

Eigenvalues and Eigenvectors

Lecture 11: Eigenvalues and Eigenvectors

Chapter 5 Eigenvalues and Eigenvectors

Transcription:

Announcements Monday, November 06 This week s quiz: covers Sections 5 and 52 Midterm 3, on November 7th (next Friday) Exam covers: Sections 3,32,5,52,53 and 55

Section 53 Diagonalization

Motivation: Difference equations Now do multiply matrices Many real-word (linear algebra problems): Start with a given situation (v 0) and want to know what happens after some time (iterate a transformation): v n = Av n = = A n v 0 Ultimate question: what happens in the long run (find v n as n ) Old Example Recall our example about rabbit populations: using eigenvectors was easier than matrix multiplications, but Taking powers of diagonal matrices is easy! Working with diagonalizable matrices is also easy We need to use the eigenvalues and eigenvectors of the dynamics

Powers of Diagonal Matrices If D is diagonal Then D n is also diagonal, the diagonal entries of D n are the nth powers of the diagonal entries of D Example D = D 2 = ( ) 2 0 0 3 ( ) 4 0 0 9 ( ) D n 2 n 0 = 0 3 n 0 0 M = 0 0 2, 0 0 3 0 0 M 2 = 0 0 4, 0 0 9 ( ) n 0 0 M n = 0 0 2 n 0 0 3 n

Powers of Matrices that are Similar to Diagonal Ones When is A is not diagonal? Example Let A = ( 2 4 ) Compute A n Using that A = PDP where P = From the first expression: ( ) 2 and D = ( ) 2 0 0 3 A 2 = (PDP )(PDP ) = PD(P P)DP = PDIDP = PD 2 P A 3 = (PDP )(PD 2 P ) = PD(P P)D 2 P = PDID 2 P = PD 3 P A n = PD n P Plug in P and D: ( ) ( A n 2 2 n = 0 0 3 n Closed formula in terms of n: easy to compute ) ( ) = 2 ( 2 n+ 3 n 2 n+ + 2 3 n 2 n 3 n 2 n + 2 3 n )

Diagonalizable Matrices Definition An n n matrix A is diagonalizable if it is similar to a diagonal matrix: A = PDP for D diagonal Important d 0 0 If A = PDP 0 d 22 0 for D = then 0 0 d nn d k 0 0 A k = PD K P 0 d22 k 0 = P P 0 0 dnn k So diagonalizable matrices are easy to raise to any power

Diagonalization The Diagonalization Theorem An n n matrix A is diagonalizable if and only if A has n linearly independent eigenvectors In this case, A = PDP for λ 0 0 P = v v 2 v n 0 λ 2 0 D =, 0 0 λ n where v, v 2,, v n are linearly independent eigenvectors, and λ, λ 2,, λ n are the corresponding eigenvalues (in the same order) Important If A has n distinct eigenvalues then A is diagonalizable Fact 2 in 5 lecture notes: eigenvectors with distinct eigenvalues are always linearly independent If A is diagonalizable matrix it need not have n distinct eigenvalues though

Diagonalization Example Problem: Diagonalize A = The characteristic polynomial is ( ) 2 4 f (λ) = λ 2 Tr(A) λ + det(a) = λ 2 5λ + 6 = (λ 2)(λ 3) Therefore the eigenvalues are 2 and 3 Let s compute some eigenvectors: ( ) 2 (A 2I )x = 0 x = 0 rref ( ) 2 x = 0 2 0 0 The parametric form is x = 2y, so v = ( 2 ) is an eigenvector with eigenvalue 2 ( ) 2 2 (A 3I )x = 0 x = 0 rref ( ) x = 0 0 0 The parametric form is x = y, so v 2 = ( ) is an eigenvector with eigenvalue 3 The eigenvectors v, v 2 are linearly independent, so the Diagonalization Theorem says A = PDP for P = ( ) 2 D = ( ) 2 0 0 3

Diagonalization Example 2 4 3 0 Problem: Diagonalize A = 2 0 The characteristic polynomial is f (λ) = det(a λi ) = λ 3 + 4λ 2 5λ + 2 = (λ ) 2 (λ 2) Therefore the eigenvalues are and 2, with respective multiplicities 2 and First compute the -eigenspace: 3 3 0 (A I )x = 0 2 2 0 x = 0 rref 0 0 0 0 x = 0 0 0 0 0 x 0 The parametric vector form is y = y + z 0 z 0 Hence a basis for the -eigenspace is B = { } 0 v, v 2 where v =, v 2 = 0 0

Diagonalization Example 2, continued Now let s compute the 2-eigenspace: 2 3 0 (A 2I )x = 0 2 3 0 x = 0 rref 0 3 0 2 x = 0 0 0 0 The parametric form is x = 3z, y = 2z, so an eigenvector with eigenvalue 2 is 3 v 3 = 2 Note that v, v 2 form a basis for the -eigenspace, and v 3 has a distinct eigenvalue Thus, the eigenvectors v, v 2, v 3 are linearly independent and the Diagonalization Theorem says 0 3 0 0 A = PDP for P = 0 2 D = 0 0 0 0 0 2 In this case: there are 3 linearly independent eigenvectors and only 2 distinct eigenvalues

Diagonalization Procedure How to diagonalize a matrix A: Find the eigenvalues of A using the characteristic polynomial 2 Compute a basis B λ for each λ-eigenspace of A 3 If there are fewer than n total vectors in the union of all of the eigenspace bases B λ, then the matrix is not diagonalizable 4 Otherwise, the n vectors v, v 2,, v n in your eigenspace bases are linearly independent, and A = PDP for λ 0 0 P = v v 2 v n 0 λ 2 0 and D =, 0 0 λ n where λ i is the eigenvalue for v i

Diagonalization A non-diagonalizable matrix Problem: Show that A = The characteristic polynomial is ( ) is not diagonalizable 0 f (λ) = det(a λi ) = (λ ) 2 Let s compute the -eigenspace: ( ) 0 (A I )x = 0 x = 0 0 0 ( ) A basis for the -eigenspace is ; solution has only one free variable! 0 Conclusion: ( ) All eigenvectors of A are multiples of 0 So A has only one linearly independent eigenvector If A was diagonalizable, there would be two linearly independent eigenvectors!

Poll Poll Which of the following matrices are diagonalizable, and why? ( ) ( ) ( ) ( ) 2 2 2 2 0 A B C D 0 0 2 0 2 0 2 Matrix D is already diagonal! Matrix B is diagonalizable because it has two distinct eigenvalues Matrices A and C are not diagonalizable: Same argument as previous slide: ( ) All eigenvectors are multiples of 0

Non-Distinct Eigenvalues Definition Let λ be an eigenvalue of a square matrix A The geometric multiplicity of λ is the dimension of the λ-eigenspace Theorem Let λ be an eigenvalue of a square matrix A Then (the geometric multiplicity of λ) (the algebraic multiplicity of λ) Note: If λ is an eigenvalue, then the λ-eigenspace has dimension at least but it might be smaller than what the characteristic polynomial suggests The intuition/visualisation is beyond the scope of this course Multiplicities all one If there are n eigenvalues all with algebraic multiplicity (so does the geometric multiplicities), then their corresponding eigenvectors are linearly independent Therefore A is diagonalizable

Non-Distinct Eigenvalues (Good) examples From previous exercises we know: Example 4 3 0 The matrix A = 2 0 has characteristic polynomial f (λ) = (λ ) 2 (λ 2) ( ) 2 The matrix B = has characteristic polynomial 4 f (λ) = ( λ)(4 λ) + 2 = (λ 2)(λ 3) Matrix A Geom M Alg M λ = 2 2 λ = 2 Thus, both matrices are diagonalizable Matrix B Geom M Alg M λ = 2 λ = 3

Non-Distinct Eigenvalues (Bad) example Example ( ) The matrix A = has characteristic polynomial f (λ) = (λ ) 2 0 We showed before that the -eigenspace has dimension and A was not diagonalizable The geometric multiplicity is smaller than the algebraic Eigenvalue Geometric Algebraic λ = 2 The Diagonalization Theorem (Alternate Form) Let A be an n n matrix The following are equivalent: A is diagonalizable 2 The sum of the geometric multiplicities of the eigenvalues of A equals n 3 The sum of all algebraic multiplicities is n And for each eigenvalue, the geometric and algebraic multiplicity are equal

Applications to Difference Equations Let D = ( ) 0 0 /2 Start with a vector v 0, and let v = Dv 0, v 2 = Dv,, v n = D n v 0 Question: What happens to the v i s for different starting vectors v 0? Answer: Note that D is diagonal, so ( ) ( D n a n = b If we start with v 0 = ( ) a, then b 0 0 /2 n the x-coordinate equals the initial coordinate, the y-coordinate gets halved every time ) ( ) ( ) a a = b b/2 n

Applications to Difference Equations Picture D ( ) ( ) ( ) ( ) a 0 a a = = b 0 /2 b b/2 v 0 v v 2 v 3 v 4 e 2 e -eigenspace /2-eigenspace So all vectors get collapsed into the x-axis, which is the -eigenspace

Applications to Difference Equations More complicated example Let A = ( ) 3/4 /4 /4 3/4 Start with a vector v 0, and let v = Av 0, v 2 = Av,, v n = A n v 0 Question: What happens to the v i s for different starting vectors v 0? Matrix Powers: This is a diagonalization question Bottom line: A = PDP for ( ) ( ) 0 P = D = 0 /2 Hence v n = PD n P v 0 Details: The characteristic polynomial is f (λ) = λ 2 Tr(A) λ + det(a) = λ 2 3 2 λ + ( 2 = (λ ) λ ) 2 We compute eigenvectors with eigenvalues and /2 to be, respectively, ( ) ( ) w = w 2 =

Applications to Difference Equations Picture of the more complicated example A n = PD n P acts on the usual coordinates of v 0 in the same way that D n acts on the B-coordinates, where B = {w, w 2} /2-eigenspace -eigenspace v 0 v v 2 v 3 v 4 w w 2 So all vectors get collapsed into the -eigenspace

Extra: Proof Diagonalization Theorem Why is the Diagonalization Theorem true? A diagonalizable implies A has n linearly independent eigenvectors: Suppose A = PDP, where D is diagonal with diagonal entries λ, λ 2,, λ n Let v, v 2,, v n be the columns of P They are linearly independent because P is invertible So Pe i = v i, hence P v i = e i Av i = PDP v i = PDe i = P(λ i e i ) = λ i Pe i = λ i v i Hence v i is an eigenvector of A with eigenvalue λ i So the columns of P form n linearly independent eigenvectors of A, and the diagonal entries of D are the eigenvalues A has n linearly independent eigenvectors implies A is diagonalizable: Suppose A has n linearly independent eigenvectors v, v 2,, v n, with eigenvalues λ, λ 2,, λ n Let P be the invertible matrix with columns v, v 2,, v n Let D = P AP De i = P APe i = P Av i = P (λ i v i ) = λ i P v i = λ i e i Hence D is diagonal, with diagonal entries λ, λ 2,, λ n Solving D = P AP for A gives A = PDP