Math 205, Summer I, Week 4b: Continued. Chapter 5, Section 8

Similar documents
Math 205, Summer I, Week 4b:

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

PROBLEM SET. Problems on Eigenvalues and Diagonalization. Math 3351, Fall Oct. 20, 2010 ANSWERS

Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

1. In this problem, if the statement is always true, circle T; otherwise, circle F.

Dimension. Eigenvalue and eigenvector

Jordan Canonical Form Homework Solutions

Problems for M 10/26:

MATH 304 Linear Algebra Lecture 34: Review for Test 2.

Linear Algebra Practice Final

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

4. Linear transformations as a vector space 17

Math Final December 2006 C. Robinson

Recall : Eigenvalues and Eigenvectors

Math 3191 Applied Linear Algebra

Math 1553 Worksheet 5.3, 5.5

Math 315: Linear Algebra Solutions to Assignment 7

A = 3 1. We conclude that the algebraic multiplicity of the eigenvalues are both one, that is,

and let s calculate the image of some vectors under the transformation T.

MAT 1302B Mathematical Methods II

Math 314/ Exam 2 Blue Exam Solutions December 4, 2008 Instructor: Dr. S. Cooper. Name:

City Suburbs. : population distribution after m years

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

Eigenvalues and Eigenvectors. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB

Question: Given an n x n matrix A, how do we find its eigenvalues? Idea: Suppose c is an eigenvalue of A, then what is the determinant of A-cI?

MATH 1553-C MIDTERM EXAMINATION 3

Diagonalization of Matrix

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix

HW2 - Due 01/30. Each answer must be mathematically justified. Don t forget your name.

Math 205, Summer I, Week 3a (continued): Chapter 4, Sections 5 and 6. Week 3b. Chapter 4, [Sections 7], 8 and 9

Diagonalization. MATH 322, Linear Algebra I. J. Robert Buchanan. Spring Department of Mathematics

Math 217: Eigenspaces and Characteristic Polynomials Professor Karen Smith

Announcements Monday, November 06

Solving a system by back-substitution, checking consistency of a system (no rows of the form

Chapter 5. Eigenvalues and Eigenvectors

Practice problems for Exam 3 A =

Chapters 5 & 6: Theory Review: Solutions Math 308 F Spring 2015

Eigenvalues, Eigenvectors, and Diagonalization

Definition (T -invariant subspace) Example. Example

EXAM. Exam #3. Math 2360, Spring April 24, 2001 ANSWERS

1. Linear systems of equations. Chapters 7-8: Linear Algebra. Solution(s) of a linear system of equations (continued)

22m:033 Notes: 7.1 Diagonalization of Symmetric Matrices

Eigenvalues and Eigenvectors A =

Math 215 HW #9 Solutions

MATH 1553 PRACTICE MIDTERM 3 (VERSION A)

Study Guide for Linear Algebra Exam 2

MATH 304 Linear Algebra Lecture 33: Bases of eigenvectors. Diagonalization.

CHAPTER 5 REVIEW. c 1. c 2 can be considered as the coordinates of v

Properties of Linear Transformations from R n to R m

Announcements Wednesday, November 01

Generalized Eigenvectors and Jordan Form

Quizzes for Math 304

ft-uiowa-math2550 Assignment NOTRequiredJustHWformatOfQuizReviewForExam3part2 due 12/31/2014 at 07:10pm CST

Linear Algebra. Rekha Santhanam. April 3, Johns Hopkins Univ. Rekha Santhanam (Johns Hopkins Univ.) Linear Algebra April 3, / 7

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017

After we have found an eigenvalue λ of an n n matrix A, we have to find the vectors v in R n such that

Glossary of Linear Algebra Terms. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB

TMA Calculus 3. Lecture 21, April 3. Toke Meier Carlsen Norwegian University of Science and Technology Spring 2013

Calculating determinants for larger matrices

MATH 221, Spring Homework 10 Solutions

MATH 1553 PRACTICE MIDTERM 3 (VERSION B)

c c c c c c c c c c a 3x3 matrix C= has a determinant determined by

Lec 2: Mathematical Economics

Announcements Wednesday, November 7

Examples True or false: 3. Let A be a 3 3 matrix. Then there is a pattern in A with precisely 4 inversions.

Math 20F Practice Final Solutions. Jor-el Briones

Lecture Summaries for Linear Algebra M51A

Eigenvalues, Eigenvectors. Eigenvalues and eigenvector will be fundamentally related to the nature of the solutions of state space systems.

Announcements Wednesday, November 7

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

Linear Algebra Practice Problems

MAC Module 12 Eigenvalues and Eigenvectors. Learning Objectives. Upon completing this module, you should be able to:

MATH 304 Linear Algebra Lecture 23: Diagonalization. Review for Test 2.

MATH 320, WEEK 11: Eigenvalues and Eigenvectors

DM554 Linear and Integer Programming. Lecture 9. Diagonalization. Marco Chiarandini

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL

Math 304 Fall 2018 Exam 3 Solutions 1. (18 Points, 3 Pts each part) Let A, B, C, D be square matrices of the same size such that

235 Final exam review questions

2. Every linear system with the same number of equations as unknowns has a unique solution.

Topic 1: Matrix diagonalization

Definitions for Quizzes

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

Math 110 Linear Algebra Midterm 2 Review October 28, 2017

Example: Filter output power. maximization. Definition. Eigenvalues, eigenvectors and similarity. Example: Stability of linear systems.

Linear Algebra- Final Exam Review

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

Chapter 5 Eigenvalues and Eigenvectors

Announcements Wednesday, November 01

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

Computationally, diagonal matrices are the easiest to work with. With this idea in mind, we introduce similarity:

EIGENVALUES AND EIGENVECTORS

Eigenvalues and Eigenvectors

Practice Final Exam. Solutions.

Eigenvalue and Eigenvector Homework

Third Midterm Exam Name: Practice Problems November 11, Find a basis for the subspace spanned by the following vectors.

What is on this week. 1 Vector spaces (continued) 1.1 Null space and Column Space of a matrix

Econ Slides from Lecture 7

Solutions Problem Set 8 Math 240, Fall

Further Mathematical Methods (Linear Algebra) 2002

ft-uiowa-math2550 Assignment OptionalFinalExamReviewMultChoiceMEDIUMlengthForm due 12/31/2014 at 10:36pm CST

Transcription:

Math 205, Summer I, 2016 Week 4b: Continued Chapter 5, Section 8

2 5.8 Diagonalization [reprint, week04: Eigenvalues and Eigenvectors] + diagonaliization 1. 5.8 Eigenspaces, Diagonalization A vector v 0 in R n (or in C n ) is an eigenvector with eigenvalue λ of an n-by-n matrix A if A v = λ v. We re-write the vector equation as (A λi n ) v = 0, which is a homogeneous system with coef matrix (A λi), and we want λ so that the system has a non-trivial solution. We see that the eigenvalues are the roots of the characteristic polynomial P (λ) = 0, where P (λ) = det(a λi). To find the eigenvectors we find the distinct roots λ = λ i, and for each i solve (A λ i I) v = 0. Problem 1. Find the eigenvalues and eigenvectors 3 1 of A = 5 1 Solution. P (λ) = det(a λi 2 ) = 3 λ 1 5 1 λ = ( 1 λ)(3 λ) 5 = λ 2 2λ 3 5 = λ 2 2λ 8 = (λ 4)(λ + 2), so λ 1 = 4, λ 2 = 2. For λ 1 = 4, we reduce the coef matrix of the system (A 4I) x = 0,

A 4I = ( 3 4 1 ) 5 1 4 1 1 = 5 5 1 1, 0 0 so for x 2 = a we get x 1 = a, and the eigenvectors for λ 1 = 4 are the vectors (x 1, x 2 ) = a( 1, 1) with a 0, from the space with basis v 1 = ( 1, 1). For λ 2 = 2, we start over with coef 3 + 2 1 A + 2I = 5 1 + 2 ( 5 1 0 0 To find a spanning set without fractions, we anticipate that finding x 1 will use division ). by 5, and take x 2 = 5b. Then 5x 1 x 2 = 0 gives 5x 1 = 5b, so x 1 = b, and (x 1, x 2 ) = b(1, 5), spanned by v 2 = (1, 5) [or, if you must, ( 1 5, 1)]. Example 1 (diagonalization): We write v 1, v 2 as columns, then 1 1 4 0 S 1 AS = D with S = and D = diag(4, 2) =. 1 5 0 2 Problem 2. Find the eigenvalues and eigenvectors 10 12 8 of A = 0 2 0. 8 12 6 Solution. Since A is 3-by-3, the characteristic polynomial P (λ) is cubic,...... expansion on the 2nd row gives: we get P (λ) = (2 λ) 10 λ 8 8 6 λ = (2 λ)[(10 λ)( 6 λ) + 64] = (2 λ)[λ 2 4λ 60 + 64] = (λ 2) 3. 3

4 Anyway, this matrix only has one distinct eigenvalue, λ 1 = 2. To find the eigenvectors, we reduce the coef matrix A 2I 10 2 12 8 = 0 2 2 0 2 3 2 0 0 0. 8 12 6 2 0 0 0 We see that x 2 and x 3 will be free variables, and anticipating that finding x 1 will involve division by 2, take x 2 = 2s and x 3 = t, so x 1 = 3s t; and (x 1, x 2, x 3 ) = (3s t, 2s, t) = s(3, 2, 0) + t( 1, 0, 1), has LI spanning vectors (3, 2, 0) and ( 1, 0, 1), giving eigenvectors except when s = t = 0. Example 2 (diagonalization): NOT diagonalizable, since either This 3-by-3 matrix is (1) we need 3 LI eigenvectors, but only have 2; OR (2) the repeated eigenvalue λ 1 = 2 has multiplicity m 1 = 3, but the dimension of the eigenspace E 2 is d 1 = 2, with m 1 d 1.

Problem 3. Find the eigenvalues and eigenvectors 3 2 of A =. 4 1 Solution. P (λ) = 3 λ 2 4 1 λ = [(3 λ)( 1 λ) + 8] = λ2 2λ 3 + 8 = λ 2 2λ + 5, with roots λ = 1 ± 2i. So there are no real λ for which there is a non-trivial real solution x R 2. But there is a complex solution 0 x C 2, and we solve for x by reducing the coef matrix. The eigenvalues are complex conjugates, say λ 1 = 1 2i, λ 2 = λ 1 = 1 + 2i. We only need to solve one of the systems. For λ 1 = 1 2i, A (1 2i)I = 2 + 2i 2 4 2 + 2i 2 1 + i. 0 0 The above step is semi-legal cheating; we know the result without doing the computation. 5

6 Presuming that the roots are correct(!), we know that there is a non-trivial solution, so there s a free-variable; and there must be a row in the RREF without a leading 1, and so a 0-row. We pick the row with real first entry, and know that the other row of the RREF is 0. We can also do the calculation without cheating, the first row 1 will be ( 1 2 ( 1 + i) ), so we check that (2 + 2i) times the first row is the 2nd row (getting 0 after subtracting), for which we need (and check) (2 + 2i) 1 2 ( 1 + i) = 2. As basis for the solution we take v 1 = (1 i, 2), and then get v 2 = (1 + i, 2) for λ 2, since an eigenvector for the conjugate value is the conjugate vector (when A has real entries). Example 3 (diagonalization): The eigenvalues are distinct, so this matrix is diagonalizable, and the matrix S uses (complex) eigenvectors v 1, v 2 just as in Example 1. Recall that an n-by-n matrix A is diagonalizable if and only if A has n linearly independent eigenvectors; and, we have a definition, matrices A and B are similar if there is an n-by-n matrix S so that S has an inverse and B = S 1 AS. We also recall that a diagonal matrix D = diag(d 1,..., d n ) is a square matrix with all entries 0 except (possibly) on the main diagonal, where the entries are d 1,..., d n (in that order). Finally, a matrix A is diagonalizable if there is a matrix S so that S 1 AS = D is a diagonal matrix.

7 Theorem. The matrix A is diagonalizable exactly when A has n linearly independent eigenvectors. Further, (1) the diagonal entries in the diagonalization are the eigenvalues, each occurring as many times on the diagonal as the multiplicity m j ; (2) the columns of the matrix S that diagonalizes A are n LI eigenvectors of A; and, (3) the j-th column of S has eigenvalue that is the j-th entry on the diagonal. Problem 4. For each eigenvalue, find the multiplicity and a basis for the eigenspace and determine whether A = 4 1 6 4 0 7 is diagonalizable. 0 0 3 Solution. Expanding det(a λi) on the 3rd row, we have P (λ) = ( 3 λ) 4 λ 1 4 λ, = (3 + λ)[ 4λ + λ 2 + 4] = (3 + λ)(λ 2) 2, so λ 1 = 2 is an eigenvalue with multiplicity 2, and λ 2 = 3 is a simple root (multiplicity 1). For λ 1 = 2, A 2I = 2 1 6 4 2 7 2 1 0 4 2 0 2 1 0 0 0 1. 0 0 5 0 0 1 0 0 0 We already have sufficient info to determine that A is not diagonalizable: Rank(A 2I)=2, so the dimension of the eigenspace (nullspace!) is 3-2 =1, less than the multiplicity. We still need bases of the eigenspaces; and see that v 1 = ( 1, 2, 0) is a basis for λ 1 = 2.

8 For λ 2 = 3, A ( 3I) = A + 3I = 7 1 6 4 3 7 0 0 0 1 7 8 0 25 25 0 0 0 (we use R 1 R 1 + 2R 2 ; update, then use R 2 R 2 4R 1 ) 1 0 1 0 1 1 (R 1 R 1, R 2 1 25 R 2; update, then R 1 R 1 + 7R 2.) 0 0 0 So a basis is v 2 = ( 1, 1, 1), dim(e 3 )=3-2=1. Problem 5. Determine whether A = 1 3 1 1 1 1 is diagonalizable or not, 1 3 3 given P (λ) = (λ 2) 2 (λ + 1). Solution. Since λ 2 = 1 is a simple root it is non-defective; 1 3 1 and we only need to check λ 1 = 2. We have A 2I = 1 3 1 1 3 1 1 3 1 0 0 0, so dim(e 2 ) = 3 1 = 2, and A is diagonalizable. 0 0 0 We re not asked for a basis here, but for practice, x 2 = a, x 3 = b and x 1 + 3x 2 x 3 = 0 gives x 1 = 3a + b, (x 1, x 2, x 3 ) = ( 3a + b, a, b) = ( 3a, a, 0) + (b, 0, b) = a( 3, 1, 0) + b(1, 0, 1), so a basis is v 1 = ( 3, 1, 0) and v 2 = (1, 0, 1).

9 Example 5 (diagonalization): We write v 1, v 2 as columns, and also v 3 = (1, 1, 1) as a column for λ = 1, then S = (v 1 v 2 v 3 ) gives S 1 AS = D, with D = diag(2, 2, 1). Finally, we collect the terminology and results used. If P (λ) written in factored form is P (λ) = (λ λ 1 ) m 1... (λ λ r ) m r, where the λ j are the distinct roots, we say that λ j has (algebraic) multiplicity m j, or that λ j is a simple root (multiplicity 1) if m j = 1. Let d j = dim(e λj ) be the dimension of the eigenspace (sometimes called the geometric multiplicity), then the main facts are 1. 1 d j m j ; 2. A is non-diagonalizable exactly when there is some eigenvalue (which must be a repeated root) that is defective, d j < m j ; 3. A is diagonalizable exactly when every eigenvalue is non-defective, d j = m j, for j = 1,..., r; 4. In particular, if P (λ) has distinct roots then A is always diagonalizable (1 d j m j = 1). We recall that m j - the algebraic multiplicity - is the number of times the jth distinct eigenvalue is a root of the characteristic polynomial; and d j - the geometric multiplicity - is the dimension of the λ j th eigenspace (that is, the nullspace of A λ j I). Problem 6. Compare eigenspaces and the defective condition for A = 4 0 0 0 2 3 and A 1 = 4 0 1 0 2 3. 0 2 1 0 2 1

10 Solution. Both matrices have char polyn P (λ) = (λ 4) 2 (λ + 1). Since λ = 1 a simple root (not repeated), it is non-defective. We have A 4I = 0 0 0 0 2 3 0 2 3 0 0 0 and 0 2 3 0 0 0 A 1 4I = 0 0 1 0 2 3 0 1 0 0 0 1. 0 2 3 0 0 0 These matrices test our eigenvector-finding-skills. For the 2nd, the equations say x 2 = x 3 = 0; while we must have a nontrivial solution. Do you see one? Don t Think! Our reflex is x 2, x 3 bound; x 1 = a free, so (x 1, x 2, x 3 ) = (a, 0, 0) = a(1, 0, 0). We also have v 1 = (1, 0, 0) an eigenvector for the first matrix, but there s also a 2nd LI solution, v 2 = (0, 3, 2). So the roots and multiplicities are the same, only one entry is changed, but A is diagonalizable; while A 1 is non-diagonalizable.