Math 205, Summer I, Week 4b:

Similar documents
Math 205, Summer I, Week 4b: Continued. Chapter 5, Section 8

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

1. In this problem, if the statement is always true, circle T; otherwise, circle F.

Recall : Eigenvalues and Eigenvectors

Math 3191 Applied Linear Algebra

PROBLEM SET. Problems on Eigenvalues and Diagonalization. Math 3351, Fall Oct. 20, 2010 ANSWERS

and let s calculate the image of some vectors under the transformation T.

Diagonalization of Matrix

Question: Given an n x n matrix A, how do we find its eigenvalues? Idea: Suppose c is an eigenvalue of A, then what is the determinant of A-cI?

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix

City Suburbs. : population distribution after m years

Jordan Canonical Form Homework Solutions

Math 314/ Exam 2 Blue Exam Solutions December 4, 2008 Instructor: Dr. S. Cooper. Name:

MAT 1302B Mathematical Methods II

Diagonalization. MATH 322, Linear Algebra I. J. Robert Buchanan. Spring Department of Mathematics

Math 315: Linear Algebra Solutions to Assignment 7

Dimension. Eigenvalue and eigenvector

22m:033 Notes: 7.1 Diagonalization of Symmetric Matrices

Eigenvalues and Eigenvectors. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB

Linear Algebra Practice Final

Eigenvalues, Eigenvectors, and Diagonalization

Math Final December 2006 C. Robinson

ft-uiowa-math2550 Assignment NOTRequiredJustHWformatOfQuizReviewForExam3part2 due 12/31/2014 at 07:10pm CST

MAC Module 12 Eigenvalues and Eigenvectors. Learning Objectives. Upon completing this module, you should be able to:

MATH 221, Spring Homework 10 Solutions

Problems for M 10/26:

Math 1553 Worksheet 5.3, 5.5

HW2 - Due 01/30. Each answer must be mathematically justified. Don t forget your name.

4. Linear transformations as a vector space 17

MATH 304 Linear Algebra Lecture 34: Review for Test 2.

Definition (T -invariant subspace) Example. Example

Announcements Monday, November 06

Chapter 5. Eigenvalues and Eigenvectors

A = 3 1. We conclude that the algebraic multiplicity of the eigenvalues are both one, that is,

Chapters 5 & 6: Theory Review: Solutions Math 308 F Spring 2015

Properties of Linear Transformations from R n to R m

c c c c c c c c c c a 3x3 matrix C= has a determinant determined by

Practice problems for Exam 3 A =

1. Linear systems of equations. Chapters 7-8: Linear Algebra. Solution(s) of a linear system of equations (continued)

Linear Algebra. Rekha Santhanam. April 3, Johns Hopkins Univ. Rekha Santhanam (Johns Hopkins Univ.) Linear Algebra April 3, / 7

MATH 304 Linear Algebra Lecture 33: Bases of eigenvectors. Diagonalization.

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

235 Final exam review questions

Announcements Wednesday, November 01

Spring 2019 Exam 2 3/27/19 Time Limit: / Problem Points Score. Total: 280

MAC Module 12 Eigenvalues and Eigenvectors

Quizzes for Math 304

Glossary of Linear Algebra Terms. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB

Computationally, diagonal matrices are the easiest to work with. With this idea in mind, we introduce similarity:

ft-uiowa-math2550 Assignment OptionalFinalExamReviewMultChoiceMEDIUMlengthForm due 12/31/2014 at 10:36pm CST

Econ Slides from Lecture 7

Calculating determinants for larger matrices

Math 217: Eigenspaces and Characteristic Polynomials Professor Karen Smith

Linear Algebra Practice Problems

Announcements Wednesday, November 01

Examples True or false: 3. Let A be a 3 3 matrix. Then there is a pattern in A with precisely 4 inversions.

EXAM. Exam #3. Math 2360, Spring April 24, 2001 ANSWERS

Eigenvalue and Eigenvector Homework

Eigenvalues and Eigenvectors A =

Math 215 HW #9 Solutions

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

EIGENVALUES AND EIGENVECTORS

Third Midterm Exam Name: Practice Problems November 11, Find a basis for the subspace spanned by the following vectors.

Study Guide for Linear Algebra Exam 2

MATH 304 Linear Algebra Lecture 23: Diagonalization. Review for Test 2.

MATH 1553-C MIDTERM EXAMINATION 3

CHAPTER 5 REVIEW. c 1. c 2 can be considered as the coordinates of v

2. Every linear system with the same number of equations as unknowns has a unique solution.

MATH 1553 PRACTICE MIDTERM 3 (VERSION B)

Math 304 Fall 2018 Exam 3 Solutions 1. (18 Points, 3 Pts each part) Let A, B, C, D be square matrices of the same size such that

Eigenvalues and Eigenvectors

Therefore, A and B have the same characteristic polynomial and hence, the same eigenvalues.

Topic 1: Matrix diagonalization

2 Eigenvectors and Eigenvalues in abstract spaces.

ICS 6N Computational Linear Algebra Eigenvalues and Eigenvectors

Lecture 12: Diagonalization

TMA Calculus 3. Lecture 21, April 3. Toke Meier Carlsen Norwegian University of Science and Technology Spring 2013

Math 110 Linear Algebra Midterm 2 Review October 28, 2017

Practice Final Exam. Solutions.

Lec 2: Mathematical Economics

Lecture Summaries for Linear Algebra M51A

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL

Lecture 15, 16: Diagonalization

Chapter 5 Eigenvalues and Eigenvectors

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

Solutions to Final Practice Problems Written by Victoria Kala Last updated 12/5/2015

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

MATH 220 FINAL EXAMINATION December 13, Name ID # Section #

DM554 Linear and Integer Programming. Lecture 9. Diagonalization. Marco Chiarandini

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017

Further Mathematical Methods (Linear Algebra) 2002

LU Factorization. A m x n matrix A admits an LU factorization if it can be written in the form of A = LU

Eigenspaces. (c) Find the algebraic multiplicity and the geometric multiplicity for the eigenvaules of A.

(a) II and III (b) I (c) I and III (d) I and II and III (e) None are true.

Review problems for MA 54, Fall 2004.

MATH 320, WEEK 11: Eigenvalues and Eigenvectors

Chapter 3. Determinants and Eigenvalues

Transcription:

Math 205, Summer I, 2016 Week 4b: Chapter 5, Sections 6, 7 and 8 (5.5 is NOT on the syllabus) 5.6 Eigenvalues and Eigenvectors 5.7 Eigenspaces, nondefective matrices 5.8 Diagonalization [*** See next slide ***] A vector v 0 in R n (or in C n ) is an eigenvector with eigenvalue λ of an n-by-n matrix A if A v = λ v. We re-write the vector equation as (A λi n ) v = 0, which is a homogeneous system with coef matrix (A λi), and we want λ so that the system has a non-trivial solution. We see that the eigenvalues are the roots of the characteristic polynomial P (λ) = 0, where P (λ) = det(a λi). To find the eigenvectors we find the distinct roots λ = λ i, and for each i solve (A λ i I) v = 0. Problem 1. Find the eigenvalues and eigenvectors ( ) 3 1 of A = 5 1

2 P (λ) = det(a λi 2 ) = 3 λ 5 1 1 λ = ( 1 λ)(3 λ) 5 = λ 2 2λ 3 5 = λ 2 2λ 8 = (λ 4)(λ + 2), so λ 1 = 4, λ 2 = 2. For λ 1 = 4, we reduce the coef matrix of the system (A 4I) x = 0, ( ) ( ) ( ) 3 4 1 1 1 1 1 A 4I = =, 5 1 4 5 5 0 0 so for x 2 = a we get x 1 = a, and the eigenvectors for λ 1 = 4 are the vectors (x 1, x 2 ) = a( 1, 1) with a 0, from the space with basis ( 1, 1). For λ 2 = 2, we start over with coef ( ) 3 + 2 1 A + 2I = 5 1 + 2 ( 5 1 0 0 To find a spanning set without fractions, we anticipate that finding x 1 will use division ). by 5, and take x 2 = 5b. Then 5x 1 x 2 = 0 gives 5x 1 = 5b, so x 1 = b, and (x 1, x 2 ) = b(1, 5), spanned by (1, 5) [or, if you must, ( 1 5, 1)]. Problem 2. Find the eigenvalues and eigenvectors 10 12 8 of A = 0 2 0. 8 12 6 Since A is 3-by-3, the characteristic polynomial P (λ) is cubic, and we think strategically: How should P (λ) be given?

3 For example, do we need the coef of λ 2? We re looking for the roots, so we need the factors (λ λ i ). For that objective there is only one good method of computing the determinant; expansion on the 2nd row. Look why the 2nd row exp is best: we get P (λ) = (2 λ) 10 λ 8 8 6 λ = (2 λ)[(10 λ)( 6 λ) + 64] = (2 λ)[λ 2 4λ 60 + 64] = (λ 2) 3. Can we agree that P (λ) = λ 3 + 12λ 2 24λ + 8, [which we would have gotten from the other methods] is not as useful as having a factor [since the other factors only need a quadratic; not solving a cubic]? Anyway, this matrix only has one distinct eigenvalue, λ 1 = 2. To find the eigenvectors, we reduce the coef matrix A 2I 10 2 12 8 = 0 2 2 0 2 3 2 0 0 0. 8 12 6 2 0 0 0 We see that x 2 and x 3 will be free variables, and anticipating that finding x 1 will involve division by 2, take x 2 = 2s and x 3 = t, so x 1 = 3s t; and (x 1, x 2, x 3 ) = (3s t, 2s, t) = s(3, 2, 0) + t( 1, 0, 1), has LI spanning vectors (3, 2, 0) and ( 1, 0, 1), giving eigenvectors except when s = t = 0. Problem 3. Find the eigenvalues and eigenvectors of A = ( ) 3 2. 4 1

4 P (λ) = 3 λ 2 4 1 λ = [(3 λ)( 1 λ) + 8] = λ2 2λ 3 + 8 = λ 2 2λ + 5, with roots λ = 1 ± 2i. So there are no real λ for which there is a non-trivial real solution x R 2, but there is a complex solution 0 x C 2. We solve for x by reducing the coef matrix as usual. An n-by-n matrix A is defined to be nondefective if A has n linearly independent eigenvectors. A related term is diagonalizable. The matrix A is diagonalizable if there is an invertible n-by-n matrix P so that P 1 AP = D is a diagonal matrix. We will see that A is diagonalizable if and only if A has n linearly independent eigenvectors.

5 Problem 4. For each eigenvalue, find the multiplicity and a basis for the eigenspace and determine whether A = 4 1 6 4 0 7 is nondefective. 0 0 3 Expanding det(a λi) on the 3rd row, we have P (λ) = ( 3 λ) 4 λ 1 4 λ, = (3 + λ)[ 4λ + λ 2 + 4] = (3 + λ)(λ 2) 2, so λ 1 = 2 is an eigenvalue with multiplicity 2, and λ 2 = 3 is a simple root (multiplicity 1). For λ 1 = 2, A 2I = 2 1 6 4 2 7 2 1 0 4 2 0 2 1 0 0 0 1. 0 0 5 0 0 1 0 0 0 We already have sufficient info to determine that A is not diagonalizable: Rank(A 2I)=2, so the dimension of the eigenspace (nullspace!) is 3-2 =1, less than the multiplicity. We still need bases of the eigenspaces; and see that v 1 = ( 1, 2, 0) is a basis for λ 1 = 2. For λ 2 = 3, A ( 3I) = A + 3I = 7 1 6 4 3 7 0 0 0 1 7 8 0 25 25 0 0 0 (we use R 1 R 1 + 2R 2 ; update, then use R 2 R 2 4R 1 ) 1 0 1 0 1 1 (R 1 R 1, R 2 1 25 R 2; update, then R 1 R 1 + 7R 2.) 0 0 0 So a basis is v 2 = ( 1, 1, 1), dim(e 3 )=3-2=1.

6 Problem 5. Determine whether A = 1 3 1 1 1 1 is diagonalizable or not, 1 3 3 given P (λ) = (λ 2) 2 (λ + 1). Since λ 2 = 1 is a simple root it is non-defective; 1 3 1 and we only need to check λ 1 = 2. We have A 2I = 1 3 1 1 3 1 1 3 1 0 0 0, so dim(e 2 ) = 3 1 = 2, and A is diagonalizable. 0 0 0 We re not asked for a basis here, but for practice, x 2 = a, x 3 = b and x 1 + 3x 2 x 3 = 0 gives x 1 = 3a + b, (x 1, x 2, x 3 ) = ( 3a + b, a, b) = ( 3a, a, 0) + (b, 0, b) = a( 3, 1, 0) + b(1, 0, 1), so a basis is ( 3, 1, 0) and (1, 0, 1).

7 We collect the terminology and results used. If P (λ) written in factored form is P (λ) = (λ λ 1 ) m 1... (λ λ r ) m r, where the λ j are the distinct roots, we say that λ j has (algebraic) multiplicity m j, or that λ j is a simple root (multiplicity 1) if m j = 1. Let d j = dim(e λj ) be the dimension of the eigenspace (sometimes called the geometric multiplicity), then the main facts are 1. 1 d j m j ; 2. A is non-diagonalizable exactly when there is some eigenvalue (which must be a repeated root) that is defective, d j < m j ; 3. A is diagonalizable exactly when every eigenvalue is non-defective, d j = m j, for j = 1,..., r; 4. In particular, if P (λ) has distinct roots then A is always diagonalizable (1 d j m j = 1). We recall that m j - the algebraic multiplicity - is the number of times the jth distinct eigenvalue is a root of the characteristic polynomial; and d j - the geometric multiplicity - is the dimension of the λ j th eigenspace (that is, the nullspace of A λ j I).

8 [*Starting 5.8*] First, we have a new definition, matrices A and B are similar if there is an n-by-n matrix P so that P has an inverse and B = P 1 AP. We also recall that a diagonal matrix D = diag(d 1,..., d n ) is a square matrix with all entries 0 except (possibly) on the main diagonal, where the entries are d 1,..., d n (in that order). Finally, a matrix A is diagonalizable if there is a matrix P so that P 1 AP = D is a diagonal matrix. Theorem. The matrix A is diagonalizable exactly when A has n linearly independent eigenvectors. Further, (1) the diagonal entries in the diagonalization are the eigenvalues, each occurring as many times on the diagonal as the multiplicity m j ; (2) the columns of the matrix P that diagonalizes A are n LI eigenvectors of A; and, (3) the j-th column of P has eigenvalue that is the j-th entry on the diagonal.

9 Finally, Problem 6. Compare eigenspaces and the defective condition for A = 4 0 0 0 2 3 and A 1 = 4 0 1 0 2 3. 0 2 1 0 2 1 Both matrices have char polyn P (λ) = (λ 4) 2 (λ + 1). Since λ = 1 a simple root (not repeated), it is non-defective. We have A 4I = 0 0 0 0 2 3 0 2 3 0 0 0 and 0 2 3 0 0 0 A 1 4I = 0 0 1 0 2 3 0 1 0 0 0 1. 0 2 3 0 0 0 These matrices test our eigenvector-finding-skills. For the 2nd, the equations say x 2 = x 3 = 0; while we must have a nontrivial solution. Do you see one? Don t Think! Our reflex is x 2, x 3 bound; x 1 = a free, so (x 1, x 2, x 3 ) = (a, 0, 0) = a(1, 0, 0). We also have v 1 = (1, 0, 0) an eigenvector for the first matrix, but there s also a 2nd LI solution, v 2 = (0, 3, 2). So the roots and multiplicities are the same, only one entry is changed, but A is diagonalizable; while A 1 is non-diagonalizable.