Announcements Wednesday, November 01

Similar documents
Announcements Wednesday, November 01

Announcements Monday, October 29

Dimension. Eigenvalue and eigenvector

Announcements Monday, November 26

Announcements Wednesday, November 7

Announcements Wednesday, November 7

Announcements Monday, November 26

Study Guide for Linear Algebra Exam 2

Chapter 5. Eigenvalues and Eigenvectors

MTH 464: Computational Linear Algebra

Announcements Monday, November 06

Announcements Wednesday, October 04

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix

Announcements Monday, November 13

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

235 Final exam review questions

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

Announcements Monday, October 02

Announcements Monday, September 18

Systems of Algebraic Equations and Systems of Differential Equations

Review problems for MA 54, Fall 2004.

ft-uiowa-math2550 Assignment OptionalFinalExamReviewMultChoiceMEDIUMlengthForm due 12/31/2014 at 10:36pm CST

Announcements: Nov 10

(a) only (ii) and (iv) (b) only (ii) and (iii) (c) only (i) and (ii) (d) only (iv) (e) only (i) and (iii)

Summer Session Practice Final Exam

LINEAR ALGEBRA REVIEW

Warm-up. True or false? Baby proof. 2. The system of normal equations for A x = y has solutions iff A x = y has solutions

MAT 1302B Mathematical Methods II

Announcements Wednesday, October 25

Question: Given an n x n matrix A, how do we find its eigenvalues? Idea: Suppose c is an eigenvalue of A, then what is the determinant of A-cI?

Computationally, diagonal matrices are the easiest to work with. With this idea in mind, we introduce similarity:

4. Linear transformations as a vector space 17

Linear Algebra Final Exam Study Guide Solutions Fall 2012

1. Linear systems of equations. Chapters 7-8: Linear Algebra. Solution(s) of a linear system of equations (continued)

Linear Algebra- Final Exam Review

and let s calculate the image of some vectors under the transformation T.

MA 265 FINAL EXAM Fall 2012

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Announcements Monday, November 20

1. In this problem, if the statement is always true, circle T; otherwise, circle F.

MATH 2210Q MIDTERM EXAM I PRACTICE PROBLEMS

2. Every linear system with the same number of equations as unknowns has a unique solution.

(b) If a multiple of one row of A is added to another row to produce B then det(b) =det(a).

Math 54 HW 4 solutions

c c c c c c c c c c a 3x3 matrix C= has a determinant determined by

Eigenvalues, Eigenvectors, and Diagonalization

Calculating determinants for larger matrices

Math 205, Summer I, Week 4b:

Final Review Written by Victoria Kala SH 6432u Office Hours R 12:30 1:30pm Last Updated 11/30/2015

Review Notes for Linear Algebra True or False Last Updated: February 22, 2010

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017

Math 315: Linear Algebra Solutions to Assignment 7

Chapter 2 Notes, Linear Algebra 5e Lay

MATH 423 Linear Algebra II Lecture 20: Geometry of linear transformations. Eigenvalues and eigenvectors. Characteristic polynomial.

Assignment 1 Math 5341 Linear Algebra Review. Give complete answers to each of the following questions. Show all of your work.

MATH 221, Spring Homework 10 Solutions

MATH 2331 Linear Algebra. Section 2.1 Matrix Operations. Definition: A : m n, B : n p. Example: Compute AB, if possible.

Eigenvalues and Eigenvectors

Glossary of Linear Algebra Terms. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB

Linear Algebra: Sample Questions for Exam 2

Announcements Monday, November 13

REVIEW FOR EXAM II. The exam covers sections , the part of 3.7 on Markov chains, and

Announcements Monday, September 25

5.3.5 The eigenvalues are 3, 2, 3 (i.e., the diagonal entries of D) with corresponding eigenvalues. Null(A 3I) = Null( ), 0 0

Properties of Linear Transformations from R n to R m

Announcements Wednesday, October 10

(a) II and III (b) I (c) I and III (d) I and II and III (e) None are true.

Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008

MATH 304 Linear Algebra Lecture 33: Bases of eigenvectors. Diagonalization.

EIGENVALUES AND EIGENVECTORS

Conceptual Questions for Review

Solving a system by back-substitution, checking consistency of a system (no rows of the form

Chapters 5 & 6: Theory Review: Solutions Math 308 F Spring 2015

Announcements Wednesday, September 27

Eigenvalues and Eigenvectors

DM554 Linear and Integer Programming. Lecture 9. Diagonalization. Marco Chiarandini

Chapter 5 Eigenvalues and Eigenvectors

Chapter 2:Determinants. Section 2.1: Determinants by cofactor expansion

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL

Final Review Sheet. B = (1, 1 + 3x, 1 + x 2 ) then 2 + 3x + 6x 2

Solutions to Final Practice Problems Written by Victoria Kala Last updated 12/5/2015

Linear algebra II Tutorial solutions #1 A = x 1

Eigenspaces and Diagonalizable Transformations

Math 323 Exam 2 Sample Problems Solution Guide October 31, 2013

Eigenvectors. Prop-Defn

Solutions to Final Exam

Announcements Wednesday, September 20

YORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH M Test #1. July 11, 2013 Solutions

Math 240 Calculus III

Lecture Notes: Eigenvalues and Eigenvectors. 1 Definitions. 2 Finding All Eigenvalues

Third Midterm Exam Name: Practice Problems November 11, Find a basis for the subspace spanned by the following vectors.

Math 18, Linear Algebra, Lecture C00, Spring 2017 Review and Practice Problems for Final Exam

MATH 220 FINAL EXAMINATION December 13, Name ID # Section #

Diagonalization. MATH 322, Linear Algebra I. J. Robert Buchanan. Spring Department of Mathematics

4. Determinants.

Linear Algebra Primer

Math 21b. Review for Final Exam

CAAM 335 Matrix Analysis

Transcription:

Announcements Wednesday, November 01 WeBWorK 3.1, 3.2 are due today at 11:59pm. The quiz on Friday covers 3.1, 3.2. My office is Skiles 244. Rabinoffice hours are Monday, 1 3pm and Tuesday, 9 11am.

Section 5.2 The Characteristic Equation

The Invertible Matrix Theorem Addenda We have a couple of new ways of saying A is invertible now: The Invertible Matrix Theorem Let A be a square n n matrix, and let T : R n R n be the linear transformation T (x) = Ax. The following statements are equivalent. 1. A is invertible. 2. T is invertible. 3. A is row equivalent to I n. 4. A has n pivots. 5. Ax = 0 has only the trivial solution. 6. The columns of A are linearly independent. 7. T is one-to-one. 8. Ax = b is consistent for all b in R n. 9. The columns of A span R n. 10. T is onto. 11. A has a left inverse (there exists B such that BA = I n). 12. A has a right inverse (there exists B such that AB = I n). 13. A T is invertible. 14. The columns of A form a basis for R n. 15. Col A = R n. 16. dim Col A = n. 17. rank A = n. 18. Nul A = {0}. 19. dim Nul A = 0. 19. The determinant of A is not equal to zero. 20. The number 0 is not an eigenvalue of A.

The Characteristic Polynomial Let A be a square matrix. λ is an eigenvalue of A Ax = λx has a nontrivial solution (A λi )x = 0 has a nontrivial solution A λi is not invertible det(a λi ) = 0. This gives us a way to compute the eigenvalues of A. Definition Let A be a square matrix. The characteristic polynomial of A is f (λ) = det(a λi ). The characteristic equation of A is the equation f (λ) = det(a λi ) = 0. Important The eigenvalues of A are the roots of the characteristic polynomial f (λ) = det(a λi ).

The Characteristic Polynomial Example Question: What are the eigenvalues of ( ) 5 2 A =? 2 1 Answer: First we find the characteristic polynomial: [( ) ( )] ( ) 5 2 λ 0 5 λ 2 f (λ) = det(a λi ) = det = det 2 1 0 λ 2 1 λ = (5 λ)(1 λ) 2 2 = λ 2 6λ + 1. The eigenvalues are the roots of the characteristic polynomial, which we can find using the quadratic formula: λ = 6 ± 36 4 2 = 3 ± 2 2.

The Characteristic Polynomial Example Question: What is the characteristic polynomial of ( ) a b A =? c d Answer: ( ) a λ b f (λ) = det(a λi ) = det = (a λ)(d λ) bc c d λ = λ 2 (a + d)λ + (ad bc) What do you notice about f (λ)? The constant term is det(a), which is zero if and only if λ = 0 is a root. The linear term (a + d) is the negative of the sum of the diagonal entries of A. Definition The trace of a square matrix A is Tr(A) = sum of the diagonal entries of A. Shortcut The characteristic polynomial of a 2 2 matrix A is f (λ) = λ 2 Tr(A) λ + det(a).

The Characteristic Polynomial Example Question: What are the eigenvalues of the rabbit population matrix 0 6 8 A = 1 0 0 2? 1 0 2 0 Answer: First we find the characteristic polynomial: λ 6 8 f (λ) = det(a λi ) = det 1 λ 0 2 1 0 λ 2 ( ) ( 1 = 8 4 0 λ λ λ 2 6 1 ) 2 = λ 3 + 3λ + 2. We know from before that one eigenvalue is λ = 2: indeed, f (2) = 8 + 6 + 2 = 0. Doing polynomial long division, we get: λ 3 + 3λ + 2 = λ 2 2λ 1 = (λ + 1) 2. λ 2 Hence λ = 1 is also an eigenvalue.

Algebraic Multiplicity Definition The (algebraic) multiplicity of an eigenvalue λ is its multiplicity as a root of the characteristic polynomial. This is not a very interesting notion yet. It will become interesting when we also define geometric multiplicity later. Example In the rabbit population matrix, f (λ) = (λ 2)(λ + 1) 2, so the algebraic multiplicity of the eigenvalue 2 is 1, and the algebraic multiplicity of the eigenvalue 1 is 2. Example ( ) 5 2 In the matrix, f (λ) = (λ (3 2 2))(λ (3 + 2 2)), so the 2 1 algebraic multiplicity of 3 + 2 2 is 1, and the algebraic multiplicity of 3 2 2 is 1.

The Characteristic Polynomial Poll Fact: If A is an n n matrix, the characteristic polynomial f (λ) = det(a λi ) turns out to be a polynomial of degree n, and its roots are the eigenvalues of A: f (λ) = ( 1) n λ n + a n 1λ n 1 + a n 2λ n 2 + + a 1λ + a 0. Poll If you count the eigenvalues of A, with their algebraic multiplicities, you will get: A. Always n. B. Always at most n, but sometimes less. C. Always at least n, but sometimes more. D. None of the above. The answer depends on whether you allow complex eigenvalues. If you only allow real eigenvalues, the answer is B. Otherwise it is A, because any degree-n polynomial has exactly n complex roots, counted with multiplicity. Stay tuned.

The B-basis Review Recall: If {v 1, v 2,..., v m} is a basis for a subspace V and x is in V, then the B-coordinates of x are the (unique) coefficients c 1, c 2,..., c m such that x = c 1v 1 + c 2v 2 + + c mv m. In this case, the B-coordinate vector of x is c 1 c 2 [x] B =.. c m Example: The vectors v 1 = ( ) 1 1 v 2 = ( ) 1 1 form a basis for R 2 because they are not collinear. [interactive]

Coordinate Systems on R n Recall: A set of n vectors {v 1, v 2,..., v n} form a basis for R n if and only if the matrix C with columns v 1, v 2,..., v n is invertible. If x = c 1v 1 + c 2v 2 + + c nv n then c 1 c 2 [x] B = = x = c1v1 + c2v2 + cnvn = C[x]B.. c n Since x = C[x] B we have [x] B = C 1 x. Translation: Let B be the basis of columns of C. Multiplying by C changes from the B-coordinates to the usual coordinates, and multiplying by C 1 changes from the usual coordinates to the B-coordinates: [x] B = C 1 x x = C[x] B.

Similarity Definition Two n n matrices A and B are similar if there is an invertible n n matrix C such that A = CBC 1. What does this mean? This gives you a different way of thinking about multiplication by A. Let B be the basis of columns of C. B-coordinates multiply by C 1 usual coordinates [x] B Ax x B[x] B multiply by C To compute Ax, you: 1. multiply x by C 1 to change to the B-coordinates: [x] B = C 1 x 2. multiply this by B: B[x] B = BC 1 x 3. multiply this by C to change to usual coordinates: Ax = CBC 1 x = CB[x] B.

Similarity Definition Two n n matrices A and B are similar if there is an invertible n n matrix C such that A = CBC 1. What does this mean? This gives you a different way of thinking about multiplication by A. Let B be the basis of columns of C. B-coordinates multiply by C 1 usual coordinates [x] B Ax x B[x] B multiply by C If A = CBC 1, then A and B do the same thing, but B operates on the B-coordinates, where B is the basis of columns of C.

Similarity Example A = ( 1/2 ) 3/2 3/2 1/2 B = ( 2 0 ) 0 1 C = ( 1 1 ) 1 1 A = CBC 1. What does B do geometrically? It scales the x-direction by 2 and the y-direction by 1. To compute Ax, first change to the B coordinates, then multiply by B, then change back to the usual coordinates, where {( ) ( )} 2 1 B =, = { } v 1 1 1, v 2 (the columns of C). B-coordinates multiply by C 1 usual coordinates [x] B B[x] B scale x by 2 scale y by 1 Ax x multiply by C

Similarity Example A = ( 1/2 ) 3/2 3/2 1/2 B = ( 2 0 ) 0 1 C = ( 1 1 ) 1 1 A = CBC 1. What does B do geometrically? It scales the x-direction by 2 and the y-direction by 1. To compute Ax, first change to the B coordinates, then multiply by B, then change back to the usual coordinates, where {( ) ( )} 2 1 B =, = { } v 1 1 1, v 2 (the columns of C). B-coordinates multiply by C 1 usual coordinates [x] B scale x by 2 scale y by 1 Ax x B[x] B multiply by C

Similarity Example A = ( 1/2 ) 3/2 3/2 1/2 B = ( 2 0 ) 0 1 C = ( 1 1 ) 1 1 A = CBC 1. What does B do geometrically? It scales the x-direction by 2 and the y-direction by 1. To compute Ax, first change to the B coordinates, then multiply by B, then change back to the usual coordinates, where {( ) ( )} 2 1 B =, = { } v 1 1 1, v 2 (the columns of C). B-coordinates multiply by C 1 usual coordinates 2-eigenspace B[x] B [x] B scale x by 2 scale y by 1 multiply by C 2-eigenspace x Ax

Similarity Example A = ( 1/2 ) 3/2 3/2 1/2 B = ( 2 0 ) 0 1 C = ( 1 1 ) 1 1 A = CBC 1. What does B do geometrically? It scales the x-direction by 2 and the y-direction by 1. To compute Ax, first change to the B coordinates, then multiply by B, then change back to the usual coordinates, where {( ) ( )} 2 1 B =, = { } v 1 1 1, v 2 (the columns of C). B-coordinates [x] B B[x] B ( 1)-eigenspace multiply by C 1 scale x by 2 scale y by 1 multiply by C usual coordinates ( 1)-eigenspace Ax x

Similarity Example What does A do geometrically? B scales the e 1-direction by 2 and the e 2-direction by 1. A scales the v 1-direction by 2 and the v 2-direction by 1. columns of C e 2 B e 1 Be 1 Be 2 [interactive] v 1 A Av 1 Av 2 v 2 Since B is simpler than A, this makes it easier to understand A. Note the relationship between the eigenvalues/eigenvectors of A and B.

Similarity Example (3 3) 3 5 3 2 0 0 1 1 0 A = 2 4 3 B = 0 1 0 C = 1 1 1 3 5 2 0 0 1 1 0 1 What do A and B do geometrically? = A = CBC 1. B scales the e 1-direction by 2, the e 2-direction by 1, and fixes e 3. A scales the v 1-direction by 2, the v 2-direction by 1, and fixes v 3. Here v 1, v 2, v 3 are the columns of C. [interactive]

Similar Matrices Have the Same Characteristic Polynomial Fact: If A and B are similar, then they have the same characteristic polynomial. Why? Suppose A = CBC 1. A λi = CBC 1 λi = CBC 1 C(λI )C 1 = C(B λi )C 1. Therefore, det(a λi ) = det ( C(B λi )C 1) = det(c) det(b λi ) det(c 1 ) = det(b λi ), because det(c 1 ) = det(c) 1. Consequence: similar matrices have the same eigenvalues! (But different eigenvectors in general.)

Similarity Caveats Warning 1. Matrices with the same eigenvalues need not be similar. For instance, ( ) ( ) 2 1 2 0 and 0 2 0 2 both only have the eigenvalue 2, but they are not similar. 2. Similarity has nothing to do with row equivalence. For instance, ( ) ( ) 2 1 1 0 and 0 2 0 1 are row equivalent, but they have different eigenvalues.

Summary We did two different things today. First we talked about characteristic polynomials: We learned to find the eigenvalues of a matrix by computing the roots of the characteristic polynomial p(λ) = det ( A λi ). For a 2 2 matrix A, the characteristic polynomial is just p(λ) = λ 2 Tr(A)λ + det(a). The algebraic multiplicity of an eigenvalue is its multiplicity as a root of the characteristic polynomial. Then we talked about similar matrices: Two square matrices A, B of the same size are similar if there is an invertible matrix C such that A = CBC 1. Geometrically, similar matrices A and B do the same thing, except B operates on the coordinate system B defined by the columns of C: B[x] B = [Ax] B. This is useful when we can find a similar matrix B which is simpler than A (e.g., a diagonal matrix).