Announcements Wednesday, November 01

Similar documents
Announcements Wednesday, November 01

Announcements Monday, October 29

Dimension. Eigenvalue and eigenvector

Announcements Monday, November 26

Announcements Monday, November 26

Announcements Wednesday, November 7

Announcements Wednesday, November 7

Study Guide for Linear Algebra Exam 2

Chapter 5. Eigenvalues and Eigenvectors

Announcements Monday, November 06

Announcements Wednesday, October 04

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix

Announcements Monday, November 13

MTH 464: Computational Linear Algebra

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

Announcements: Nov 10

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

Systems of Algebraic Equations and Systems of Differential Equations

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

Announcements Monday, October 02

Summer Session Practice Final Exam

(a) only (ii) and (iv) (b) only (ii) and (iii) (c) only (i) and (ii) (d) only (iv) (e) only (i) and (iii)

1. Linear systems of equations. Chapters 7-8: Linear Algebra. Solution(s) of a linear system of equations (continued)

MAT 1302B Mathematical Methods II

Announcements Monday, September 18

Announcements Monday, November 20

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

LINEAR ALGEBRA REVIEW

4. Linear transformations as a vector space 17

Assignment 1 Math 5341 Linear Algebra Review. Give complete answers to each of the following questions. Show all of your work.

Warm-up. True or false? Baby proof. 2. The system of normal equations for A x = y has solutions iff A x = y has solutions

MATH 2331 Linear Algebra. Section 2.1 Matrix Operations. Definition: A : m n, B : n p. Example: Compute AB, if possible.

Announcements Monday, November 13

Chapter 2 Notes, Linear Algebra 5e Lay

Glossary of Linear Algebra Terms. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB

Math 54 HW 4 solutions

235 Final exam review questions

Review problems for MA 54, Fall 2004.

Properties of Linear Transformations from R n to R m

Computationally, diagonal matrices are the easiest to work with. With this idea in mind, we introduce similarity:

Announcements Wednesday, October 25

Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Eigenvalues, Eigenvectors, and Diagonalization

Conceptual Questions for Review

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017

Question: Given an n x n matrix A, how do we find its eigenvalues? Idea: Suppose c is an eigenvalue of A, then what is the determinant of A-cI?

Final Review Written by Victoria Kala SH 6432u Office Hours R 12:30 1:30pm Last Updated 11/30/2015

2. Every linear system with the same number of equations as unknowns has a unique solution.

Linear Algebra Final Exam Study Guide Solutions Fall 2012

Review Notes for Linear Algebra True or False Last Updated: February 22, 2010

Eigenvalues and Eigenvectors

Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008

Math 18, Linear Algebra, Lecture C00, Spring 2017 Review and Practice Problems for Final Exam

First of all, the notion of linearity does not depend on which coordinates are used. Recall that a map T : R n R m is linear if

Chapter 5 Eigenvalues and Eigenvectors

Announcements Wednesday, November 15

Mon Mar matrix eigenspaces. Announcements: Warm-up Exercise:

Math 3191 Applied Linear Algebra

Announcements Wednesday, September 27

Calculating determinants for larger matrices

Math 205, Summer I, Week 4b:

Announcements Wednesday, September 20

(a) II and III (b) I (c) I and III (d) I and II and III (e) None are true.

Announcements Monday, September 25

MATH 2210Q MIDTERM EXAM I PRACTICE PROBLEMS

Announcements Wednesday, October 10

Diagonalization of Matrix

Announcements Monday, September 17

MA 265 FINAL EXAM Fall 2012

EK102 Linear Algebra PRACTICE PROBLEMS for Final Exam Spring 2016

1. In this problem, if the statement is always true, circle T; otherwise, circle F.

5.3.5 The eigenvalues are 3, 2, 3 (i.e., the diagonal entries of D) with corresponding eigenvalues. Null(A 3I) = Null( ), 0 0

ft-uiowa-math2550 Assignment OptionalFinalExamReviewMultChoiceMEDIUMlengthForm due 12/31/2014 at 10:36pm CST

Linear Algebra: Sample Questions for Exam 2

(b) If a multiple of one row of A is added to another row to produce B then det(b) =det(a).

Practice problems for Exam 3 A =

Final Review Sheet. B = (1, 1 + 3x, 1 + x 2 ) then 2 + 3x + 6x 2

Linear Algebra- Final Exam Review

Chapters 5 & 6: Theory Review: Solutions Math 308 F Spring 2015

MATH 310, REVIEW SHEET 2

LINEAR ALGEBRA SUMMARY SHEET.

Eigenspaces and Diagonalizable Transformations

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

TMA Calculus 3. Lecture 21, April 3. Toke Meier Carlsen Norwegian University of Science and Technology Spring 2013

Solving a system by back-substitution, checking consistency of a system (no rows of the form

MATH 221, Spring Homework 10 Solutions

Econ Slides from Lecture 7

Recall : Eigenvalues and Eigenvectors

Eigenvalues for Triangular Matrices. ENGI 7825: Linear Algebra Review Finding Eigenvalues and Diagonalization

Math 21b. Review for Final Exam

Section 4.5. Matrix Inverses

PRACTICE FINAL EXAM. why. If they are dependent, exhibit a linear dependence relation among them.

From Lay, 5.4. If we always treat a matrix as defining a linear transformation, what role does diagonalisation play?

ft-uiowa-math2550 Assignment NOTRequiredJustHWformatOfQuizReviewForExam3part2 due 12/31/2014 at 07:10pm CST

MATH 304 Linear Algebra Lecture 33: Bases of eigenvectors. Diagonalization.

MATH 423 Linear Algebra II Lecture 20: Geometry of linear transformations. Eigenvalues and eigenvectors. Characteristic polynomial.

Eigenvalues and Eigenvectors

Math 323 Exam 2 Sample Problems Solution Guide October 31, 2013

Announcements Monday, November 19

1 Last time: inverses

Transcription:

Announcements Wednesday, November 01 WeBWorK 3.1, 3.2 are due today at 11:59pm. The quiz on Friday covers 3.1, 3.2. My office is Skiles 244. Rabinoffice hours are Monday, 1 3pm and Tuesday, 9 11am.

Section 5.2 The Characteristic Equation

The Invertible Matrix Theorem Addenda We have a couple of new ways of saying A is invertible now: The Invertible Matrix Theorem Let A be a square n n matrix, and let T : R n R n be the linear transformation T (x) = Ax. The following statements are equivalent. 1. A is invertible. 2. T is invertible. 3. A is row equivalent to I n. 4. A has n pivots. 5. Ax = 0 has only the trivial solution. 6. The columns of A are linearly independent. 7. T is one-to-one. 8. Ax = b is consistent for all b in R n. 9. The columns of A span R n. 10. T is onto. 11. A has a left inverse (there exists B such that BA = I n). 12. A has a right inverse (there exists B such that AB = I n). 13. A T is invertible. 14. The columns of A form a basis for R n. 15. Col A = R n. 16. dim Col A = n. 17. rank A = n. 18. Nul A = {0}. 19. dim Nul A = 0. 19. The determinant of A is not equal to zero. 20. The number 0 is not an eigenvalue of A.

The Characteristic Polynomial Let A be a square matrix. λ is an eigenvalue of A Ax = λx has a nontrivial solution (A λi )x = 0 has a nontrivial solution A λi is not invertible det(a λi ) = 0. This gives us a way to compute the eigenvalues of A. Definition Let A be a square matrix. The characteristic polynomial of A is f (λ) = det(a λi ). The characteristic equation of A is the equation f (λ) = det(a λi ) = 0. Important The eigenvalues of A are the roots of the characteristic polynomial f (λ) = det(a λi ).

The Characteristic Polynomial Example Question: What are the eigenvalues of ( ) 5 2 A =? 2 1

The Characteristic Polynomial Example Question: What is the characteristic polynomial of ( ) a b A =? c d What do you notice about f (λ)? The constant term is det(a), which is zero if and only if λ = 0 is a root. The linear term (a + d) is the negative of the sum of the diagonal entries of A. Definition The trace of a square matrix A is Tr(A) = sum of the diagonal entries of A. Shortcut The characteristic polynomial of a 2 2 matrix A is f (λ) = λ 2 Tr(A) λ + det(a).

The Characteristic Polynomial Example Question: What are the eigenvalues of the rabbit population matrix 0 6 8 A = 1 0 0 2? 1 0 2 0

Algebraic Multiplicity Definition The (algebraic) multiplicity of an eigenvalue λ is its multiplicity as a root of the characteristic polynomial. This is not a very interesting notion yet. It will become interesting when we also define geometric multiplicity later. Example In the rabbit population matrix, f (λ) = (λ 2)(λ + 1) 2, so the algebraic multiplicity of the eigenvalue 2 is 1, and the algebraic multiplicity of the eigenvalue 1 is 2. Example ( ) 5 2 In the matrix, f (λ) = (λ (3 2 2))(λ (3 + 2 2)), so the 2 1 algebraic multiplicity of 3 + 2 2 is 1, and the algebraic multiplicity of 3 2 2 is 1.

The Characteristic Polynomial Poll Fact: If A is an n n matrix, the characteristic polynomial f (λ) = det(a λi ) turns out to be a polynomial of degree n, and its roots are the eigenvalues of A: f (λ) = ( 1) n λ n + a n 1λ n 1 + a n 2λ n 2 + + a 1λ + a 0.

The B-basis Review Recall: If {v 1, v 2,..., v m} is a basis for a subspace V and x is in V, then the B-coordinates of x are the (unique) coefficients c 1, c 2,..., c m such that x = c 1v 1 + c 2v 2 + + c mv m. In this case, the B-coordinate vector of x is c 1 c 2 [x] B =.. c m Example: The vectors v 1 = ( ) 1 1 v 2 = ( ) 1 1 form a basis for R 2 because they are not collinear. [interactive]

Coordinate Systems on R n Recall: A set of n vectors {v 1, v 2,..., v n} form a basis for R n if and only if the matrix C with columns v 1, v 2,..., v n is invertible. Translation: Let B be the basis of columns of C. Multiplying by C changes from the B-coordinates to the usual coordinates, and multiplying by C 1 changes from the usual coordinates to the B-coordinates: [x] B = C 1 x x = C[x] B.

Similarity Definition Two n n matrices A and B are similar if there is an invertible n n matrix C such that A = CBC 1. What does this mean? This gives you a different way of thinking about multiplication by A. Let B be the basis of columns of C. B-coordinates multiply by C 1 usual coordinates [x] B Ax x B[x] B multiply by C To compute Ax, you: 1. multiply x by C 1 to change to the B-coordinates: [x] B = C 1 x 2. multiply this by B: B[x] B = BC 1 x 3. multiply this by C to change to usual coordinates: Ax = CBC 1 x = CB[x] B.

Similarity Definition Two n n matrices A and B are similar if there is an invertible n n matrix C such that A = CBC 1. What does this mean? This gives you a different way of thinking about multiplication by A. Let B be the basis of columns of C. B-coordinates multiply by C 1 usual coordinates [x] B Ax x B[x] B multiply by C If A = CBC 1, then A and B do the same thing, but B operates on the B-coordinates, where B is the basis of columns of C.

Similarity Example A = ( 1/2 ) 3/2 3/2 1/2 B = ( 2 0 ) 0 1 C = ( 1 1 ) 1 1 A = CBC 1. What does B do geometrically? It scales the x-direction by 2 and the y-direction by 1. To compute Ax, first change to the B coordinates, then multiply by B, then change back to the usual coordinates, where {( ) ( )} 2 1 B =, = { } v 1 1 1, v 2 (the columns of C). B-coordinates multiply by C 1 usual coordinates [x] B B[x] B scale x by 2 scale y by 1 Ax x multiply by C

Similarity Example A = ( 1/2 ) 3/2 3/2 1/2 B = ( 2 0 ) 0 1 C = ( 1 1 ) 1 1 A = CBC 1. What does B do geometrically? It scales the x-direction by 2 and the y-direction by 1. To compute Ax, first change to the B coordinates, then multiply by B, then change back to the usual coordinates, where {( ) ( )} 2 1 B =, = { } v 1 1 1, v 2 (the columns of C). B-coordinates multiply by C 1 usual coordinates [x] B scale x by 2 scale y by 1 Ax x B[x] B multiply by C

Similarity Example A = ( 1/2 ) 3/2 3/2 1/2 B = ( 2 0 ) 0 1 C = ( 1 1 ) 1 1 A = CBC 1. What does B do geometrically? It scales the x-direction by 2 and the y-direction by 1. To compute Ax, first change to the B coordinates, then multiply by B, then change back to the usual coordinates, where {( ) ( )} 2 1 B =, = { } v 1 1 1, v 2 (the columns of C). B-coordinates multiply by C 1 usual coordinates 2-eigenspace B[x] B [x] B scale x by 2 scale y by 1 multiply by C 2-eigenspace x Ax

Similarity Example A = ( 1/2 ) 3/2 3/2 1/2 B = ( 2 0 ) 0 1 C = ( 1 1 ) 1 1 A = CBC 1. What does B do geometrically? It scales the x-direction by 2 and the y-direction by 1. To compute Ax, first change to the B coordinates, then multiply by B, then change back to the usual coordinates, where {( ) ( )} 2 1 B =, = { } v 1 1 1, v 2 (the columns of C). B-coordinates [x] B B[x] B ( 1)-eigenspace multiply by C 1 scale x by 2 scale y by 1 multiply by C usual coordinates ( 1)-eigenspace Ax x

Similarity Example What does A do geometrically? B scales the e 1-direction by 2 and the e 2-direction by 1. A scales the v 1-direction by 2 and the v 2-direction by 1. columns of C e 2 B e 1 Be 1 Be 2 [interactive] v 1 A Av 1 Av 2 v 2 Since B is simpler than A, this makes it easier to understand A. Note the relationship between the eigenvalues/eigenvectors of A and B.

Similarity Example (3 3) 3 5 3 2 0 0 1 1 0 A = 2 4 3 B = 0 1 0 C = 1 1 1 3 5 2 0 0 1 1 0 1 What do A and B do geometrically? = A = CBC 1. B scales the e 1-direction by 2, the e 2-direction by 1, and fixes e 3. A scales the v 1-direction by 2, the v 2-direction by 1, and fixes v 3. Here v 1, v 2, v 3 are the columns of C. [interactive]

Similar Matrices Have the Same Characteristic Polynomial Fact: If A and B are similar, then they have the same characteristic polynomial. Why? Suppose A = CBC 1. Consequence: similar matrices have the same eigenvalues! (But different eigenvectors in general.)

Similarity Caveats Warning 1. Matrices with the same eigenvalues need not be similar. For instance, ( ) ( ) 2 1 2 0 and 0 2 0 2 both only have the eigenvalue 2, but they are not similar. 2. Similarity has nothing to do with row equivalence. For instance, ( ) ( ) 2 1 1 0 and 0 2 0 1 are row equivalent, but they have different eigenvalues.

Summary We did two different things today. First we talked about characteristic polynomials: We learned to find the eigenvalues of a matrix by computing the roots of the characteristic polynomial p(λ) = det ( A λi ). For a 2 2 matrix A, the characteristic polynomial is just p(λ) = λ 2 Tr(A)λ + det(a). The algebraic multiplicity of an eigenvalue is its multiplicity as a root of the characteristic polynomial. Then we talked about similar matrices: Two square matrices A, B of the same size are similar if there is an invertible matrix C such that A = CBC 1. Geometrically, similar matrices A and B do the same thing, except B operates on the coordinate system B defined by the columns of C: B[x] B = [Ax] B. This is useful when we can find a similar matrix B which is simpler than A (e.g., a diagonal matrix).