Travis Schedler. Thurs, Oct 27, 2011 (version: Thurs, Oct 27, 1:00 PM)

Similar documents
Lecture 11: Finish Gaussian elimination and applications; intro to eigenvalues and eigenvectors (1)

Lecture 21: The decomposition theorem into generalized eigenspaces; multiplicity of eigenvalues and upper-triangular matrices (1)

Lecture 19: Polar and singular value decompositions; generalized eigenspaces; the decomposition theorem (1)

Lecture 19: Polar and singular value decompositions; generalized eigenspaces; the decomposition theorem (1)

Math 4153 Exam 3 Review. The syllabus for Exam 3 is Chapter 6 (pages ), Chapter 7 through page 137, and Chapter 8 through page 182 in Axler.

Lecture 22: Jordan canonical form of upper-triangular matrices (1)

What is on this week. 1 Vector spaces (continued) 1.1 Null space and Column Space of a matrix

Lecture 23: Determinants (1)

MATH SOLUTIONS TO PRACTICE MIDTERM LECTURE 1, SUMMER Given vector spaces V and W, V W is the vector space given by

Lecture 19: Isometries, Positive operators, Polar and singular value decompositions; Unitary matrices and classical groups; Previews (1)

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

Linear algebra I Homework #1 due Thursday, Oct Show that the diagonals of a square are orthogonal to one another.

Eigenvalues and Eigenvectors

1 Invariant subspaces

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

MATH 221, Spring Homework 10 Solutions

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

Announcements Monday, October 29

Lecture 23: Trace and determinants! (1) (Final lecture)

FALL 2011, SOLUTION SET 10 LAST REVISION: NOV 27, 9:45 AM. (T c f)(x) = f(x c).

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

Calculating determinants for larger matrices

MATH 315 Linear Algebra Homework #1 Assigned: August 20, 2018

Lecture 12: Diagonalization

Eigenvalues and Eigenvectors

MATRICES ARE SIMILAR TO TRIANGULAR MATRICES

LU Factorization. A m x n matrix A admits an LU factorization if it can be written in the form of A = LU

MATH 423 Linear Algebra II Lecture 33: Diagonalization of normal operators.

Study Guide for Linear Algebra Exam 2

Online Exercises for Linear Algebra XM511

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017

2. Every linear system with the same number of equations as unknowns has a unique solution.

MATH 304 Linear Algebra Lecture 23: Diagonalization. Review for Test 2.

MATH 304 Linear Algebra Lecture 34: Review for Test 2.

(a) II and III (b) I (c) I and III (d) I and II and III (e) None are true.

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

Math Final December 2006 C. Robinson

MATH 2331 Linear Algebra. Section 2.1 Matrix Operations. Definition: A : m n, B : n p. Example: Compute AB, if possible.

MATH Spring 2011 Sample problems for Test 2: Solutions

Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008

Warm-up. True or false? Baby proof. 2. The system of normal equations for A x = y has solutions iff A x = y has solutions

Cheat Sheet for MATH461

Math 113 Homework 5 Solutions (Starred problems) Solutions by Guanyang Wang, with edits by Tom Church.

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix

MATH 115A: SAMPLE FINAL SOLUTIONS

Final Review Written by Victoria Kala SH 6432u Office Hours R 12:30 1:30pm Last Updated 11/30/2015

Math 18, Linear Algebra, Lecture C00, Spring 2017 Review and Practice Problems for Final Exam

The Spectral Theorem for normal linear maps

Dimension. Eigenvalue and eigenvector

Contents. Preface for the Instructor. Preface for the Student. xvii. Acknowledgments. 1 Vector Spaces 1 1.A R n and C n 2

Math 113 Final Exam: Solutions

Spectral Theorem for Self-adjoint Linear Operators

Conceptual Questions for Review

Then x 1,..., x n is a basis as desired. Indeed, it suffices to verify that it spans V, since n = dim(v ). We may write any v V as r

Therefore, A and B have the same characteristic polynomial and hence, the same eigenvalues.

Final A. Problem Points Score Total 100. Math115A Nadja Hempel 03/23/2017

Lecture 4: Linear independence, span, and bases (1)

MTH 464: Computational Linear Algebra

LINEAR ALGEBRA SUMMARY SHEET.

BASIC ALGORITHMS IN LINEAR ALGEBRA. Matrices and Applications of Gaussian Elimination. A 2 x. A T m x. A 1 x A T 1. A m x

Diagonalization. Hung-yi Lee

Solutions to Final Practice Problems Written by Victoria Kala Last updated 12/5/2015

1. General Vector Spaces

Math 121 Practice Final Solutions

Lecture 3 Eigenvalues and Eigenvectors

Math 344 Lecture # Linear Systems

Numerical Linear Algebra Homework Assignment - Week 2

Linear algebra II Tutorial solutions #1 A = x 1

Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

1 Last time: least-squares problems

ALGEBRA QUALIFYING EXAM PROBLEMS LINEAR ALGEBRA

Math 113 Homework 5. Bowei Liu, Chao Li. Fall 2013

Solution of Linear Equations

Chapter 5 Eigenvalues and Eigenvectors

LINEAR ALGEBRA QUESTION BANK

Linear Algebra Lecture Notes-II

MODEL ANSWERS TO THE FIRST QUIZ. 1. (18pts) (i) Give the definition of a m n matrix. A m n matrix with entries in a field F is a function

Properties of Linear Transformations from R n to R m

(b) If a multiple of one row of A is added to another row to produce B then det(b) =det(a).

( 9x + 3y. y 3y = (λ 9)x 3x + y = λy 9x + 3y = 3λy 9x + (λ 9)x = λ(λ 9)x. (λ 2 10λ)x = 0

and let s calculate the image of some vectors under the transformation T.

Math 113 Winter 2013 Prof. Church Midterm Solutions

Eigenvalues and Eigenvectors 7.1 Eigenvalues and Eigenvecto

Math 369 Exam #2 Practice Problem Solutions

Announcements Wednesday, November 01

A linear algebra proof of the fundamental theorem of algebra

Lecture 11: Eigenvalues and Eigenvectors

A linear algebra proof of the fundamental theorem of algebra

Eigenvalues and Eigenvectors

Lecture 22: Section 4.7

235 Final exam review questions

Linear Algebra 2 Spectral Notes

Definition (T -invariant subspace) Example. Example

Review problems for MA 54, Fall 2004.

Problem Set (T) If A is an m n matrix, B is an n p matrix and D is a p s matrix, then show

Eigenvalues and Eigenvectors A =

REU 2007 Apprentice Class Lecture 8

6 Inner Product Spaces

Transcription:

Lecture 13: Proof of existence of upper-triangular matrices for complex linear transformations; invariant subspaces and block upper-triangular matrices for real linear transformations (1) Travis Schedler Thurs, Oct 27, 2011 (version: Thurs, Oct 27, 1:00 PM)

Goals (2) Theorem on existence of upper-triangular matrices for F = C, when V is f.d. and nonzero Theorem on existence of block upper-triangular matrices for F = R, with diagonal blocks of size 2 or 1, when V is f.d. and nonzero Corollary: existence of eigenvalues when V is odd-dimensional Read Chapter 5, and do PS 6. Then, start on Chapter 6 and PS 7.

Warm-up exercise (3) Let A Mat(n, n, F) be an upper-triangular (square) matrix. Show that A is invertible if and only if its diagonal entries are nonzero (use Gaussian elimination!) Conclude that A λi is not injective if and only if λ appears on the diagonal of A. Conclude that the eigenvalues of A (i.e., λ such that Av = λv has a nonzero solution v Mat(n, 1, F)) are precisely the diagonal entries. Remark: A somewhat different proof, without using Gaussian elimination, is given in Axler, Proposition 5.16.

Solution to warm-up exercise (4) If A has nonzero diagonal entries, then we can rescale all the rows so that the diagonal entries are one. This is in row-echelon form with no nonzero rows, so the rank of the matrix is equal to n, i.e., A is invertible. Conversely, if A has some diagonal entries that are zero, then performing Gaussian elimination, the entries that are zero will yield free columns (and the nonzero entries will yield pivot columns). So not all columns will be pivot columns, and the number of these is the rank of A which is then strictly less than n. So A is not invertible.

Upper-triangular matrices (5) In general one has no eigenbasis, i.e., there is no diagonal matrix M(T ). However much more generally:

Upper-triangular matrices (5) In general one has no eigenbasis, i.e., there is no diagonal matrix M(T ). However much more generally: Proposition (Proposition 5.12) The following are equivalent for a basis (v 1,..., v n ) of V and T L(V ): (a) M(T ) is upper-triangular; (b) Tv k Span(v 1,..., v k ) for all k; (c) Span(v 1,..., v k ) is T -invariant for all k.

Upper-triangular matrices (5) In general one has no eigenbasis, i.e., there is no diagonal matrix M(T ). However much more generally: Proposition (Proposition 5.12) The following are equivalent for a basis (v 1,..., v n ) of V and T L(V ): (a) M(T ) is upper-triangular; (b) Tv k Span(v 1,..., v k ) for all k; (c) Span(v 1,..., v k ) is T -invariant for all k. Proof (on board): Just write out M(T ).

Upper-triangular matrices (5) In general one has no eigenbasis, i.e., there is no diagonal matrix M(T ). However much more generally: Proposition (Proposition 5.12) The following are equivalent for a basis (v 1,..., v n ) of V and T L(V ): (a) M(T ) is upper-triangular; (b) Tv k Span(v 1,..., v k ) for all k; (c) Span(v 1,..., v k ) is T -invariant for all k. Proof (on board): Just write out M(T ). Theorem (Theorem 5.13) If F = C and V is finite-dimensional, then there exists a basis such that M(T ) is upper-triangular.

Upper-triangular matrices (5) In general one has no eigenbasis, i.e., there is no diagonal matrix M(T ). However much more generally: Proposition (Proposition 5.12) The following are equivalent for a basis (v 1,..., v n ) of V and T L(V ): (a) M(T ) is upper-triangular; (b) Tv k Span(v 1,..., v k ) for all k; (c) Span(v 1,..., v k ) is T -invariant for all k. Proof (on board): Just write out M(T ). Theorem (Theorem 5.13) If F = C and V is finite-dimensional, then there exists a basis such that M(T ) is upper-triangular. Proof (on board; see next slide.)

Proof of Theorem 5.13 (on board) (6) Proof: by induction on dim V. Base case: dim V = 1 (or 0).

Proof of Theorem 5.13 (on board) (6) Proof: by induction on dim V. Base case: dim V = 1 (or 0). Inductive step: let v 1 =nonzero eigenvector.

Proof of Theorem 5.13 (on board) (6) Proof: by induction on dim V. Base case: dim V = 1 (or 0). Inductive step: let v 1 =nonzero eigenvector. Write V = Span(v 1 ) U for some U. Let S L(U) be S(u) = P U,Span(v1 )T (u) for all u.

Proof of Theorem 5.13 (on board) (6) Proof: by induction on dim V. Base case: dim V = 1 (or 0). Inductive step: let v 1 =nonzero eigenvector. Write V = Span(v 1 ) U for some U. Let S L(U) be S(u) = P U,Span(v1 )T (u) for all u. By induction, there is a basis (v 2,..., v n ) of U in which M(S) is upper-triangular.

Proof of Theorem 5.13 (on board) (6) Proof: by induction on dim V. Base case: dim V = 1 (or 0). Inductive step: let v 1 =nonzero eigenvector. Write V = Span(v 1 ) U for some U. Let S L(U) be S(u) = P U,Span(v1 )T (u) for all u. By induction, there is a basis (v 2,..., v n ) of U in which M(S) is upper-triangular. Now P U,Span(v1 )T (Span(v 2,..., v k )) Span(v 2,..., v k ). This implies T (Span(v 2,..., v k )) Span(v 1, v 2,..., v k ).

Proof of Theorem 5.13 (on board) (6) Proof: by induction on dim V. Base case: dim V = 1 (or 0). Inductive step: let v 1 =nonzero eigenvector. Write V = Span(v 1 ) U for some U. Let S L(U) be S(u) = P U,Span(v1 )T (u) for all u. By induction, there is a basis (v 2,..., v n ) of U in which M(S) is upper-triangular. Now P U,Span(v1 )T (Span(v 2,..., v k )) Span(v 2,..., v k ). This implies T (Span(v 2,..., v k )) Span(v 1, v 2,..., v k ). Hence Span(v 1,..., v k ) is T -invariant for all k.

Proof of Theorem 5.13 (on board) (6) Proof: by induction on dim V. Base case: dim V = 1 (or 0). Inductive step: let v 1 =nonzero eigenvector. Write V = Span(v 1 ) U for some U. Let S L(U) be S(u) = P U,Span(v1 )T (u) for all u. By induction, there is a basis (v 2,..., v n ) of U in which M(S) is upper-triangular. Now P U,Span(v1 )T (Span(v 2,..., v k )) Span(v 2,..., v k ). This implies T (Span(v 2,..., v k )) Span(v 1, v 2,..., v k ). Hence Span(v 1,..., v k ) is T -invariant for all k. Note that (v 1,..., v n ) is a basis since it is put together from bases of Span(v 1 ) and U.

Proof of Theorem 5.13 (on board) (6) Proof: by induction on dim V. Base case: dim V = 1 (or 0). Inductive step: let v 1 =nonzero eigenvector. Write V = Span(v 1 ) U for some U. Let S L(U) be S(u) = P U,Span(v1 )T (u) for all u. By induction, there is a basis (v 2,..., v n ) of U in which M(S) is upper-triangular. Now P U,Span(v1 )T (Span(v 2,..., v k )) Span(v 2,..., v k ). This implies T (Span(v 2,..., v k )) Span(v 1, v 2,..., v k ). Hence Span(v 1,..., v k ) is T -invariant for all k. Note that (v 1,..., v n ) is a basis since it is put together from bases of Span(v 1 ) and U. By the proposition, M(T, (v 1,..., v n )) is upper-triangular.

Eigenvalues and upper-triangular matrices (7) Recall that the eigenvalues of a diagonal matrix are the diagonal entries.

Eigenvalues and upper-triangular matrices (7) Recall that the eigenvalues of a diagonal matrix are the diagonal entries. Proposition (Proposition 5.18) The eigenvalues of an upper-triangular matrix are exactly the diagonal entries.

Eigenvalues and upper-triangular matrices (7) Recall that the eigenvalues of a diagonal matrix are the diagonal entries. Proposition (Proposition 5.18) The eigenvalues of an upper-triangular matrix are exactly the diagonal entries. ( ) ( ) ( ) 2 1 1 1 Ex: A = : eigenvals 2, 1; eigenvecs and. 0 1 0 1

Eigenvalues and upper-triangular matrices (7) Recall that the eigenvalues of a diagonal matrix are the diagonal entries. Proposition (Proposition 5.18) The eigenvalues of an upper-triangular matrix are exactly the diagonal entries. ( ) ( ) ( ) 2 1 1 1 Ex: A = : eigenvals 2, 1; eigenvecs and. 0 1 0 1 If not upper-triangular, not true: e.g., 90 rotation: no eigenvalue of zero; complex eigenvalues are ±i.

Eigenvalues and upper-triangular matrices (7) Recall that the eigenvalues of a diagonal matrix are the diagonal entries. Proposition (Proposition 5.18) The eigenvalues of an upper-triangular matrix are exactly the diagonal entries. ( ) ( ) ( ) 2 1 1 1 Ex: A = : eigenvals 2, 1; eigenvecs and. 0 1 0 1 If not upper-triangular, not true: e.g., 90 rotation: no eigenvalue of zero; complex eigenvalues are ±i. Proof (from the warm-up!): The eigenvalues are exactly λ F such that null(a λi ) {0}.

Eigenvalues and upper-triangular matrices (7) Recall that the eigenvalues of a diagonal matrix are the diagonal entries. Proposition (Proposition 5.18) The eigenvalues of an upper-triangular matrix are exactly the diagonal entries. ( ) ( ) ( ) 2 1 1 1 Ex: A = : eigenvals 2, 1; eigenvecs and. 0 1 0 1 If not upper-triangular, not true: e.g., 90 rotation: no eigenvalue of zero; complex eigenvalues are ±i. Proof (from the warm-up!): The eigenvalues are exactly λ F such that null(a λi ) {0}. So we have to show: an upper-triangular matrix is invertible if and only if its diagonal entries are nonzero (Prop. 5.16).

Eigenvalues and upper-triangular matrices (7) Recall that the eigenvalues of a diagonal matrix are the diagonal entries. Proposition (Proposition 5.18) The eigenvalues of an upper-triangular matrix are exactly the diagonal entries. ( ) ( ) ( ) 2 1 1 1 Ex: A = : eigenvals 2, 1; eigenvecs and. 0 1 0 1 If not upper-triangular, not true: e.g., 90 rotation: no eigenvalue of zero; complex eigenvalues are ±i. Proof (from the warm-up!): The eigenvalues are exactly λ F such that null(a λi ) {0}. So we have to show: an upper-triangular matrix is invertible if and only if its diagonal entries are nonzero (Prop. 5.16). We proved this in the warm-up using Gaussian elimination.

The real case (8) Now let F := R. Then T need not have an eigenvalue. Example: rotation matrices!

The real case (8) Now let F := R. Then T need not have an eigenvalue. Example: rotation matrices! Theorem (Theorem 5.24) V 0 admits a T -invariant subspace of dimension 1 or 2.

The real case (8) Now let F := R. Then T need not have an eigenvalue. Example: rotation matrices! Theorem (Theorem 5.24) V 0 admits a T -invariant subspace of dimension 1 or 2. Proof. As we proved, there is a polynomial f (x) such that f (T ) = 0.

The real case (8) Now let F := R. Then T need not have an eigenvalue. Example: rotation matrices! Theorem (Theorem 5.24) V 0 admits a T -invariant subspace of dimension 1 or 2. Proof. As we proved, there is a polynomial f (x) such that f (T ) = 0. Factor f (x) (over R) using: Theorem (Thereom 4.14) Every polynomial with real coefficients factors into linear and quadratic factors.

The real case (8) Now let F := R. Then T need not have an eigenvalue. Example: rotation matrices! Theorem (Theorem 5.24) V 0 admits a T -invariant subspace of dimension 1 or 2. Proof. As we proved, there is a polynomial f (x) such that f (T ) = 0. Factor f (x) (over R) using: Theorem (Thereom 4.14) Every polynomial with real coefficients factors into linear and quadratic factors. Thus, for some linear or quadratic factor g(x), we must have null g(t ) 0.

The real case (8) Now let F := R. Then T need not have an eigenvalue. Example: rotation matrices! Theorem (Theorem 5.24) V 0 admits a T -invariant subspace of dimension 1 or 2. Proof. As we proved, there is a polynomial f (x) such that f (T ) = 0. Factor f (x) (over R) using: Theorem (Thereom 4.14) Every polynomial with real coefficients factors into linear and quadratic factors. Thus, for some linear or quadratic factor g(x), we must have null g(t ) 0. Now, say v null g(t ) is nonzero. Then, Span(v, Tv) is an invariant subspace of dimension 2 (or 1).

Real case: block upper-triangular form (9) Theorem Let F = R and T L(V ). Then, in some basis, M(T ) has the A 1 0 A 2 block upper-triangular form......, where each A j 0 0 A n is either 1 1 or 2 2; in the latter case, A j has no real eigenvalues.

Real case: block upper-triangular form (9) Theorem Let F = R and T L(V ). Then, in some basis, M(T ) has the A 1 0 A 2 block upper-triangular form......, where each A j 0 0 A n is either 1 1 or 2 2; in the latter case, A j has no real eigenvalues. By Gaussian elimination, A λi is noninjective if and only if A i λi is for some i (even if λ C!) So the real eigenvalues are the entries of the 1 1 matrices A i.

Real case: block upper-triangular form (9) Theorem Let F = R and T L(V ). Then, in some basis, M(T ) has the A 1 0 A 2 block upper-triangular form......, where each A j 0 0 A n is either 1 1 or 2 2; in the latter case, A j has no real eigenvalues. By Gaussian elimination, A λi is noninjective if and only if A i λi is for some i (even if λ C!) So the real eigenvalues are the entries of the 1 1 matrices A i. The non-real complex eigenvalues are the eigenvalues of the 2 2 A i. Using PS 6, #10 you see that the complex eigenvalues of each of these are of the form a ± bi for some a, b R. (This is because the eigenvalues of A i are fixed by complex conjugation, since A i itself is).

Proof of the theorem (on board) (10) We prove this by induction on dim V, beginning with zero.

Proof of the theorem (on board) (10) We prove this by induction on dim V, beginning with zero. For V 0, we showed there exists an invariant subspace U V of dimension 1 or 2. Let this be 1 if possible; otherwise let it be 2.

Proof of the theorem (on board) (10) We prove this by induction on dim V, beginning with zero. For V 0, we showed there exists an invariant subspace U V of dimension 1 or 2. Let this be 1 if possible; otherwise let it be 2. Pick a basis of U. Then M(T U ) is either 1 1 or, if it is 2 2, it has no real eigenvalue.

Proof of the theorem (on board) (10) We prove this by induction on dim V, beginning with zero. For V 0, we showed there exists an invariant subspace U V of dimension 1 or 2. Let this be 1 if possible; otherwise let it be 2. Pick a basis of U. Then M(T U ) is either 1 1 or, if it is 2 2, it has no real eigenvalue. Write V = U U for some complement U V.

Proof of the theorem (on board) (10) We prove this by induction on dim V, beginning with zero. For V 0, we showed there exists an invariant subspace U V of dimension 1 or 2. Let this be 1 if possible; otherwise let it be 2. Pick a basis of U. Then M(T U ) is either 1 1 or, if it is 2 2, it has no real eigenvalue. Write V = U U for some complement U V. Inductively, pick a basis of U so that S := P U,U T U, A 2 viewed as a map U U, has the form...... 0 A n

Proof of the theorem (on board) (10) We prove this by induction on dim V, beginning with zero. For V 0, we showed there exists an invariant subspace U V of dimension 1 or 2. Let this be 1 if possible; otherwise let it be 2. Pick a basis of U. Then M(T U ) is either 1 1 or, if it is 2 2, it has no real eigenvalue. Write V = U U for some complement U V. Inductively, pick a basis of U so that S := P U,U T U, A 2 viewed as a map U U, has the form...... 0 A n Put this together with our basis of U to get a basis of V. Then M(T ) is as desired.

Operators on odd-dimensional vector spaces have eigenvalues (11) Corollary (Theorem 5.26) Every operator on an odd-dimensional real vector space has an eigenvalue.

Operators on odd-dimensional vector spaces have eigenvalues (11) Corollary (Theorem 5.26) Every operator on an odd-dimensional real vector space has an eigenvalue. Proof. Pick a basis so that the matrix M(T ) is in the above block upper-triangular form.

Operators on odd-dimensional vector spaces have eigenvalues (11) Corollary (Theorem 5.26) Every operator on an odd-dimensional real vector space has an eigenvalue. Proof. Pick a basis so that the matrix M(T ) is in the above block upper-triangular form. Then, at least one block must be (λ j ) for λ j R.

Operators on odd-dimensional vector spaces have eigenvalues (11) Corollary (Theorem 5.26) Every operator on an odd-dimensional real vector space has an eigenvalue. Proof. Pick a basis so that the matrix M(T ) is in the above block upper-triangular form. Then, at least one block must be (λ j ) for λ j R. So λ j is an eigenvalue.

Operators on odd-dimensional vector spaces have eigenvalues (11) Corollary (Theorem 5.26) Every operator on an odd-dimensional real vector space has an eigenvalue. Proof. Pick a basis so that the matrix M(T ) is in the above block upper-triangular form. Then, at least one block must be (λ j ) for λ j R. So λ j is an eigenvalue. Remark: we will give another proof much later: the eigenvalues are the roots of the characteristic polynomial det(xi A), which has real coefficients. Its degree equals dim V, and if this is odd, it must have a real root.

Similarity of matrices (12) Recall from class and the homework: Two matrices A and B can be the matrix of the same linear transformation if and only if they are conjugate: A = CBC 1.

Similarity of matrices (12) Recall from class and the homework: Two matrices A and B can be the matrix of the same linear transformation if and only if they are conjugate: A = CBC 1. This is because if ( v 1 v n) = ( v1 v n ) C, then M(T, (v 1,..., v n )) = CM(T, (v 1,..., v n))c 1.

Similarity of matrices (12) Recall from class and the homework: Two matrices A and B can be the matrix of the same linear transformation if and only if they are conjugate: A = CBC 1. This is because if ( v 1 v n) = ( v1 v n ) C, then M(T, (v 1,..., v n )) = CM(T, (v 1,..., v n))c 1. Example: if (v 1,..., v n ) is the standard basis of V = Mat(n, 1, F), then C = (v 1 v n) is the vectors v 1,..., v n put together.

Corollaries of upper-triangular theorems (13) Corollary If A is a square complex matrix, then there exists an invertible complex matrix C such that B = C 1 AC is upper-triangular. The eigenvalues of A (or B) are then exactly the diagonal entries.

Corollaries of upper-triangular theorems (13) Corollary If A is a square complex matrix, then there exists an invertible complex matrix C such that B = C 1 AC is upper-triangular. The eigenvalues of A (or B) are then exactly the diagonal entries. Corollary If A is a square real matrix, then there exists an invertible real matrix C such that B = C 1 AC is block upper-triangular with diagonal blocks 1 1 or 2 2; in the latter case the block has no real eigenvalues. The real eigenvalues of A (or B) are the entries of the 1 1-diagonal blocks of B, and the nonreal complex eigenvalues are the eigenvalues of the 2 2-diagonal blocks of B (which are always of the form a ± bi, a, b R).