NOTES ON SIMPLIFICATION OF MATRICES

Similar documents
2.3 Nilpotent endomorphisms

5 The Rational Canonical Form

Lectures - Week 4 Matrix norms, Conditioning, Vector Spaces, Linear Independence, Spanning sets and Basis, Null space and Range of a Matrix

APPENDIX A Some Linear Algebra

8.4 COMPLEX VECTOR SPACES AND INNER PRODUCTS

Representation theory and quantum mechanics tutorial Representation theory and quantum conservation laws

Salmon: Lectures on partial differential equations. Consider the general linear, second-order PDE in the form. ,x 2

MATH 241B FUNCTIONAL ANALYSIS - NOTES EXAMPLES OF C ALGEBRAS

Affine transformations and convexity

U.C. Berkeley CS294: Spectral Methods and Expanders Handout 8 Luca Trevisan February 17, 2016

MTH 819 Algebra I S13. Homework 1/ Solutions. 1 if p n b and p n+1 b 0 otherwise ) = 0 if p q or n m. W i = rw i

THE CHINESE REMAINDER THEOREM. We should thank the Chinese for their wonderful remainder theorem. Glenn Stevens

BOUNDEDNESS OF THE RIESZ TRANSFORM WITH MATRIX A 2 WEIGHTS

STAT 309: MATHEMATICAL COMPUTATIONS I FALL 2018 LECTURE 16

On Finite Rank Perturbation of Diagonalizable Operators

Formulas for the Determinant

SL n (F ) Equals its Own Derived Group

CSCE 790S Background Results

DIFFERENTIAL FORMS BRIAN OSSERMAN

332600_08_1.qxp 4/17/08 11:29 AM Page 481

FINITELY-GENERATED MODULES OVER A PRINCIPAL IDEAL DOMAIN

Random Walks on Digraphs

Restricted Lie Algebras. Jared Warner

Quantum Mechanics I - Session 4

= = = (a) Use the MATLAB command rref to solve the system. (b) Let A be the coefficient matrix and B be the right-hand side of the system.

1 Matrix representations of canonical matrices

Inner Product. Euclidean Space. Orthonormal Basis. Orthogonal

Errata to Invariant Theory with Applications January 28, 2017

MEM633 Lectures 7&8. Chapter 4. Descriptions of MIMO Systems 4-1 Direct Realizations. (i) x u. y x

Solutions Homework 4 March 5, 2018

C/CS/Phy191 Problem Set 3 Solutions Out: Oct 1, 2008., where ( 00. ), so the overall state of the system is ) ( ( ( ( 00 ± 11 ), Φ ± = 1

n α j x j = 0 j=1 has a nontrivial solution. Here A is the n k matrix whose jth column is the vector for all t j=0

DISCRIMINANTS AND RAMIFIED PRIMES. 1. Introduction A prime number p is said to be ramified in a number field K if the prime ideal factorization

Composite Hypotheses testing

Bézier curves. Michael S. Floater. September 10, These notes provide an introduction to Bézier curves. i=0

Problem Set 9 Solutions

a b a In case b 0, a being divisible by b is the same as to say that

Linear, affine, and convex sets and hulls In the sequel, unless otherwise specified, X will denote a real vector space.

Homework 1 Lie Algebras

8.6 The Complex Number System

A CHARACTERIZATION OF ADDITIVE DERIVATIONS ON VON NEUMANN ALGEBRAS

MATH Sensitivity of Eigenvalue Problems

Computing π with Bouncing Balls

A p-adic PERRON-FROBENIUS THEOREM

Stanford University CS359G: Graph Partitioning and Expanders Handout 4 Luca Trevisan January 13, 2011

ALGEBRA HW 7 CLAY SHONKWILER

Perron Vectors of an Irreducible Nonnegative Interval Matrix

9 Characteristic classes

A combinatorial problem associated with nonograms

U.C. Berkeley CS294: Beyond Worst-Case Analysis Luca Trevisan September 5, 2017

763622S ADVANCED QUANTUM MECHANICS Solution Set 1 Spring c n a n. c n 2 = 1.

Supplement: Proofs and Technical Details for The Solution Path of the Generalized Lasso

First day August 1, Problems and Solutions

Math 217 Fall 2013 Homework 2 Solutions

Linear Algebra and its Applications

REDUCTION MODULO p. We will prove the reduction modulo p theorem in the general form as given by exercise 4.12, p. 143, of [1].

Learning Theory: Lecture Notes

Deriving the X-Z Identity from Auxiliary Space Method

Anti-van der Waerden numbers of 3-term arithmetic progressions.

Eigenvalues of Random Graphs

U.C. Berkeley CS278: Computational Complexity Professor Luca Trevisan 2/21/2008. Notes for Lecture 8

The Order Relation and Trace Inequalities for. Hermitian Operators

Lecture 5 Decoding Binary BCH Codes

Short running title: A generating function approach A GENERATING FUNCTION APPROACH TO COUNTING THEOREMS FOR SQUARE-FREE POLYNOMIALS AND MAXIMAL TORI

U.C. Berkeley CS294: Beyond Worst-Case Analysis Handout 6 Luca Trevisan September 12, 2017

THERE ARE NO POINTS OF ORDER 11 ON ELLIPTIC CURVES OVER Q.

Math 261 Exercise sheet 2

PRIMES 2015 reading project: Problem set #3

MEM 255 Introduction to Control Systems Review: Basics of Linear Algebra

Lecture 2: Gram-Schmidt Vectors and the LLL Algorithm

Bezier curves. Michael S. Floater. August 25, These notes provide an introduction to Bezier curves. i=0

Notes on Frequency Estimation in Data Streams

BACKGROUND: WEAK CONVERGENCE, LINEAR ALGEBRA 1. WEAK CONVERGENCE

Week 2. This week, we covered operations on sets and cardinality.

Example: (13320, 22140) =? Solution #1: The divisors of are 1, 2, 3, 4, 5, 6, 9, 10, 12, 15, 18, 20, 27, 30, 36, 41,

e - c o m p a n i o n

Bernoulli Numbers and Polynomials

Lecture 7: Gluing prevarieties; products

Singular Value Decomposition: Theory and Applications

Rapid growth in finite simple groups

Ballot Paths Avoiding Depth Zero Patterns

Linear Approximation with Regularization and Moving Least Squares

Polynomials. 1 What is a polynomial? John Stalker

An Introduction to Morita Theory

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 12 10/21/2013. Martingale Concentration Inequalities and Applications

However, since P is a symmetric idempotent matrix, of P are either 0 or 1 [Eigen-values

A FORMULA FOR COMPUTING INTEGER POWERS FOR ONE TYPE OF TRIDIAGONAL MATRIX

LECTURE 9 CANONICAL CORRELATION ANALYSIS

Matrix Approximation via Sampling, Subspace Embedding. 1 Solving Linear Systems Using SVD

Google PageRank with Stochastic Matrix

Difference Equations

FACTORIZATION IN KRULL MONOIDS WITH INFINITE CLASS GROUP

1 GSW Iterative Techniques for y = Ax

Non-negative Matrices and Distributed Control

Math 594. Solutions 1

On the Operation A in Analysis Situs. by Kazimierz Kuratowski

Lecture 12: Discrete Laplacian

Homework Notes Week 7

18.781: Solution to Practice Questions for Final Exam

Complete subgraphs in multipartite graphs

Transcription:

NOTES ON SIMPLIFICATION OF MATRICES JONATHAN LUK These notes dscuss how to smplfy an (n n) matrx In partcular, we expand on some of the materal from the textbook (wth some repetton) Part of the exposton s nspred by some notes of Elashberg, whch you may easly fnd on the nternet From the pont of vew of the larger context of the course, the goal s to have a way to compute e A for a gven matrx A, whch s useful n solvng lnear ODEs I wll lkely be modfyng these notes as I lecture Meanwhle, f you have any comments or correctons, even very mnor ones, please let me know Before we proceed, let s start wth a few observatons: () To compute e A, t s useful to fnd some B whch s smlar to A, e, B = S AS for some S, such that e B s easer to compute Ths s because (Exercse) e A = Se B S (2) Now, n terms of computng exponentals, the easest matrces are dagonal ones If Λ = dag(λ, λ n ), then (Exercse) e Λ = dag(e λ,, e λn ) Combnng ths wth the frst observaton, we have an easy way to compute the exponentals of dagonalzable matrces (cf Defnton 4) (3) Another class of matrces whose exponentals are easy to compute are nlpotent matrces Here, we say that N s nlpotent f N k = for some k N For a nlpotent matrx, the power seres that defnes e N becomes a fnte sum and s therefore easer to compute Not all matrces are dagonalzable (Why? Exercse) The man result that we wll get to s that all matrces can be decomposed as the sum of a dagonalzable matrx B and a nlpotent matrx N such that B and N commute Ths gves a way of computng the exponental of any matrx Even f one s only nterested n real matrces, as we wll see, complex numbers enter naturally (cf Remark 2) From now on, we work wth complex matrces and vectors Some results on dagonalzablty We begn by recallng the followng defntons from 6CM: Defnton (Egenvalues and egenvectors) Let A be an (n n) matrx We say that λ s an egenvalue of A f there exsts v such that Av = λv If λ s an egenvalue, any v satsfyng Av = λv (ncludng ) s called an egenvector Remark 2 Some remarks are n order: () λ s an egenvalue ff ker(λi A) {} ff det(λi A) = (2) (Complex) egenvalues always exst Ths s because det(λi A) = s a polynomal n λ, whch always has a root n C by the fundamental theorem of algebra In fact, for an (n n) matrx A, we always have n roots f we count multplcty Ths s the man reason we work over C nstead of R Defnton 3 (Smlarty) Two (n n) matrces A and B are smlar f there exsts an nvertble (n n) matrx S such that S AS = B Defnton 4 (Dagonalzablty) A matrx s dagonalzable f t s smlar to a dagonal matrx One mportant (suffcent but not necessary) crteron for dagonalzablty s gven by the spectral theorem from 6CM:

2 JONATHAN LUK Theorem 5 (Spectral theorem) A real symmetrc matrx s dagonalzable Moreover, gven a real symmetrc matrx A, one can fnd a real (n n) matrx Q such that Q AQ s dagonal and Q T Q = I We wll not repeat the proof here In general, we have the followng characterzaton: Lemma 6 An (n n) matrx A s dagonalzable f and only f C n admts a bass whch conssts of egenvectors of A Proof Only f Suppose A s dagonalzable Then S AS = D for some dagonal matrx λ λ 2 D = λn and some nvertble matrx S Let v C n, =,, n, be the columns of S, e, S = v v 2 v n Snce S s nvertble, {v } n = forms a bass Moreover, Av Av 2 Av n = AS = SD = λ v λ 2 v 2 λ n v n Hence, comparng each column, we have Av = λ v In other words, the bass v,, v n conssts of egenvectors If Ths s smlar Gven a bass {v,, v n } of C n consstng of egenvectors of A, defne S = v v 2 v n Then a smlar computaton as the only f part shows that S AS s dagonal A corollary s the followng (suffcent but not necessary) crteron for dagonalzablty: Corollary 7 Suppose all the egenvalues of an (n n) matrx A are dstnct, then A s dagonalzable Proof Let {λ } n = be the dstnct egenvalues and let {v } n = Cn \ {} be correspondng egenvectors We prove that {v } n = s lnearly ndependent by nducton Base case Snce v, {v } s lnearly ndependent Inducton step Suppose {v, v k } are lnearly ndependent Let Applyng A, we get k c v = = k c λ v = =

NOTES ON SIMPLIFICATION OF MATRICES 3 Multply the frst equaton by λ k and subtract the second equaton from t We obtan k c (λ k λ )v = = By lnear ndependence of {v,, v k }, c (λ k λ ) = for all {,, k } Snce λ k λ, we have c = for all {,, k } Ths then mples also that c k = Hence, {v, v k } are lnearly ndependent By nducton, {v, v n } s lnearly ndependent It therefore forms a bass of C n By Lemma 6, A s dagonalzable In general, not all matrces are dagonalzable (see homework 3 for an example) Nevertheless, we have Propostons 8 and Proposton 9 below Proposton 8 Every matrx s smlar to an upper trangular matrx Proof Let us prove ths by nducton on the sze of the matrx Clearly all matrces are upper trangular Suppose that for some k 2, every (k ) (k ) matrx s smlar to an upper trangular matrx Let A be an k k matrx By Remark 2, A has an egenvector, say, x wth egenvalue λ Complete x to a bass {x, y 2,, y k } Consder now the matrx S = x y 2 y k Then S AS = λ B where B s a (k ) (k ) matrx (whch we do not know much about) and are entres that we have no nformaton about By the nducton hypothess, there exsts a (k ) (k ) matrx S 2 such that S 2 B S 2 s upper trangular Now let S 2 be a k k matrx defned by Let S = S S 2 We compute S AS =S 2 S AS S 2 = = S 2 = S2 S 2 λ S 2 B S 2 whch s an upper trangular matrx,, λ B S2 Proposton 9 The set of dagonalzable matrces s dense n the set of all matrces

4 JONATHAN LUK Proof Gven an (n n) matrx A and an ɛ >, we want to fnd a matrx B wth A B op < ɛ such that B s dagonalzable To do ths, frst, we fnd S such that S AS s upper trangular Then the egenvalues of A are the dagonal entres of S AS (Why? Exercse) Now one can fnd an upper trangular matrx B by perturbng the dagonal entres of S AS such that B S AS op < ɛ S op S op and all the egenvalues of B are dstnct Defne B = S BS We check that A B op = SS (A B)SS op S op S AS B op S op < ɛ B s dagonalzable: B s dagonalzable by Corollary 7 Snce B s smlar to B, B s dagonalzable 2 Cayley Hamlton theorem Defnton 2 Let p(λ) = a n λ n + + a wth a C for =,, n be a polynomal Then for an (n n) matrx A, p(a) s the (n n) matrx The followng s a fundamental result: p(a) = a n A n + + a I Theorem 22 (Cayley Hamlton) Let χ A (λ) = det(λi A) be the characterstc polynomal of A Then χ A (A) = Proof Step Frst, ths s true for dagonal matrces To see that, let A be a dagonal matrx, λ λ 2 A = λn λ n It s easy to compute that χ A (λ) = Π n = (λ λ ) We now check χ A (A) = Π n =(A λ I) λ 2 λ = λ λ 2 λ λ n λ 2 λ n = λ n λ (λ λ 2 ) (λ λ n ) λn λ2 (λ 2 λ ) (λ 2 λ n ) (λn λ) (λn λ2) = Step 2 We clam that a consequence of Step s that χ A (A) = for all dagonalzable A To see ths, suppose S AS = D for some dagonal matrx D Then det(λi A) = det(s ) det(λi A) det(s) = det(s (λi A)S) = det(λi D) Hence χ A (λ) = χ D (λ) Now suppose Then χ A (λ) = χ D (λ) = λ n + a n λ n + + a χ A (A) =A n + a n A n + + a I = SS (A n + a n A n + + a I)SS =S ( (S AS) n + a n (S AS) n + + a I ) S =Sχ D (D)S By Step, χ D (D) = Hence, χ A (A) = Step 3 χ A (A) depends on A contnuously Therefore, the concluson of the theorem follows from the densty of dagonalzable matrces (Proposton 9)

NOTES ON SIMPLIFICATION OF MATRICES 5 3 Generalzed egenspaces The Cayley Hamlton theorem gves us a way to decompose C n nto generalzed egenspaces Defnton 3 Let A be an (n n) matrx We say that v s a generalzed egenvector f there exsts an egenvalue λ and a k N such that v ker(λi A) k Defnton 32 Let A be an (n n) matrx and λ be an egenvalue The generalzed egenspace assocate to λ s gven by {v C n : (λi A) k v = for some k N} Defnton 33 Gven two polynomals f and g, we say that h s a greatest common dvsor of f and g f h s a non-zero polynomal of maxmal degree whch dvdes both f and g Proposton 34 (Bézout s dentty) Let f, g be polynomals Then there exsts a greatest common dvsor whch s unque up to multplcaton by constants Moreover, for any greatest common dvsor h, there exsts polynomals p and q such that pf + qg = h Proof Consder the set S = {pf + qg : p, q polynomals} Let h = p h f + q h g be a non-zero polynomal n S wth mnmal degree We clam that h dvdes any polynomal n S Suppose not Then there are some p d and q d such that p d f + q d g = sh + r for some polynomals s and r, wth r and deg(r) < deg(h) But then r = (p d p h s)f + (q d q h s)g s a non-zero element n S wth a smaller degree than h, whch s a contradcton We now clam that h s a greatest common dvsor of f and g Suppose c s a common dvsor of f and g, say f = cp c and g = cq c for some polynomals p c and q c Then h = p h f + q h g = p h p c c + q h q c c = (p h p c + q h q c )c Hence, c dvdes h In other words, any other common dvsor of f and g must dvde h Ths shows that h s a greatest common dvsor The same argument also shows the unqueness of the greatest common dvsor up to multplcatve constant Fnally, Bézout s dentty follows from takng p = p h and q = q h Defnton 35 We denote by (f, g) the unque monc (e, wth leadng coeffcent ) polynomal whch s a greatest common dvsor of f and g Theorem 36 Let V be a fnte dmensonal vector space, A : V polynomals such that (f, g) = and f(a)g(a) = Then V = ker(f(a)) ker(g(a)), V be a lnear map and f, g be e, for every v V, there exst unque x ker(f(a)) and y ker(g(a)) such that v = x + y Proof By Proposton 34, there exst polynomals p and q such that pf + qg = Now gven v V, defne x = q(a)g(a)v, y = p(a)f(a)v By defnton, v = Iv = (p(a)f(a) + q(a)g(a))v = y + x and f(a)x = q(a)f(a)g(a)v =, g(a)y = p(a)f(a)g(a)v = Notce that we have used that any two polynomals of A commute We have thus shown that for every v V, there exst x ker(f(a)) and y ker(g(a)) such that v = x + y Fnally, suppose v = x+y = x +y, where x, x ker(f(a)) and y, y ker(g(a)) Then x x = y y ker(f(a)) ker(g(a)) Then Hence, x = x and consequently y = y x x = (f(a)p(a) + g(a)q(a))(x x ) =

6 JONATHAN LUK Iteratng ths, we have Corollary 37 Let V be a fnte dmensonal vector space, A : V V be a lnear map and f = f f k, where f are polynomals wth (f, f j ) = whenever j, f(a) = Then V = ker(f (A)) ker(f k (A)) In partcular, usng Cayley Hamlton theorem, we can decompose C n nto generalzed egenspaces: Corollary 38 Let A be an (n n) matrx and suppose the characterstc polynomal can be wrtten as where the λ s are dstnct and k = ν = n Then Ths also mples the followng: det(λi A) = Π k =(λ λ ) ν, C n = ker(a λ I) ν ker(a λ 2 I) ν2 ker(a λ k I) ν k (3) Corollary 39 C n admts a bass of generalzed egenvectors 4 Decomposton nto dagonalzable and nlpotent matrces Theorem 4 Let A be an (n n) matrx Then there exsts (n n) matrces L and N such that () A = L + N, (2) L s dagonalzable, (3) N s nlpotent, n fact, N n =, (4) LN = NL Proof Recall that C n admts a decomposton (3) Defne L so that for any v ker(a λ I) ν, Lv = λ v It s easy to check that L s lnear and therefore s gven by a matrx We now check the followng clams: Clam : L s dagonalzable By defnton C n admts a bass of egenvectors of L Hence, L s dagonalzable by Lemma 6 Clam 2: A L s nlpotent We wll show that for ν = max{ν,, ν k }, (A L) ν v = for all v C n Ths then mples (A L) ν = Snce ν n, we have N n = By (3), t suffces to consder v ker(a λ I) ν for some By defnton of L, (A L)v = (A λ I)v Now clearly (A λ I)v ker(a λ I) ν, so we have (A L) 2 v = (A λ I)v Iteratng ths, we have (A L) ν = (A λ I) ν v = snce ν ν and v ker(a λ I) ν Clam 3: N = A L and L commute As before, by (3), t suffces to check that NLv = LNv for v ker(a λ I) ν for some Now NLv = (A λ I)λ v = λ (A λ I)v = LNv Theorem 42 The decomposton n Theorem 4 s unque Proof Recall that C n admts a decomposton (3) We want to prove that f A = L + N, where L s dagonalzable, N k = for some k N and NL = LN, then Lv = λ v for any v ker(a λ I) ν We compute, for v ker(a λ I) ν, that (L λ I) n+k v =(A λ I N) 2n v = 2n j= (n + k)! j!(n + k j)! ( N)j (A λ I) n+k j v n (n + k)! = j!(n + k j)! ( N)j (A λ I) n+k j v j= =

NOTES ON SIMPLIFICATION OF MATRICES 7 Here, we have used () N and A commute snce NA = N(L + N) = NL + N 2 = LN + N 2 = (L + N)N = AN, (2) N k =, (3) Snce (A λ I) ν v = and ν n, we have (A λ I) l v = for all l n Now, (L λ I) n+k v = Snce L s dagonalzable, (L λ I)v = Ths s what we want to show Consder A = Egenvalues We frst fnd ts egenvalues det(λi A) = det λ λ λ λ 5 Example = λλ(λ2 + ) ( )(λ 2 + ) = (λ 2 + ) 2 Therefore, the egenvalues are ±, each wth multplcty 2 Generalzed egenvectors We now fnd the generalzed egenvectors I A = By nspecton, Smlarly, We compute By nspecton, (I A) 2 = ( I A) 2 = It s easy to check that ndeed ker((i A) 2 ) = span I A =, = ker(( I A) 2 ) = span, = 2 2 2 2 2 2 2 2 2 2 C n = ker((i A) 2 ) ker(( I A) 2 ) 2 2 2 2 2 2 (5) (52)

8 JONATHAN LUK We now decompose A nto ts dagonalzable and nlpotent parts Defne L accordng to the proof of Theorem 4 We check that L = ( 2 )L 2 L = ( 2 ) ( 2 )( ) =, Therefore, L L L We check that = ( 2 )L = ( 2 )L = 2 L + 2 L ( 2 )L + 2 L L = = ( 2 ) = ( 2 ) N = A L = = ( 2 ) + ( 2 )( ) ( 2 )( ) + ( 2 )( ) = = = whch s obvously nlpotent Takng exponental usng the prevous decomposton L s dagonalzable More precsely, gven the above computatons, we know that for S =,,,, we have We check that S LS = S = 2 2 2 2

NOTES ON SIMPLIFICATION OF MATRICES 9 We check that 2 2 S LS = 2 2 2 2 = 2 2 Hence, e Lt =S = = = e t e t e t e t S cos t sn t sn t cos t cos t sn t sn t cos t e t e t e t e t 2 et 2 et 2 et 2 et 2 e t 2 e t 2 e t 2 e t 2 2 2 2 Also, snce we have, N 2 = e tn = I + tn = t t =, Puttng all these together e ta = e tl e tn = cos t sn t sn t cos t cos t sn t sn t cos t t t = cos t sn t t cos t t sn t sn t cos t t sn t t cos t cos t sn t sn t cos t 6 Jordan canoncal form It s useful to further analyse nlpotent matrces Ths wll also lead us to the noton of Jordan canoncal forms, whch gves us another way to compute exponentals of matrces

JONATHAN LUK We begn wth the analyss of nlpotent matrces One example of nlpotent matrces s the followng m m matrx: J m := An easy computaton shows that Lemma 6 J m m = Proposton 62 Let N be an (n n) nlpotent matrx Then N s smlar to a matrx J, whch s a block dagonal matrx wth blocks J n+,, J nk +, where (n + ) + + (n k + ) = n Proof It suffces to fnd a bass v, Nv,, N n v, v 2, Nv 2,, N n2 v 2,, v k,, N n k v k, where Then for N n+ v = N n2+ v 2 = = N n k+ v k = (6) S = N n v Nv v N n2 v 2 v 2 N n k v k v k, S AS s n the desred form Step : Choosng v Let v be a non-zero vector Snce N s nlpotent, there exsts n N {} such that N n v and N n+ v = We clam that v, Nv,, N n v are lnearly ndependent We prove ths by nducton Frst N n v s lnearly ndependent snce t s non-zero Now, suppose N n v, N n v,, N n j v are lnearly ndependent Suppose Multplyng by N on the left, we have c n j N n j v + c n jn n j v + + c n N n v = (62) c n j N n j v + c n jn n j+ v + + c n N n v =, snce N n+ v = By lnear ndependence of N n v, N n v,, N n j v, we have c n j = c n jn n j+ = = c n = Usng (62), we also have c n = Hence, N n v, N n v,, N n j v, N n j v are lnearly ndependent The clam follows from nducton If v, Nv,, N n v form a bass, we are done Otherwse, proceed to Step 2 Step 2 Choosng v 2 Take v 2 so that v, Nv,, N n v, v 2 are lnearly ndependent Take n 2 such that N n2 v 2 but N n2+ v 2 = Wthout loss of generalty, assume n 2 n (otherwse, relabel) There are now two cases: ether () N n v and N n2 v 2 are lnearly ndependent; or (2) N n v = an n2 v 2 for some a C \ {} In case (), we clam that v,, N n v, v 2,, N n2 v 2 are lnearly ndependent We wll prove nductvely n j that v,, N n v, N n2 j v 2,, N n2 j v 2,, N n2 v 2 are lnearly ndependent We already know from above that v,, N n v are lnearly ndependent Suppose that v,, N n v, N n2 j v 2,, N n2 j v 2,, N n2 v 2 are lnearly ndependent for some j (wth j = meanng that none of the v 2 terms are present) Let c v + + c n N n v + d n2 j N n2 j + d n2 jn n2 j + + d n2 N n2 v 2 =, Applyng N and usng the nducton hypothess, we have c = = c n =, d n2 j = = d n2 = Ths mples c n N n v + d n2 N n2 v 2 =

NOTES ON SIMPLIFICATION OF MATRICES Snce N n v and N n2 v 2 are lnearly ndependent, c n = d n2 = Hence, v,, N n v, N n2 j v 2,, N n2 j v 2,, N n2 v 2 are lnearly ndependent Ths completes the nducton In case (2), we let v 2 = v 2 a N n n2 v Notce that ths s stll lnearly ndependent wth v, v n and there exsts n 2 < n 2 such that N n2 v 2 and N n 2 v 2 = Now we consder the followng cases, ether ether ( ) N n v and N n 2 v 2 are lnearly ndependent; or (2 ) N n v = a N n 2 v 2 for some a C \ {} In case ( ), we then argue as above to show that v,, N n v, v 2,, N n 2 v 2 are lnearly ndependent In case (2 ), we then defne v 2 = v 2 a N n n 2 v 2 and then repeat ths process Snce n 2 > n 2 >, ths process must come to an end At the end, we wll obtan v 2 such that v,, N n v, v 2,, N n2 v 2 are lnearly ndependent We now abuse notaton to rename v 2 and n 2 as v 2 and n 2 Now f v,, N n v, v 2,, N n2 v 2 form a bass, we are done Otherwse, we contnue ths process by fndng v 3, v 4, etc, and obtan a bass Remark 63 Let us note that the proof above also suggests an algorthm for fndng a bass satsfyng (6) v, Nv,, N n v, v 2, Nv 2,, N n2 v 2,, v k,, N n k v k Defnton 64 The followng (m m) matrx s known as a Jordan block λ λ λi + J m := λ λ Defnton 65 A matrx s sad to be n Jordan canoncal form f t s a block dagonal matrx wth Jordan blocks Corollary 66 Any (n n) matrx A s smlar to a matrx n Jordan canoncal form Proof We pck a bass of generalzed egenvectors as follows Frst recall (3) Snce A λ I s nlpotent for any ker((a λ I) ν ), by Proposton 62, we can fnd a bass of ker((a λ I) ν ) whch takes the form v,, (A λ I)v,,, (A λ I) n, v,,, v,k, (A λ I)v,k,, (A λ I) n,k v,k By (3), puttng all these bases together for all generalzed egenspaces ker((a λ I) ν ), we obtan a bass of C n Fnally, defnng S by S = (A λ I) n, v, v, (A λ m I) n m,km vm,km v m,km, one checks that S AS s n the Jordan canoncal form 6 Exponentals of matrces n Jordan canoncal form Frst, we note that t t 2 t 2! m (m )! t e Jmt = t 2 2! t

2 JONATHAN LUK Take a Jordan block λi + J m Notce that λij m = J m λi Hence, e (λi+jm)t := e λt λt t2 te 2! eλt t m (m )! eλt e λt te λt t 2 e λt te λt e λt 2! eλt 62 Example wth Jordan canoncal form Let us go back to the example n Secton 5 and fnd the Jordan canoncal formof A Let us frst fnd a bass of C n followng the algorthm suggested n the proof of Proposton 62 Take, an element n the generalzed egenspace n (5) We compute Now consder the vector (A I) = = n the generalzed egenspace n (52) We compute (A + I) = = Therefore, for S defned as we have S = S AS =, Let us note that ths gves another way to compute e At Frst, we compute S : S = 2 2 2 2

NOTES ON SIMPLIFICATION OF MATRICES 3 Then, we have e At =S = = = e t t e t e t t e t S cos t sn t t cos t t sn t sn t cos t t sn t t cos t cos t sn t sn t cos t e t te t e t e t te t e t 2 et 2 et t 2 et t 2 et 2 et 2 et 2 e t 2 e t t 2 e t t 2 e t 2 e t 2 e t 2 2 2 2