Jordan normal form notes (version date: 11/21/07)

Similar documents
MAT1302F Mathematical Methods II Lecture 19

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix

Diagonalization of Matrix

Lecture 15, 16: Diagonalization

Math Linear Algebra Final Exam Review Sheet

(a) II and III (b) I (c) I and III (d) I and II and III (e) None are true.

and let s calculate the image of some vectors under the transformation T.

Diagonalization. P. Danziger. u B = A 1. B u S.

Modeling and Simulation with ODE for MSE

Study Guide for Linear Algebra Exam 2

Recall : Eigenvalues and Eigenvectors

Eigenspaces. (c) Find the algebraic multiplicity and the geometric multiplicity for the eigenvaules of A.

Eigenvalues and Eigenvectors

Notes on the matrix exponential

Dot Products. K. Behrend. April 3, Abstract A short review of some basic facts on the dot product. Projections. The spectral theorem.

Math 110 Linear Algebra Midterm 2 Review October 28, 2017

Lecture Notes: Eigenvalues and Eigenvectors. 1 Definitions. 2 Finding All Eigenvalues

Chapter 7. Canonical Forms. 7.1 Eigenvalues and Eigenvectors

1 Last time: least-squares problems

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Chapters 5 & 6: Theory Review: Solutions Math 308 F Spring 2015

Math 205, Summer I, Week 4b:

From Lay, 5.4. If we always treat a matrix as defining a linear transformation, what role does diagonalisation play?

Chapter 6: Orthogonality

After we have found an eigenvalue λ of an n n matrix A, we have to find the vectors v in R n such that

Chapter 5. Eigenvalues and Eigenvectors

CS 246 Review of Linear Algebra 01/17/19

1. Linear systems of equations. Chapters 7-8: Linear Algebra. Solution(s) of a linear system of equations (continued)

a 11 a 12 a 11 a 12 a 13 a 21 a 22 a 23 . a 31 a 32 a 33 a 12 a 21 a 23 a 31 a = = = = 12

Linear Algebra: Matrix Eigenvalue Problems

Econ Slides from Lecture 7

Math Spring 2011 Final Exam

Chapter 4 & 5: Vector Spaces & Linear Transformations

Conceptual Questions for Review

The goal of this chapter is to study linear systems of ordinary differential equations: dt,..., dx ) T

Spring 2019 Exam 2 3/27/19 Time Limit: / Problem Points Score. Total: 280

MATHEMATICS 217 NOTES

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

A = 3 1. We conclude that the algebraic multiplicity of the eigenvalues are both one, that is,

The Cayley-Hamilton Theorem and the Jordan Decomposition

Solutions to Final Exam

Chapter 7: Symmetric Matrices and Quadratic Forms

Math 205, Summer I, Week 4b: Continued. Chapter 5, Section 8

Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Homework sheet 4: EIGENVALUES AND EIGENVECTORS. DIAGONALIZATION (with solutions) Year ? Why or why not? 6 9

Math Ordinary Differential Equations

Examples include: (a) the Lorenz system for climate and weather modeling (b) the Hodgkin-Huxley system for neuron modeling

Linear Algebra Practice Final

c Igor Zelenko, Fall

The Jordan Normal Form and its Applications

Math 3191 Applied Linear Algebra

Topic 1: Matrix diagonalization

1 Matrices and Systems of Linear Equations

1. Matrix multiplication and Pauli Matrices: Pauli matrices are the 2 2 matrices. 1 0 i 0. 0 i

EIGENVALUES AND EIGENVECTORS

1. General Vector Spaces

Linear Algebra Practice Problems

Roberto s Notes on Linear Algebra Chapter 10: Eigenvalues and diagonalization Section 3. Diagonal matrices

Lecture 10 - Eigenvalues problem

j=1 x j p, if 1 p <, x i ξ : x i < ξ} 0 as p.

Symmetric and self-adjoint matrices

Topics in linear algebra

MATH 315 Linear Algebra Homework #1 Assigned: August 20, 2018

October 4, 2017 EIGENVALUES AND EIGENVECTORS. APPLICATIONS

Math 217: Eigenspaces and Characteristic Polynomials Professor Karen Smith

22m:033 Notes: 7.1 Diagonalization of Symmetric Matrices

MTH 464: Computational Linear Algebra

Mathematical Methods wk 2: Linear Operators

Linear Algebra Review. Vectors

Eigenvalues and Eigenvectors

Practice Final Exam Solutions

Announcements Monday, November 06

Lecture 6. Eigen-analysis

Linear Algebra (MATH ) Spring 2011 Final Exam Practice Problem Solutions

Math 314H Solutions to Homework # 3

CAAM 335 Matrix Analysis

Generalized Eigenvectors and Jordan Form

Systems of Linear Differential Equations Chapter 7

September 26, 2017 EIGENVALUES AND EIGENVECTORS. APPLICATIONS

Eigenvalues & Eigenvectors

A PRIMER ON SESQUILINEAR FORMS

ICS 6N Computational Linear Algebra Symmetric Matrices and Orthogonal Diagonalization

Eigenvalues and Eigenvectors 7.1 Eigenvalues and Eigenvecto

Bare-bones outline of eigenvalue theory and the Jordan canonical form

Dot Products, Transposes, and Orthogonal Projections

Banded symmetric Toeplitz matrices: where linear algebra borrows from difference equations. William F. Trench Professor Emeritus Trinity University

MAC Module 12 Eigenvalues and Eigenvectors. Learning Objectives. Upon completing this module, you should be able to:

Jordan Canonical Form Homework Solutions

Eigenvalues and Eigenvectors

REVIEW FOR EXAM III SIMILARITY AND DIAGONALIZATION

c 1 v 1 + c 2 v 2 = 0 c 1 λ 1 v 1 + c 2 λ 1 v 2 = 0

MATH 423 Linear Algebra II Lecture 20: Geometry of linear transformations. Eigenvalues and eigenvectors. Characteristic polynomial.

A matrix over a field F is a rectangular array of elements from F. The symbol

Schur s Triangularization Theorem. Math 422

LINEAR ALGEBRA MICHAEL PENKAVA

MAT 1302B Mathematical Methods II

G1110 & 852G1 Numerical Linear Algebra

Matrices and Linear Algebra

Question: Given an n x n matrix A, how do we find its eigenvalues? Idea: Suppose c is an eigenvalue of A, then what is the determinant of A-cI?

3.3 Eigenvalues and Eigenvectors

Transcription:

Jordan normal form notes (version date: /2/7) If A has an eigenbasis {u,, u n }, ie a basis made up of eigenvectors, so that Au j = λ j u j, then A is diagonal with respect to that basis To see this, let U = (u u n ), ie let U denote the matrix with j-th column u j, j =,, n Then AU = (Au Au n ) = (λ u λ n u n ) = U diag (λ,, λ n ) implies that U AU = diag (λ,, λ n ) What happens if A doesn t have an eigenbasis? Important example: Let e j denote the j-th Euclidean basis vector, with in the j-th postion and elsewhere For example,, e 2 =, and e n = Define e = N n := ( e e n ) = The characteristic polynomial of N n is χ Nn (λ) = λ n ; hence zero is the only eigenvalue of N n x x 2 N n = x + x 2 e + x n e n = x x n n implies that ker N n = span {e } For any n n matrix B = (b b n ), B N n = (B B e B e n ) = ( b b n ); the columns of B are shifted right, with a zero column being introduced on the left In particular, N 2 n = ( e e n 2 ) N 3 n = ( e e n 3 ) N n n =

Hence ker Nn j = span {e,, e j } If we let S n,k denote the n n matrix with on the k super-diagonal and elsewhere, ie with ij-th entry equal to if j = i + k, otherwise, then Nn k = S n,k The exponential of N n can be computed using the power series for exp, since S n,j = O if j n implies that t j n exp(t N n ) = j! N n j t j = j! S n,j () j= j= We can use this example as a guide in handling arbitrary repeated eigenvalues with insufficient eigenspace, ie eigenvalues λ whose algebraic multiplicity (the power to which (x λ) appears in the characteristic polynomial is greater than the geometric multiplicity (the dimension of the eigenspace) Define s : R n n R R n n by s(a, λ) := A λ I If λ is an eigenvalue of A, then the eigenspace of λ is the kernel of s(a, λ): ker s(a, λ) = {x R n : s(a, λ)x = } = {x R n : A x = λ x} If λ is an eigenvalue of A with (algebraic) multiplicity l, then the generalized eigenspace of λ is the kernel of s(a, λ) l 2 Example: A = 2 3 The characteristic polynomial of A is χ A (λ) = (λ 2) 2 (λ 3) s(a, 2) = ( e e 3 ), with ker s(a, λ) = span {e } s(a, 2) 2 = ( e 3 ), with ker s(a, λ) 2 = span {e, e 2 } Consider the case in which λ has algebraic multiplicity k, but geometric multiplicity (the eigenspace of λ is one dimensional) Let u be an eigenvector of A with eigenvalue λ If there are u 2,, u k such that then s(a, λ)u j+ = u j (equivalently Au j+ = u j + λ u j+ ) j =,, k, (Here I is the k k identity matrix) A(u u 2 u k ) = (A u A u 2 A u k ) = (λ u u + λ u 2 u k + λ u k ) = λ(u u 2 u k ) + ( u 2 u k ) = (u u 2 u k )(λ I + N k ) 2

Example: A = 2 2 The characteristic polynomial of A is χ A (λ) = λ(λ ) 2, so is an eigenvalue with multiplicity one and is an eigenvalue with algebraic multiplicity 2 Summing all three columns of 2 s(a, ) = 2 gives, so u = is an eigenvector of A with eigenvalue One can verify that the eigenspace of is one dimensional, so we need to determine the generalized eigenspace The second column of s(a, λ) equals u, so s(a, λ)e 2 = u ; hence we can take u 2 = e 2 The generalized eigenspace equals span {u, u 2 } Summing the first two columns of A gives, so u 3 = is an eigenvector of A with eigenvalue Note: This basis is a permutation of the one I used in lecture here I ve put the double eigenvalue first and the single one last The intermediate matrices are different, but the final answer is, as it has to be, the same The choice of basis, and hence of U, is not unique! If we set U = (u u 2 u 3 ) =, then U A U = U A U is almost diagonal, and almost as easy to exponentiate as a diagonal matrix One way of computing exp(t A) is the following: U AU = C + D, where C = and D = diag (,, ) C D = D C and C 2 = O imply that exp(t U A U) = exp(t (C+D)) = exp(t C) exp(t D) = (I+t C) diag ( e t, e t, ) = e t t e t e t Hence exp(t A) = U exp(t U A U) U = t e t t e t e t ( + t) e t ( + t) e t e t t e t t e t e t 3

A block diagonal matrix B consists of a collection of smaller square matrices straddling the diagonal, with zeroes elsewhere: B O O O B 2 O B = block (B,, B n ) =, O O B k ( ) where B j is a d j d j matrix, j =,, k block (B,, B n ) j = block B j,, Bj n ; hence exp(t block (B,, B n )) = block (exp(t B ),, exp(t B n )) Exponentiating several little matrices is generally easier than exponentiating one big one; for example, exp(t diag (λ,, λ n )) = diag ( e λ t,, e λn t) is a very easy calculation Hence the following construction, called the Jordan normal form is very convenient It guarantees that every matrix can be block diagonalized by a change of basis, with the blocks being matrices whose exponentials are explicitly known A Jordan block is a complex k k matrix of the form B = λ I + N k = O λ O λ λ ie b jj = λ, j =,, k, b j j+ =, j =,, k, and all other entries are A Jordan block is easily exponentiated: Since λ I commutes with everything,, k exp(t (λ I + N k )) = exp(t λ I) exp(t N k ) = e λ t t j j! S k,j (2) (Recall that S k,j has il-th entry equal to if l = i + j, otherwise; see ()) Jordan normal form theorem: If A is an n n complex matrix with eigenvalues λ, λ l (possibly repeated), then there is an invertible matrix U such that U A U = block (B,, B r ) for some Jordan blocks B,, B r The blocks are uniquely determined, but the ordering depends on the choice of U Using the Jordan normal form, we can solve any linear homogeneous initial value problem ẋ = A x, x() = x, as follows: We know that the solution has the form x(t) = exp(t A)x, so it suffices to compute exp(t A) Let U be a matrix that takes A to Jordan normal form, with blocks B, B r (The columns of U are (generalized) eigenvectors of A) Then exp(t A) = U exp(t U AU) U = U exp(t block (B,, B r )) U j= = U block (exp(t B ),, exp(t B r )) U 4

The exponential of each of the Jordan blocks is given by (2) Note: If we set y(t) = U x(t), then y(t) = U exp(t A) x so y(t) satisfies ẏ = block (B,, B r ) y = U (U block (exp(t B ),, exp(t B r )) U ) x = block (exp(t B ),, exp(t B r )) y(), 5