The Jordan Normal Form and its Applications

Similar documents
DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix

Definition (T -invariant subspace) Example. Example

Dimension. Eigenvalue and eigenvector

Study Guide for Linear Algebra Exam 2

Math 3191 Applied Linear Algebra

Recall : Eigenvalues and Eigenvectors

Diagonalization of Matrix

Jordan Canonical Form Homework Solutions

(a) II and III (b) I (c) I and III (d) I and II and III (e) None are true.

Final Review Written by Victoria Kala SH 6432u Office Hours R 12:30 1:30pm Last Updated 11/30/2015

Linear Algebra II Lecture 22

Linear Algebra Practice Problems

Chapter 5. Eigenvalues and Eigenvectors

Eigenvalues and Eigenvectors

Linear Systems. Class 27. c 2008 Ron Buckmire. TITLE Projection Matrices and Orthogonal Diagonalization CURRENT READING Poole 5.4

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Eigenvalues and Eigenvectors

ALGEBRA QUALIFYING EXAM PROBLEMS LINEAR ALGEBRA

Math 323 Exam 2 Sample Problems Solution Guide October 31, 2013

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL

Linear System Theory

Lecture 15, 16: Diagonalization

235 Final exam review questions

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017

LINEAR ALGEBRA REVIEW

Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Chapter 4 & 5: Vector Spaces & Linear Transformations

Solutions to Final Exam

Summer Session Practice Final Exam

Math Final December 2006 C. Robinson

Chapters 5 & 6: Theory Review: Solutions Math 308 F Spring 2015

Linear Algebra. Rekha Santhanam. April 3, Johns Hopkins Univ. Rekha Santhanam (Johns Hopkins Univ.) Linear Algebra April 3, / 7

Chapter 5 Eigenvalues and Eigenvectors

DM554 Linear and Integer Programming. Lecture 9. Diagonalization. Marco Chiarandini

Math 314H Solutions to Homework # 3

Jordan Normal Form and Singular Decomposition

Eigenvalues and Eigenvectors 7.2 Diagonalization

A proof of the Jordan normal form theorem

Homework 6 Solutions. Solution. Note {e t, te t, t 2 e t, e 2t } is linearly independent. If β = {e t, te t, t 2 e t, e 2t }, then

TMA Calculus 3. Lecture 21, April 3. Toke Meier Carlsen Norwegian University of Science and Technology Spring 2013

First of all, the notion of linearity does not depend on which coordinates are used. Recall that a map T : R n R m is linear if

Final A. Problem Points Score Total 100. Math115A Nadja Hempel 03/23/2017

Warm-up. True or false? Baby proof. 2. The system of normal equations for A x = y has solutions iff A x = y has solutions

Linear Algebra Final Exam Solutions, December 13, 2008

1 Last time: least-squares problems

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

Eigenvalues and Eigenvectors A =

Homework Set 5 Solutions

MATH 221, Spring Homework 10 Solutions

Homework sheet 4: EIGENVALUES AND EIGENVECTORS. DIAGONALIZATION (with solutions) Year ? Why or why not? 6 9

PRACTICE FINAL EXAM. why. If they are dependent, exhibit a linear dependence relation among them.

A Review of Linear Algebra

Lecture 11: Diagonalization

Eigenvalues and Eigenvectors

Eigenvalues, Eigenvectors, and an Intro to PCA

Control Systems. Linear Algebra topics. L. Lanari

Topic 1: Matrix diagonalization

Announcements Monday, November 26

Lecture Notes: Eigenvalues and Eigenvectors. 1 Definitions. 2 Finding All Eigenvalues

6 Inner Product Spaces

Diagonalization. MATH 322, Linear Algebra I. J. Robert Buchanan. Spring Department of Mathematics

MATH SOLUTIONS TO PRACTICE MIDTERM LECTURE 1, SUMMER Given vector spaces V and W, V W is the vector space given by

Final Exam, Linear Algebra, Fall, 2003, W. Stephen Wilson

Definition: An n x n matrix, "A", is said to be diagonalizable if there exists a nonsingular matrix "X" and a diagonal matrix "D" such that X 1 A X

MA 265 FINAL EXAM Fall 2012

Notes on the matrix exponential

(a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? Solution: dim N(A) 1, since rank(a) 3. Ax =

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )

Announcements Wednesday, November 01

LINEAR ALGEBRA BOOT CAMP WEEK 2: LINEAR OPERATORS

1 Invariant subspaces

Lecture Summaries for Linear Algebra M51A

LECTURE VII: THE JORDAN CANONICAL FORM MAT FALL 2006 PRINCETON UNIVERSITY. [See also Appendix B in the book]

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

MATH JORDAN FORM

Linear algebra II Tutorial solutions #1 A = x 1

(VI.D) Generalized Eigenspaces

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

I. Multiple Choice Questions (Answer any eight)

Final Exam Practice Problems Answers Math 24 Winter 2012

Eigenvalues and Eigenvectors 7.1 Eigenvalues and Eigenvecto

80 min. 65 points in total. The raw score will be normalized according to the course policy to count into the final score.

homogeneous 71 hyperplane 10 hyperplane 34 hyperplane 69 identity map 171 identity map 186 identity map 206 identity matrix 110 identity matrix 45

Topic 2 Quiz 2. choice C implies B and B implies C. correct-choice C implies B, but B does not imply C

Announcements Monday, October 29

MATH 304 Linear Algebra Lecture 33: Bases of eigenvectors. Diagonalization.

Jordan Normal Form. Chapter Minimal Polynomials

Generalized Eigenvectors and Jordan Form

Linear Algebra Practice Problems

Generalized eigenvector - Wikipedia, the free encyclopedia

Test 3, Linear Algebra

1. Linear systems of equations. Chapters 7-8: Linear Algebra. Solution(s) of a linear system of equations (continued)

Bare-bones outline of eigenvalue theory and the Jordan canonical form

5.) For each of the given sets of vectors, determine whether or not the set spans R 3. Give reasons for your answers.

Eigenvalues, Eigenvectors, and an Intro to PCA

3.3 Eigenvalues and Eigenvectors

Midterm 2 Solutions, MATH 54, Linear Algebra and Differential Equations, Fall 2014

Definitions for Quizzes

Jordan normal form notes (version date: 11/21/07)

Topics in linear algebra

Transcription:

The and its Applications Jeremy IMPACT Brigham Young University

A square matrix A is a linear operator on {R, C} n. A is diagonalizable if and only if it has n linearly independent eigenvectors. What happens if A does not have n linearly independent eigenvectors? When does this happen? What general form can we obtain in this case?

The The is one decomposition of a matrix, A = P 1 JP where J is the normal form. It has the advantage of corresponding to the eigenspaces and of being as close to diagonal as possible. More specifically, if a matrix is diagonal then its Jordan Normal Form is the diagonalization.

Complementary Subspaces Definition Two subspaces U, W of a vector space V are complementary if U W = {0} and for all v V, there exist u U, w W such that v = u + w. In fact, u, w are the unique vectors that satisfy this property. We denote this V = U W.

Complementary Subspaces Remark This idea extends to finite collections: V = W 1 W 2 W m. Remark If U, W are subspaces of V with dim U + dim W = dim V and U W = {0} then it can be shown that U W = V.

The Index of a Matrix Recall and N(A) N(A 2 ) N(A 3 )... R(A) R(A 2 ) R(A 3 )...

The Index of a Matrix Definition The index of a matrix is the smallest nonnegative integer k = Ind(A) such that N(A k ) = N(A k+1 ) =... R(A k ) = R(A k+1 ) =... where A 0 = I. Note that Ind(A) = 0 if A is invertible.

The Index of a Matrix Theorem Let A be a square matrix and let k = Ind(A). Then V = N(A k ) R(A k ).

The Index of a Matrix Proof. Suppose x N(A k ) R(A k ). Then A k x = 0 and there exists y such that x = A k y. Therefore, A k A k y = A 2k y = 0 so that y N(A 2k ). But N(A 2k ) = N(A k ) so that x = A k y = 0. The rank-nullity theorem implies that so V = N(A k ) R(A k ). dim N(A k ) + dim R(A k ) = n = dim(v )

Invariant Subspaces Definition A subspace W V is said to be invariant (with respect to a matrix A) if AW W.

Invariant Subspaces Example Notice that for any matrix A, the range R(A) is invariant since for x R(A), Ax R(A) by definition. It follows that R(A k ) is invariant for any k. Also, N(A) is invariant since Ax = 0 N(A). So is N(A k ). Another example is an eigenspae N(A λi ) because any vector satisfies Ax = λx N(A λi ).

Decomposing a matrix If V = U W and U and W are A-invariant subspaces then there exists an invertible matrix P such that [ ] A = P 1 AU 0 P. 0 A W In fact, P = [p 1,..., p r, p r+1,... p n ] where {p 1,..., p r } is a basis for U and {p r+1,..., p n } is a basis for W. Furthermore, A U = A U is the restriction of A to the subspace U.

Matrix Diagonalization When we diagonalize A, we are simply using complementary invariant spaces. These are the eigenspaces: V = N(A λ 1 I ) N(A λ 2 I ) N(A λ r I ). The matrix that diagonalizes A is P containing bases for the eigenspaces (the columns are eigenvectors) and the blocks A λi are diagonal because on the space N(A λi ), the action of A is simply that of λi.

Matrix Diagonalization How do we know that V = n N(A λ i I )? i=1

Matrix Diagonalization Example Consider the matrix A = [ ] 2 1 0 2 Since A is upper diagonal, its only eigenvalue is 2. What are the eigenvectors?

Matrix Diagonlization Clearly V N(A 2I ) because the dimensions do not match. This matrix cannot be diagonalized because it doesn t have a full set of linearly independent eigenvectors.

Generalized Eigenspaces Notice that we had repeated eigenvalues. Remember that if we have n distinct eigenvalues we know there are n linearly independent eigenvectors. This problem only occurs when we have repeated eigenvalues.

Generalized Eigenspaces What if we could make N(A 2I ) bigger so that it covered all of V? Let s try N(A 2I ) 2 for example. It is easy to show that V = N(A 2I ) 2. Notice that ([ ] 2 1 0 2 [ 2 0 0 2 ]) ( ) 0 = 1 ( ) 1 0 So we found a vector not in N(A 2I ) such that (A 2I )x N(A 2I ). This is called an generalized eigenvector of second order.

Generalized Eigenspaces This can be repeated. In fact, we can show that if λ 1,..., λ r are the distinct eigenvalues of A and k i = Ind(A λ i I ) then V = N(A λ 1 I ) k 1 N(A λ r I ) kr. N(A λ i I ) k i is called the generalized eigenspace of A corresponding to λ i.

Diagonalization Revisited If we can t diagonalize a matrix, how do we choose a basis that gets us close? Remember that if x is a generalized eigenvector of order k then (A λi )x is a generalized eigenvector of order k 1. Repeating we may obtain a sequence x 1, x 2,..., x k such that 0 = (A λi )x 1 x 1 = (A λi )x 2. x k 1 = (A λi )x k

Diagonalization Revisited What is the action of A on the space spanned by {x 1,..., x k }? Well, we know what A λi looks like relative to this basis: 0 1 0... 0 0 0 1... 0 A λi =. 0 0 0... 1 0 0 0... 0

Diagonlization Revisited So then A must be λ 1 0... 0 0 λ 1... 0 A =. 0 0 0... 1 0 0 0... λ

Diagonalization Revisited Of course, {x 1,..., x k } may not span all of N(A λi ) k. So, to get a basis for N(A λi ) k we follows this same idea. Take a basis {x 1,... x d1 } for N(A λi ). Extend this to a basis for N(A λi ) 2 so that {x 1,..., x d1, x d1 +1,..., x d2 } (A λi ){x d1 +1,..., x d2 } = {x 1,..., x d2 d 1 }. Then the portion of P corresponding to N(A λi ) is be [x 1, x d1 +1, x d2 +1,..., x 2, x d1 +2,..., x d1 ].

If we choose our basis this way, we can decompose A into the following form: J(λ 1 ) 0... 0 0 J(λ 2 )... 0 A =.. 0 0... J(λ r ) The block J(λ i ) is called a Jordan segment for λ i.

A Jordan segment is a matrix of the form J 1 (λ i ) 0... 0 0 J 2 (λ i )... 0 J(λ i ) =.. 0 0... J ri (λ i ) Each J l (λ i ) is called a Jordan block for λ i.

A Jordan block for λ i is a matrix λ i 1 0... 0 0 λ i 1... 0 J l (λ i ) =. 0 0 0... 1 0 0 0... λ i