Jordan Normal Form and Singular Decomposition

Similar documents
Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Jordan Canonical Form Homework Solutions

Math 3191 Applied Linear Algebra

MATH 221, Spring Homework 10 Solutions

Definition (T -invariant subspace) Example. Example

Recall : Eigenvalues and Eigenvectors

and let s calculate the image of some vectors under the transformation T.

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix

ft-uiowa-math2550 Assignment NOTRequiredJustHWformatOfQuizReviewForExam3part2 due 12/31/2014 at 07:10pm CST

Chapter 5. Eigenvalues and Eigenvectors

235 Final exam review questions

Diagonalization of Matrix

Eigenvalue and Eigenvector Problems

ICS 6N Computational Linear Algebra Eigenvalues and Eigenvectors

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

Therefore, A and B have the same characteristic polynomial and hence, the same eigenvalues.

UNIT 6: The singular value decomposition.

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

The Singular Value Decomposition

Chapter 7. Canonical Forms. 7.1 Eigenvalues and Eigenvectors

Linear Algebra- Final Exam Review

Lecture 10 - Eigenvalues problem

Lecture 15, 16: Diagonalization

AMS526: Numerical Analysis I (Numerical Linear Algebra)

Singular Value Decomposition

Question: Given an n x n matrix A, how do we find its eigenvalues? Idea: Suppose c is an eigenvalue of A, then what is the determinant of A-cI?

Linear Algebra. Rekha Santhanam. April 3, Johns Hopkins Univ. Rekha Santhanam (Johns Hopkins Univ.) Linear Algebra April 3, / 7

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

Linear Algebra Primer

Diagonalization. MATH 322, Linear Algebra I. J. Robert Buchanan. Spring Department of Mathematics

Warm-up. True or false? Baby proof. 2. The system of normal equations for A x = y has solutions iff A x = y has solutions

Announcements Monday, November 06

Linear Algebra - Part II

Lecture 11: Diagonalization

Review problems for MA 54, Fall 2004.

Exercise Set 7.2. Skills

18.06SC Final Exam Solutions

LEC 2: Principal Component Analysis (PCA) A First Dimensionality Reduction Approach

PRACTICE FINAL EXAM. why. If they are dependent, exhibit a linear dependence relation among them.

MATH 304 Linear Algebra Lecture 33: Bases of eigenvectors. Diagonalization.

LinGloss. A glossary of linear algebra

The Jordan Normal Form and its Applications

Conceptual Questions for Review

Notes on Eigenvalues, Singular Values and QR

Calculating determinants for larger matrices

Summary of Week 9 B = then A A =

CS 246 Review of Linear Algebra 01/17/19

7. Symmetric Matrices and Quadratic Forms

Math 489AB Exercises for Chapter 2 Fall Section 2.3

Control Systems. Linear Algebra topics. L. Lanari

1 Last time: least-squares problems

Generalized Eigenvectors and Jordan Form

Lecture 3 Eigenvalues and Eigenvectors

AMS526: Numerical Analysis I (Numerical Linear Algebra for Computational and Data Sciences)

Econ Slides from Lecture 7

Eigenvalues and Eigenvectors

Math 315: Linear Algebra Solutions to Assignment 7

MATH 1553, Intro to Linear Algebra FINAL EXAM STUDY GUIDE

City Suburbs. : population distribution after m years

Linear algebra II Tutorial solutions #1 A = x 1

Math Matrix Algebra

Math 205, Summer I, Week 4b:

Section 6.4. The Gram Schmidt Process

Math Final December 2006 C. Robinson

Eigenvalues for Triangular Matrices. ENGI 7825: Linear Algebra Review Finding Eigenvalues and Diagonalization

Glossary of Linear Algebra Terms. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB

MAT 1302B Mathematical Methods II

Math 205, Summer I, Week 4b: Continued. Chapter 5, Section 8

Announcements Monday, November 26

(the matrix with b 1 and b 2 as columns). If x is a vector in R 2, then its coordinate vector [x] B relative to B satisfies the formula.

4. Linear transformations as a vector space 17

A = 3 1. We conclude that the algebraic multiplicity of the eigenvalues are both one, that is,

Study Guide for Linear Algebra Exam 2

Cheat Sheet for MATH461

Singular value decomposition

Background Mathematics (2/2) 1. David Barber

Computational Methods. Eigenvalues and Singular Values

MATH 581D FINAL EXAM Autumn December 12, 2016

Dimension. Eigenvalue and eigenvector

Math 2331 Linear Algebra

Eigenvalues, Eigenvectors, and Diagonalization

Linear Algebra in Actuarial Science: Slides to the lecture

Lecture 12: Diagonalization

Definition: An n x n matrix, "A", is said to be diagonalizable if there exists a nonsingular matrix "X" and a diagonal matrix "D" such that X 1 A X

c c c c c c c c c c a 3x3 matrix C= has a determinant determined by

Review Notes for Linear Algebra True or False Last Updated: January 25, 2010

Chapter 3. Determinants and Eigenvalues

Eigenvalues for Triangular Matrices. ENGI 7825: Linear Algebra Review Finding Eigenvalues and Diagonalization

MATH 1553-C MIDTERM EXAMINATION 3

Lecture notes on Quantum Computing. Chapter 1 Mathematical Background

TBP MATH33A Review Sheet. November 24, 2018

Mathematical Methods for Engineers 1 (AMS10/10A)

Math Linear Algebra Final Exam Review Sheet

a 11 a 12 a 11 a 12 a 13 a 21 a 22 a 23 . a 31 a 32 a 33 a 12 a 21 a 23 a 31 a = = = = 12

Math 314/ Exam 2 Blue Exam Solutions December 4, 2008 Instructor: Dr. S. Cooper. Name:

1. Diagonalize the matrix A if possible, that is, find an invertible matrix P and a diagonal

Final A. Problem Points Score Total 100. Math115A Nadja Hempel 03/23/2017

Math 1553 Worksheet 5.3, 5.5

Transcription:

University of Debrecen

Diagonalization and eigenvalues Diagonalization We have seen that if A is an n n square matrix, then A is diagonalizable if and only if for all λ eigenvalues of A we have dim(u λ ) = mult(λ). the sum of the geometric multiplicities of the eigenvalues is n. Or equivalently, A is diagonalizable if and only if it has n linearly independent eigenvectors.

Example Consider the matrix 2 4 A = 1 2 2 4 1 3 6 1 Computing det(a λi 4 ), we get λ 2 ( 1 λ) 2, thus the eigenvalues of A are λ 1 = 1 and λ 2 = with mult( 1) = 2 and mult() = 2.

Example The eigenvectors corresponding to λ 1 = 1 are v 1 = 1 and v 2 =. 1 However, the only eigenvector we find for λ 2 = is 2 v 3 = 1. What happens now?

Jordan Normal Form Jordan Normal Form For a given n n square matrix A, there exists an invertible matrix S, such that S 1 AS = diag(j 1,... J p ), where J i is the square matrix λ i 1... λ i 1... J i =......... λ i and is called the Jordan-block corresponding to the ith eigenvalue.

Example So for the matrix A in our previous example, we should have 1 S 1 AS = 1 1 Now, we just have to find a suitable matrix, S.

Generalized eigenvectors Definition Suppose that λ is an eigenvalue of the n n square matrix A, with multiplicity k 1. Then the vectors v R n are called generalized eigenvectors of A, if (A λi n ) k v =. An important property If mult(λ) = k, then there are exactly k generalize eigenvectors corresponding to λ. Another important property If the columns of S are the generalized eigenvectors of A, then S 1 AS will have Jordan form.

How can we find the generalized eigenvectors? From our previous example, we see that AS = SJ, where J is the Jordan form of A. If S = (v 1 v 2 v 3 v 4 ), then this means, that 1 A(v 1 v 2 v 3 v 4 ) = (v 1 v 2 v 3 v 4 ) 1 1 So, we have (A + 1I 4 )v 1 = (A + 1I 4 )v 2 = (A + I 4 )v 3 = (A + I 4 )v 4 = v 3 So, (A + I 4 ) 2 v 4 = (A + I 4 )v 3 =.

How can we find the generalized eigenvectors? So, we have to compute (A + I 4 ) 2, and solve the linear equation Doing so yields v t 1 1 2 3 (A + I 4 ) 2 v =. + t 2 1 4 6, t 1, t 2 R. Now, we pick a vector v 4 from the solution set, for which (A I 4 )v 4. Then, we can choose v 3 = (A I 4 )v 4.

How can we find the generalized eigenvectors? In this case, pick and then v 4 = 1 2 3, 2 v 3 = 1.

In general Chains Suppose that λ is an eigenvalue of A of multiplicity k 2, and suppose, that we have found a generalized eigenvector, v k of λ. (Or, in other words, solved the equation (A λi n ) k v =.) Then we have the following Jordan chain. v k 1 = (A λi n )v k,..., v i 1 = (A λi n )v i,... v 1 = (A λi n )v 2. The vectors v 1, v 2,..., v k are the generalized eigenvalues corresponding to λ, and they generate the generalized eigenspace corresponding to λ.

Summing it up For our example A, we can choose 2 1 2 4 1 S = 1 1 2, so S 1 = 3 6 1 1. 1 3 1 2 And this way 1 S 1 AS = 1 1

Shur Decomposition Since the Jordan form is numerically unstable, in computations usually Shur decomposition is used. Shur Decomposition If A is an n n square matrix, than A can be expressed as A = QUQ 1, where Q is an orthogonal matrix (i.e. Q t Q = I n ), and U is an upper triangular matrix. Shur decomposition can be calculated for example with the QR-algorithm.

Singular Decomposition Singular Values Decomposition If A is and m n real matrix, then the singluar value decomposition of A is the product A = USV t, where U is an m m orthogonal matrix, S is an m n rectangular diagonal matrix, and V t is an n n orthogonal matrix. Nomenclature The diagonal entries of S are called singular values of A. The columns of U are called left-singular vectors of A. The columns of V are called right-singular vectors of A.

How to compute SVD? Computation The computation of the singular value decomposition is done through eigenvector and eigenvalue calculations. Namely The non-zero singular values of A (i.e. the diagonal entries of S) are the square roots of the non-zero eigenvectors of both A t A and AA t. The left-singular vectors of A (i.e. the columns of U) are the eigenvectors of AA t. The right singular vectors of A (i.e. the columns of V ) are the eigenvectors of A t A