Lecture 19: Polar and singular value decompositions; generalized eigenspaces; the decomposition theorem (1)

Similar documents
Lecture 19: Polar and singular value decompositions; generalized eigenspaces; the decomposition theorem (1)

Lecture 21: The decomposition theorem into generalized eigenspaces; multiplicity of eigenvalues and upper-triangular matrices (1)

Lecture 11: Finish Gaussian elimination and applications; intro to eigenvalues and eigenvectors (1)

Travis Schedler. Thurs, Oct 27, 2011 (version: Thurs, Oct 27, 1:00 PM)

Lecture 19: Isometries, Positive operators, Polar and singular value decompositions; Unitary matrices and classical groups; Previews (1)

Homework 11 Solutions. Math 110, Fall 2013.

FALL 2011, SOLUTION SET 10 LAST REVISION: NOV 27, 9:45 AM. (T c f)(x) = f(x c).

Lecture 22: Jordan canonical form of upper-triangular matrices (1)

MATH SOLUTIONS TO PRACTICE MIDTERM LECTURE 1, SUMMER Given vector spaces V and W, V W is the vector space given by

Math 113 Final Exam: Solutions

Math 4153 Exam 3 Review. The syllabus for Exam 3 is Chapter 6 (pages ), Chapter 7 through page 137, and Chapter 8 through page 182 in Axler.

Math 4153 Exam 1 Review

Contents. Preface for the Instructor. Preface for the Student. xvii. Acknowledgments. 1 Vector Spaces 1 1.A R n and C n 2

Lecture 23: Determinants (1)

Then x 1,..., x n is a basis as desired. Indeed, it suffices to verify that it spans V, since n = dim(v ). We may write any v V as r

Generalized eigenspaces

1 Invariant subspaces

Lecture 23: Trace and determinants! (1) (Final lecture)

GENERALIZED EIGENVECTORS, MINIMAL POLYNOMIALS AND THEOREM OF CAYLEY-HAMILTION

Practice Midterm Solutions, MATH 110, Linear Algebra, Fall 2013

JORDAN NORMAL FORM. Contents Introduction 1 Jordan Normal Form 1 Conclusion 5 References 5

Vector Spaces and Linear Maps

The Spectral Theorem for normal linear maps

Linear Algebra 1. M.T.Nair Department of Mathematics, IIT Madras. and in that case x is called an eigenvector of T corresponding to the eigenvalue λ.

Lecture 4: Linear independence, span, and bases (1)

Lecture 6: Corrections; Dimension; Linear maps

Linear Algebra 2 Spectral Notes

The converse is clear, since

( 9x + 3y. y 3y = (λ 9)x 3x + y = λy 9x + 3y = 3λy 9x + (λ 9)x = λ(λ 9)x. (λ 2 10λ)x = 0

A linear algebra proof of the fundamental theorem of algebra

A linear algebra proof of the fundamental theorem of algebra

Linear Algebra Final Exam Solutions, December 13, 2008

Math 113 Homework 5 Solutions (Starred problems) Solutions by Guanyang Wang, with edits by Tom Church.

6 Inner Product Spaces

University of Colorado Denver Department of Mathematical and Statistical Sciences Applied Linear Algebra Ph.D. Preliminary Exam June 10, 2011

Math 113 Homework 5. Bowei Liu, Chao Li. Fall 2013

NOTES II FOR 130A JACOB STERBENZ

Notes on the matrix exponential

MATH 304 Linear Algebra Lecture 33: Bases of eigenvectors. Diagonalization.

Math 113 Winter 2013 Prof. Church Midterm Solutions

NORMS ON SPACE OF MATRICES

University of Colorado Denver Department of Mathematical and Statistical Sciences Applied Linear Algebra Ph.D. Preliminary Exam June 8, 2012

Chapter 7: Symmetric Matrices and Quadratic Forms

Lecture 11: Eigenvalues and Eigenvectors

Lecture notes - Math 110 Lec 002, Summer The reference [LADR] stands for Axler s Linear Algebra Done Right, 3rd edition.

Notes on singular value decomposition for Math 54. Recall that if A is a symmetric n n matrix, then A has real eigenvalues A = P DP 1 A = P DP T.

University of Colorado at Denver Mathematics Department Applied Linear Algebra Preliminary Exam With Solutions 16 January 2009, 10:00 am 2:00 pm

ALGEBRA QUALIFYING EXAM PROBLEMS LINEAR ALGEBRA

The Jordan Normal Form and its Applications

MATRIX LIE GROUPS AND LIE GROUPS

Math 408 Advanced Linear Algebra

Eigenvalues and Eigenvectors A =

Diagonalization of Matrix

Lecture 12: Diagonalization

OHSx XM511 Linear Algebra: Solutions to Online True/False Exercises

Math 113 Practice Final Solutions

MATH 304 Linear Algebra Lecture 34: Review for Test 2.

Summary of Week 9 B = then A A =

Math 550 Notes. Chapter 2. Jesse Crawford. Department of Mathematics Tarleton State University. Fall 2010

LIE ALGEBRAS: LECTURE 7 11 May 2010

Chapter 4 Euclid Space

LINEAR ALGEBRA BOOT CAMP WEEK 4: THE SPECTRAL THEOREM

REPRESENTATION THEORY WEEK 7

4. Linear transformations as a vector space 17

Econ Slides from Lecture 7

NONCOMMUTATIVE POLYNOMIAL EQUATIONS. Edward S. Letzter. Introduction

Lecture 7: Positive Semidefinite Matrices

Homework Set 5 Solutions

Final A. Problem Points Score Total 100. Math115A Nadja Hempel 03/23/2017

Final Review Written by Victoria Kala SH 6432u Office Hours R 12:30 1:30pm Last Updated 11/30/2015

(VI.D) Generalized Eigenspaces

Math 108b: Notes on the Spectral Theorem

The University of Texas at Austin Department of Electrical and Computer Engineering. EE381V: Large Scale Learning Spring 2013.

A NOTE ON THE JORDAN CANONICAL FORM

j=1 u 1jv 1j. 1/ 2 Lemma 1. An orthogonal set of vectors must be linearly independent.

MAC Module 12 Eigenvalues and Eigenvectors. Learning Objectives. Upon completing this module, you should be able to:

MAC Module 12 Eigenvalues and Eigenvectors

First of all, the notion of linearity does not depend on which coordinates are used. Recall that a map T : R n R m is linear if

Problems for M 11/2: A =

(VII.E) The Singular Value Decomposition (SVD)

Math 217: Eigenspaces and Characteristic Polynomials Professor Karen Smith

C.6 Adjoints for Operators on Hilbert Spaces

Symmetric and self-adjoint matrices

Review problems for MA 54, Fall 2004.

Solutions to Final Exam

Lecture Summaries for Linear Algebra M51A

SPECTRAL THEORY EVAN JENKINS

Eigenspaces and Diagonalizable Transformations


Throughout these notes we assume V, W are finite dimensional inner product spaces over C.

(f + g)(s) = f(s) + g(s) for f, g V, s S (cf)(s) = cf(s) for c F, f V, s S

Final Exam Practice Problems Answers Math 24 Winter 2012

Math 115A: Homework 5

Chapter 8 Integral Operators

(a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? Solution: dim N(A) 1, since rank(a) 3. Ax =

EIGENVALUES AND EIGENVECTORS

TOEPLITZ OPERATORS. Toeplitz studied infinite matrices with NW-SE diagonals constant. f e C :

MATH 304 Linear Algebra Lecture 23: Diagonalization. Review for Test 2.

is an isomorphism, and V = U W. Proof. Let u 1,..., u m be a basis of U, and add linearly independent

Math 489AB Exercises for Chapter 2 Fall Section 2.3

Transcription:

Lecture 19: Polar and singular value decompositions; generalized eigenspaces; the decomposition theorem (1) Travis Schedler Thurs, Nov 17, 2011 (version: Thurs, Nov 17, 1:00 PM) Goals (2) Polar decomposition and singular value decomposition Generalized eigenspaces and the decomposition theorem Read Chapter 7, begin Chapter 8, and do PS 9. Warm-up exercise (3) (a) Let T be an invertible operator on a f.d. i.p.s. and set P := T T and S := T P 1. Show that S is an isometry. Recall P is positive, so T = SP is a polar decomposition (i.e., S is an isometry and P positive). (b) Now suppose T = 0. Show that polar decompositions T = SP are exactly T = S0 for every isometry S, i.e., we have always P = 0 but S can be anything. One-dimensional analogue: Either z C is invertible, in which case z = (z/ z ) z = sp or else z is zero, in which case z = s 0 for any s of absolute value one. Solution to warm-up exercise (4) (a) S S = (T P 1 ) T P 1 = (P 1 ) T T P 1 = P 1 P 2 P 1 = I. used that P = P and hence (P 1 ) = P 1 as well. Here we (b) Since isometries are invertible, 0 = SP for S an isometry implies P = S 1 0 = 0. On the other hand clearly S0 = 0 for all S. 1

Polar decomposition and SVD (5) Proposition: every complex number z is expressible as z = r e iθ, where r 0 and θ [0, 2π). (Unique if z nonzero). Equivalently: z = s r, for s = 1 and r = z = z z 0. Theorem 1. Let V be a f.d. i.p.s. and T L(V ). Then there is an expression T = SP, for S an isometry and P positive. P is unique and P = T T. Moreover, S is unique iff T is invertible. Corollary 2 (Singular Value Decomposition (SVD)). There exists orthonormal bases (e 1,..., e n ) and (f 1,..., f n ) of V such that T e i = s i f i, for s i 0 the singular values. Moreover, (e 1,..., e n ) is an orthonormal eigenbasis of T T with eigenvalues s 2 i. Proof: Let (e 1,..., e n ) be an orthonormal eigenbasis of T T and s 1,..., s n the square roots of the eigenvalues. When s i 0, set f i := s 1 i T e i. Then extend the resulting f i to an orthonormal eigenbasis. Uniqueness of polar decomposition (6) If T = SP, then T T = P S SP = P P = P 2, so P = T T. Thus P is unique (positive operators have unique positive square roots; see the slides for Lecture 18 or Axler). If T is invertible, S = T P 1 so S is unique. Conversely, if T is not invertible, neither is P, and we can replace S by SS where S is an isometry such that S v = v for all eigenvectors v of nonzero eigenvalue. So then S is not unique. Existence of polar decomposition (7) Set P := T T. range(p ) is P -invariant and P is an isomorphism there (it has an eigenbasis with nonzero eigenvalues). Define thus P 1 range(p ) : range(p ) range(p ). Consider S 1 := T P 1 range(p ) : range(p ) range(t ). S 1S 1 = I, so u, v = S 1 u, S 1 v for all u, v range(p ). Recall: null(p ) = null(t ). So dim range(p ) = dim range(t ). Thus S 1 takes an on. basis (e 1,..., e m ) of range(p ) to an on. basis (f 1,..., f m ) of range(t ). Extend (e i ) and (f i ) to on. bases of V and extend S 1 to S L(V ) by S(e i ) = f i when i > m. So S takes an on. basis to another on. basis, i.e., it is an isometry. Finally, T = SP, since it is true on range(p ) and null(t ) = null(p ) = range(p ) = Span(e m+1,..., e n ). 2

Nonuniqueness of SVD (8) Note: the SVD is not unique, even if T is invertible: the orthonormal eigenbasis (e i ) of T T is not unique. (e.g., one can reorder them and multiply by ±1, at the least.) On the other hand, the polar decomposition is unique iff T is invertible. ( ) 0 2 Example: T = T A, A =. 1 0 ( ) ( ) 0 1 1 0 We can guess that A = SP =. So this is the answer 1 0 0 2 (unique since A, equivalently P, is invertible). ( ( 1 0 For SVD we could have e 1 =, e 0) 2 =, s 1) 1 = 1, s 2 = 2, f 1 = ( 0 1), f 2 = ( 1 0 ). But we could also swap everything: s 1 with s 2, e 1 with e 2, and f 1 with f 2. Or we could take e 1 to e 1 (hence f 1 to f 1 ) and/or e 2 to e 2 (hence f 2 to f 2 ). Computing SVD and polar decomposition (9) The best way to compute these is to do SVD first; then let P be the operator with eigenvectors (e i ) and eigenvalues (s i ), and let S be the isometry Se i = f i for all i. To compute SVD, given T, compute first T T. Then find the eigenvalues of T T (2 2 case: characteristic polynomial: for A = M(T ) in an orthonormal basis, these are the roots of x 2 tr(āt A)x+ det(āt A).) Find the eigenspaces and an on. eigenbasis (e i ) of T T. Next, set P := T T, by taking the nonnegative square root of the eigenvalues. These eigenvalues are the s i. Finally, let f i := s 1 i T e i for the nonzero s i ; for the remaining f i just extend the ones we get to an orthonormal basis. Example (10) Example from before: A = ( ) 0 2. 1 0 3

( ) 1 0 First, Ā t A =. 0 4 ( ( 1 0 Next, an eigenbasis is e 1 = and e 0) 2 = with eigenvalues 1 and 4. 1) So P = Ā t A has the same eigenbasis, with eigenvalues s 1 = 1 = 1 and s 2 = 4 = 2. ( ) ( ( ) Then f 1 = s 1 0 0 1 Ae 1 = 1 1 =. Also f 1 1) 2 = s 1 2 2 Ae 2 = 2 1 = 0 ( ) 1. 0 Now P = ( ) 1 0 and S = 0 2 ( ) 0 1, as desired. 1 0 ( ) s1 0 In general: P = (e 1 e 2 ) (e 0 s 1 e 2 ) 1 and S = (f 1 f 2 )(e 1 e 2 ) 1. 2 Generalized eigenvectors (11) Goal: Although not all f.d. vector spaces admit an eigenbasis, over F = C they always admit a basis of generalized eigenvectors. Definition 3. A generalized eigenvector v of T eigenvalue λ is one such that, for some m 1, (T λi) m v = 0. Examples: m = 1 above if and only if v is an (ordinary) eigenvector. If T is nilpotent, then all vectors are generalized eigenvectors of eigenvalue zero. So, even though it does not have an eigenbasis, every basis is a basis of generalized eigenvectors! Definition 4. Let V (λ) be the generalized eigenspace of eigenvalue λ: the span of all generalized eigenvectors of eigenvalue λ. Note that V (λ) is T -invariant, since (T λi) m v = 0 implies (T λi) m T v = T (T λi) m v = 0. The decomposition theorem (12) Theorem 5. Let V be f.d., F = C, and T L(V ). Then V is the direct sum of its generalized eigenspaces: V = λ V (λ). First step: Lemma 6. Suppose that λ µ. Then V (λ) V (µ) = {0}. 4

Proof. Suppose v V (λ) V (µ) is nonzero. Let (T λi) m v = 0 but not (T λi) m 1 v. So u := (T λi) m 1 v is a nonzero eigenvector of eigenvalue λ. Now let m 1 be such that (T µi) m v = 0. Then also (T µi) m u = (T µi) m (T λi) m 1 v = (T λi) m 1 (T µi) m v = 0. But, (T µi) m u = (λ µ) m u 0, since λ µ. This is a contradiction. Lemma on generalized eigenspaces (13) Lemma 7. V (λ) = null(t λi) dim V. I.e., if v is a generalized eigenvector of eigenvalue λ, we can take m = dim V before: (T λi) dim V v = 0. Proof. Let U i := (T λi) i V (λ). Since V (λ) is T -invariant (hence (T λi)-invariant), U 0 U 1. However, if U i = U i+1, then (T λi) is injective on U i. Since (T λi) is nilpotent, this implies U i = {0}. So U 0 U 1 U m = {0}, and dim U i dim V (λ) i. Hence m dim V (λ) dim V, and (T λi) dim V v = 0 for all v V (λ). One more lemma (14) Lemma 8. V = (T λi) dim V range(t λi) dim V = V (λ) range(t λi) dim V. Proof. Since the dimensions are equal, we need to show just that (T λi) dim V range(t λi) dim V = {0}. Let v (T λi) dim V range(t λi) dim V. Write v = (T λi) dim V u. Since (T λi) 2 dim V u = (T λi) dim V v = 0, also u is a generalized eigenvector of eigenvalue λ. But, by the last lemma, then (T λi) dim V u = 0, so v = 0. Proof of the decomposition theorem (15) Theorem 9. Let V be f.d., F = C, and T L(V ). Then V is the direct sum of its generalized eigenspaces: V = λ V (λ). Proof: By induction on dim V. Let λ be an eigenvalue of T, so V (λ) {0}. Write V = V (λ) range(t λi) dim V. Since dim range(t λi) dim V < dim V, the ind. hyp. shows that range(t λi) is the direct sum of the generalized eigenspaces of T range(t λi) dim V. 5

To conclude, we claim that for µ λ, V (µ) range(t λi) dim V. Thus V (µ) is a generalized eigenspace of T range(t λi) dim V. For this, we show that (T λi) dim V V (µ) = V (µ). First, V (µ) is T -invariant, so (T λi) dim V V (µ) V (µ). We only need to show (T λi) dim V is injective on V (µ). This means that V (µ) null(t λi) dim V = {0}. But this is V (µ) V (λ) = {0}, by the lemmas. 6