Spectral Theorems in Euclidean and Hermitian Spaces

Similar documents
Spectral Theorems in Euclidean and Hermitian Spaces

4.2. ORTHOGONALITY 161

Linear algebra 2. Yoav Zemel. March 1, 2012

Math 350 Fall 2011 Notes about inner product spaces. In this notes we state and prove some important properties of inner product spaces.

22m:033 Notes: 7.1 Diagonalization of Symmetric Matrices

Basics of Hermitian Geometry

9.1 Eigenvectors and Eigenvalues of a Linear Map

Singular Value Decomposition (SVD) and Polar Form

MATH 423 Linear Algebra II Lecture 33: Diagonalization of normal operators.

Contents. Preface for the Instructor. Preface for the Student. xvii. Acknowledgments. 1 Vector Spaces 1 1.A R n and C n 2

1 Last time: least-squares problems

Spectral Theorem for Self-adjoint Linear Operators

Linear Algebra Lecture Notes-II

LECTURE VI: SELF-ADJOINT AND UNITARY OPERATORS MAT FALL 2006 PRINCETON UNIVERSITY

The Cartan Dieudonné Theorem

Math 413/513 Chapter 6 (from Friedberg, Insel, & Spence)

Diagonalizing Matrices

Linear Algebra. Min Yan

Math Linear Algebra II. 1. Inner Products and Norms

The Spectral Theorem for normal linear maps

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

MATHEMATICS 217 NOTES

Real symmetric matrices/1. 1 Eigenvalues and eigenvectors

Linear Algebra 2 Spectral Notes

Numerical Linear Algebra

6 Inner Product Spaces

A PRIMER ON SESQUILINEAR FORMS

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

Quantum Computing Lecture 2. Review of Linear Algebra

Chapter 6 Inner product spaces

Throughout these notes we assume V, W are finite dimensional inner product spaces over C.

Linear Algebra: Matrix Eigenvalue Problems

MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors.

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )

Eigenvalues and eigenvectors

Review of Linear Algebra

Jean Gallier Department of Computer and Information Science University of Pennsylvania Philadelphia, PA 19104, USA

LINEAR ALGEBRA REVIEW

Elementary linear algebra

Singular Value Decomposition and Polar Form

No books, no notes, no calculators. You must show work, unless the question is a true/false, yes/no, or fill-in-the-blank question.

Chapter 3 Transformations

Chapter 5. Basics of Euclidean Geometry

Review problems for MA 54, Fall 2004.

1. General Vector Spaces

NOTES ON BILINEAR FORMS

Last name: First name: Signature: Student number:

LINEAR ALGEBRA BOOT CAMP WEEK 4: THE SPECTRAL THEOREM

MATH 583A REVIEW SESSION #1

1 Inner Product and Orthogonality

Then x 1,..., x n is a basis as desired. Indeed, it suffices to verify that it spans V, since n = dim(v ). We may write any v V as r

(v, w) = arccos( < v, w >

Lecture notes on Quantum Computing. Chapter 1 Mathematical Background

Schur s Triangularization Theorem. Math 422

Singular Value Decomposition and Polar Form

MATH 304 Linear Algebra Lecture 34: Review for Test 2.

University of Colorado Denver Department of Mathematical and Statistical Sciences Applied Linear Algebra Ph.D. Preliminary Exam June 8, 2012

Homework 11 Solutions. Math 110, Fall 2013.

f(a)f(b) = ab for all a, b E. WhenE = F, an affine isometry f: E E is also called a rigid motion.

Glossary of Linear Algebra Terms. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB

Chapter 6: Orthogonality

1. Foundations of Numerics from Advanced Mathematics. Linear Algebra

October 25, 2013 INNER PRODUCT SPACES

Recall the convention that, for us, all vectors are column vectors.

Analysis Preliminary Exam Workshop: Hilbert Spaces

Basics of Hermitian Geometry

Fall TMA4145 Linear Methods. Exercise set Given the matrix 1 2

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

Categories and Quantum Informatics: Hilbert spaces

Linear Algebra. Session 12

Review of Linear Algebra Definitions, Change of Basis, Trace, Spectral Theorem

Math 443 Differential Geometry Spring Handout 3: Bilinear and Quadratic Forms This handout should be read just before Chapter 4 of the textbook.

Lecture 7: Positive Semidefinite Matrices

Algebra I Fall 2007

Symmetric and self-adjoint matrices

Quadratic forms. Here. Thus symmetric matrices are diagonalizable, and the diagonalization can be performed by means of an orthogonal matrix.

homogeneous 71 hyperplane 10 hyperplane 34 hyperplane 69 identity map 171 identity map 186 identity map 206 identity matrix 110 identity matrix 45

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix

Practice Exam. 2x 1 + 4x 2 + 2x 3 = 4 x 1 + 2x 2 + 3x 3 = 1 2x 1 + 3x 2 + 4x 3 = 5

Math 113 Final Exam: Solutions

There are two things that are particularly nice about the first basis

MATH 23a, FALL 2002 THEORETICAL LINEAR ALGEBRA AND MULTIVARIABLE CALCULUS Solutions to Final Exam (in-class portion) January 22, 2003

SYLLABUS. 1 Linear maps and matrices

Physics 129B, Winter 2010 Problem Set 4 Solution

j=1 u 1jv 1j. 1/ 2 Lemma 1. An orthogonal set of vectors must be linearly independent.

LinGloss. A glossary of linear algebra

Supplementary Notes on Linear Algebra

Honors Linear Algebra, Spring Homework 8 solutions by Yifei Chen

Math 108b: Notes on the Spectral Theorem

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

18.06 Problem Set 8 - Solutions Due Wednesday, 14 November 2007 at 4 pm in

Section 7.5 Inner Product Spaces

Review of linear algebra

Linear Algebra Review. Vectors

Linear Algebra using Dirac Notation: Pt. 2

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.

Solutions to Review Problems for Chapter 6 ( ), 7.1

Lecture notes: Applied linear algebra Part 1. Version 2

Lecture 3: Review of Linear Algebra

Transcription:

Chapter 10 Spectral Theorems in Euclidean and Hermitian Spaces 10.1 Normal Linear Maps Let E be a real Euclidean space (or a complex Hermitian space) with inner product u, v 7! hu, vi. In the real Euclidean case, recall that h, i is bilinear, symmetric and positive definite (i.e., hu, ui > 0forall u 6= 0). In the complex Hermitian case, recall that h, i is sesquilinear, which means that it linear in the first argument, semilinear in the second argument (i.e., hu, µvi = µhu, vi), hv, ui = hu, vi, andpositivedefinite (as above). 533

534 CHAPTER 10. SPECTRAL THEOREMS In both cases we let kuk = p hu, ui and the map u 7! kuk is a norm. Recall that every linear map, f : E! E, hasanadjoint f which is a linear map, f : E! E, suchthat for all u, v 2 E. hf(u),vi = hu, f (v)i, Since h, i is symmetric, it is obvious that f = f. Definition 10.1. Given a Euclidean (or Hermitian) space, E, alinearmapf : E! E is normal i f f = f f. Alinearmapf : E! E is self-adjoint if f = f, skewself-adjoint if f = f,andorthogonal if f f = f f =id.

10.1. NORMAL LINEAR MAPS 535 Our first goal is to show that for every normal linear map f : E! E (where E is a Euclidean space), there is an orthonormal basis (w.r.t. h, i) suchthatthematrix of f over this basis has an especially nice form: It is a block diagonal matrix in which the blocks are either one-dimensional matrices (i.e., single entries) or twodimensional matrices of the form µ µ This normal form can be further refined if f is self-adjoint, skew-self-adjoint, or orthogonal. As a first step, we show that f and f have the same kernel when f is normal. Proposition 10.1. Given a Euclidean space E, if f : E! E is a normal linear map, then Ker f =Kerf.

536 CHAPTER 10. SPECTRAL THEOREMS The next step is to show that for every linear map f : E! E, thereissomesubspacew of dimension 1 or 2suchthatf(W ) W. When dim(w )=1,W is actually an eigenspace for some real eigenvalue of f. Furthermore, when f is normal, there is a subspace W of dimension 1 or 2 such that f(w ) W and f (W ) W. The di culty is that the eigenvalues of f are not necessarily real. One way to get around this problem is to complexify both the vector space E and the inner product h, i. First, we need to embed a real vector space E into a complex vector space E C.

10.1. NORMAL LINEAR MAPS 537 Definition 10.2. Given a real vector space E, lete C be the structure E E under the addition operation (u 1,u 2 )+(v 1,v 2 )=(u 1 + v 1,u 2 + v 2 ), and multiplication by a complex scalar z = x+iy defined such that (x + iy) (u, v) =(xu yv, yu + xv). The space E C is called the complexification of E. It is easily shown that the structure E C is a complex vector space. It is also immediate that (0, v)=i(v, 0), and thus, identifying E with the subspace of E C consisting of all vectors of the form (u, 0), we can write (u, v) =u + iv. Given a vector w = u + iv, itsconjugate w is the vector w = u iv.

538 CHAPTER 10. SPECTRAL THEOREMS Observe that if (e 1,...,e n )isabasisofe (a real vector space), then (e 1,...,e n )isalsoabasisofe C (recall that e i is an abreviation for (e i, 0)). Given a linear map f : E! E, themapf can be extended to a linear map f C : E C! E C defined such that f C (u + iv) =f(u)+if(v). For any basis (e 1,...,e n )ofe, thematrixm(f) representing f over (e 1,...,e n )isidenticaltothematrix M(f C )representingf C over (e 1,...,e n ), where we view (e 1,...,e n )asabasisofe C. As a consequence, det(zi M(f)) = det(zi M(f C )), which means that f and f C have the same characteristic polynomial (which has real coe cients). We know that every polynomial of degree n with real (or complex) coe cients always has n complex roots (counted with their multiplicity), and the roots of det(zi M(f C )) that are real (if any) are the eigenvalues of f.

10.1. NORMAL LINEAR MAPS 539 Next, we need to extend the inner product on E to an inner product on E C. The inner product h, i on a Euclidean space E is extended to the Hermitian positive definite form h, i C on E C as follows: hu 1 + iv 1,u 2 + iv 2 i C = hu 1,u 2 i + hv 1,v 2 i + i(hu 2,v 1 i hu 1,v 2 i). Then, given any linear map f : E! E,itiseasilyverified that the map fc defined such that f C(u + iv) =f (u)+if (v) for all u, v 2 E, istheadjoint of f C w.r.t. h, i C.

540 CHAPTER 10. SPECTRAL THEOREMS Assuming again that E is a Hermitian space, observe that Proposition 10.1 also holds. Proposition 10.2. Given a Hermitian space E, for any normal linear map f : E! E, a vector u is an eigenvector of f for the eigenvalue (in C) i u is an eigenvector of f for the eigenvalue. The next proposition shows a very important property of normal linear maps: eigenvectors corresponding to distinct eigenvalues are orthogonal. Proposition 10.3. Given a Hermitian space E, for any normal linear map f : E! E, if u and v are eigenvectors of f associated with the eigenvalues and µ (in C) where 6= µ, then hu, vi =0.

10.1. NORMAL LINEAR MAPS 541 We can also show easily that the eigenvalues of a selfadjoint linear map are real. Proposition 10.4. Given a Hermitian space E, the eigenvalues of any self-adjoint linear map f : E! E are real. There is also a version of Proposition 10.4 for a (real) Euclidean space E and a self-adjoint map f : E! E. Proposition 10.5. Given a Euclidean space E, if f : E! E is any self-adjoint linear map, then every eigenvalue of f C is real and is actually an eigenvalue of f (which means that there is some real eigenvector u 2 E such that f(u) = u). Therefore, all the eigenvalues of f are real. Given any subspace W of a Hermitian space E, recall that the orthogonal W? of W is the subspace defined such that W? = {u 2 E hu, wi =0, for all w 2 W }.

542 CHAPTER 10. SPECTRAL THEOREMS Recall that E = W W? (construct an orthonormal basis of E using the Gram Schmidt orthonormalization procedure). The same result also holds for Euclidean spaces. As a warm up for the proof of Theorem 10.9, let us prove that every self-adjoint map on a Euclidean space can be diagonalized with respect to an orthonormal basis of eigenvectors. Theorem 10.6. Given a Euclidean space E of dimension n, for every self-adjoint linear map f : E! E, there is an orthonormal basis (e 1,...,e n ) of eigenvectors of f such that the matrix of f w.r.t. this basis is a diagonal matrix 0 1 with i 2 R. B @ 1.... 2........... n C A,

10.1. NORMAL LINEAR MAPS 543 One of the key points in the proof of Theorem 10.6 is that we found a subspace W with the property that f(w ) W implies that f(w? ) W?. In general, this does not happen, but normal maps satisfy astrongerpropertywhichensuresthatsuchasubspace exists. The following proposition provides a condition that will allow us to show that a normal linear map can be diagonalized. It actually holds for any linear map. Proposition 10.7. Given a Hermitian space E, for any linear map f : E! E and any subspace W of E, if f(w ) W, then f W? W?. Consequently, if f(w ) W and f (W ) W, then f W? W? and f W? W?. The above Proposition also holds for Euclidean spaces. Although we are ready to prove that for every normal linear map f (over a Hermitian space) there is an orthonormal basis of eigenvectors, we now return to real Euclidean spaces.

544 CHAPTER 10. SPECTRAL THEOREMS If f : E! E is a linear map and w = u + iv is an eigenvector of f C : E C! E C for the eigenvalue z = + iµ, whereu, v 2 E and, µ 2 R, since and f C (u + iv) =f(u)+if(v) f C (u + iv) =( + iµ)(u + iv) = u µv + i(µu + v), we have f(u) = u µv and f(v) =µu + v, from which we immediately obtain f C (u iv) =( iµ)(u iv), which shows that w = u iv is an eigenvector of f C for z = iµ. Using this fact, we can prove the following proposition:

10.1. NORMAL LINEAR MAPS 545 Proposition 10.8. Given a Euclidean space E, for any normal linear map f : E! E, if w = u + iv is an eigenvector of f C associated with the eigenvalue z = + iµ (where u, v 2 E and, µ 2 R), if µ 6= 0 (i.e., z is not real) then hu, vi =0and hu, ui = hv, vi, which implies that u and v are linearly independent, and if W is the subspace spanned by u and v, then f(w )=W and f (W )=W. Furthermore, with respect to the (orthogonal) basis (u, v), the restriction of f to W has the matrix µ µ. If µ =0, then is a real eigenvalue of f and either u or v is an eigenvector of f for.ifw is the subspace spanned by u if u 6= 0, or spanned by v 6= 0if u =0, then f(w ) W and f (W ) W.

546 CHAPTER 10. SPECTRAL THEOREMS Theorem 10.9. (Main Spectral Theorem) Given a Euclidean space E of dimension n, for every normal linear map f : E! E, there is an orthonormal basis (e 1,...,e n ) such that the matrix of f w.r.t. this basis is a block diagonal matrix of the form 0 A 1... B @ A 2............ A p such that each block A j is either a one-dimensional matrix (i.e., a real scalar) or a two-dimensional matrix of the form A j = j µ j µ j j where j,µ j 2 R, with µ j > 0. 1 C A

10.1. NORMAL LINEAR MAPS 547 After this relatively hard work, we can easily obtain some nice normal forms for the matrices of self-adjoint, skewself-adjoint, and orthogonal, linear maps. However, for the sake of completeness, we state the following theorem. Theorem 10.10. Given a Hermitian space E of dimension n, for every normal linear map f : E! E, there is an orthonormal basis (e 1,...,e n ) of eigenvectors of f such that the matrix of f w.r.t. this basis is a diagonal matrix where j 2 C. 0 B @ 1 1... 2... C...... A... n Remark: Thereisaconverse to Theorem 10.10, namely, if there is an orthonormal basis (e 1,...,e n )ofeigenvectors of f, thenf is normal.

548 CHAPTER 10. SPECTRAL THEOREMS 10.2 Self-Adjoint, Skew-Self-Adjoint, and Orthogonal Linear Maps Theorem 10.11. Given a Euclidean space E of dimension n, for every self-adjoint linear map f : E! E, there is an orthonormal basis (e 1,...,e n ) of eigenvectors of f such that the matrix of f w.r.t. this basis is a diagonal matrix where i 2 R. 0 B @ 1 1... 2... C...... A... n Theorem 10.11 implies that if 1,..., p are the distinct real eigenvalues of f and E i is the eigenspace associated with i,then E = E 1 E p, where E i and E j are othogonal for all i 6= j.

10.2. SELF-ADJOINT AND OTHER SPECIAL LINEAR MAPS 549 Theorem 10.12. Given a Euclidean space E of dimension n, for every skew-self-adjoint linear map f : E! E, there is an orthonormal basis (e 1,...,e n ) such that the matrix of f w.r.t. this basis is a block diagonal matrix of the form 0 A 1... B @ A 2............ A p such that each block A j is either 0 or a two-dimensional matrix of the form A j = 1 C A 0 µj µ j 0 where µ j 2 R, with µ j > 0. In particular, the eigenvalues of f C are pure imaginary of the form ±iµ j, or 0.

550 CHAPTER 10. SPECTRAL THEOREMS Theorem 10.13. Given a Euclidean space E of dimension n, for every orthogonal linear map f : E! E, there is an orthonormal basis (e 1,...,e n ) such that the matrix of f w.r.t. this basis is a block diagonal matrix of the form 0 A 1... B @ A 2............ A p such that each block A j is either 1, 1, or a twodimensional matrix of the form cos j sin A j = j sin j cos j where 0 < j <. In particular, the eigenvalues of f C are of the form cos j ± i sin j, or 1, or 1. 1 C A

10.2. SELF-ADJOINT AND OTHER SPECIAL LINEAR MAPS 551 It is obvious that we can reorder the orthonormal basis of eigenvectors given by Theorem 10.13, so that the matrix of f w.r.t. this basis is a block diagonal matrix of the form 0 1 A 1......... B... A r C @ I q A... I p where each block A j is a two-dimensional rotation matrix A j 6= ±I 2 of the form with 0 < j <. A j = cos j sin j sin j cos j The linear map f has an eigenspace E(1,f)=Ker(f id) of dimension p for the eigenvalue 1, and an eigenspace E( 1,f)=Ker(f +id) of dimension q for the eigenvalue 1.

552 CHAPTER 10. SPECTRAL THEOREMS If det(f) = +1 (f is a rotation), the dimension q of E( 1,f)mustbeeven,andtheentriesin I q can be paired to form two-dimensional blocks, if we wish. Remark: Theorem10.13canbeusedtoproveasharper version of the Cartan-Dieudonné Theorem. Theorem 10.14. Let E be a Euclidean space of dimension n 2. For every isometry f 2 O(E), if p =dim(e(1,f)) = dim(ker (f id)), then f is the composition of n p reflections and n p is minimal. The theorems of this section and of the previous section can be immediately applied to matrices.

10.3. NORMAL AND OTHER SPECIAL MATRICES 553 10.3 Normal, Symmetric, Skew-Symmetric, Orthogonal, Hermitian, Skew-Hermitian, and Unitary Matrices First, we consider real matrices. Definition 10.3. Given a real m n matrix A, the transpose A > of A is the n m matrix A > = (a > ij ) defined such that a > ij = a ji for all i, j, 1apple i apple m, 1apple j apple n. Arealn n matrix A is 1. normal i 2. symmetric i AA > = A > A, A > = A, 3. skew-symmetric i A > = A, 4. orthogonal i AA > = A > A = I n.

554 CHAPTER 10. SPECTRAL THEOREMS Theorem 10.15. For every normal matrix A, there is an orthogonal matrix P and a block diagonal matrix D such that A = PDP >, where D is of the form D = 0 D 1... B @ D 2............ D p such that each block D j is either a one-dimensional matrix (i.e., a real scalar) or a two-dimensional matrix of the form D j = j µ j µ j j where j,µ j 2 R, with µ j > 0. 1 C A

10.3. NORMAL AND OTHER SPECIAL MATRICES 555 Theorem 10.16. For every symmetric matrix A, there is an orthogonal matrix P and a diagonal matrix D such that A = PDP >, where D is of the form where i 2 R. D = 0 B @ 1 1... 2... C...... A... n

556 CHAPTER 10. SPECTRAL THEOREMS Theorem 10.17. For every skew-symmetric matrix A, there is an orthogonal matrix P and a block diagonal matrix D such that A = PDP >, where D is of the form D = 0 D 1... B @ D 2............ D p such that each block D j is either 0 or a two-dimensional matrix of the form D j = 0 µj µ j 0 where µ j 2 R, with µ j > 0. In particular, the eigenvalues of A are pure imaginary of the form ±iµ j, or 0. 1 C A

10.3. NORMAL AND OTHER SPECIAL MATRICES 557 Theorem 10.18. For every orthogonal matrix A, there is an orthogonal matrix P and a block diagonal matrix D such that A = PDP >, where D is of the form D = 0 D 1... B @ D 2............ D p such that each block D j is either 1, 1, or a twodimensional matrix of the form cos j sin D j = j sin j cos j where 0 < j <. In particular, the eigenvalues of A are of the form cos j ± i sin j, or 1, or 1. 1 C A We now consider complex matrices.

558 CHAPTER 10. SPECTRAL THEOREMS Definition 10.4. Given a complex m n matrix A, the transpose A > of A is the n m matrix A > =(a > ij ) defined such that a > ij = a ji for all i, j, 1apple i apple m, 1apple j apple n. Theconjugate A of A is the m n matrix A =(b ij )definedsuchthat b ij = a ij for all i, j, 1apple i apple m, 1apple j apple n. Given an n n complex matrix A, theadjoint A of A is the matrix defined such that A = (A > )=(A) >. Acomplexn n matrix A is 1. normal i 2. Hermitian i 3. skew-hermitian i AA = A A, A = A, A = A, 4. unitary i AA = A A = I n.

10.3. NORMAL AND OTHER SPECIAL MATRICES 559 Theorem 10.10 can be restated in terms of matrices as follows. We can also say a little more about eigenvalues (easy exercise left to the reader). Theorem 10.19. For every complex normal matrix A, there is a unitary matrix U and a diagonal matrix D such that A = UDU. Furthermore, if A is Hermitian, D is a real matrix, if A is skew-hermitian, then the entries in D are pure imaginary or null, and if A is unitary, then the entries in D have absolute value 1.

560 CHAPTER 10. SPECTRAL THEOREMS