Spectral Theorem for Self-adjoint Linear Operators

Similar documents
Math 350 Fall 2011 Notes about inner product spaces. In this notes we state and prove some important properties of inner product spaces.

The Spectral Theorem for normal linear maps

Quantum Computing Lecture 2. Review of Linear Algebra

MATH 423 Linear Algebra II Lecture 33: Diagonalization of normal operators.

1 Last time: least-squares problems

Review of Linear Algebra Definitions, Change of Basis, Trace, Spectral Theorem

Numerical Linear Algebra Homework Assignment - Week 2

MATRICES ARE SIMILAR TO TRIANGULAR MATRICES

22m:033 Notes: 7.1 Diagonalization of Symmetric Matrices

Real symmetric matrices/1. 1 Eigenvalues and eigenvectors

Chapter 7: Symmetric Matrices and Quadratic Forms

Throughout these notes we assume V, W are finite dimensional inner product spaces over C.

Linear Algebra Lecture Notes-II

NOTES ON BILINEAR FORMS

MATHEMATICS 217 NOTES

The following definition is fundamental.

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.

Symmetric and self-adjoint matrices

Chapter 6: Orthogonality

Math 108b: Notes on the Spectral Theorem

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )

Schur s Triangularization Theorem. Math 422

j=1 x j p, if 1 p <, x i ξ : x i < ξ} 0 as p.

Numerical Linear Algebra

The Jordan canonical form

Math Linear Algebra II. 1. Inner Products and Norms

Notes on basis changes and matrix diagonalization

MATH 304 Linear Algebra Lecture 23: Diagonalization. Review for Test 2.

Lecture notes on Quantum Computing. Chapter 1 Mathematical Background

LINEAR ALGEBRA MICHAEL PENKAVA

18.06 Problem Set 8 - Solutions Due Wednesday, 14 November 2007 at 4 pm in

Lecture notes: Applied linear algebra Part 1. Version 2

Review problems for MA 54, Fall 2004.

A PRIMER ON SESQUILINEAR FORMS

5 Compact linear operators

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

Symmetric and anti symmetric matrices

Linear Algebra: Matrix Eigenvalue Problems

1. General Vector Spaces

MATH SOLUTIONS TO PRACTICE MIDTERM LECTURE 1, SUMMER Given vector spaces V and W, V W is the vector space given by

c Igor Zelenko, Fall

Linear algebra 2. Yoav Zemel. March 1, 2012

REPRESENTATION THEORY WEEK 7

Math 413/513 Chapter 6 (from Friedberg, Insel, & Spence)

A linear algebra proof of the fundamental theorem of algebra

Summary of Week 9 B = then A A =

A linear algebra proof of the fundamental theorem of algebra

Singular Value Decomposition (SVD) and Polar Form

EXAM. Exam 1. Math 5316, Fall December 2, 2012

Diagonalization by a unitary similarity transformation

Homework set 4 - Solutions

Archive of past papers, solutions and homeworks for. MATH 224, Linear Algebra 2, Spring 2013, Laurence Barker

A geometric proof of the spectral theorem for real symmetric matrices

BASIC ALGORITHMS IN LINEAR ALGEBRA. Matrices and Applications of Gaussian Elimination. A 2 x. A T m x. A 1 x A T 1. A m x

Functional Analysis Review

University of Colorado at Denver Mathematics Department Applied Linear Algebra Preliminary Exam With Solutions 16 January 2009, 10:00 am 2:00 pm

Applied Mathematics 205. Unit V: Eigenvalue Problems. Lecturer: Dr. David Knezevic

Spectral Theorems in Euclidean and Hermitian Spaces

Matrix Theory, Math6304 Lecture Notes from October 25, 2012

Math 443 Differential Geometry Spring Handout 3: Bilinear and Quadratic Forms This handout should be read just before Chapter 4 of the textbook.

6 Inner Product Spaces

Econ 204 Supplement to Section 3.6 Diagonalization and Quadratic Forms. 1 Diagonalization and Change of Basis

Homework 11 Solutions. Math 110, Fall 2013.

Linear algebra and applications to graphs Part 1

CLASSICAL GROUPS DAVID VOGAN

MATH 304 Linear Algebra Lecture 34: Review for Test 2.

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

Linear Algebra 2 Spectral Notes

I. Multiple Choice Questions (Answer any eight)

LINEAR ALGEBRA REVIEW

GQE ALGEBRA PROBLEMS

MATH 583A REVIEW SESSION #1

7 Bilinear forms and inner products

Spectral radius, symmetric and positive matrices

LINEAR ALGEBRA BOOT CAMP WEEK 4: THE SPECTRAL THEOREM

Review of Linear Algebra

Math 408 Advanced Linear Algebra

Contents. Preface for the Instructor. Preface for the Student. xvii. Acknowledgments. 1 Vector Spaces 1 1.A R n and C n 2

MTH 5102 Linear Algebra Practice Final Exam April 26, 2016

Fall TMA4145 Linear Methods. Exercise set Given the matrix 1 2

Eigenvalues and eigenvectors

implies that if we fix a basis v of V and let M and M be the associated invertible symmetric matrices computing, and, then M = (L L)M and the

LinGloss. A glossary of linear algebra

Math 554 Qualifying Exam. You may use any theorems from the textbook. Any other claims must be proved in details.

ALGEBRA 8: Linear algebra: characteristic polynomial

Math 4153 Exam 3 Review. The syllabus for Exam 3 is Chapter 6 (pages ), Chapter 7 through page 137, and Chapter 8 through page 182 in Axler.

FFTs in Graphics and Vision. The Laplace Operator

The Singular Value Decomposition

AN ELEMENTARY PROOF OF THE SPECTRAL RADIUS FORMULA FOR MATRICES

Topic 1: Matrix diagonalization

x i e i ) + Q(x n e n ) + ( i<n c ij x i x j

Chapter 8. Eigenvalues

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix

Mathematical Methods wk 2: Linear Operators

EXERCISES ON DETERMINANTS, EIGENVALUES AND EIGENVECTORS. 1. Determinants

642:550, Summer 2004, Supplement 6 The Perron-Frobenius Theorem. Summer 2004

Linear Algebra 1. M.T.Nair Department of Mathematics, IIT Madras. and in that case x is called an eigenvector of T corresponding to the eigenvalue λ.

Elementary linear algebra

Econ Slides from Lecture 7

LECTURE VI: SELF-ADJOINT AND UNITARY OPERATORS MAT FALL 2006 PRINCETON UNIVERSITY

Transcription:

Notes for the undergraduate lecture by David Adams. (These are the notes I would write if I was teaching a course on this topic. I have included more material than I will cover in the 45 minute lecture; the intention is to show how I would cover this topic in its entirety. In practice I would give out an exercise sheet in connection with this material, but I will skip that here.) Spectral Theorem for Self-adjoint Linear Operators (finite-dimensional case) Let V be a finite-dimensional vector space, either real or complex, and equipped with an inner product,. Let A : V V be a linear operator. Recall that the adjoint of A is the linear operator A : V V characterized by A v, w = v, Aw v, w V (0.1) A is called self-adjoint (or Hermitian) when A = A. Spectral Theorem. If A is self-adjoint then there is an orthonormal basis (o.n.b.) of V consisting of eigenvectors of A. Each eigenvalue is real. Before proceeding to the proof, let us note why this theorem is important. Recall that, after making a choice of of basis v 1,...,v n for V, a linear operator A corresponds to an n n matrix A = (A ij ) where n = dim V. The correspondence is given by n Av j = A ij v i j = 1,...,n (0.2) i=1 The spectral theorem tells us that if A is self-adjoint then V admits an eigenvector basis, Av j = λ j v j, in which case the matrix for A is diagonal: λ 1 0 A =... (0.3) 0 λ n Thus A admits a very simple representation in this case. As an example of the useful consequences of this, we can evaluate A p from the fact that λ p 1 0 A p =... (0.4) 0 λ p n The practical significance of of the eigenvector basis being orthonormal is that it is easier to determine in explicit calculations. In fact, a basis of eigenvectors usually exists for non-self-adjoint linear operators as well. 1 However, it is usually much more difficult to explicitly determine the basis in that case. 1 This can be shown when the eigenvalues of the operator are all distinct. However, when the eigenvalues are not all distinct there are cases where a basis of eigenvectors does not exist. 1

A specific example: For A = ( ) 0 1 1 1, regarded as a linear operator on R 2, one can derive the formula ( ) p ( ) 0 1 a(p) a(p + 1) = a(p) = λ p + + λ p, λ 1 1 a(p + 1) a(p + 2) ± = 1 2 (1 ± 5) where λ ± are the eigenvalues of ( ) 0 1 1 1. The derivation of this is given as an exercise (Exercise 1 in the exercise section; hints are provided). Proof of the Spectral Theorem. We use induction on the dimension of V. First we show that the theorem holds when dim V = 1. Then we assume that it holds when the dimension is n 1 and show that it holds when the dimension is n. When dim V = 1 all linear operators are multiplication by scalars: Av = λv v V. The scalar λ is seen to be real as follows: For arbitrary v V we have λ v 2 = Av, v = v, Av = Av, v = λ v 2 = λ v 2 (0.6) hence λ = λ. Thus the theorem is verified in this case. Assuming now that the theorem holds when the vector space has dimension n 1, we show that it holds when the dimension is n. There are two main steps: Step 1. Show that A has at least one eigenvector, v 1, with real eigenvalue λ 1. This is the hard part. We do it further below. Step 2. Define W to be the vector subspace orthogonal to v 1 in V : Show that A(W) W, i.e., W is invariant under A. Step 2 is easily done by by noting that (0.5) W := { v V v, v 1 = 0 } (0.7) v, v 1 = 0 Av, v 1 = v, Av 1 = v, λ 1 v 1 = λ 1 v, v 1 = 0 (0.8) where we used Av 1 = λ 1 v 1 from Step 1. This shows that if v W then Av W, so A(W) W as claimed. The result of Step 2 implies that A restricts to a self-adjoint linear operator on W: A : W W. (0.9) Since dim W = dim V 1 = n 1 the induction hypothesis implies that W has an o.n.b. {v 2,...,v n } of eigenvectors for A with real eigenvalues λ 2,...,λ n. Since these eigenvectors are all contained in W they are all orthogonal to v 1, so {v 1, v 2,...v n } is an o.n.b. for V. (We normalized v 1 so that v 1 = 1.) Thus we obtain an o.n.b. for V of eigenvectors for A with real eigenvaues, and the Spectral Theorem is established. 2

To summarize, we have shown that the Spectral Theorem holds if we can do Step 1. There are several ways to do this step. The quickest way is to use the fundamental theorem of algebra (f.t.a.) as follows. The f.t.a. asserts that any non-constant complex polynomial p(z) vanishes at at least one value of the complex variable z. We apply this to the determinant p(z) = det(a z1) (0.10) where 1 is the identity map on V. The f.t.a. asserts that there exists a z 0 such that det(a z 0 1) = 0. But this implies that there is a non-zero solution v to (A z 0 1)v = 0 (0.11) The solution is an eigenvector: Av = z 0 v. The eigenvalue z 0 is seen to be real by noting that z 0 v 2 = Av, v and performing the same calculation as in (0.6) to find z 0 = z 0. Setting v 1 = v and λ = z 0 completes Step 1, and with this we have completed the proof of the Spectral Theorem. An alternative way to do Step 1.: We now present another way to do Step 1. It is longer than the preceding one, but more elementary: It does not require the fundamental theorem of algebra but only relies on a basic property of continuous functions. Define the function f : V R by f(v) = Av, v (0.12) Note that f is real-valued since Av, v = v, Av = Av, v. Let S denote the unit sphere in V : S := { v V v = 1 } (0.13) Note that f is a continuous function and S is compact topological subspace of V. It follows from a basic theorem on continuous functions that f has a finite maximum on S. I.e., there is a v 1 S such that f(v) f(v 1 ) v S (0.14) We now show that v 1 is an eigenvector for A and its eigenvalue is λ = f(v 1 ). For arbitrary v V and t R the inequality (0.14) implies which can be re-expressed as f ( v 1 + tv ) f(v1 ), (0.15) v 1 + tv A(v 1 + tv), v 1 + tv f(v 1 ) v 1 + tv, v 1 + tv 0. (0.16) 3

Defining g v (t) to be the function on the left-hand side, we have g v (t) 0 and g v (0) = 0 v V, t R (0.17) It follows that g v(0) = 0. 2 This gives 0 = g v(0) = Av, v 1 + Av 1, v f(v 1 ) v, v 1 f(v 1 ) v 1, v. (0.18) When v is orthogonal to v 1 (i.e., v, v 1 = 0) this becomes 0 = Av, v 1 + Av 1, v = Av 1, v + Av 1, v = 2Re Av 1, v (0.19) where we have used Av, v 1 = v, Av 1 = Av 1, v. If the vector space V is complex, then by replacing v by iv in this argument we find from (0.19) that the imaginary part of Av 1, v also vanishes. Thus we have shown that Av 1, v = 0 v V with v, v 1 = 0. (0.20) This implies that Av 1 has no component orthogonal to v 1 and therefore must be proportional to v 1, i.e., Av 1 = λv 1 (0.21) for some scalar λ. Thus v 1 is an eigenvalue for A as claimed. To see that its eigenvalue λ is real we note that, since v 1 = 1, λ = Av 1, v 1 = f(v 1 ) (0.22) which, as noted earlier, is real. This completes the alternative way to do Step 1. Spectral Theorem for Hermitian matrices We now specialize to the case where the linear operator is a matrix A = (A ij ) on the complex vector space C n (or real vector space R n ) equipped with the standard inner product. Recall that the adjoint of the matrix A (also called the conjugate transpose or Hermitian transpose) is A = (A ij ) where A ij = A ji (0.23) The matrix is called Hermitian when it is its own adjoint, i.e., when A ji = A ij i, j = 1,...,n. (0.24) 2 This is easy to see from (0.17) after recalling that g (0) gives the tangent to the graph of g(t) at t = 0. 4

In the case of a real matrix this becomes A ji = A ij in which case the matrix is called symmetric. The spectral theorem for self-adjoint linear operators corresponds to the following theorem for complex Hermitian (or real symmetric) matrices. It is very useful in both pure and applied mathematics, having important and sometimes surprising applications to a wide range of problems. (Several of these will be given in the Exercises.) Spectral theorem for complex Hermitian (or real symmetric) matrices: Every complex Hermitian (or real symmetric) matrix A can be diagonalized by a unitary transformation. Specifically, U AU = D (0.25) where D is a diagonal matrix with real entries and U is a unitary matrix, i.e., U = U 1. The matrices U and D are obtained as follows. Let v 1,..., v n be an o.n.b. of eigenvectors for A (whose existence is guaranteed by the previous Spectral Theorem), and λ 1,...,λ n the corresponding eigenvalues. Then (0.25) holds with U being the matrix whose j th column is v j, and D the diagonal matrix whose j th diagonal entry is λ j for all j = 1,..., n. Proof. From the definition of U it is straightforward to show that From this it readily follows that v j = Ue j and e j = U v j. (0.26) U U = 1, (0.27) showing that U is unitary as claimed. (Exercise: verify (0.26) and (0.27).) Now, using the notation U = (v 1,...,v n ) to represent that the j th column of U is v j, we have and hence AU = (Av 1,...,Av n ) = (λ 1 v 1,...,λ n v n ) (0.28) U AU = (λ 1 U v 1,...,λ n U v n ) = (λ 1 e 1,...,λ n e n ) = D (0.29) where (0.26) was used to obtain the second equality. This completes the proof. Note that the matrix spectral theorem can be re-expressed as Since U is unitary it follows that for any power p. (Exercise: show this.) A = UDU. (0.30) A p = UD p U (0.31) 5