LECTURE VI: SELF-ADJOINT AND UNITARY OPERATORS MAT FALL 2006 PRINCETON UNIVERSITY

Similar documents
LECTURE VII: THE JORDAN CANONICAL FORM MAT FALL 2006 PRINCETON UNIVERSITY. [See also Appendix B in the book]

Linear Algebra 2 Spectral Notes

Diagonalizing Matrices

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

Review problems for MA 54, Fall 2004.

Homework 11 Solutions. Math 110, Fall 2013.

Math 108b: Notes on the Spectral Theorem

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Linear Algebra II. 7 Inner product spaces. Notes 7 16th December Inner products and orthonormal bases

22m:033 Notes: 7.1 Diagonalization of Symmetric Matrices

NATIONAL UNIVERSITY OF SINGAPORE DEPARTMENT OF MATHEMATICS SEMESTER 2 EXAMINATION, AY 2010/2011. Linear Algebra II. May 2011 Time allowed :

Math 350 Fall 2011 Notes about inner product spaces. In this notes we state and prove some important properties of inner product spaces.

x 3y 2z = 6 1.2) 2x 4y 3z = 8 3x + 6y + 8z = 5 x + 3y 2z + 5t = 4 1.5) 2x + 8y z + 9t = 9 3x + 5y 12z + 17t = 7

Linear Algebra: Matrix Eigenvalue Problems

MATH 304 Linear Algebra Lecture 34: Review for Test 2.

Linear algebra and applications to graphs Part 1

Eigenvalues and Eigenvectors A =

Quantum Computing Lecture 2. Review of Linear Algebra

Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Definition (T -invariant subspace) Example. Example

LINEAR ALGEBRA BOOT CAMP WEEK 4: THE SPECTRAL THEOREM

The Spectral Theorem for normal linear maps

4. Linear transformations as a vector space 17

Real symmetric matrices/1. 1 Eigenvalues and eigenvectors

235 Final exam review questions

Exercise Set 7.2. Skills

University of Colorado at Denver Mathematics Department Applied Linear Algebra Preliminary Exam With Solutions 16 January 2009, 10:00 am 2:00 pm

Lecture 15, 16: Diagonalization

Diagonalization of Matrix

1. General Vector Spaces

MATH 304 Linear Algebra Lecture 23: Diagonalization. Review for Test 2.

MATHEMATICS 217 NOTES

MATH 115A: SAMPLE FINAL SOLUTIONS

12x + 18y = 30? ax + by = m

6 Inner Product Spaces

A PRIMER ON SESQUILINEAR FORMS

Topic 1: Matrix diagonalization

Math 113 Final Exam: Solutions

Chapter 6: Orthogonality

1 Last time: least-squares problems

Chapter 4 Euclid Space

Math Linear Algebra II. 1. Inner Products and Norms

2. Review of Linear Algebra

Chapter 3 Transformations

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )

SYLLABUS. 1 Linear maps and matrices

Glossary of Linear Algebra Terms. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB

Math 396. An application of Gram-Schmidt to prove connectedness

Linear Algebra II Lecture 13

Vector Space Basics. 1 Abstract Vector Spaces. 1. (commutativity of vector addition) u + v = v + u. 2. (associativity of vector addition)

Review of Linear Algebra

Spectral Theorems in Euclidean and Hermitian Spaces

Practice Exam. 2x 1 + 4x 2 + 2x 3 = 4 x 1 + 2x 2 + 3x 3 = 1 2x 1 + 3x 2 + 4x 3 = 5

MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors.

MATH 583A REVIEW SESSION #1

Review of some mathematical tools

Quadratic forms. Here. Thus symmetric matrices are diagonalizable, and the diagonalization can be performed by means of an orthogonal matrix.

Linear Algebra 1. M.T.Nair Department of Mathematics, IIT Madras. and in that case x is called an eigenvector of T corresponding to the eigenvalue λ.

Spectral Theorem for Self-adjoint Linear Operators

Linear algebra 2. Yoav Zemel. March 1, 2012

YORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH M Test #2 Solutions

Eigenvalues and Eigenvectors

Linear Algebra Lecture Notes-II

Math 291-2: Lecture Notes Northwestern University, Winter 2016

Math 3191 Applied Linear Algebra

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

4.2. ORTHOGONALITY 161

Supplementary Notes on Linear Algebra

Contents. Preface for the Instructor. Preface for the Student. xvii. Acknowledgments. 1 Vector Spaces 1 1.A R n and C n 2

LINEAR ALGEBRA SUMMARY SHEET.

Math 21b. Review for Final Exam

LINEAR ALGEBRA REVIEW

LINEAR ALGEBRA MICHAEL PENKAVA

Linear Algebra Final Exam Solutions, December 13, 2008

MATRICES ARE SIMILAR TO TRIANGULAR MATRICES

ALGEBRA 8: Linear algebra: characteristic polynomial

Generalized Eigenvectors and Jordan Form

Notes on the matrix exponential

Economics 204 Summer/Fall 2010 Lecture 10 Friday August 6, 2010

Linear Algebra. Workbook

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

Linear Algebra- Final Exam Review

MATH 23a, FALL 2002 THEORETICAL LINEAR ALGEBRA AND MULTIVARIABLE CALCULUS Solutions to Final Exam (in-class portion) January 22, 2003

Math 443 Differential Geometry Spring Handout 3: Bilinear and Quadratic Forms This handout should be read just before Chapter 4 of the textbook.

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

Linear Algebra problems

Final A. Problem Points Score Total 100. Math115A Nadja Hempel 03/23/2017

Preliminary/Qualifying Exam in Numerical Analysis (Math 502a) Spring 2012

Matrix Theory, Math6304 Lecture Notes from September 27, 2012 taken by Tasadduk Chowdhury

Linear Algebra (part 1) : Vector Spaces (by Evan Dummit, 2017, v. 1.07) 1.1 The Formal Denition of a Vector Space

Symmetric and self-adjoint matrices

Then x 1,..., x n is a basis as desired. Indeed, it suffices to verify that it spans V, since n = dim(v ). We may write any v V as r

Eigenvalues and Eigenvectors

BASIC ALGORITHMS IN LINEAR ALGEBRA. Matrices and Applications of Gaussian Elimination. A 2 x. A T m x. A 1 x A T 1. A m x

LINEAR ALGEBRA REVIEW

Linear Algebra using Dirac Notation: Pt. 2

Final Exam. Linear Algebra Summer 2011 Math S2010X (3) Corrin Clarkson. August 10th, Solutions

Lecture 7: Positive Semidefinite Matrices

Transcription:

LECTURE VI: SELF-ADJOINT AND UNITARY OPERATORS MAT 204 - FALL 2006 PRINCETON UNIVERSITY ALFONSO SORRENTINO 1 Adjoint of a linear operator Note: In these notes, V will denote a n-dimensional euclidean vector space and we will denote the inner product by, Denition Let (V,, ) be a n-dimensional euclidean vector space and T : V V a linear operator We will call the adjoint of T, the linear operator S : V V such that: T (u), v = u, S(v), for all u, v V Proposition 1 Let (V,, ) be a n-dimensional euclidean vector space and T : V V a linear operator The adjoint of T exists and is unique Moreover, if E denotes an orthonormal basis of V (with respect to, ) and T has matrix B with respect to E (ie, T (E) = EB), then the adjoint of T is the linear operator S : V V, that has matrix B T with respect to E For this reason, the adjoint of T is sometimes called the transpose operator of T and denoted T T Proof Let E an arbitrary basis of V, let T (E) = EB, with B M n (R) and let A be the matrix of, (wrt E) Therefore: u, v = Ex, Ey = x T Ay for all u = Ex, v = Ey V [observe that A GL n (R), since, is an inner product] Let us denote with S : V V an arbitrary linear operator with matrix C M n (R) (wrt E), ie, S(E) = EC For any u, v V we have: Hence: T (u), v = T (Ex), Ey = EBx, Ey = (Bx) T Ay = x T (B T A)y u, S(v) = Ex, S(Ey) = Ex, ECy = x T A(Cy) = x T (AC)y S is the adjoint of T B T A = AC C = A 1 B T A Therefore, the adjoint of T exists and is unique (in fact, its matrix wrt E is uniquely determined by A and B) Moreover, if E is orthonormal, then A = I n and C = B T Denition Let (V,, ) be a n-dimensional euclidean vector space and T : V V a linear operator T is said to be self-adjoint [or symmetric] if T = T T ; ie, T (u), v = u, T (v), for all u, v V Remark Let T : V V be a linear operator and E an orthonormal basis of V If T (E) = EB, then it follows from the previous proposition that: T is self-adjoint B = B T 1

2 ALFONSO SORRENTINO Hence, self-adjoint operators on V (with dim V = n) are in 1 1 correspondence with symmetric matrices in M n (R) In particular, they form a vector subspace of End(V ), that is isomorphic to the vector subspace of symmetric matrices in M n (R) Notice that if E is NOT orthonormal, then a self-adjoint operator might not have a symmetric matrix (wrt that basis); viceversa, an operator with a symmetric matrix (wrt a non-orthonormal basis) might not be self-adjoint Example Let V be a two dimensional euclidean vector space, with inner product, dened by ( ) 2 1 A = 1 1 with respect to a xed basis E Let T : V V a linear operator, dened by T (E) = EB, where ( ) 1 2 B = 2 1 Let us verify that T is not self-adjoint (although B is symmetric) and nd the matrix of its adjoint operator S (wrt E) Observe that, if T were self-adjoint, we would have from which T (Ex), Ey = Ex, T (Ey), for all Ex, Ey V ; x T (B T A)y = x T (AB)y for all x, y M n,1 (R) Therefore, we should have B T A = AB, but: ( ) B T 0 1 A = 3 1 The adjoint of T is dened by: ( 0 3 1 1 S(E) = EC with C = A 1 B T A = ) ( 3 0 0 1 ) 2 Spectral theorem for self-adjoint operators The main theorem of this section states that if T is a self-adjoint operator on an eucliden vector space, then there exists an orthonormal basis of V formed by eigenvectors of T Theorem 1 (Spectral theorem for self-adjoint operators) Let V be a n- dimensional euclidean vector space and T : V V a self-adjoint linear operator Then, there exists an orthonormal basis of V formed by eigenvectors of T The proof of this theorem is based on the following preliminary results Proposition 2 Let A M n (R) be a symmetric matrix Then, its characteristic polynomial P = P A has n real roots (each counted with its algebraic multiplicity); hence, P can be factorized in the product of n linear polynomials in R[x] Proof From the Fundamental Theorem of Algebra, it follows that P can be always linearly factorized in C[x] We need only to show that each root λ C of P is indeed a real number; ie, λ R Let T : C n C n be the linear operator with matrix A, with respect to the canonical basis E of C n Since λ is an eigenvalue of T, then there exists a non-zero vector z such that T (z) = λz, ie, (1) Az = λz

LECTURE VI: SELF-ADJOINT AND UNITARY OPERATORS 3 Let us remember some denitions and simple results about the complex numbers For any α = x + iy C, we dene its conjugate as α = x iy In particular, the following holds: αα = x 2 + y 2 [it is called norm of α ]; αα 0 and αα = 0 α = 0 ; α = α ; α = α α R ; α + β = α + β and αβ = αβ Let us consider a non-zero vector z = z 1 z n C n One can easily verify that z T z = z 1 z n C n and its conjugate z = n z i z i > 0 [at least one z i is dierent from zero] ; i=1 ( ) z T Az = z T Az = z T Az [since A is real] If we multiply (1) by z T (on the left), we get and consequently z T Az = z T λz = λ(z T z), λ = 1 z T z (zt Az) In order to show that λ R, it suces to verify that z T Az R or equivalently: In fact, using that A is symmetric: (z T Az) = z T Az) (z T Az) = z T Az = (z T Az) T = z T Az Proposition 3 Let T be a self-adjoint operator on a n-dimensional euclidean vector space V and u an eigenvector Then, T (u ) u ; ie, T let the subspace u xed Proof Let T (u) = λu For any v u [ie, v, u = 0] we have: Therefore, T (v) u ; T (v), u = v, T (u) = v, λu = λ v, u = 0 We can now prove the Spectral Theorem Proof [Spectral Theorem] We proceed by induction on n = dim (V ) If dim V = 1, then the assertion is evident (it is sucient to choose any basis E = (e 1 ) with e 1 = 1) Suppose that n 2 and assume that the theorem is true for self-adjoint operators on euclidean vector spaces of dimension n 1 According to proposition 2, T has at least one real eigenvalue λ and call e 1 one of the corresponding eigenvectors We can assume that e 1 = 1 (otherwise, we just divide this vector by its norm)

4 ALFONSO SORRENTINO One can observe that e 1 is an euclidean vector subspace of V of dimension n 1 (see Lecture V, 2) From proposition 3, T (e 1 ) e 1, therefore we can restrict T to the subspace e 1 We obtain a new linear operator T : e 1 e 1, that is still self-adjoint (since it acts like T on the vectors in e 1 ) Using the inductive hypothesis, T has an orthonormal basis {e 2,, e n }, formed by eigenvectors Since e 1 e i (for all i = 1,, n), then the vectors e 1,, e n are pairwise orthogonal and therefore linearly independent These vectors form the desired basis Remark Let T be a self-adjoint operator and F an orthonormal basis of eigenvectors of T We want to point out that, with respect to this basis, T is represented by a diagonal matrix: λ 1 0 0 0 λ 2 0 D =, 0 0 λ n where λ 1,, λ n are the eigenvalues of T They are not necessarily distinct: each of them appears h λi times on the diagonal (where h λi is its algebraic multiplicity = geometric multiplicity ) Proposition 4 Let T : V V be a self-adjoint operator If u, v are eigenvectors corresponding to distinct eigenvalues, then they are orthogonal Therefore, the eigenspaces of T are pairwise orthogonal Proof Let T (u) = λu and T (v) = µv, with λ, µ R and λ µ We have: T (u), v = λu, v = λ u, v and u, T (v) = u, µv = µ u, v Since T (u), v = u, T (v), then λ u, v = µ u, v and therefore (since λ µ) u, v = 0 It follows also that E λ Eµ and E µ Eλ (where E λ and E µ are the associated eigenspaces) Remark From the previous proposition, it follows that in order to compute an orthonormal basis F of eigenvectors of a self-adjoint operators, it is enough to nd the basis of each eigenspace E λ and orthonormalize it (using Gram-Schmidt, for instance) The union of such bases provides the desired one Example In R 4 with the canonical inner product, consider the linear operator dened (wrt the canonical basis of R 4 ) by: 1 0 0 0 A = 0 1 0 0 0 0 0 1 0 0 1 0 Determine an orthonormal basis F of eigenvectors of T and write the matrix of T with respect to F Solution: T has characteristic polynomial P = (x 1) 2 (x + 1) 2 and therefore its specturm is Λ(T ) = {1, 1}, with multiplicities h 1 = 2 and h 1 = 2 The eigenspace E 1 has equations: { 2x2 = 0 x 3 + x 4 = 0

LECTURE VI: SELF-ADJOINT AND UNITARY OPERATORS 5 and therefore: E 1 = (1, 0, 0, 0), (0, 0, 1, 1) This basis is already orthogonal, but we need to normalize the vectors, dividing by their norm: ( ) 1 1 E 1 = (1, 0, 0, 0), 0, 0,, 2 2 The eigenspace E 1 has equations: { 2x1 = 0 x 3 + x 4 = 0 and therefore: E 1 = (0, 1, 0, 0), (0, 0, 1, 1) This basis is already orthogonal, but we need to normalize the vectors, dividing by their norm: ( 1 E 1 = (0, 1, 0, 0), 0, 0,, 1 ) 2 2 Concluding, an orthonormal basis F for T is given by: 1 0 0 0 0 0 1 0 F = EC, where C = 1 0 2 1 0 2 1 0 2 0 1 2 The matrix of T with respect to this basis is: T (F) = FD, where D = C T AC = O 4(R) 1 0 0 0 0 1 0 0 0 0 1 0 0 0 0 1 3 Unitary operators Denition Let (V,, ) be a n-dimensional euclidean vector space and T : V V a linear operator We will say that is unitary if: T (u), T (v) = u, v, for all u, v V [ie, T preserves the inner product, in V ] Proposition 5 Let T be a linear operator on (V,, ) We have: Proof T is unitary T is invertible and T 1 = T T (= ) It is sucient to verify that T T T = Id, that is equivalent to T T (T (v)) = v for all v V In fact, from the denition above and that of adjoint of T, one can conclude: therefore, u, v = T (u), T (v) = u, T T (T (v)) ; u, v T T (T (v)) = 0 for all u V We can deduce from this (using the non-degeneracy of the inner product) that T T (T (v)) = v, for any v V ( =) We have that, for any u, v V : T (u), T (v) = u, T T (T (v)) = u, T 1 (T (v)) = u, v

6 ALFONSO SORRENTINO Let us try to deduce some information about the matrices of these unitary operators Let E be a basis for (V,, ) and suppose that Ex, Ey = x T Ay If T is a linear operator on V, such that T (E) = EB, then: T is unitary T (Ex), T (Ey) = Ex, Ey, for all Ex, Ey V EBx, EBy = Ex, Ey, for all Ex, Ey V (Bx) T A(By) = x T Ay, for all x, y M n,1 (R) x T (B T AB)y = x T Ay, for all x, y M n,1 (R) B T AB = A Corollary 1 Let (V,, ) be a n-dimensional euclidean vector space, with an orthonormal basis E and let T : V V a linear operator, such that T (E) = EB Then, T is unitary B O n (R) [ie, B is an orthogonal matrix] Proof For what observed above: T is unitary if and only if B T AB = A Since E is orthonormal, then A = I n and we get: T is unitary B T B = I n B O n (R) Remark i) Let T be a unitary operator If E is an orthonormal basis, then also T (E) is an orthonormal basis (see Lecture V, prop 8) Therefore, unitary operators send orthonormal bases into orthonormal bases ii) If T is a unitary operator, then det T = ±1 In fact, if T (E) = EB, with E orthonormal, then B O n (R) and det T = det B = ±1 In particular, unitary operators with det T = 1 are called special unitary operators (or rotations) of V iii) If T is unitary, then its spectrum Λ T {1, 1} In fact, if T (u) = λu, then: u, u = T (u), T (u) = λu, λu = λ 2 u, u ; therefore λ 2 = 1, that implies λ = ±1 iv) Let us point out that, in general, a unitary linear operator might not be diagonalizable Consider, for instance, R 2 with the canonical inner product and the operator that is dened (wrt the canonical basis) by ( ) R π 0 1 = SO 2 1 0 2 (R) O 2 (R) This is a unitary operator and it has no eigenvalues, hence it is not diagonalizable v) If T is unitary and Λ T = {1, 1}, then the eigenspaces E 1 and E 1 are orthogonal to each other In fact, if T (u) = u and T (v) = v, then: u, v = T (u), T (v) = u, v = u, v ; this implies that u, v = 0 and therefore u, v = 0 vi) If T is unitary and u is one of its eigenvectors, then T (u ) u In fact, for any v u (remember that λ = ±1): T (v), u = 1 λ T (v), λu = 1 λ T (v), T (u) = 1 v, u = 0 ; λ hence, T (v) u Department of Mathematics, Princeton University E-mail address: asorrent@mathprincetonedu