Math 408 Advanced Linear Algebra

Similar documents
Throughout these notes we assume V, W are finite dimensional inner product spaces over C.

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

Chapter 0 Preliminaries

Review problems for MA 54, Fall 2004.

Lecture notes on Quantum Computing. Chapter 1 Mathematical Background

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

Conceptual Questions for Review

Linear Algebra Practice Problems

MATH 423 Linear Algebra II Lecture 33: Diagonalization of normal operators.

I. Multiple Choice Questions (Answer any eight)

Problem Set (T) If A is an m n matrix, B is an n p matrix and D is a p s matrix, then show

18.06 Problem Set 8 - Solutions Due Wednesday, 14 November 2007 at 4 pm in

Chapter 7: Symmetric Matrices and Quadratic Forms

The Singular Value Decomposition and Least Squares Problems

UNIT 6: The singular value decomposition.

Math 108b: Notes on the Spectral Theorem

OHSx XM511 Linear Algebra: Solutions to Online True/False Exercises

Math Homework 8 (selected problems)

Lecture Summaries for Linear Algebra M51A

1 Last time: least-squares problems

Math Linear Algebra Final Exam Review Sheet

(a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? Solution: dim N(A) 1, since rank(a) 3. Ax =

1. General Vector Spaces

Spectral inequalities and equalities involving products of matrices

Linear Algebra Lecture Notes-II

Math 315: Linear Algebra Solutions to Assignment 7

Matrix Theory, Math6304 Lecture Notes from October 25, 2012

A = 3 1. We conclude that the algebraic multiplicity of the eigenvalues are both one, that is,

Algebra C Numerical Linear Algebra Sample Exam Problems

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix

1 Linear Algebra Problems

BASIC ALGORITHMS IN LINEAR ALGEBRA. Matrices and Applications of Gaussian Elimination. A 2 x. A T m x. A 1 x A T 1. A m x

b jσ(j), Keywords: Decomposable numerical range, principal character AMS Subject Classification: 15A60

Singular Value Decomposition (SVD) and Polar Form

Final Exam, Linear Algebra, Fall, 2003, W. Stephen Wilson

Eigenvalues and Eigenvectors A =

Linear algebra and applications to graphs Part 1

MATH 511 ADVANCED LINEAR ALGEBRA SPRING 2006

Numerical Linear Algebra Homework Assignment - Week 2

Diagonalizing Matrices

Math 102 Final Exam - Dec 14 - PCYNH pm Fall Name Student No. Section A0

ALGEBRA QUALIFYING EXAM PROBLEMS LINEAR ALGEBRA

Linear Algebra. Workbook

Online Exercises for Linear Algebra XM511

The Singular Value Decomposition

18.06SC Final Exam Solutions

Elementary linear algebra

Therefore, A and B have the same characteristic polynomial and hence, the same eigenvalues.

MATH 304 Linear Algebra Lecture 23: Diagonalization. Review for Test 2.

HOMEWORK PROBLEMS FROM STRANG S LINEAR ALGEBRA AND ITS APPLICATIONS (4TH EDITION)

Compound matrices and some classical inequalities

University of Colorado at Denver Mathematics Department Applied Linear Algebra Preliminary Exam With Solutions 16 January 2009, 10:00 am 2:00 pm

Diagonalization by a unitary similarity transformation

Foundations of Matrix Analysis

ANSWERS. E k E 2 E 1 A = B

Chapter 6: Orthogonality

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

AMS526: Numerical Analysis I (Numerical Linear Algebra)

MATH 235. Final ANSWERS May 5, 2015

Preliminary/Qualifying Exam in Numerical Analysis (Math 502a) Spring 2012

CANONICAL FORMS, HIGHER RANK NUMERICAL RANGES, TOTALLY ISOTROPIC SUBSPACES, AND MATRIX EQUATIONS

Computational Methods CMSC/AMSC/MAPL 460. Eigenvalues and Eigenvectors. Ramani Duraiswami, Dept. of Computer Science

Spectral Theorem for Self-adjoint Linear Operators

MATH 304 Linear Algebra Lecture 34: Review for Test 2.

Eigenvalues and Eigenvectors

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

Lecture 8: Linear Algebra Background

Archive of past papers, solutions and homeworks for. MATH 224, Linear Algebra 2, Spring 2013, Laurence Barker

Equality: Two matrices A and B are equal, i.e., A = B if A and B have the same order and the entries of A and B are the same.

Eigenvalues and Eigenvectors

linearly indepedent eigenvectors as the multiplicity of the root, but in general there may be no more than one. For further discussion, assume matrice

MATH 320: PRACTICE PROBLEMS FOR THE FINAL AND SOLUTIONS

The Singular Value Decomposition (SVD) and Principal Component Analysis (PCA)

MTH 5102 Linear Algebra Practice Final Exam April 26, 2016

1. Select the unique answer (choice) for each problem. Write only the answer.

Math 18, Linear Algebra, Lecture C00, Spring 2017 Review and Practice Problems for Final Exam

Quantum Computing Lecture 2. Review of Linear Algebra

Eigenvalues and Eigenvectors

MIT Final Exam Solutions, Spring 2017

Symmetric and anti symmetric matrices

Lecture notes: Applied linear algebra Part 1. Version 2

ELEMENTARY LINEAR ALGEBRA WITH APPLICATIONS. 1. Linear Equations and Matrices

Convexity of the Joint Numerical Range

Lecture notes: Applied linear algebra Part 2. Version 1

Singular Value Decomposition

Notes on basis changes and matrix diagonalization

CS 246 Review of Linear Algebra 01/17/19

Applied Linear Algebra in Geoscience Using MATLAB

Problem # Max points possible Actual score Total 120

33AH, WINTER 2018: STUDY GUIDE FOR FINAL EXAM

Linear Algebra: Matrix Eigenvalue Problems

and let s calculate the image of some vectors under the transformation T.

Linear Algebra Review. Vectors

Numerical Methods I Eigenvalue Problems

Question: Given an n x n matrix A, how do we find its eigenvalues? Idea: Suppose c is an eigenvalue of A, then what is the determinant of A-cI?

EXERCISES ON DETERMINANTS, EIGENVALUES AND EIGENVECTORS. 1. Determinants

7. Symmetric Matrices and Quadratic Forms

MATH 221, Spring Homework 10 Solutions

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

Transcription:

Math 408 Advanced Linear Algebra Chi-Kwong Li Chapter 4 Hermitian and symmetric matrices Basic properties Theorem Let A M n. The following are equivalent. Remark (a) A is Hermitian, i.e., A = A. (b) x Ax R for all x C n. (c) A is normal with real eigenvalues. (a) Let A M n. Then A = H + ig, where H = (A + A )/2 and G = (A A )/(2i) are Hermitian. Also, AA and A A are Hermitian. If A is Hermitian, then A k is Hermitian for positive integer k, and A 1 is Hermitian if A is invertible. (b) The set of Hermitian matrices form a real linear space. (c) The product of Hermitian matrices may not be Hermitian. (d) The product of two Hermitian matrices is Hermitian if and only if they commute. (e) A matrix A is the product of two Hermitian matrices if and only if A is similar to A (via a Hermitian matrix), equivalently, A is similar to a real matrix. Proof. (a) If A is Hermitian, then A = U DU for some real diagonal matrix D. So, A k = U D k U is Hermitian for positive integers k (or negative integers if A 1 exists). (b) Let H n be the set of n n Hermitian matrices. Then 0 H n, and ra + sb H n for any A, B H n and r, s R. (c) Consider A = E 12 + E 21 and B = E 11 E 22. (d) If A, B H n commute then A = U D 1 U and B = U D 2 U for some unitary U and real diagonal D 1, D 2. Thus, AB = U (D 1 D 2 )U is unitary. If A, B, AB H n, then AB = (AB) = B A = BA. (e) Suppose A = HK with H, K H n. If H = U DU for some unitary U and D = D 1 0 n k, ( ) where D 1 is an invertible real diagonal matrix in H k. Then U AU = DU D1 K KU = 11 0 0 and U A U = ( K11 D 1 0 0 ). Now, K 11 D 1 and DK 11 have the same Jordan blocks for the nonzero eigenvalues, and rank (A m ) = rank ((A ) m ) for all m. Thus, A and A have the same Jordan form and are similar. Suppose AS = SA with S = H+iG. Then S A = AS implies that A(H+iG) = (H+iG)A and (H ig)a = A(H ig). Hence, AH = HA and AG = GA. Hence AK = KA for some invertible K = H + tg. Thus, A = K(A K 1 ), where (A K 1 ) = K 1 A = A K 1. Suppose A and A are similar. Then they have the same Jordan blocks. Thus, the Jordan blocks J m (λ) and J m (λ) will occur together in A for any nonreal eigenvalue λ, and can be combined to a real Jordan block. Thus, A is similar to a direct sum of real Jordan blocks. 1

Suppose S 1 AS = R for a real matrix R. Then S 1 AS = R = T 1 R t T = T 1 R T = T 1 S A (S 1 ) T. Thus, A and A are similar. Eigenvalue inequalities Denote by λ 1 (A) λ n (A) the eigenvalues of a Hermitian matrix A M n. Theorem (Courant-Fischer-Weyl) Suppose A M n is Hermitian. Then for k {1,..., n}. λ k (A) = max dim W =k min v W,v v=1 v Av = min dim W =n k+1 min v W,v v=1 v Av. Proof. For simplicity, write λ j = λ j (A) for all j. Suppose {u 1,..., u n } is a set of orthonormal eigenvectors of A such that Au j = λ j u j. For any subspace of dimension W, say, spanned by w 1,..., w k, it contains a unit vector v in the span of {u k,..., u n }. Thus, v = n j=k µ ju j with n j=k µ j 2 = 1 such that v Av = µ j 2 λ j j=k µ j 2 λ k = λ k. Let W be the span of u 1,..., u k. For any v W, we have v = k γ jv j so that v Av = k γ j 2 λ j j=k γ j 2 λ k = λ k and the equality holds if v = u k. The first equality holds. The proof is complete. Corollary (Rayleigh principle) Let A M n be Hermitian. For any unit vector x C n, j=k λ 1 (A) v Av λ n (A). Corollary (Interlacing inequalities) Let A M n be Hermitian with eigenvalues a 1 a n and B M n 1 be a principal submatrix of A with eigenvalues b 1 b n 1. Then a 1 b 1 a 2 a n 1 b n 1 a n. Theorem (Fan-Pall) Suppose a 1,..., a n and b 1,..., b n 1 satisfy the interlacing inequalities. Then there is a Hermitian matrix A M n with a principal submatrix B M n 1 with a 1,..., a n and b 1,..., b n 1 as eigenvalues. Proof. By the discussion in class, We may assume that a 1 > b 1 > a 2 > a n 1 > b n 1 > b n. Let D = diag (a 1,..., a n ). We show that there is a unitary U such that if we delete the first row and first column of the matrix A = UDU, we get a matrix B H n 1 with eigenvalues b 1 b n. To see this, note that for t / {a 1,..., a n }, we have ti A = U(tI D)U is invertible with inverse (ti A) 1 = U(t D) 1 U. (1) 2

Recall that the (1, 1) entry of an invertible matrix C equals det(c(1, 1))/ det(c) using the adjoint matrix formula, where C(1, 1) is obtained from C by removing its first row and first column. So, the (1, 1) entry of (ti A) 1 equals det(ti B)/ det(ti A) as B is the matrix obtained from A by removing the first row and the first column. Now, consider the right side of (1). If the first row of U equals (x 1,..., x n ), then the (1, 1) entry of the matrix is f(t) = x 1 2 t a 1 + + x n 2 t a n. Equating the two expressions for the (1, 1) entry of (ti A) 1 mentioned above, we have det(ti B) = x j 2 t a j det(ti A) = x j 2 a k ). k j(t Homework 1. (a) Show that for all i = 1,..., n, the value w i = n 1 (a i b j )/ j i (a i a j ) is positive. By our assumption that a 1 > b 1 > b n 1 > b n, we see that for j i, we have a i > b j if and only if a i > a j. Thus, the numerator and the denominators have the same number of negative terms. Thus, the quotient is positive. (b) If we choose x i = w i for all i = 1,..., n, show that x j 2 n 1 det(ti A) = (t b j ) t a j for all t = a 1,..., a n, and hence the two polynomials in t are identical. If we substitute a i on the left hand side, all the summands are zero except for the one involving x i 2, and by the definition of x i 2 the expression reduces to n 1 (a i b j ). Since both side are polynomials of degree n 1, we see that the two polynomail are identical. (c) Show that if we choose a unitary U with the first row x = (x 1,..., x n ) satisfying (b), then U AU has a principal submatrix with eigenvalues b 1 b n 1 as asserted. By part (a) and (b), and the previous discussion, such a choice of U will yield a submatrix B of A = UDU such that det(ti B) = n 1 (t b j). Thus, B has the desired eigenvalues. Definition Let c = (c 1,..., c n ) and d = (d 1,..., d n ) be real vectors. We say that c is majorized by d, denoted by c d if the sum of the k largest entries of c is not larger than that of d for k = {1,..., n 1}, and the sum of all entries of c is the same as that of d. Theorem Suppose a = (a 1,..., a n ) and d = (d 1,..., d n ) are real vectors. Then there is a Hermitian matrix in M n with eigenvalues a 1,..., a n and diagonal entries d 1,..., d n if and only if d a. 3

Homework 2. (a) Suppose n = 2. If (d 1, d 2 ) (a 1, a 2 ), show that there is a vector v R 2 such that v t Dv = d 1, where D = diag (a 1, a 2 ). Furthermore, if P is an orthogonal matrix with the first column equal to v, then P t DP has diagonal entries d 1, d 2. Note that a 2 d 1 a 1. Assume v t = (cos θ, sin θ). Then f(θ) = v t Dv = a 1 cos 2 θ + a 2 sin 2 θ. Now, f(0) = a 1 d 1 and f(π/2) = a 2 d 1. By continuity, there is θ [0, π/2] such that f(θ) = d 1. Let w t = ( sin θ, cos θ). Then P = [v w] is orthogonal, P t DP has (1, 1) entry d 1. By the trace condition, we see that the (2, 2) entry of P t DP is d 2 so that a 1 + a 2 = d 1 + d 2. (b) Suppose n 3. Suppose (d 1,..., d n ) (a 1,..., a n ), where a 1 a n and d 1 d n. Let k be the largest integer such that a k d 1. (b.1) If a 1 a n, show that k < n and a k d 1 > a k+1. Assume a 1 a n d 1 d n. Then a 1 + + a n = d 1 + + d n implies that a 1 = = a n = d 1 = = d n, which is a contradiction. (b.2) Show that d = (d 2,..., d n ) is majorized by ã = (a 1,..., a k 1, ã k+1, a k+2..., a n ) where ã k+1 = a k + a k+1 d 1. Consider the sum of the l largest entries of the second vector. Case 1. If l < k, then l a j l d j l d j+1, which is the sum of the l largest entries of the first vector. Case 2. If l k, then the sum of the l largest entries of ã is at least l+1 a j d 1 which is larger than or equal to l d j+1, which is the sum of the l largest entries of d, by the original assumption. (b.3) (Optional) Show by induction that one can construct a real symmetric matrix with eigenvalues a 1,..., a n and diagonal entries d 1,..., d n. We prove by induction on n. If n = 2, the result follows from (a). Suppose n > 2. Construct the vector d and ã in (b). There is unitary (real orthogonal) V M n 1 so that V DV has diagonal entries d 2,..., d n, where D = diag (ã k+1, a 1,..., a k 1, a k+2,..., a n ). Now, there is a unitary (real ( ) ( ) orthogonal) P M 2 such that P ak 0 d1 P =. Let Q = (P I a k+1 0 ã n 2 )([1] V ). k+1 Then Q diag (a k, a k+1, a 1,..., a k 1, a k+2,..., a n )Q has diagonal entries d 1,..., d n. Theorem (Lidskii) Let A, B M n be Hermitian with eigenvalues a 1 a n and b 1 b n, Suppose C = A + B has eigenvalues c 1 c n. Then for any 1 i 1 < < i m n, m b n j+1 s=1 m (c is a is ) s=1 m b j. s=1 Corollary (Weyl) Let A, B M n be Hermitian. Then for any j, k {1,..., n} with j + k 1 n, we have λ j (A) + λ k (B) λ j+k 1 (C). 4

Proof. Suppose A has eigenvalues a 1 a n and B has eigenvalues b 1 b n. We may replace (A, B) by (A a j I, B b k I) and assume that a j = 0 = b k. Let A = UD 1 U and B = V D 2 V be such that U, V M n are unitary, D 1 = diag (a 1,..., a n ) and D 2 = diag (b 1,..., b n ). Set A 1 = Udiag (a 1,..., a j 1, 0,..., 0)U and B 1 = V diag (b 1,..., b k 1, 0,..., 0)V. Note that A 1 +B 1 has rank at most j +k 2, i.e., A 1 +B 1 has at most j +k 2 nonzero eigenvalues. Thus, λ j+k 1 (A 1 + B 1 ) = 0 and λ j+k 1 (A + B) λ 1 (A + B A 1 B 1 ). Now, set X = A 1 + B 1, Y = A + B A 1 B 1, and Z = X + Y = A + B. By Lidskii Theorem, we have λ j+k 1 (Z) λ j+k 1 (X) λ 1 (Y ). Next, set X = A A 1, Y = B B 1, and Z = X + Y. By Lidskii Theorem again, λ 1 (Z) λ 1 (X) + λ 1 (Y ). Thus, λ 1 (A + B A 1 B 1 ) λ 1 (A A 1 ) + λ 1 (B B 1 ) = λ j (A) + λ k (B). Positive semi-definite matrices Definition A matrix A M n is positive definite (positive semi-definite) if x Ax is positive (nonnegative) for all nonzero x C n. Theorem Let A M n. The following are equivalent. (a) A is positive definite (semi-definite). (b) A is normal with positive (nonnegative) eigenvalues. (c) A = RR, where R M n with det(r) > 0 and can be chosen to be in triangular form or positive definite (R M n with det(r) 0 and can be positive demi-definite). (d) A is Hermitian with det(a k ) > 0 for k = 1,..., n, where A k is the leading k k principal submatrix. (There is a permutation matrix P such that P AP t = Ã 0 n m such that det(ãj) > 0 for j = 1,..., m.) Proof. The equivalence of (a) and (b) are shown in class. To prove (b) (c). Let A = UDU where U is unitary, and D = diag (a 1,..., a n ). Then for D = diag ( a 1,..., a n ), we have A = RR with R = U DU. Evidently, R is positive definite. Further, we can write R = LV for some unitary V and lower triangular L so that A = LL. To prove (c) (a). Note that for any nonzero x C n, y = R x is nonzero as R is invertible. Thus, x Ax = y y > 0. Thus, A is positive definite. To prove (c) (d) if we choose R to be lower triangular in condition (c). Then det(a k ) = det(r k R k ) = det(r k) det(r k ) = det(r k) 2 > 0. To prove (d) (c), by Gaussian elimination, we can choose L 1 by changing the first column of I n so that L 1 A has zero entries in the (i, 1) positions for i = 2,..., n. Since A is Hermitian and the first row of L 1 A is the same as A, we have L 1 AL 1 = [a 11 ] A 2 where a 11 is the (1, 1) entry of A. Now, change L 1 to L 1 = L 1 ( diag ( a11, 1,..., 1) ). Then L 1 A L 1 = [1] A 2. Now, use induction argumet on A 2, we see that A 2 = L 2 L 2 for some lower triangular matrix with positive 1 diagonal entries. Set R = L 1 ([1] L 2). Then A = RR. 5

Definition Let A M n. Then AA and A A has the same collection of nonnegative eigenvalues. The numbers s j (A) = λ j (AA ) is the jth singular value of A for j = 1,..., n. Theorem Let A M n. (a) (SVD decomposition) There is are unitary matrices such that U AV is a diagonal matrix with diagonal entries s 1 (A),..., s n (A). (b) (Polar decomposition) There are positive semi-definite matrices P, Q and unitary matrices Ṽ, Ũ such that A = P Ṽ = ŨQ. Theorem Let A M n. Then the matrix ( ) 0 A A has eigenvalues ±s 0 1 (A),..., ±s n (A). Remark We can deduce singular value inequalities using the above (Wielandt) matrix. Congruence Theorem Let A M n be Hermitian, and S M n be invertible. Then S AS and A have the same number of positive and negative eigenvalues. In particular, there is an invertible T M n such that T AT = I p I q 0 s. Theorem Let A, B M n be Hermitian. Then there is an invertible S M n such that both S AS and S BS are in diagonal form if any one of the following is true. (a) There are real numbers a, b such that aa + bb is positive definite. (b) The matrix A is invertible and A 1 B is similar to a real diagonal matrix. Symmetric matrices For real symmetric matrices, the theory of Hermitian matrices applies. For complex symmetric matrices, we have the following results. Theorem Let A M n. The following are equivalent. (a) A is symmetric. (b) There is a unitary U M n such that A = UDU t such that D has nonnegative entries arranged in descending order. (c) A = SS t for some S M n. Theorem Let A M n. Then A is normal and symmetric if and only if there is a real orthogonal matrix A such that QAQ t is a diagonal matrix. Theorem Every matrix A is similar to a symmetric matrix. Consequently, every matrix A is a product of two symmetric matrices. Theorem Let A M n be symmetric. Then A is diagonalizable if and only if there is a complex orthogonal matrix Q such that Q t AQ is in diagonal form. 6