Matrix Theory, Math6304 Lecture Notes from September 27, 2012 taken by Tasadduk Chowdhury

Similar documents
MATH 423 Linear Algebra II Lecture 33: Diagonalization of normal operators.

Schur s Triangularization Theorem. Math 422

I. Multiple Choice Questions (Answer any eight)

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

AMS526: Numerical Analysis I (Numerical Linear Algebra)

AMS526: Numerical Analysis I (Numerical Linear Algebra for Computational and Data Sciences)

Math 108b: Notes on the Spectral Theorem

1 Last time: least-squares problems

Diagonalizing Matrices

MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors.

ft-uiowa-math2550 Assignment NOTRequiredJustHWformatOfQuizReviewForExam3part2 due 12/31/2014 at 07:10pm CST

Chapter 6: Orthogonality

MATH 304 Linear Algebra Lecture 34: Review for Test 2.

Review of some mathematical tools

6 Inner Product Spaces

Lecture 3: QR-Factorization

Numerical Methods for Solving Large Scale Eigenvalue Problems

Review problems for MA 54, Fall 2004.

Definition (T -invariant subspace) Example. Example

MATRICES ARE SIMILAR TO TRIANGULAR MATRICES

Lecture 15, 16: Diagonalization

Matrix Theory, Math6304 Lecture Notes from October 25, 2012

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

MATH 304 Linear Algebra Lecture 23: Diagonalization. Review for Test 2.

Lecture 10 - Eigenvalues problem

Linear Algebra in Actuarial Science: Slides to the lecture

Math 3191 Applied Linear Algebra

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL

Draft. Lecture 14 Eigenvalue Problems. MATH 562 Numerical Analysis II. Songting Luo. Department of Mathematics Iowa State University

Conceptual Questions for Review

Exercise Set 7.2. Skills

22m:033 Notes: 7.1 Diagonalization of Symmetric Matrices

University of Colorado Denver Department of Mathematical and Statistical Sciences Applied Linear Algebra Ph.D. Preliminary Exam June 8, 2012

Math 407: Linear Optimization

Linear Algebra Lecture Notes-II

Math 489AB Exercises for Chapter 2 Fall Section 2.3

Eigenvalues and Eigenvectors

Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Applied Mathematics 205. Unit V: Eigenvalue Problems. Lecturer: Dr. David Knezevic

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )

Eigenvalues and eigenvectors

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008

Chapter 7: Symmetric Matrices and Quadratic Forms

Linear Algebra 1. M.T.Nair Department of Mathematics, IIT Madras. and in that case x is called an eigenvector of T corresponding to the eigenvalue λ.

235 Final exam review questions

Linear Algebra, part 3 QR and SVD

LECTURE VI: SELF-ADJOINT AND UNITARY OPERATORS MAT FALL 2006 PRINCETON UNIVERSITY

Typical Problem: Compute.

Lecture notes: Applied linear algebra Part 1. Version 2

Math 18, Linear Algebra, Lecture C00, Spring 2017 Review and Practice Problems for Final Exam

Symmetric and self-adjoint matrices

LINEAR ALGEBRA BOOT CAMP WEEK 4: THE SPECTRAL THEOREM

Diagonalization of Matrix

Lecture 7: Positive Semidefinite Matrices

Eigenvalue and Eigenvector Problems

Matrix Theory, Math6304 Lecture Notes from Sept 11, 2012 taken by Tristan Whalen

MATH 235. Final ANSWERS May 5, 2015

Linear Algebra 2 Spectral Notes

EXAM. Exam 1. Math 5316, Fall December 2, 2012

Math 102 Final Exam - Dec 14 - PCYNH pm Fall Name Student No. Section A0

Linear Algebra II Lecture 13

7. Symmetric Matrices and Quadratic Forms

Throughout these notes we assume V, W are finite dimensional inner product spaces over C.

The following definition is fundamental.

Final Exam, Linear Algebra, Fall, 2003, W. Stephen Wilson

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

YORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH M Test #2 Solutions

1 Linear Algebra Problems

Spectral Theorem for Self-adjoint Linear Operators

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.

Applied Linear Algebra in Geoscience Using MATLAB

MATH 115A: SAMPLE FINAL SOLUTIONS

Solutions to practice questions for the final

EE/ACM Applications of Convex Optimization in Signal Processing and Communications Lecture 2

Chapter 6 Inner product spaces

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

Pseudoinverse & Moore-Penrose Conditions

Preliminary/Qualifying Exam in Numerical Analysis (Math 502a) Spring 2012

forms Christopher Engström November 14, 2014 MAA704: Matrix factorization and canonical forms Matrix properties Matrix factorization Canonical forms

MATH 583A REVIEW SESSION #1

Check that your exam contains 30 multiple-choice questions, numbered sequentially.

MATH Spring 2011 Sample problems for Test 2: Solutions

1. General Vector Spaces

MA 265 FINAL EXAM Fall 2012

The Spectral Theorem for normal linear maps

Eigenvalues and Eigenvectors

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

AMS526: Numerical Analysis I (Numerical Linear Algebra)

ft-uiowa-math2550 Assignment OptionalFinalExamReviewMultChoiceMEDIUMlengthForm due 12/31/2014 at 10:36pm CST

AMS526: Numerical Analysis I (Numerical Linear Algebra)

Cheat Sheet for MATH461

Math 310 Final Exam Solutions

Final Exam Practice Problems Answers Math 24 Winter 2012

Solutions to Review Problems for Chapter 6 ( ), 7.1

a 11 a 12 a 11 a 12 a 13 a 21 a 22 a 23 . a 31 a 32 a 33 a 12 a 21 a 23 a 31 a = = = = 12

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

This can be accomplished by left matrix multiplication as follows: I

Study Guide for Linear Algebra Exam 2

Transcription:

Matrix Theory, Math634 Lecture Notes from September 27, 212 taken by Tasadduk Chowdhury Last Time (9/25/12): QR factorization: any matrix A M n has a QR factorization: A = QR, whereq is unitary and R is upper triangular. In addition, we proved that if A is non-singular, Q and R are unique. Cholesky factorization: any B M n satisfying B = A A, A M n has the factorization B = LL where L M n is lower triangular and non-negative on the diagonal. QR algorithm: details and convergence criteria. Warm-up If A = A M n is normal and QR algorithm converges (entry-wise), what can we say about the limiting matrix A? Convergence implies that there is a unitary Q and a upper-triangular R such that From above, we see that Q and R commute. Also, So Q and A also commute. Moreover, from (1) we get A = Q R = R Q. (1) A Q = Q R Q = Q A. (2) Q A = R = R Q Q = A Q. (3) By taking adjoints, we get A Q = Q A from (2) and Q A = A Q from (3). Thus, R R = Q A A Q = A Q Q A (by commutativity) = A A = A A (by normality of A ) = A Q Q A = Q A A Q (by commutativity) = R R. 1

(We claimed above that A is normal. From last lecture s proposition of QR algorithm we know that A n s are all unitarily equivalent. It can be checked that any matrix unitarily equivalent to a normal matrix is also normal: say X and Y are unitarily equivalent, and X is normal. Then, Y Y =(U XU) (U XU) = U X UU XU = U X XU = U XX U = U XUU X U = YY, and thus Y is normal. Hence, since A n s are unitarily equivalent to A = A, anda is normal, and thus the limiting matrix A is also normal.) By normality of R and the fact that it is triangular, R = D with D adiagonalmatrixand A = D Q = D Q. Since {Q,D } is a commuting family in M n,thereexistsaunitaryu M n that diagonalizes both: U D U = D, and U Q U = Q, where Q is a diagonal matrix. We will denote the i th diagonal entry of of Q by ω i. Note that ω i =1,sincetheeigenvaluesofanunitarymatrixare±1, andtheeigenvaluesofq are the diagonal entries of Q.Hence,weobtainthefollowingresult: U D Q U = U D UU Q U = D Q d 11... ω 1.... d = 22.... ω 2.................... d nn... ω n d 11 ω 1.... d = 22 ω 2.............. d nn ω n Since the matrix D Q commutes with A,andA is unitarily equivalent to A (follows from the QR algorithm), D Q is unitarily equivalent to A. Soλ j = d jj ω j are the eigenvalues of A. Thus, we know d jj = λ j.moreover,ifthediagonalentriesofd are distinct (magnitudes of the eigenvalues of A are distinct), and since D Q = Q D,theoffdiagonalentriesofQ must be zeros, and we precisely have Q = Q.Thus,Q is diagonal and unitary, and d 11 ω 1... d A = 22 ω 2............... d nn ω n 2

So if the magnitudes of a normal matrix A are distinct and if the QR algorithm converges, then it diagonalizes A. 1 Real Matrices (cont d) We proceed to show under which conditions a matrix with entries over R can be diagonalized the same way as matrices with complex entries. 1.5.1 Definition. AmatrixA M n (R) is similar to B M n (R) if there exists invertible S M n (R) such that B = S 1 AS. The matrix A is diagonalizable if A is similar to a diagonal matrix. 1.5.2 Theorem. A M n (R) is diagonalizable if and only if there is a set of n linearly independent eigenvectors. Proof. As before, S 1 AS = D, withd diagonal implies that S =[x 1,x 2,,x n ] contains a basis of n eigenvectors and vice versa. 1.5.3 Theorem. If A M n (R) has n distinct (real) eigenvalues, then it is diagonalizable. Proof. As before, we use the fact that eigenvectors belonging to distinct eigenvalues are linearly independent. 1.5.4 Theorem. AmatrixA M n (R) is diagonalizable if and only if it has n eigenvalues (with multiplicities counted) and the geometric and algebraic eigenvalues are equal. Proof. Extends preceding theorem, same strategy as before. The following theorem is the real case of Schur s triangularization theorem. 1.5.5 Theorem. If A M n (R) has n (real) eigenvalues (counting multiplicity), then there exists an orthogonal matrix O M n (R) such that O t AO = T, where T is triangular, and the eigenvalues of A are the diagonal entries of T. Proof. This proof is identical to the proof of Schur s theorem in the complex case. We prove it by induction on the dimension n. Forn =1,itistriviallytrue. Now suppose the theorem holds for all matrices in M n 1 (R). LetA M n (R) has real eigenvalues λ 1,λ 2,,λ n. Choose an eigenvector x R n, x =1. Now x may be extended to a basis {x, y 2,,y n } of R n. Apply Gram-Schmidt orthonormalization to this basis to produce an orthonormal basis {x, z 2,,z n } of R n.definethematrixo 1 =[x, z 2,,z n ].NotethatO 1 is orthogonal, since its columns are orthonormal. Thus, O1AO t 1 = O1 t λ1 x 1 λ 1 =. B, 3

where B M n 1 (R). NowthecharacteristicpolynomialofA factors as p A (t) =(t λ 1 )p B (t), where p B (t) is the characteristic polynomial of B. Thismeansthatλ 2,,λ n are the eigenvalues of B. Now,bytheinductionassumption,thereexistsanorthogonal O Mn 1 (R) such that O t B T, where T is upper triangular with λ 2,,λ n on the diagonal. Define 1 O 2 =. O Note that O 2 is orthogonal (its columns are orthonormal). Let O = O 1 O 2. O is orthogonal since Then O t =(O 1 O 2 ) t = O2O t 1 t = O2 1 O1 1 =(O 1 O 2 ) 1 = O 1. O T AO =(O 1 O 2 ) t AO 1 O 2 = O2(O t 1AO t 1 )O 2 λ 1 = O2 t. B O 2 λ 1 λ 1 =. O t BO =. T = T. Thus, we have triangularized A to the matrix T which has the eigenvalues of A in its diagonal. 1.5.6 Theorem. If A M n (R) is symmetric, then there exists an orthogonal matrix O M n (R) such that O t AO = D, where D is a diagonal matrix containing the eigenvalues of A as its diagonal entries. 4

Proof. By symmetry, all eigenvalues of A are real (previously proven). This would also apply to all (complex) eigenvalues obtained from factoring the characteristic polynomial p A of A over C. This emplies A has n real eigenvalues. Consequently, Schur s triangularization (over R) gives orthogonal O, O t AO = T, with T triangular. But T is normal and triangular, thus T is diagonal. 1.5.7 Question. What if A is not symmetric? 1.6 Block Triangularization 1.6.8 Theorem. If A M n (R), thenthereexitsarealorthogonalmatrixo M n (R) such that A 1 O t A 2 AO =......., A r with diagonal blocks A j M 1 (R) or M 2 (R). Proof. Repeat Schur s triangularization procedure. To begin with, if λ 1 is real, then there is a (real) x 1 R n and Ax 1 = λ 1 x 1.Sonormalizingx 1,complementingtoorthonormalbasisofR n viewed as columns gives A x 1 = λ 1 x 1, and λ 1 x t 1 t A x 1 =......, and we proceed with a lower right block as before. If λ 1 = α + iβ, α, β R, β=,thenthere is an eigenvalue λ 2 such that λ 2 = λ 1 = α iβ. Why? Take an eigenvector x C n belonging to eigenvalue λ, takerealandimaginaryparts: x = u + iv where u, v R n.then Ax = A(u + iv) = Au + iav = λ(u + iv) =(α + iβ)(u + iv) Comparing real and imaginary portions we have: = αu βv + iβu + iαv. Au = αu βv and Av = βu + αv. 5

Consider x = u iv, then A(u iv) =αu βv i(βu + αv) =(α iβ)(u iv). We conclude x is an eigenvalue belonging to eigenvalue λ. Since λ = λ, {x, x} is linearly independent. This means that {u, v} is also linearly independent since u = 1 (x + x) and 2 v = 1 (x x). ByGram-Schmidt,wecanfindanorthonormalsystemwithsame(real)spanas 2i {u, v}, say{z,w}. Sincespan(u, v) is an invariant subspace, complementing to an orthonormal basis of R n viewed as columns, we get A z w = a 11 z + a 12 z a 21 w + a 22 w, and z t w t A z w A1 =, t where A 1 =,andintheblockpartitionedmatrixabove, s and s denote the matrices of proper dimensions. Iterating this procedure, we arrive at the claimed block-triangular form. 6