Linear Algebra 20 Symmetric Matrices and Quadratic Forms

Similar documents
Chapter 7: Symmetric Matrices and Quadratic Forms

Linear Algebra 18 Orthogonality

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

22m:033 Notes: 7.1 Diagonalization of Symmetric Matrices

1 Last time: least-squares problems

Chapter 6: Orthogonality

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix

7. Symmetric Matrices and Quadratic Forms

A A x i x j i j (i, j) (j, i) Let. Compute the value of for and

The Singular Value Decomposition

Lecture Note on Linear Algebra 16. Eigenvalues and Eigenvectors

Review problems for MA 54, Fall 2004.

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

UNIT 6: The singular value decomposition.

Lecture 7: Positive Semidefinite Matrices

Exercise Set 7.2. Skills

235 Final exam review questions

Chapter 3 Transformations

Linear Algebra. Session 12

ft-uiowa-math2550 Assignment OptionalFinalExamReviewMultChoiceMEDIUMlengthForm due 12/31/2014 at 10:36pm CST

. = V c = V [x]v (5.1) c 1. c k

Lec 2: Mathematical Economics

Applied Linear Algebra in Geoscience Using MATLAB

Definition (T -invariant subspace) Example. Example

TBP MATH33A Review Sheet. November 24, 2018

Quadratic forms. Defn

Linear Algebra Fundamentals

W2 ) = dim(w 1 )+ dim(w 2 ) for any two finite dimensional subspaces W 1, W 2 of V.

Notes on singular value decomposition for Math 54. Recall that if A is a symmetric n n matrix, then A has real eigenvalues A = P DP 1 A = P DP T.

Spectral Theorem for Self-adjoint Linear Operators

Eigenvalues and Eigenvectors

Diagonalizing Matrices

Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Math 3191 Applied Linear Algebra

Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition

Linear Algebra- Final Exam Review

8. Diagonalization.

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

2 Eigenvectors and Eigenvalues in abstract spaces.

Lecture 15, 16: Diagonalization

EXAM. Exam 1. Math 5316, Fall December 2, 2012

Singular Value Decomposition

Definitions for Quizzes

The value of a problem is not so much coming up with the answer as in the ideas and attempted ideas it forces on the would be solver I.N.

Diagonalization of Matrix

Sample Final Exam: Solutions

MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors.

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL

Final Exam, Linear Algebra, Fall, 2003, W. Stephen Wilson

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

MATH 221, Spring Homework 10 Solutions

Linear Algebra Massoud Malek

Math 314/ Exam 2 Blue Exam Solutions December 4, 2008 Instructor: Dr. S. Cooper. Name:

Mathematical foundations - linear algebra

ft-uiowa-math2550 Assignment NOTRequiredJustHWformatOfQuizReviewForExam3part2 due 12/31/2014 at 07:10pm CST

2. Review of Linear Algebra

Math 315: Linear Algebra Solutions to Assignment 7

(a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? Solution: dim N(A) 1, since rank(a) 3. Ax =

CS 143 Linear Algebra Review

LINEAR ALGEBRA REVIEW

5.) For each of the given sets of vectors, determine whether or not the set spans R 3. Give reasons for your answers.

Math 350 Fall 2011 Notes about inner product spaces. In this notes we state and prove some important properties of inner product spaces.

Basic Calculus Review

A Review of Linear Algebra

MAC Module 12 Eigenvalues and Eigenvectors. Learning Objectives. Upon completing this module, you should be able to:

MAC Module 12 Eigenvalues and Eigenvectors

Symmetric matrices and dot products

Name: Final Exam MATH 3320

Throughout these notes we assume V, W are finite dimensional inner product spaces over C.

The Singular Value Decomposition (SVD) and Principal Component Analysis (PCA)

THE EIGENVALUE PROBLEM

Math 224, Fall 2007 Exam 3 Thursday, December 6, 2007

Review of some mathematical tools

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.

Solutions to Review Problems for Chapter 6 ( ), 7.1

I. Multiple Choice Questions (Answer any eight)

A = 3 1. We conclude that the algebraic multiplicity of the eigenvalues are both one, that is,

Chapter 4 Euclid Space

(v, w) = arccos( < v, w >

Chapter 7. Canonical Forms. 7.1 Eigenvalues and Eigenvectors

6 Inner Product Spaces

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

Transpose & Dot Product

University of Colorado Denver Department of Mathematical and Statistical Sciences Applied Linear Algebra Ph.D. Preliminary Exam June 8, 2012

2. Every linear system with the same number of equations as unknowns has a unique solution.

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )

Mathematical foundations - linear algebra

Math 443 Differential Geometry Spring Handout 3: Bilinear and Quadratic Forms This handout should be read just before Chapter 4 of the textbook.

Math 102 Final Exam - Dec 14 - PCYNH pm Fall Name Student No. Section A0

Quizzes for Math 304

AMS526: Numerical Analysis I (Numerical Linear Algebra)

ORTHOGONALITY AND LEAST-SQUARES [CHAP. 6]

Linear Algebra Review. Vectors

1. General Vector Spaces

The Singular Value Decomposition and Least Squares Problems

Linear Algebra & Geometry why is linear algebra useful in computer vision?

Transpose & Dot Product

EE5120 Linear Algebra: Tutorial 6, July-Dec Covers sec 4.2, 5.1, 5.2 of GS

MATH 115A: SAMPLE FINAL SOLUTIONS

Transcription:

Linear Algebra 20 Symmetric Matrices and Quadratic Forms Wei-Shi Zheng, wszheng@ieee.org, 2011 1 What Do You Learn from This Note Do you still remember the following equation or something like that I have mentioned in the class f = x T A x. (1) It is exactly a quadratic form function which is going to detail in the following. This connects the eigen-analysis and orthogonal matrix very well. Basic Concept:symmetric matrix( 对称矩阵 ), quadratic form, quadratic function, positive definitely matrix( 正定矩阵 ), Spectral Decomposition ( 谱分解 ), singular value decomposition ( 奇异值分解 ) 2 What is symmetric matrix Definition 1 (symmetric matrix). A square matrix A is said to be symmetric iff A T = A (or equivalently, [A] ij = [A] ji ). Examples: 0 n n, I n, diagonal matrices, etc.. Theorem 2. All square matrices in R n n form a vector space. Denote this vector space as M n. 1

Theorem 3. The set of all symmetric matrices of size n, denoted by Sym n, forms a subspace of M n. Proof. Let A and B be symmetric matrices of size n and c a scalar. Then 1. 0 n n is symmetric obviously; 2. A + B is symmetric since (A + B) T = A T + B T = A + B; 3. ca is symmetric since (ca) T = ca T = ca. So Sym n is a subspace of M n. 3 Eigenvectors for Symmetric Matrix Theorem 4. Let A Sym n (R). Also let λ 1, λ 2 R be distinct eigenvalues of A with eigenvectors v 1, v 2 respectively. Then v 1 v 2. Proof. We have λ 1 ( v 1 v 2 ) = λ 1 v 1 v 2 = Av 1 v 2 = v T 1 A T v 2 = v T 1 Av 2 = v 1 Av 2 = v 1 λ 2 v 2 = λ 2 ( v 1 v 2 ). So (λ 1 λ 2 )( v 1 v 2 ) = 0. But λ 1 λ 2 0 since they are distinct, which forces that v 1 v 2 = 0. Theorem 5. Let A Sym n (R) and λ a complex eigenvalue of A. Then λ R, that is λ is a real eigenvalue of A. Proof. Let v C n be a λ eigenvector. Then A v = λ v. Also, A T = A. So λ(v T v) = v T (λ v) = v T A v = A v T v = (λ v T ) v = λ( v T v). So (λ λ) v T v = 0. Since v 0, it follows that v T v = v 1 v 1 + + v n v n = v 1 2 + + v n 2 > 0 where ( v 1 v n ) T = v, which forces that λ λ = 0. So λ is real. 2

4 Orthogonality & Symmetry Theorem 6. Let A be any symmetric function and U any orthogonal matrix. Then U 1 AU is still a symmetric matrix. Proof. Notice that U T = U 1. So (U 1 AU) T = U T A T (U 1 ) T = U 1 AU. We can now prove the major result on symmetric matrices. Theorem 7 (orthogonally diagonalizable). Let A be a square matrix. Then A is a symmetric matrix if and only if there exists an orthogonal matrix U such that U 1 AU is diagonal. Proof. Firstly, it is easy to verify that if there exists an orthogonal matrix U such that U 1 AU is diagonal, then A is symmetric. It is because let the diagonal matrix be D, then A = UDU 1 Sym n (R). Secondly, or say reversely, if A is symmetric, then we need to prove there exists an orthogonal matrix U such that U 1 AU is diagonal. It can be done by induction on n as follows. 1. Step 1: Base Case: n = 1. Trivial. 2. Step 2: Inductive Hypotheses: Assume the result holds for n 1. 3. Step 3: Inductive Step: Let λ 1 be an eigenvalue of A, which is real by Theorem 5, and u 1 a unit λ 1 eigenvector. Then we can extend { u 1 } to a basis of R n, say { u 1, v 2,..., v n }. Furthermore, by applying the Gram Schmidt process, we obtain an orthonormal basis U 1 = { u 1, u 2,..., u n } of R n. Define U 1 = ( u 1 u n ) O n (R), which is the matrix from basis U 1 to basis E. Let F = U 1 1 AU 1 It is not hard to see that the first column of F is (λ 1 0 0) since U1 1 = U T. So F has the form ( ) λ1 0 F = T n 1 0 n 1 A, 3

where A Sym n 1 (R). By Inductive Hypotheses, there exists ( ) 1 0 U O n 1 (R) such that U 1 A U is diagonal. Define U 2 = T n 1 0 n 1 U, which is an orthogonal matrix. Then ( ) U2 1 U1 1 AU 1 U 2 = U2 1 λ1 0 F U 2 = T n 1 0 n 1 U 1 A U, which is diagonal. So take U = U 1 U 2, then U is an orthogonal matrix and U 1 AU is diagonal. Remarks (Spectral Decomposition, 谱分解 ): If A is orthogonally diagonalizable, and A = P DP T = ( λ ) 1 u T 1 u 1 u n...., then A = λ 1 u 1 u T 1 + + λ n u n u T n is the spectral decomposition of matrix A. 5 Singular Value Decomposition Question: If matrix A is not symmetric, it may not be orthogonally diagonalizable. However, A T A is symmetric and can be orthogonally diagonalized as P DP 1. Then is it possible to find some orthogonal matrix U and Σ such that P DP 1 = P Σ T U T UΣP 1, A = UΣP 1 = UΣV T? λ n u T n Answer: Yes. It can be. The Singular Value Decomposition can realize this!!! Question: What is singular values? Definition 8. Let A be m n matrix. Then A T A can be orthogonally diagonalized. Let { v 1,, v n } be the orthogonal eigenvector basis of matrix A T A, where their corresponding eigenvalues are λ 1 λ 2 λ n 0. Then we say σ i = λ i are the singular values of matrix A. 4

Theorem 9. Let A be m n matrix, where r = rank(a). Then there exists a m m orthogonal ( matrix U and a n ) n orthogonal matrix V and a m n D O matrix Σ = r (n r), where D = diag(σ 1, σ 2,, σ r ) O (m r) r O (m r) (n r) σ 1 σ 2 σ r > 0 are singular values of A, such that A = UΣV. Proof. STEP 1: Let { v 1,, v n } be the orthogonal eigenvector basis of matrix A T A, where their corresponding eigenvalues are λ 1 λ 2 λ r > λ r+1 = λ r+2 = = λ n = 0. Then we can prove that {A v 1,, A v r } are orthogonal basis of ColA. STEP 2: Normalize all vectors in the set {A v 1,, A v r } to obtain orthonormal basis { u 1,, u r }, where u i = 1 A v i A v i = 1 σ i A v i. That is A v i = σ i u i, 1 i r. STEP 3: Extend { u 1,, u r } to be the orthonormal basis of R m { u 1,, u r, u r+1,, u n }. STEP 4: Let U = [ u 1,, u n ] and V = [ v 1,, v n ]. Then they are orthogonal matrices and AV = [A v 1,, A v r, 0 0] = [σ 1 u 1,, σ r u r, 0 0]. Then we can easily have A = UΣV T. Example: ( Compute ) the singular value decomposition for matrix 4 11 14 A =. 8 7 2 Applications: Image Processing ( 见演示 ) 5

6 Application: Quadratic Form( 二次型 ) Definition 10 (quadratic form). A real quadratic form q(x 1,..., x n ) in n variables x 1,..., x n is a polynomial over R having the form q(x 1,..., x n ) = c ij x i x j. 1 i j n Examples: 0, 8x 2 1, x 2 x 7, x 2 1 + x 2 2 + x 2 3, x 1 x 2 + 12x 2 5, 2x 3 x 4 4x 2 1 are all quadratic forms. Now we are given a quadratic form q(x 1,..., x n ) = 1 i j n c ijx i x j. We can define a symmetric matrix A such that c ij if i = j; 1 [A] ij = c 2 ij if i < j; 1 c 2 ji if i > j. Then it is easy to verify that q(x 1,..., x n ) = (x 1 x n )A(x 1 x n ) T. Conversely, for any A Sym n (R), q(x 1,..., x n ) = (x 1 x n )A(x 1 x n ) T = a ii x 2 i + 2a ij x i x j 1 i n 1 i<j n is a quadratic form, which is called the quadratic form corresponding to A and is denoted by q A (x 1,..., x n ). Thus, any quadratic form in n variables can be represented by a symmetric matrix of size n uniquely. Definition 11 (quadratic form function). Let q(x 1,..., x n ) be a real quadratic form. Then the function f q : R n R, f q ( x) = q(x 1,..., x n ), where x = (x 1 x n ) T, is called the quadratic form function induced by q. If q corresponds to A Sym n, then we write f A for f q and f A ( x) = x T A x. 6

Question: Can we have a more simplified quadratic form for the same quadratic function? Recall that for any x R n, we have x = [x] E, where E is the standard basis. Suppose that B is another basis of R n. Then x = [ x] E = P [ x] B, where P = [B] E. So for any quadratic form function f A we have f A ( x) = f A (P y) = y T P T AP y = f P T AP ( y), where y = [ x] B. So f A and f P T AP can be regarded as the same quadratic form function with respect to different bases. Therefore for the sake of convenience, it is quite natural to choose an invertible matrix P such that f P T AP has a simple form. Also, under the postulates of Euclidean Geometry, only orthogonal coordinate transformation is admitted, that is P is required to be orthogonal. Now by Theorem 6, we can choose an orthogonal matrix U such that U T AU is diagonal. So we have Theorem 12. Let f A ( x) = x T A x be a quadratic form function. Then there exists U O n such that f A ( x) = f U T AU( y) = y T U T AU y = λ 1 y 2 1 + + λ n y 2 n, where y = (y 1 y n ) T and λ 1,..., λ n are all eigenvalues of A counting multiplicity. 7

6.1 Positive definite quadratic form Definition 13 (positive definite quadratic form). Let q be a real quadratic form in n variables. Then q is said to be 1. positive definite ( 正定的 ) if f q ( x) > 0 for all x 0; 2. negative definite ( 负定的 ) if f q ( x) < 0 for all x 0; 3. indefinite ( 不定的 ) if the values of f q ( x) can be both positive and negative. Theorem 14. Let q A be the quadratic form corresponding to A Sym n (R) and λ 1,..., λ n all the eigenvalues of A. Then 1. q A is positive definite iff λ i > 0 for all i = 1,..., n; 2. q A is negative definite iff λ i < 0 for all i = 1,..., n; 3. q A is indefinite iff there exist i, j such that λ i > 0 and λ j < 0. Proof. Let U be an orthogonal matrix such that f A ( x) = f U T AU( y) = λ 1 y 2 1 + + λ n y 2 n. Then the range of f A ( x) is the same as that of f U T AU. So the result follows easily by arguing the range of λ 1 y 2 1 + + λ n y 2 n. Theorem 15. Let A Sym 2 (R) and λ 1 λ 2 be eigenvalues of A. Then equation f A ( x) = 1 represents a conic ( 圆锥 ) section, which can be classified as: 1. an ellipse ( 椭圆 ) if λ 1 λ 2 > 0; 2. a hyperbola ( 双曲线 ) if λ 1 > 0 > λ 2 ; 3. empty set if 0 λ 1 λ 2 ; 4. a pair of parallel lines if λ 1 > λ 2 = 0. Proof. Let U O 2 be such that f A ( x) = f U T AU( y) = λ 1 y 2 1 + λ 2 y 2 2. Then the original equation turns out to be λ 1 y 2 1 + λ 2 y 2 2 = 1 and the result follows easily. 8

7 Geometric Understanding of Symmetric Matrix Theorem 16. Let A be a symmetric matrix, then we have M = max{ x T A x x = 1} = the maximum eigenvalue of A, n = min{ x T A x x = 1} = the minimum eigenvalue of A. The Cafe Terrace on the Place du Forum, by Van Gogh 9