ICS 6N Computational Linear Algebra Symmetric Matrices and Orthogonal Diagonalization

Similar documents
ICS 6N Computational Linear Algebra Eigenvalues and Eigenvectors

Lecture 15, 16: Diagonalization

7. Symmetric Matrices and Quadratic Forms

22m:033 Notes: 7.1 Diagonalization of Symmetric Matrices

Diagonalization of Matrix

ICS 6N Computational Linear Algebra Vector Space

Eigenvalues, Eigenvectors, and an Intro to PCA

Math Matrix Algebra

MAC Module 12 Eigenvalues and Eigenvectors. Learning Objectives. Upon completing this module, you should be able to:

MAC Module 12 Eigenvalues and Eigenvectors

Chapter 7: Symmetric Matrices and Quadratic Forms

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

Chapter 6. Eigenvalues. Josef Leydold Mathematical Methods WS 2018/19 6 Eigenvalues 1 / 45

A A x i x j i j (i, j) (j, i) Let. Compute the value of for and

ICS 6N Computational Linear Algebra Matrix Algebra

235 Final exam review questions

1. General Vector Spaces

Econ Slides from Lecture 7

ft-uiowa-math2550 Assignment NOTRequiredJustHWformatOfQuizReviewForExam3part2 due 12/31/2014 at 07:10pm CST

Jordan Canonical Form Homework Solutions

ICS 6N Computational Linear Algebra Vector Equations

More Linear Algebra. Edps/Soc 584, Psych 594. Carolyn J. Anderson

REVIEW FOR EXAM III SIMILARITY AND DIAGONALIZATION

Quadratic forms. Here. Thus symmetric matrices are diagonalizable, and the diagonalization can be performed by means of an orthogonal matrix.

Eigenvalues, Eigenvectors, and an Intro to PCA

Lecture 3 Eigenvalues and Eigenvectors

Recall : Eigenvalues and Eigenvectors

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

2. Matrix Algebra and Random Vectors

APPLICATIONS The eigenvalues are λ = 5, 5. An orthonormal basis of eigenvectors consists of

Dimensionality Reduction: PCA. Nicholas Ruozzi University of Texas at Dallas

Eigenvalues, Eigenvectors, and an Intro to PCA

Diagonalization. Hung-yi Lee

Exercise Set 7.2. Skills

c c c c c c c c c c a 3x3 matrix C= has a determinant determined by

5.) For each of the given sets of vectors, determine whether or not the set spans R 3. Give reasons for your answers.

Diagonalizing Matrices

Eigenvalues and Eigenvectors 7.2 Diagonalization

MATRIX ALGEBRA. or x = (x 1,..., x n ) R n. y 1 y 2. x 2. x m. y m. y = cos θ 1 = x 1 L x. sin θ 1 = x 2. cos θ 2 = y 1 L y.

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL

MATH 304 Linear Algebra Lecture 34: Review for Test 2.

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

Mathematical foundations - linear algebra

Mathematical Methods for Engineers 1 (AMS10/10A)

a 11 a 12 a 11 a 12 a 13 a 21 a 22 a 23 . a 31 a 32 a 33 a 12 a 21 a 23 a 31 a = = = = 12

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix

Study Guide for Linear Algebra Exam 2

CS 143 Linear Algebra Review

Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008

The Singular Value Decomposition

c Igor Zelenko, Fall

Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

MA 265 FINAL EXAM Fall 2012

MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors.

AMS526: Numerical Analysis I (Numerical Linear Algebra)

Final Review Written by Victoria Kala SH 6432u Office Hours R 12:30 1:30pm Last Updated 11/30/2015

Econ 204 Supplement to Section 3.6 Diagonalization and Quadratic Forms. 1 Diagonalization and Change of Basis

Introduction to Matrix Algebra

ft-uiowa-math2550 Assignment OptionalFinalExamReviewMultChoiceMEDIUMlengthForm due 12/31/2014 at 10:36pm CST

SOLUTION KEY TO THE LINEAR ALGEBRA FINAL EXAM 1 2 ( 2) ( 1) c a = 1 0

Chapter 6: Orthogonality

Announcements Monday, November 06

Lecture 10: Eigenvectors and eigenvalues (Numerical Recipes, Chapter 11)

MATH 1553, Intro to Linear Algebra FINAL EXAM STUDY GUIDE

Example: Filter output power. maximization. Definition. Eigenvalues, eigenvectors and similarity. Example: Stability of linear systems.

City Suburbs. : population distribution after m years

2. Every linear system with the same number of equations as unknowns has a unique solution.

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

Basic Concepts in Matrix Algebra

Math 108b: Notes on the Spectral Theorem

Lecture 10 - Eigenvalues problem

. = V c = V [x]v (5.1) c 1. c k

Glossary of Linear Algebra Terms. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB

Practice problems for Exam 3 A =

A = 3 1. We conclude that the algebraic multiplicity of the eigenvalues are both one, that is,

Section 7.3: SYMMETRIC MATRICES AND ORTHOGONAL DIAGONALIZATION

Math 315: Linear Algebra Solutions to Assignment 7

Principal Component Analysis

Review problems for MA 54, Fall 2004.

TBP MATH33A Review Sheet. November 24, 2018

MATH 220 FINAL EXAMINATION December 13, Name ID # Section #

Jordan normal form notes (version date: 11/21/07)

Definition (T -invariant subspace) Example. Example

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

Math 3191 Applied Linear Algebra

Linear Algebra. Rekha Santhanam. April 3, Johns Hopkins Univ. Rekha Santhanam (Johns Hopkins Univ.) Linear Algebra April 3, / 7

Linear Algebra Fundamentals

Math Camp Notes: Linear Algebra II

What is Principal Component Analysis?

Lec 2: Mathematical Economics

Principal Component Analysis

Linear Models Review

Math 314/ Exam 2 Blue Exam Solutions December 4, 2008 Instructor: Dr. S. Cooper. Name:

Eigenvalues and Eigenvectors

Principle Components Analysis (PCA) Relationship Between a Linear Combination of Variables and Axes Rotation for PCA

Announcements Wednesday, November 01

Unit 5. Matrix diagonaliza1on

4. Linear transformations as a vector space 17

Linear Algebra Primer

MTH 2032 SemesterII

Transcription:

ICS 6N Computational Linear Algebra Symmetric Matrices and Orthogonal Diagonalization Xiaohui Xie University of California, Irvine xhx@uci.edu Xiaohui Xie (UCI) ICS 6N 1 / 21

Symmetric matrices An n n matrix A is symmetric if A T = A. Component wise: A is symmetric if for i, j = 1, 2,..., n a ij = a ji Xiaohui Xie (UCI) ICS 6N 2 / 21

Matrix Diagonalization Matrix A is diagonalizable if there exists a diagonal matrix Λ such that A = PΛP 1 If A can be diagonalized, then A k = PΛ k P 1 No all matrices can be diagonalized. A matrix can be diagonalized if and only if there exists n linearly independent eigenvectors. Some special cases: If an nxn matrix A has n distinct eigenvalues, then it is diagonalizable. If A is symmetric, then it is diagonalizable. Xiaohui Xie (UCI) ICS 6N 3 / 21

Diagonalization of symmetric matrices Example: diagonalize the matrix 6 2 1 A = 2 6 1 1 1 5 Characteristic equation of A is 0 = λ 3 + 17λ 2 90λ + 144 = (λ 8)(λ 6)(λ 3) so we have three distinct eigenvalues λ 1 = 8, λ 2 = 6, λ 3 = 3. Find corresponding eigenvectors 1 1 1 v 1 = 1, v 2 = 1, v 3 = 1 0 2 1 Note that v T 1 v 2 = 0, v T 1 v 3 = 0, v T 2 v 3 = 0, i.e., the eigenvectors are mutually orthogonal. Xiaohui Xie (UCI) ICS 6N 4 / 21

Diagonalization of symmetric matrices 6 2 1 Example: diagonalize the matrix A = 2 6 1 1 1 5 Further normalize eigenvector to be unit vectors. 1/ 2 u 1 = 1/ 1/ 6 2, u 2 = 1/ 1/ 3 6 0 2/, u 3 = 1/ 3 6 1/ 3 Let 1/ 2 1/ 6 1/ 3 P = 1/ 2 1/ 6 1/ 8 0 0 3 0 2/ 6 1/, D = 0 6 0 3 0 0 3 A = PDP T, since P is an orthogonal matrix (P 1 = P T ). Xiaohui Xie (UCI) ICS 6N 5 / 21

Spectrum theorem If A is an n n symmetric matrix 1 All eigenvalues of A are real 2 A has exactly n real eigenvalues (counting for multiplicity). But this doesn t mean they are distinct 3 The geometric multiplicity of λ = dim(null(a λi )) = the algebraic multiplicity of λ 4 The eigenspaces are mutually orthogonal: If λ 1 λ 2 are two distinct eigenvalues, then their corresponding eigenvectors v 1, v 2 are orthogonal. Xiaohui Xie (UCI) ICS 6N 6 / 21

Proof 1 Let λ be an eigenvalue of A with corresponding eigenvector x, so Ax = λx and Ax = λ x. Then = λ = λ, so λ is real. λ x T x = x T Ax = (Ax) T x = λx T x. 2 Let x 1 and x 2 be two eigenvectors corresponding to two distinct eigenvalues λ 1 and λ 2. x T 1 Ax 2 = (x T 1 Ax 2 ) T = x T 2 A T (x T 1 ) T = x T 2 Ax 1 = λ 2 x T 1 x 2 = λ 1 x T 1 x 2 = (λ 1 λ 2 )(x T 1 x 2 ) = 0 Since λ 1 λ 2, (x T 1 x 2) = 0 so they are orthogonal. Xiaohui Xie (UCI) ICS 6N 7 / 21

Orthogonal diagonalization If an n n matrix A is symmetric, its eigenvectors v 1,, v n can be chosen to be orthonormal. If it has n distinct eigenvalues, then the n eigenvectors are orthogonal. Normalize these vectors to make them orthonormal. If an eigenvalue λ has multiplicity greater than 1, find an orthonormal basis of the corresponding eigenspace, Null(A λi), and use vectors in this basis as eigenvectors. In this case, P = [ v 1 v 2... ] v n is an orthogonal matrix, that is, P 1 = P T. And A can be orthogonally diagonalized A = PΛP T Xiaohui Xie (UCI) ICS 6N 8 / 21

Orthogonal diagonalization: an example 3 2 4 Orthogonally diagonalize the matrix A = 2 6 2 4 2 3 Characteristic equation: 0 = λ 3 + 12λ 2 21λ 98 = (λ 7) 2 (λ + 2) Produce bases for the eigenspaces by solving linear equations: 1 1/2 1 λ = 7 : v 1 = 0 v 2 = 1 1 0 ; λ = 2 : v 3 = 1/2 1 Apply Gram-Schdmit to produce an orthogonal basis for the eigenspace of λ = 7. Xiaohui Xie (UCI) ICS 6N 9 / 21

Orthogonal diagonalization: an example Produce bases for the eigenspaces by solving linear equations: 1 1/2 1 λ = 7 : v 1 = 0 v 2 = 1 1 0 ; λ = 2 : v 3 = 1/2 1 Apply Gram-Schdmit to produce orthogonal bases The component of v 2 orthogonal to v 1 is z 2 = v 2 v 1/4 2 v 1 v 1 = 1 v 1 v 1 1/4 Normalize v 1, z 2 1/ 2 1/ 18 u 1 = 0 1/, u 2 = 4/ 18 2 1/ 18 Normalize v 3 to obtain u 3. A = PDP T where P = [u 1, u 2, u 3 ] and D = diag(7, 7, 2). Xiaohui Xie (UCI) ICS 6N 10 / 21

Application 1: Quadratic Forms Any quadratic function of x can be expressed in the form of Q(x) = x T Ax where x is a vector in R n and A is an nxn symmetric matrix. More explicitly, n n x T Ax = a ij x i x j i=1 j=1 Xiaohui Xie (UCI) ICS 6N 11 / 21

Example For example, Q(x) = 2x1 2 + 3x2 2 + 4x3 2 + 5x 2 x 3 + 6x 1 x 2 can be written in quadratic form with matrix 2 3 0 A = 3 3 5 2 5 0 2 4 Xiaohui Xie (UCI) ICS 6N 12 / 21

Optimizing quadratic functions Consider the following optimization problem: max Q(x) = 2x 2 1 + 3x 2 2 + 4x 2 3 subject to x = 1 Xiaohui Xie (UCI) ICS 6N 13 / 21

Optimizing quadratic functions Consider the following optimization problem (without cross-product terms): max Q(x) = 2x 2 1 + 3x 2 2 + 4x 2 3 subject to x = 1 Solution: Since 2x1 2 4x 1 2 and 3x 2 4x2 2, we have Q(x) 4x 2 1 + 4x 2 2 + 4x 2 3 = 4 In addition, we can choose x 1 = 0, x 2 = 0, x 3 = 1 to reach the maximum. Xiaohui Xie (UCI) ICS 6N 14 / 21

Optimizing quadratic functions A more general problem: max Q(x) = x T Ax subject to x = 1 Xiaohui Xie (UCI) ICS 6N 15 / 21

Optimizing quadratic functions A more general problem: max Q(x) = x T Ax subject to x = 1 Solution: Use A = PΛP T to transform the problem into an easier form. Q(x) = x T PΛP T x = (P T x) T Λ(P T x) Use y = P T x to change variables. Convert the problem to max Q(y) = y T Λy = λ 1 y 2 1 + + λ n y 2 n subject to y = 1 max x T Ax subject to x = 1: λ max { A } min x T Ax subject to x = 1: λ min { A } Xiaohui Xie (UCI) ICS 6N 16 / 21

Optimizing quadratic functions: example max Q(x) = x 2 1 8x 1 x 2 5x 2 2 subject to x = 1 Xiaohui Xie (UCI) ICS 6N 17 / 21

Optimizing quadratic functions: example Solution: The matrix of the quadratic form is [ ] 1 4 A = 4 5 Orthogonally diagonalize A: [ ] [ ] 2/ 5 1/ 5 P = 1/ 5 2/ 3 0, D = 5 0 7 Change variables from x to y = P T x, and rewrite the objective function x 2 1 8x 1 x 2 5x 2 2 = x T Ax = (Py) T A(Py) = y T Dy = 3y 2 1 7y 2 2 max Q(x) over x = 1 is 3. Xiaohui Xie (UCI) ICS 6N 18 / 21

Application 2: Principle Component Analysis (PCA) Problem: Given a set of data points {x (1), x (2),, x (m) } in R n, find the axis along which the data points have maximal variation. Assume the data center around origin. If not, subtract the mean from each data point. Xiaohui Xie (UCI) ICS 6N 19 / 21

Application 2: Principle Component Analysis (PCA) Problem: Given a set of data points {x (1), x (2),, x (m) } in R n, find the axis along which the data points have maximal variance. Use a unit vector u in R n denote the direction of the axis. Project each data point onto u to obtain {y (1), y (2),, y (m) }, where y (i) = u T x (i). The variance of projected points σ 2 = 1 m m (y (i) ) 2 = 1 m i=1 m u T x (i) (x (i) ) T u = u T Xu i=1 where matrix X is defined by X = 1 m m x (i) (x (i) ) T i=1 called covariance matrix. Xiaohui Xie (UCI) ICS 6N 20 / 21

Application 2: Principle Component Analysis (PCA) Problem: Given a set of data points {x (1), x (2),, x (m) } in R n, find the axis along which the data points have maximal variance. Reformulate the problem into a quadratic optimization problem max u T Xu subject to u = 1 where matrix X = 1 m m i=1 x (i) (x (i) ) T is the covariance matrix. Solution: u is the eigenvector corresponding to the largest eigenvalue of X. The resulting y points are called the first principle component. Xiaohui Xie (UCI) ICS 6N 21 / 21