Let A an n n real nonsymmetric matrix. The eigenvalue problem: λ 1 = 1 with eigenvector u 1 = ( ) λ 2 = 2 with eigenvector u 2 = ( 1

Similar documents
EIGENVALE PROBLEMS AND THE SVD. [5.1 TO 5.3 & 7.4]

A few applications of the SVD

EIGENVALUE PROBLEMS. Background on eigenvalues/ eigenvectors / decompositions. Perturbation analysis, condition numbers..

σ 11 σ 22 σ pp 0 with p = min(n, m) The σ ii s are the singular values. Notation change σ ii A 1 σ 2

Chapter 7: Symmetric Matrices and Quadratic Forms

UNIT 6: The singular value decomposition.

Fall TMA4145 Linear Methods. Exercise set Given the matrix 1 2

Singular Value Decomposition

Notes on singular value decomposition for Math 54. Recall that if A is a symmetric n n matrix, then A has real eigenvalues A = P DP 1 A = P DP T.

235 Final exam review questions

Chapters 5 & 6: Theory Review: Solutions Math 308 F Spring 2015

Linear Algebra Background

Linear Algebra - Part II

be a Householder matrix. Then prove the followings H = I 2 uut Hu = (I 2 uu u T u )u = u 2 uut u

Linear Methods in Data Mining

Latent Semantic Models. Reference: Introduction to Information Retrieval by C. Manning, P. Raghavan, H. Schutze

33AH, WINTER 2018: STUDY GUIDE FOR FINAL EXAM

(a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? Solution: dim N(A) 1, since rank(a) 3. Ax =

Notes on Eigenvalues, Singular Values and QR

Linear Algebra in Actuarial Science: Slides to the lecture

Eigenvalues and Eigenvectors

DATA MINING LECTURE 8. Dimensionality Reduction PCA -- SVD

Problems. Looks for literal term matches. Problems:

2. Every linear system with the same number of equations as unknowns has a unique solution.

Information Retrieval

Latent Semantic Analysis. Hongning Wang

Singular Value Decompsition

CS 143 Linear Algebra Review

ICS 6N Computational Linear Algebra Eigenvalues and Eigenvectors

Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Lecture 15, 16: Diagonalization

Maths for Signals and Systems Linear Algebra in Engineering

1 Last time: least-squares problems

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

TBP MATH33A Review Sheet. November 24, 2018

Chapter 5. Eigenvalues and Eigenvectors

Linear Algebra & Geometry why is linear algebra useful in computer vision?

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )

Jordan Normal Form and Singular Decomposition

SINGULAR VALUE DECOMPOSITION

Linear Algebra- Final Exam Review

Econ Slides from Lecture 7

Problem # Max points possible Actual score Total 120

The Singular Value Decomposition

Math 315: Linear Algebra Solutions to Assignment 7

7. Symmetric Matrices and Quadratic Forms

Chapter 7. Canonical Forms. 7.1 Eigenvalues and Eigenvectors

Review problems for MA 54, Fall 2004.

EE731 Lecture Notes: Matrix Computations for Signal Processing

Study Guide for Linear Algebra Exam 2

Final Exam, Linear Algebra, Fall, 2003, W. Stephen Wilson

Computational Methods. Eigenvalues and Singular Values

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix

Machine Learning. Principal Components Analysis. Le Song. CSE6740/CS7641/ISYE6740, Fall 2012

Lecture 10 - Eigenvalues problem

MATH 1553 PRACTICE MIDTERM 3 (VERSION B)

Numerical Methods I Singular Value Decomposition

The Singular Value Decomposition

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

CS47300: Web Information Search and Management

Math Final December 2006 C. Robinson

LinGloss. A glossary of linear algebra

1. In this problem, if the statement is always true, circle T; otherwise, circle F.

MATH 221, Spring Homework 10 Solutions

The Singular Value Decomposition (SVD) and Principal Component Analysis (PCA)

Conceptual Questions for Review

Chapter 6: Orthogonality

Latent Semantic Indexing (LSI) CE-324: Modern Information Retrieval Sharif University of Technology

AMS526: Numerical Analysis I (Numerical Linear Algebra)

Math 2331 Linear Algebra

Basic Calculus Review

Warm-up. True or false? Baby proof. 2. The system of normal equations for A x = y has solutions iff A x = y has solutions

Definitions for Quizzes

Linear Algebra, part 3. Going back to least squares. Mathematical Models, Analysis and Simulation = 0. a T 1 e. a T n e. Anna-Karin Tornberg

Lecture 1: Review of linear algebra

Principal Component Analysis

The Singular Value Decomposition and Least Squares Problems

Cheat Sheet for MATH461

Review Notes for Linear Algebra True or False Last Updated: January 25, 2010

Ma/CS 6b Class 20: Spectral Graph Theory

Recall : Eigenvalues and Eigenvectors

MTH 102: Linear Algebra Department of Mathematics and Statistics Indian Institute of Technology - Kanpur. Problem Set

Linear Algebra Review. Fei-Fei Li

Eigenvalues, Eigenvectors, and Diagonalization

Dimensionality Reduction: PCA. Nicholas Ruozzi University of Texas at Dallas

ft-uiowa-math2550 Assignment NOTRequiredJustHWformatOfQuizReviewForExam3part2 due 12/31/2014 at 07:10pm CST

Math 314/ Exam 2 Blue Exam Solutions December 4, 2008 Instructor: Dr. S. Cooper. Name:

Latent Semantic Analysis. Hongning Wang

Eigenvalues and diagonalization

A = 3 1. We conclude that the algebraic multiplicity of the eigenvalues are both one, that is,

Latent Semantic Indexing (LSI) CE-324: Modern Information Retrieval Sharif University of Technology

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

1. Select the unique answer (choice) for each problem. Write only the answer.

Review of Linear Algebra

The University of Texas at Austin Department of Electrical and Computer Engineering. EE381V: Large Scale Learning Spring 2013.

Math Matrix Algebra

Problem Set (T) If A is an m n matrix, B is an n p matrix and D is a p s matrix, then show

4. Linear transformations as a vector space 17

Unit 5: Matrix diagonalization

Transcription:

Eigenvalue Problems. Introduction Let A an n n real nonsymmetric matrix. The eigenvalue problem: EIGENVALE PROBLEMS AND THE SVD. [5.1 TO 5.3 & 7.4] Au = λu Example: ( ) 2 0 A = 2 1 λ 1 = 1 with eigenvector u 1 = ( ) 0 λ C : eigenvalue u C n : eigenvector λ 2 = 2 with eigenvector u 2 = ( 1 2 The set of eigenvalues of A is called the spectrum of A 1 ) 16-2 Text: 5.1-5.3 EIG 16-1 16-2 Eigenvalue Problems. Their origins Structural Engineering [Ku = λmu] Stability analysis [e.g., electrical networks, mechanical system,..] Quantum chemistry and Electronic structure calculations [Schrödinger equation..] Application of new era: page ranking on the world-wide web. 16-3 Text: 5.1-5.3 EIG Basic definitions and properties A scalar λ is called an eigenvalue of a square matrix A if there exists a nonzero vector u such that Au = λu. The vector u is called an eigenvector of A associated with λ. The set of all eigenvalues of A is the spectrum of A. Notation: Λ(A). λ is an eigenvalue iff the columns of A λi are linearly dependent. λ is an eigenvalue iff det(a λi) = 0 Compute the eigenvalues of the matrix: Eigenvectors? A = 2 1 0 1 0 1 0 1 2 16-4 Text: 5.1-5.3 EIG 16-3 16-4

Basic definitions and properties (cont.) An eigenvalue is a root of the Characteristic polynomial: p A (λ) = det(a λi) So there are n eigenvalues (counted with their multiplicities). The multiplicity of these eigenvalues as roots of p A are called algebraic multiplicities. Find all the eigenvalues of the matrix: A = 1 2 4 0 1 2 0 0 2 Find the associated eigenvectors. How many eigenvectors can you find if a 33 is replaced by one? Same questions if a 12 is replaced by zero. What are all the eigenvalues of a diagonal matrix? 16-5 Text: 5.1-5.3 EIG 16-6 Text: 5.1-5.3 EIG 16-5 16-6 Two matrices A and B are similar if there exists an invertible matrix V such that A = V BV 1 A and B represent the same linear mapping in 2 different bases. Explain why [Hint: Assume a column of V represents one basis vector of the new basis expressed in the old basis...] Solution: Let A be linear mapping represented in standard basis e 1,, e n (the old basis). Consider a new basis v 1, v 2,, v n. Assume each v i is expressed in the old basis and let V = [v 1, v 2,..., v n ]. A vector s in the new basis is expressed as V s in the old basis (explain). Linear mapping applied to this vector is t = A(V s). This is expressed in old basis. Then t = V (V 1 AV s) expresses the result in new basis: B = V 1 AV s represents mapping A in basis V. Show: A and B have the same eigenvalues. What about eigenvectors? Definition: A is diagonalizable if it is similar to a diagonal matrix Note : not all matrices are diagonalizable Theorem 1: A matrix is diagonalizable iff it has n linearly independent eigenvectors 16-8 Text: 5.1-5.3 EIG 16-7 16-8

Example: Which of these matrices is/are diagonalizable A = 1 1 0 0 2 1 B = 1 1 0 0 1 1 C = 1 1 0 0 1 1 0 0 3 0 0 1 0 0 2 Theorem 2: The eigenvectors associated with distinct eigenvalues are linearly independent Prove the result for 2 distinct eigenvalues Solution: Let Au 1 = λ 1 u 1 and Au 2 = λ 2 u 2 with λ 1 λ 2. We prove that if α 1 u 1 + α 2 u 2 = 0 then we must have α 1 = α 2 = 0. Multiply α 1 u 1 + α 2 u 2 = 0 by A λ 1 I: then (A λ 1 I) [α 1 u 1 + α 2 u 2 ] = 0 α 1 (A λ 1 I)u 1 + α 2 (A λ 1 I)u 2 = 0 0 + α 2 (λ 2 λ 1 I)u 2 = 0 Since λ 2 λ 1 we must have α 2 = 0. Similar argument will show that α 1 = 0. Consequence: if all eigenvalues of a matrix A are simple then A is diagonalizable. Theorem 3: A symmetric matrix has real eigenvalues and is diagonalizable. In addition A admits a set of orthonormal eigenvectors. 16-10 Text: 5.1-5.3 EIG 16-9 16-10 Transformations that Preserve Eigenstructure The Singular Value Decomposition (SVD) Shift Polynomial B = A σi: Av = λv Bv = (λ σ)v eigenvalues move, eigenvectors remain the same. B = p(a) = α 0 I + + α n A n : Av = λv Bv = p(λ)v eigenvalues transformed, eigenvectors remain the same. Theorem For any matrix A R m n there exist orthogonal matrices U R m m and V R n n such that A = UΣV T where Σ is a diagonal matrix with entries σ ii 0. Invert B = A 1 : Av = λv Bv = λ 1 v eigenvalues inverted, eigenvectors remain the same. Let A be diagonalizable. How would you compute p(a) if p is a high degree polynomial? [Hint: start with A k ] σ 11 σ 22 σ pp 0 with p = min(n, m) The σ ii are the singular values of A. σ ii is denoted simply by σ i 16-11 Text: 5.1-5.3 EIG 16-12 Text: 7.4 SVD 16-11 16-12

Case 1: The thin SVD A = U Σ T V Consider the Case-1. It can be rewritten as A = [U 1 U 2 ] ( ) Σ1 0 V T Case 2: Which gives: A = U 1 Σ 1 V T A = U Σ V T where U 1 is m n (same shape as A), and Σ 1 and V are n n referred to as the thin SVD. Important in practice. How can you obtain the thin SVD from the QR factorization of A and the SVD of an n n matrix? 16-13 Text: 7.4 SVD 16-14 Text: 7.4 SVD 16-13 16-14 A few properties. Assume that σ 1 σ 2 σ r > 0 and σ r+1 = = σ p = 0 Then: rank(a) = r = number of nonzero singular values. Ran(A) = span{u 1, u 2,..., u r } Null(A) = span{v r+1, v r+2,..., v n } The matrix A admits the SVD expansion: Rank and approximate rank of a matrix The number of nonzero singular values r equals the rank of A In practice: zero singular values replaced by small values due to noise. Can define approximate rank: rank obtained by neglecting smallest singular values Example: Let A a matrix with singular values σ 1 = 10.0; σ 2 = 6.0; σ 3 = 3.0; σ 4 = 0.030; σ 5 = 0.0130; σ 6 = 0.0010; A = r σ i u i v T i i=1 σ 4, σ 5, σ 6, are likely due to noise - so approximate rank is 3. Rigorous way of stating this exists but beyond scope of this class [see csci 5304] 16-15 Text: 7.4 SVD 16-16 Text: 7.4 SVD 16-15 16-16

Right and Left Singular vectors: Av i = σ i u i A T u j = σ j v j Consequence A T Av i = σ 2 i v i and AA T u i = σ 2 i u i Right singular vectors (v i s) are eigenvectors of A T A Left singular vectors (u i s) are eigenvectors of AA T Possible to get the SVD from eigenvectors of AA T and A T A but: difficulties due to non-uniqueness of the SVD A few applications of the SVD Many methods require to approximate the original data (matrix) by a low rank matrix before attempting to solve the original problem Regularization methods require the solution of a least-squares linear system Ax = b approximately in the dominant singular space of A The Latent Semantic Indexing (LSI) method in information retrieval, performs the query in the dominant singular space of A Methods utilizing Principal Component Analysis, e.g. Face Recognition. 16-17 Text: 7.4 SVD 16-18 Text: 7.4 SVDapp 16-17 16-18 Information Retrieval: Vector Space Model Given: a collection of documents (columns of a matrix A) and a query vector q. Vector Space Model - continued Problem: find a column of A that best matches q Similarity metric: angle between column c and query q cos θ(c, q) = ct q c q Collection represented by an m n term by document matrix with a ij = L ij G i N j Queries ( pseudo-documents ) q are represented similarly to a column 16-19 Text: 7.4 SVDapp To rank all documents we need to compute s = A T q s = similarity vector. Literal matching not very effective. Problems with literal matching: polysemy, synonymy,... 16-20 Text: 7.4 SVDapp 16-19 16-20

Use of the SVD Solution: Extract intrinsic information or underlying semantic information LSI: replace matrix A by a low rank approximation using the Singular Value Decomposition (SVD) A = UΣV T A k = U k Σ k V T k U k : term space, V k : document space. Refer to this as Truncated SVD (TSVD) approach Amounts to replacing small sing. values of A by zeros New similarity vector: s k = A T k q = V kσ k U T k q 16-21 Text: 7.4 SVDapp LSI : an example %% D1 : INFANT & TODDLER first aid %% D2 : BABIES & CHILDREN s room for your HOME %% D3 : CHILD SAFETY at HOME %% D4 : Your BABY s HEALTH and SAFETY %% : From INFANT to TODDLER %% D5 : BABY PROOFING basics %% D6 : Your GUIDE to easy rust PROOFING %% D7 : Beanie BABIES collector s GUIDE %% D8 : SAFETY GUIDE for CHILD PROOFING your HOME %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% %% TERMS: 1:BABY 2:CHILD 3:GUIDE 4:HEALTH 5:HOME %% 6:INFANT 7:PROOFING 8:SAFETY 9:TODDLER %% Source: Berry and Browne, SIAM., 99 Number of documents: 8 Number of terms: 9 16-22 Text: 7.4 SVDapp 16-21 16-22 Raw matrix (before scaling). A = d1 d2 d3 d4 d5 d6 d7 d8 1 1 1 1 bab 1 1 1 chi 1 1 1 gui 1 hea 1 1 1 hom 1 1 inf 1 1 1 pro 1 1 1 saf 1 1 tod Get the anwser to the query Child Safety, so q = [0 1 0 0 0 0 0 1 0] using cosines and then using LSI with k = 3. 16-23 Text: 7.4 SVDapp 16-23