be a Householder matrix. Then prove the followings H = I 2 uut Hu = (I 2 uu u T u )u = u 2 uut u

Similar documents
Linear Least Squares. Using SVD Decomposition.

Linear Algebra, part 3. Going back to least squares. Mathematical Models, Analysis and Simulation = 0. a T 1 e. a T n e. Anna-Karin Tornberg

Notes on singular value decomposition for Math 54. Recall that if A is a symmetric n n matrix, then A has real eigenvalues A = P DP 1 A = P DP T.

The Singular Value Decomposition

The Singular Value Decomposition and Least Squares Problems

Singular Value Decomposition

Dimensionality Reduction: PCA. Nicholas Ruozzi University of Texas at Dallas

Singular Value Decomposition

Notes on Eigenvalues, Singular Values and QR

Maths for Signals and Systems Linear Algebra in Engineering

The Singular Value Decomposition

Orthogonal Transformations

STAT 309: MATHEMATICAL COMPUTATIONS I FALL 2017 LECTURE 5

Problem set 5: SVD, Orthogonal projections, etc.

(a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? Solution: dim N(A) 1, since rank(a) 3. Ax =

Singular Value Decomposition

Orthonormal Transformations and Least Squares

Summary of Week 9 B = then A A =

Linear Algebra, part 3 QR and SVD

UNIT 6: The singular value decomposition.

Numerical Methods. Elena loli Piccolomini. Civil Engeneering. piccolom. Metodi Numerici M p. 1/??

Lecture 5 Singular value decomposition

The QR Decomposition

SINGULAR VALUE DECOMPOSITION

Computational Methods. Eigenvalues and Singular Values

EE731 Lecture Notes: Matrix Computations for Signal Processing

Least Squares. Tom Lyche. October 26, Centre of Mathematics for Applications, Department of Informatics, University of Oslo

18.06 Problem Set 10 - Solutions Due Thursday, 29 November 2007 at 4 pm in

Matrix decompositions

σ 11 σ 22 σ pp 0 with p = min(n, m) The σ ii s are the singular values. Notation change σ ii A 1 σ 2

Linear Algebra Methods for Data Mining

MATH36001 Generalized Inverses and the SVD 2015

Orthonormal Transformations

Lecture notes: Applied linear algebra Part 1. Version 2

STA141C: Big Data & High Performance Statistical Computing

Let A an n n real nonsymmetric matrix. The eigenvalue problem: λ 1 = 1 with eigenvector u 1 = ( ) λ 2 = 2 with eigenvector u 2 = ( 1

Chapter 7: Symmetric Matrices and Quadratic Forms

Positive Definite Matrix

Numerical Methods I Singular Value Decomposition

Notes on Solving Linear Least-Squares Problems

Computational math: Assignment 1

Throughout these notes we assume V, W are finite dimensional inner product spaces over C.

Algebra C Numerical Linear Algebra Sample Exam Problems

Numerical Linear Algebra Notes

STA141C: Big Data & High Performance Statistical Computing

Linear Algebra. Session 12

LinGloss. A glossary of linear algebra

Basic Calculus Review

Nonlinear Programming Algorithms Handout

The Singular Value Decomposition (SVD) and Principal Component Analysis (PCA)

linearly indepedent eigenvectors as the multiplicity of the root, but in general there may be no more than one. For further discussion, assume matrice

Singular Value Decomposition

Problem # Max points possible Actual score Total 120

AM 205: lecture 8. Last time: Cholesky factorization, QR factorization Today: how to compute the QR factorization, the Singular Value Decomposition

Introduction to Numerical Linear Algebra II

Notes on Linear Algebra

STAT 309: MATHEMATICAL COMPUTATIONS I FALL 2018 LECTURE 13

AMS526: Numerical Analysis I (Numerical Linear Algebra)

Jordan Normal Form and Singular Decomposition

Properties of Matrices and Operations on Matrices

Fall TMA4145 Linear Methods. Exercise set Given the matrix 1 2

Computational Methods CMSC/AMSC/MAPL 460. EigenValue decomposition Singular Value Decomposition. Ramani Duraiswami, Dept. of Computer Science

MATH 3795 Lecture 10. Regularized Linear Least Squares.

Applied Numerical Linear Algebra. Lecture 8

Lecture 6 Sept Data Visualization STAT 442 / 890, CM 462

The Singular Value Decomposition

1 Linearity and Linear Systems

Singular Value Decomposition (SVD)

The University of Texas at Austin Department of Electrical and Computer Engineering. EE381V: Large Scale Learning Spring 2013.

Review of Some Concepts from Linear Algebra: Part 2

Orthogonalization and least squares methods

Linear Algebra in Actuarial Science: Slides to the lecture

Linear Algebra (Review) Volker Tresp 2017

LECTURE 7. Least Squares and Variants. Optimization Models EE 127 / EE 227AT. Outline. Least Squares. Notes. Notes. Notes. Notes.

Applied Mathematics 205. Unit II: Numerical Linear Algebra. Lecturer: Dr. David Knezevic

Linear Least-Squares Data Fitting

Model Order Reduction Techniques

5 Selected Topics in Numerical Linear Algebra

COMP 558 lecture 18 Nov. 15, 2010

Homework 1 Elena Davidson (B) (C) (D) (E) (F) (G) (H) (I)

Preliminary/Qualifying Exam in Numerical Analysis (Math 502a) Spring 2012

Homework 1. Yuan Yao. September 18, 2011

2. Linear algebra. matrices and vectors. linear equations. range and nullspace of matrices. function of vectors, gradient and Hessian

3D Computer Vision - WT 2004

Linear Algebra Review. Fei-Fei Li

8. the singular value decomposition

THE SINGULAR VALUE DECOMPOSITION MARKUS GRASMAIR

AMS526: Numerical Analysis I (Numerical Linear Algebra)

Preliminary Examination, Numerical Analysis, August 2016

ENGG5781 Matrix Analysis and Computations Lecture 8: QR Decomposition

EIGENVALE PROBLEMS AND THE SVD. [5.1 TO 5.3 & 7.4]

Linear Methods in Data Mining

5 Linear Algebra and Inverse Problem

Using SVD to Recommend Movies

EIGENVALUE PROBLEMS. Background on eigenvalues/ eigenvectors / decompositions. Perturbation analysis, condition numbers..

ACM106a - Homework 4 Solutions

EIGENVALUES AND SINGULAR VALUE DECOMPOSITION

Chapters 5 & 6: Theory Review: Solutions Math 308 F Spring 2015

Section 3.9. Matrix Norm

Lecture notes on Quantum Computing. Chapter 1 Mathematical Background

Transcription:

MATH 434/534 Theoretical Assignment 7 Solution Chapter 7 (71) Let H = I 2uuT Hu = u (ii) Hv = v if = 0 be a Householder matrix Then prove the followings H = I 2 uut Hu = (I 2 uu )u = u 2 uut u = u 2u = u (ii) Hv = (I 2 uut v = v 2uuT v u Y u = v 0 = v because vt u = 0 This implies that u T v = 0 (72) Let x be an n-vector Develop an algorithm to compute a Householder matrix H = I 2 uut such that Hx has zeros in positions (r + 1) through n; r < n How many flops will be required to implement this algorithm? Given x = (1, 2, 3) T, apply your algorithm to construct H such that Hx has a zero in the third position For x = (x 1,, x n ) T, let Now find û such that Ĥ = I 2ûûT û T û y = and x 1 x n Ĥy = 0 0

Then u = (0,, 0, û) T will generate a Householder transform H such that }{{} r 1 zeros Hx = (,,, 0,, 0) This takes 3(n r) flops }{{} r non zeros 77 Using the Cholesky decomposition of A T A, where A is m n(m n) and has full rank, prove that A has a unique reduced QR factorization with positive diagonal entries of R What are the computational drawbacks of this approach for finding a reduced QR factorization numerically? Illustrate the difficulties with an example Let A T A = HH T = R1 T R 1, where R 1 is an upper triangular matrix with positive diagonal entries Let Q 1 = AR1 1 Then we have A = Q 1 R 1 Now considering Q T 1 Q 1, we have Q T 1 Q 1 = (R T 1 ) 1 A T AR 1 1 = (R T 1 ) 1 R T 1 R 1 R 1 1 = I This shows that Q 1 is an orthogonal matrix Thus, A = Q 1 R 1 is a reduced QR factorization of A In order to prove the uniqueness, we assume that A = Q 1 1R1 1 be another reduced QR factorization of A Thus, (Q 1 1) T Q T 1 and R1 1 is upper triangular with positive diagonal entries Then, we have A T A = (R T 1 ) T (Q T 1 ) T Q T 1 R T 1 Therefore, (R T 1 ) T R 1 1 is the Cholesky factorization of A T A Since the Cholesky factorization is unique, R 1 1 = R 1 Furthermore, we have Q 1 1 = A(R 1 1) 1 = AR 1 1 = Q 1 Therefore, the reduced QR factorization of A is unique Computational Drawbacks The matrix A T A may not be formed accurately Then the column of the matrix Q may be far from orthogonal and the entries of the matrix R may not be computed accurately 716 (a) Let A be m n, and let U and V be orthogonal Then, using the definition of singular values Prove that the singular values of A and V T AV are the same What about the singular vectors? (b) How are the singular vectors of A related with those of U T AV? (c) How are the singular values of a symmetric matrix related to its eigenvalues? (a) Applying the singular value decomposition to a matrix A, we have A = U 1 Σ 1 V1 T Now multiplying A = U 1 Σ 1 V1 T by orthogonal matrices V T and V, we have V T AV = V T U 1 Σ 1 V T 1 V = P Σ 1 Q where P = V T U 1 and Q = V T 1 V are orthogonal Therefore, A and V T AV have the same singular values 2

(b) Now applying the singular value decomposition, we have U T AV = U T U 1 ΣV T 1 V The singular vectors of A are the columns of U 1 and V 1, and the singular values of U T AV are those of U T U 1 and V T 1 V (c) Let A = U 1 Σ 1 V T 1 be the singular value decomposition of A Then we have A 2 = A T A = V 1 Σ T 1 U T 1 U 1 Σ 1 V T 1 = V 1 Σ T 1 Σ 1 V T 1 Since V T 1 = V 1 1 and orthogonal transformations do not change the eigenvalues, the eigenvalues of A 2 are the squares of the singular values of A 719 For an m n matrix A, prove the following results using the SVD of A: rank(a T A) = rank(aa T ) = rank(a) = rank(a T ) (ii) A T A and AA T have the same nonzero eigenvalues (iii) If the eigenvectors u 1 and u 2 of A T A are orthogonal, then Au 1 and Au 2 are orthogonal Without the loss of generality, we will assume that m n for all problems of No 719 Singular Value Decomposition For a matrix A R m n, there exist orthogonal matrices U R m m and V R n n such that A = UΣV T where Σ = dig(σ 1,, σ n ) R m n and σ 1 σ 2 σ n 0 From the SVD of A, there are orthogonal matrices U and V such that A = UΣV T Then we have A T = V Σ T U T However, we know Σ = dig(σ 1,, σ n ) R m n This implies Σ T = dig(σ 1,, σ n ) R n m Also, we know that rank(a) = The number of nonzero singular values Since the singular values of A and A T are σ 1, σ 2,, σ n, the number of nonzero singular values of A is equal to the number of nonzero singular values of A T Therefore, rank(a) = rank(a T ) Now consider AA T and A T A Since A = UΣV T and A T = V Σ T U T, we have AA T = (UΣV T )(V Σ T U T ) = U(ΣΣ T )U T (1) A T A = (V Σ T U T )(UΣV T ) = V (Σ T Σ)V T (2) 3

Looking at Σ T Σ and ΣΣ T, we can see ΣΣ T = = σ 1 0 0 0 σ 2 0 σ 1 0 0 0 0 0 0 σ 0 σ 2 0 0 0 n 0 0 0 σ n 0 0 0 0 m n σ 2 1 0 0 0 0 0 σ 2 2 0 0 0 0 0 0 0 σn 2 0 0 0 0 0 0 0 0 0 m m n m Similarly, we can obtain Σ T Σ = σ1 2 0 0 0 σ2 2 0 0 0 σ 2 n n n From (1) and (2), the singular values of AA T and A T A are σ1, 2 σ2, 2, σn 2 Therefore, the number of nonzero singular values of AA T is equal to the number of nonzero singular values of A T A, that is, rank(aa T ) = rank(a T A) Also, from the facts that the singular values of A are σ 1, σ 2,, σ n and that the singular values of AA T are σ1, 2 σ2, 2, σn, 2 the number of nonzero singular values of A is equal to the number of nonzero singular values of AA T Therefore, rank(a) = rank(aa T ) Finally, we showed that rank(a T A) = rank(aa T ) = rank(a) = rank(a T ) (ii) By the Singular Value Decomposition, we have A = UΣV T where Σ m n = diag(σ 1, σ 2,, σ n ) A T = V Σ T U T where Σ n m = diag(σ 1, sigma 2,, σ n ) By the definition of singular values, for a real matrix A, the square roots of the eigenvalues of A T A are called the singular values of A, that is, the eigenvalues of A T A are σ 2 1, σ 2 2,, σ 2 n Since AA T = ( A T ) T A T and the singular values of A T are σ 1, σ 2,, σ n, the eigenvalues of AA T = ( A T ) T A are σ 2 1, σ 2 2,, σ 2 n Therefore, A T A and AA T have the same nonzero eigenvalues 4

(iii) Suppose that the eigenvectors u 1 and u 2 of A T A are orthogonal Then we have (A T A)u 1 = λ 1 u 1 (1) (A T A)u 2 = λ 2 u 2 (2) u T 1 u 2 = u T 2 u 1 = 0 (3) Then consider (Au 1 ) T (Ay 2 ) and (Au 2 ) t (Au 1 ) Using (2) and (3), we have (Au 1 ) T (Au 2 ) = u 1 A T Au 2 = u T 1 (λ 2 u 2 ) = λ 2 u T 1 u 2 = 0 Similarly, using (1) and (2), we have (Au 2 ) T (Au 1 ) = u 2 A T Au 1 = u T 2 (λ 1 u 1 ) = λ 1 u T 2 u 1 = 0 Therefore, Au 1 and Au 2 are orthogonal 721 Let U have orthogonal columns Then using SVD, prove that AU 2 = A 2, (ii) AU F = A F, Again without the loss of generality, we will assume that m n for all problems of No 721 Since U has orthogonal columns, U = [u 1 u 2 u n ] m n, where u i is the i th column vector expressed as u i = (u 1i, u 2i,, u mi ) T and satisfying u T i u j = { 0 if i j 1 if i = j This implies U T U = u T 1 u T 2 ( u1 u 2 u T n ) m n = I n n u T n n m Therefore, U is an orthogonal matrix From the Singular Value Decomposition, for a matrix A, there exist orthogonal matrices Û and V such that A = ÛΣ V T, where Σ = diag(σ 1, σ 2,, σ n ) Now multiplying A = ÛΣ V T by U, we have AU = ÛΣ V T U Since the product of orthogonal matrices is an orthogonal matrix, letting W T = V T U, we have AU = ÛΣW T ( ) Since A 2 = σ max, from ( ) we have AU 2 = σ max Therefore, AU 2 = A 2 5

(ii) Since A F = (σ 2 1 + σ 2 2 + + σ 2 n) 1 2, again from ( ), we have Therefore, we showed A F = AU F AU F = (σ 2 1 + σ 2 2 + + σ 2 n) 1 2 722 Let UΣV T be the SVD of A Then prove that U T AV 2 = p F i=1 σi 2, where σ i are the singular values of A From the Singular Value Decomposition, for an m n matrix A, there exists U R m m and V R n n such that A = UΣV T where Σ = diag(σ 1,, σ p ) R m n and p = min(m, n) Since U and V are orthogonal matrices, we have A = UΣV T U T AV = Σ Now taking the Frobenius norm on U T AV = Σ, we have U T AV 2 = F Σ 2 F (1) By the definition of the Frobenius norm, A F = [ nj=1 mi=1 a ij 2] 1 2, since Σ = diag(σ 1,, σ p ), we have p Σ 2 F = σi 2 (2) From (1) and (2), we showed that U T AV 2 = p F i=1 σi 2 723 Let A be an m n matrix (a) Using the SVD of A, prove that A T A = A 2 2 2 ; (ii) Cond 2 (A T A) = (Cond 2 (A)) 2 ; (iii) Cond 2 (A) = Cond 2 (U T AV ), where U and V are orthogonal i=1 (b) Let rank(a m n ) = n, and let B m r be a matrix obtained by deleting (n r) columns from A Then prove that Cond 2 (B) Cond 2 (A) (a) Using the Singular Value Decomposition, we have an orthogonal matrices U R m m and V R n n such that A = UΣV T Without the loss of generality, we assume m n 6

By the Singular Value Decomposition, we have A = UΣ m n V T Therefore, A 2 = σ max = σ 1 Also, we have A T = V Σ n n U T Considering A T A, we have A T A = UΣ T n mv V T Σ m n U = UΣΣ T U T Since ΣΣ T m m = diag(σ 2 1, σ 2 2, σ 2 n), A T A 2 = σ 2 max = σ 2 1 Therefore, A 2 2 = A T A 2 (ii) Cond 2 (A T A) = σ max of (A T A) of (A T A) = σ max of (A 2 ) of (A 2 ) = (σ max of A) 2 ( of A) 2 = (Cond 2(A)) 2 (iii) By the Singular Value Decomposition, we have Since Cond 2 (Σ) = σmax, we have A = UΣV T U T AV = Σ Cond 2 (U T AV ) = Cond 2 (Σ) = σ max (1) Also, Cond 2 (A) = σmax Therefore, we have Cond 2 (A) = Cond 2 (U T AV ) (b) Let A m n = a 11 a 12 a 1n a 21 a 22 a 2n a m1 a m2 a mn m n Then we have B m r = a 11 a 12 a 1r a 21 a 22 a 2r a m1 a m2 a mr m r Since rank(a) = n, the number of nonzero singular values of A is n = σ n σ r Therefore, we have This tell us that Cond 2 (A) = σ max = σ 1 σ n σ 1 σ r = Cond 2 (B) 7