Problem set 5: SVD, Orthogonal projections, etc.

Similar documents
MATH36001 Generalized Inverses and the SVD 2015

Linear Systems. Carlo Tomasi

be a Householder matrix. Then prove the followings H = I 2 uut Hu = (I 2 uu u T u )u = u 2 uut u

Math 407: Linear Optimization

σ 11 σ 22 σ pp 0 with p = min(n, m) The σ ii s are the singular values. Notation change σ ii A 1 σ 2

Linear Systems. Carlo Tomasi. June 12, r = rank(a) b range(a) n r solutions

The Singular Value Decomposition

AMS526: Numerical Analysis I (Numerical Linear Algebra)

AMS526: Numerical Analysis I (Numerical Linear Algebra)

2. Every linear system with the same number of equations as unknowns has a unique solution.

Singular Value Decomposition (SVD)

EECS 275 Matrix Computation

EE731 Lecture Notes: Matrix Computations for Signal Processing

Cheat Sheet for MATH461

Applied Mathematics 205. Unit II: Numerical Linear Algebra. Lecturer: Dr. David Knezevic

(a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? Solution: dim N(A) 1, since rank(a) 3. Ax =

Notes on singular value decomposition for Math 54. Recall that if A is a symmetric n n matrix, then A has real eigenvalues A = P DP 1 A = P DP T.

Singular Value Decomposition

Computational Methods. Least Squares Approximation/Optimization

Review of Some Concepts from Linear Algebra: Part 2

DS-GA 1002 Lecture notes 10 November 23, Linear models

Introduction to Numerical Linear Algebra II

The QR Factorization

AM 205: lecture 8. Last time: Cholesky factorization, QR factorization Today: how to compute the QR factorization, the Singular Value Decomposition

Conceptual Questions for Review

Dot product and linear least squares problems

The Singular Value Decomposition and Least Squares Problems

UNIT 6: The singular value decomposition.

Designing Information Devices and Systems II

Orthonormal Transformations and Least Squares

Fall TMA4145 Linear Methods. Exercise set Given the matrix 1 2

Final Review Written by Victoria Kala SH 6432u Office Hours R 12:30 1:30pm Last Updated 11/30/2015

Numerical Methods. Elena loli Piccolomini. Civil Engeneering. piccolom. Metodi Numerici M p. 1/??

ELE/MCE 503 Linear Algebra Facts Fall 2018

MATH 2331 Linear Algebra. Section 2.1 Matrix Operations. Definition: A : m n, B : n p. Example: Compute AB, if possible.

Problem 1. CS205 Homework #2 Solutions. Solution

Solutions to Final Practice Problems Written by Victoria Kala Last updated 12/5/2015

Computational math: Assignment 1

b 1 b 2.. b = b m A = [a 1,a 2,...,a n ] where a 1,j a 2,j a j = a m,j Let A R m n and x 1 x 2 x = x n

AMS526: Numerical Analysis I (Numerical Linear Algebra)

Lecture 6, Sci. Comp. for DPhil Students

Linear Algebra Formulas. Ben Lee

Problem # Max points possible Actual score Total 120

The Singular Value Decomposition

The Singular Value Decomposition

Matrix Algebra for Engineers Jeffrey R. Chasnov

Linear Algebra, part 3. Going back to least squares. Mathematical Models, Analysis and Simulation = 0. a T 1 e. a T n e. Anna-Karin Tornberg

Linear Least squares

MATH 350: Introduction to Computational Mathematics

Math 224, Fall 2007 Exam 3 Thursday, December 6, 2007

Numerical Linear Algebra Notes

SINGULAR VALUE DECOMPOSITION

Inverses. Stephen Boyd. EE103 Stanford University. October 28, 2017

Transpose & Dot Product

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

EE731 Lecture Notes: Matrix Computations for Signal Processing

IV. Matrix Approximation using Least-Squares

10-725/36-725: Convex Optimization Prerequisite Topics

Projections and Least Square Solutions. Recall that given an inner product space V with subspace W and orthogonal basis for

Notes on Eigenvalues, Singular Values and QR

Chapter 7: Symmetric Matrices and Quadratic Forms

7. Dimension and Structure.

Matrix decompositions

Transpose & Dot Product

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

Applied Numerical Linear Algebra. Lecture 8

Dimensionality Reduction: PCA. Nicholas Ruozzi University of Texas at Dallas

18.06 Professor Johnson Quiz 1 October 3, 2007

Singular Value Decomposition

Algebra C Numerical Linear Algebra Sample Exam Problems

STAT 309: MATHEMATICAL COMPUTATIONS I FALL 2013 PROBLEM SET 2

Block Bidiagonal Decomposition and Least Squares Problems

Section 6.4. The Gram Schmidt Process

This can be accomplished by left matrix multiplication as follows: I

Model Order Reduction Techniques

I. Multiple Choice Questions (Answer any eight)

Dimension and Structure

Summary of Week 9 B = then A A =

Preliminary Linear Algebra 1. Copyright c 2012 Dan Nettleton (Iowa State University) Statistics / 100

Solutions of Linear system, vector and matrix equation

Chapter 0 Miscellaneous Preliminaries

Linear Least Squares. Using SVD Decomposition.

Knowledge Discovery and Data Mining 1 (VO) ( )

MAT Linear Algebra Collection of sample exams

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL

Solution of Linear Equations

The Fundamental Theorem of Linear Algebra

MODULE 8 Topics: Null space, range, column space, row space and rank of a matrix

COMP 558 lecture 18 Nov. 15, 2010

SUMMARY OF MATH 1600

Least Squares. Tom Lyche. October 26, Centre of Mathematics for Applications, Department of Informatics, University of Oslo

Let A an n n real nonsymmetric matrix. The eigenvalue problem: λ 1 = 1 with eigenvector u 1 = ( ) λ 2 = 2 with eigenvector u 2 = ( 1

7. Symmetric Matrices and Quadratic Forms

Pseudoinverse & Orthogonal Projection Operators

LECTURE 7. Least Squares and Variants. Optimization Models EE 127 / EE 227AT. Outline. Least Squares. Notes. Notes. Notes. Notes.

Math 265 Linear Algebra Sample Spring 2002., rref (A) =

Math 314/ Exam 2 Blue Exam Solutions December 4, 2008 Instructor: Dr. S. Cooper. Name:

Householder reflectors are matrices of the form. P = I 2ww T, where w is a unit vector (a vector of 2-norm unity)

Linear Algebra Review: Linear Independence. IE418 Integer Programming. Linear Algebra Review: Subspaces. Linear Algebra Review: Affine Independence

Math 2174: Practice Midterm 1

Transcription:

Problem set 5: SVD, Orthogonal projections, etc. February 21, 2017 1 SVD 1. Work out again the SVD theorem done in the class: If A is a real m n matrix then here exist orthogonal matrices such that where U = [u 1,..., u m ] R m m and V = [v 1,..., v n ] R n n and σ 1 σ 2... σ p. A = UΣV T Σ = diag(σ 1, σ 2,..., σ p ) p = min{m, n} 2. Suppose that σ 1 σ 2... σ r > σ r+1 =... = σ p = 0. Then show that (a) rank(a) = r (b) null(a) = [v r+1,..., v n ] (c) range(a) = [u 1,..., u r ] (d) If U r = U(:, 1 : r), Σ r = Σ(1 : r, 1 : r) and V r = V(:, 1 : r), then A r = U r Σ r V r T = r T σ i u i v i i=1 3. Show that (a) A 2 F = σ2 1 + σ2 2 +... + σ2 p (b) A 2 2 = σ 1 1

4. Let the SVD of A be as above. If k < r = rank(a) and A k = k T σ i u i v i i=1 then show that min rank(b=k) A B 2 2 = A A k 2 2 = σ 1 5. Show (without using condition numbers) that if A is square (n n) and σ n > 0 is small, then solving x =A 1 b is unstable. 6. Show that in the 2 norm 2 Least-squares cond(a) = σ 1 σ n 1. Consider the least-squares problem: Find the least-squares solution to the m n set of equations Ax = b, where m > n and rank(a) = n Show that the following constitute a solution: (a) Find the SVD A = UΣV T (b) Set b = U T b (c) Find the vector y defined by y i = b i /σ i, where σ i is the i th diagonal entry of Σ (d) The solution is x = Vy 2. Consider the least-squares problem: Find the general least-squares solution to the m n set of equations Ax = b, where m > n and rank(a) = r < n Show that the following constitute a solution: (a) Find the SVD A = UΣV T (b) Set b = U T b (c) Find the vector y defined by y i = b i /σ i, for i = 1,..., r, and y i = 0 otherwise. (d) The solution x of minimum norm x is Vy 2

(e) The general solution is x = Vy + λ r+1 v r+1 +... + λ n v n where v r+1,..., v n are the last n r columns of V. 3. Show that the least-squares solution to an m n system of equations Ax = b of rank n is given by A + b (pseudo-inverse). In the case of a deficient-rank system, x = A + b is the solution that minimizes x. 4. Show that if A is an m n matrix of rank n, then A + = (A T A) 1 A T and, in general, a least-squares solution can be obtained by solving the normal equations (A T A)x = A T b 5. Weighted least-squares: Let C be a positive definite matrix. Then the C-norm is defined as a C = (a T Ca) 1/2. The weighted leastsquares problem is one of minimizing Ax b C. The most common weighting is when C is diagonal. Show that weigthed least-sqaures solution can be obtained by solving: (A T CA)x = A T Cb 6. Consider the constrained least squares problem: Given A of size n n, find x that minimizes Ax subject to x = 1. Show that the solution is given by the last column of V where A = UΣV T is the SVD of A. 7. Consider the following constrained least-squares problem: Given an m n matrix A with m n, find the vector x that minimizes Ax subject to x = 1 and Cx = 0. Show that a solution is given as: (a) If C has fewer rows than columns, then add 0 rows to C to make it square. Compute the SVD C = UΣV T. Let C be the matrix obtained from V after deleting the first r columns where r is the number of non-zero entries in Σ. (b) Find the solution to minimization of AC x subject to x = 1. (c) The solution is obtained as x = C x. 3

8. Consider the following constrained least-squares problem: Given an m n matrix A with m n, find the vector x that minimizes Ax subject to x = 1 and x range(g). Show that a solution is given as: (a) Compute the SVD G = UΣV T. Let (b) Let U be the matrix of first r columns of G where r = rank(g). (c) Find the solution to minimization of AU x subject to x = 1. (d) The solution is obtained as x = U x. 3 Orthogonal Projections Let S R n be a subspace. P R n n is the orthogonal projection onto S if range(p) = S, P 2 = P and P T = P. 1. Show the following: (a) If x R n and P is an orthogonal projection on to S (Px S), then (I P) is an orthogonal projection onto S ((I P)x S ) where S is the orthogonal complement of S. (b) The orthogonal projection onto a subspace is unique. (c) If v R n, then P = vv T /v T v is the orthogonal projection onto S = span{v}. (d) If the columns of V = [v 1,..., v k ] are an orthonormal basis for S, then VV T is the unique orthonormal projection onto S. 2. Suppose that the SVD of A is A = UΣV T and rank(a) = r. If we have the U and V partitionings: U = [ U r Ũ r ] r m r V = [ V r Ṽ r ] r n r Then, show that (a) V r V T r = projection onto null(a) = range(a T ) (b) Ṽ r Ṽr T = projection onto null(a) (c) U r U T r = projection onto range(a) (d) Ũ r Ũ T r = projection onto range(a) = null(a T ) 4

3. Let v R n be non-zero. The n n matrix P = I 2 vvt v T v is the Householder reflection. Show that: (a) P is an orthogonal projection. (b) When a vector v R n is multiplied by P, it is reflected in the hyperplane span{v}. (c) If Then, v = x ± x 2 e 1 Px = x 2 e 1 span{e 1 } 5