The Singular Value Decomposition
|
|
- John Perry
- 6 years ago
- Views:
Transcription
1 The Singular Value Decomposition An Important topic in NLA Radu Tiberiu Trîmbiţaş Babeş-Bolyai University February 23, 2009 Radu Tiberiu Trîmbiţaş ( Babeş-Bolyai University)The Singular Value Decomposition February 23, / 24
2 The SVD -The Main Idea Motivation: The image of the unit sphere under any m n matrix is a hyperellipse Radu Tiberiu Trîmbiţaş ( Babeş-Bolyai University)The Singular Value Decomposition February 23, / 24
3 The SVD -Brief Description Suppose (for the moment) that A is m n with m n and full rank n Choose orthonormal bases v 1,..., v n for the row space u 1,..., u n for the column space such that Av i is in the direction of u i : Av i = σ i u i In MATLAB: eigshow The singular values σ 1 σ 2 σ n > 0 Radu Tiberiu Trîmbiţaş ( Babeş-Bolyai University)The Singular Value Decomposition February 23, / 24
4 The SVD -Brief Description In matrix form, Av i = σ i u i becomes where ˆΣ = diag(σ 1, σ 2,..., σ n ) AV = Û ˆΣ, that is, A = Û ˆΣV This is the reduced singular value decomposition Add orthogonal extension to Û and add rows to ˆΣ to obtain the full singular value decomposition A = UΣV Radu Tiberiu Trîmbiţaş ( Babeş-Bolyai University)The Singular Value Decomposition February 23, / 24
5 Reduced SVD A compact representation is the reduced SVD, for m n: A = Û ˆΣV where Û is m n, V is n n, Σ is n n Radu Tiberiu Trîmbiţaş ( Babeş-Bolyai University)The Singular Value Decomposition February 23, / 24
6 Full SVD Let A be an m n matrix. The singular value decomposition of A is the factorization A = UΣV where U is m m unitary (the left singular vectors of A) V is n n unitary (the right singular vectors of A) Σ is m n diagonal (the singular values of A) Radu Tiberiu Trîmbiţaş ( Babeş-Bolyai University)The Singular Value Decomposition February 23, / 24
7 Radu Tiberiu Trîmbiţaş ( Babeş-Bolyai University)The Singular Value Decomposition February 23, / 24
8 Formal Definition Let m and n be arbitrary; we do not require m n. Definition Given A C m n (not necessarily of full rank), a singular value decomposition (SVD) of A is a factorization where U C m m is unitary, V C n n is unitary, Σ R m n is diagonal. A = UΣV (1) In addition it is assumed that the diagonal entries σ j of Σ are nonnegative and in nonincreasing order; that is, σ 1 σ 2... σ p 0, where p = min(m, n). Radu Tiberiu Trîmbiţaş ( Babeş-Bolyai University)The Singular Value Decomposition February 23, / 24
9 The SVD and The Eigenvalue Decomposition The eigenvalue decomposition A = X ΛX 1 uses the same basis X for row and column space, but the SVD uses two different bases V, U generally does not use an orthonormal basis, but the SVD does is only defined for square matrices, but the SVD exists for all matrices For symmetric positive definite matrices A, the eigenvalue decomposition and the SVD are equal Radu Tiberiu Trîmbiţaş ( Babeş-Bolyai University)The Singular Value Decomposition February 23, / 24
10 Existence and Uniqueness Theorem Every matrix A C m n has a singular value decomposition (1). Furthermore, the singular values {σ j } are uniquely determined, and, if A is squared and the σ j are distinct, the left and the right singular vectors {u j } and {v j } are uniquely determined up to complex signs (i.e. complex scalar factors of modulus 1). Radu Tiberiu Trîmbiţaş ( Babeş-Bolyai University)The Singular Value Decomposition February 23, / 24
11 Existence and Uniqueness Proof Existence Proof. To prove the existence of the SVD, we isolate the direction of the largest action of A, and then proceed by induction on the dimension of A. Set σ 1 = A 2. By a compactness argument, there must be a vector v 1 C n with v 1 2 = 1 and u 1 2 = σ 1, where u 1 = Av 1. Consider any extension of v 1 to an orthonormal basis {v j } of C n and of u 1 to an orthonormal basis {u j } of C m, and let U 1 and V 1 denote the unitary matrices with columns u j and v j, respectively. Then we have [ U1 σ1 w AV 1 = S = 0 B ], (2) where 0 is a column vector of dimension m 1, w is a row vector of dimension n 1, and B has the dimensions (m 1) (n 1). Radu Tiberiu Trîmbiţaş ( Babeş-Bolyai University)The Singular Value Decomposition February 23, / 24
12 Existence and Uniqueness Existence Furthermore, [ σ1 w 0 B ] [ σ1 w ] 2 σ w w = (σ w w) 1/2 [ σ1 w ], 2 implying S 2 (σ1 2 + w w) 1/2. Since U 1 and V 1 are unitary, we know that S 2 = A 2 = σ 1, so this implies w = 0. If n = 1 or m = 1, we are done. Otherwise, the submatrix B describes the action of of A on the subspace orthogonal to v 1. By the induction hypothesis, B has a SVD B = U 2 Σ 2 V2. Now, it is easily verified that A = U 1 [ U 2 ] [ σ1 0 0 Σ 2 ] [ 1 0 is a SVD of A, completing the proof of existence. 0 V 2 ] V 1 Radu Tiberiu Trîmbiţaş ( Babeş-Bolyai University)The Singular Value Decomposition February 23, / 24
13 Existence and Uniqueness Proof Uniqueness First we note that σ 1 is uniquely determined by the condition that is equal to A 2, as follows from (1). Now suppose that in addition to v 1, there is another linearly independent vector w with w 2 = 1 and Aw 2 = σ 1. Define a unit vector v 2, orthogonal to v 1, as a linear combination of v 1 and w, v 2 = w (v 1 w)v 1 w (v 1 w)v 1 2. Since A 2 = σ 1, Av 2 2 σ 1 ; but this must be an equality, for otherwise, since w = v 1 c + v 2 s for some constants c and s with c 2 + s 2 = 1, we would have Aw 2 < σ 1. This vector v 2 is a second right singular vector of A corresponding to the singular value σ 1 ; it will lead to the appearance of a vector y (equal to the last n 1 components of V 1 v 2) with y 2 = 1 and By 2 = σ 1. We conclude that, if the singular vector v 1 is not unique, then the corresponding singular value σ 1 is not simple. Radu Tiberiu Trîmbiţaş ( Babeş-Bolyai University)The Singular Value Decomposition February 23, / 24
14 Existence and uniqueness - Proof Uniqueness To complete the uniqueness proof we note that, as indicated above, once σ 1, v 1, and u 1 are determined, the remainder of the SVD is determined by the action of A on the space orthogonal to v 1. Since v 1 is unique up to sign, this orthogonal space is uniquely defined, and the uniqueness of the remaining singular values and vectors now follows by induction. Radu Tiberiu Trîmbiţaş ( Babeş-Bolyai University)The Singular Value Decomposition February 23, / 24
15 Matrix Properties via the SVD Theorem The rank of A is r, the number of nonzero singular values. Proof. The rank of a diagonal matrix is equal to the number of its nonzero entries, and in the decomposition A = UΣV, U and V are off full rank. Therefore rank(a) = rank(σ) = r. Radu Tiberiu Trîmbiţaş ( Babeş-Bolyai University)The Singular Value Decomposition February 23, / 24
16 Matrix Properties via the SVD Theorem range(a) = u 1,..., u r and null(a) = v r+1,..., v n. Proof. This is a consequence of the fact that range(σ) = e 1,..., e r C m and null(σ) = e r+1,..., e n C n. Radu Tiberiu Trîmbiţaş ( Babeş-Bolyai University)The Singular Value Decomposition February 23, / 24
17 Matrix Properties via the SVD Theorem A 2 = σ 1 and A F = σ σ σr r. Proof. The first result was already established in the proof of Theorem 2: since A = UΣV with unitary U and V, A 2 = Σ 2 = max{ σ j } = σ 1. For the second, note that invariant under unitary multiplication, so A F = Σ F, and the conclusion follows. Radu Tiberiu Trîmbiţaş ( Babeş-Bolyai University)The Singular Value Decomposition February 23, / 24
18 Matrix Properties via the SVD Theorem The nonzero singular values of A are the square roots of the nonzero eigenvalues of A A or AA. (This matrices have the same nonzero eigenvalues.) Proof. From the calculation A A = (UΣV ) (UΣV ) = V Σ U UΣV = V (Σ Σ)V, we see that A A is similar to Σ Σ and hence has the same n eigenvalues. The eigenvalues of a diagonal matrix Σ Σ are σ 2 1, σ2 2,..., σ2 p, with n p additional zero eigenvalues if n > p. A similar calculation applies to the m eigenvalues of AA. Radu Tiberiu Trîmbiţaş ( Babeş-Bolyai University)The Singular Value Decomposition February 23, / 24
19 Matrix Properties via the SVD Theorem If A = A, then the singular values of A are the absolute values of the eigenvalues of A. Proof. A Hermitian = A has a complete set of orthogonal eigenvectors and real eigenvalues A = X ΛX 1 holds with X = Q unitary and Λ real diagonal. But we can write A = QΛQ = Q Λ sign(λ)q, (3) Since sign(λ)q is unitary whenever Q is unitary, (3) is a SVD of A, with singular values equal to the diagonal entries of Λ, λ j. If desired, these numbers can be put into nonincreasing order by inserting suitable permutation matrices as factors in the left-hand unitary matrix of (3), Q, and the right-hand unitary matrix, sign(λ)q. Radu Tiberiu Trîmbiţaş ( Babeş-Bolyai University)The Singular Value Decomposition February 23, / 24
20 Matrix Properties via the SVD Theorem For A C m m, det(a) = m i=1 σ i. Proof. The determinant of a product of square matrices is the product of the determinants of the factors. Furthermore, the determinant of a unitary matrix is always 1 in the absolute value; this follows from the formula UU = I and the property det(u ) = (det(u)). Therefore, det(a) = det(uσv ) = det(u) det(σ) det(v ) = det(σ) = m i=1 σ i. Radu Tiberiu Trîmbiţaş ( Babeş-Bolyai University)The Singular Value Decomposition February 23, / 24
21 Low-Rank Approximations The SVD can be written as a sum of rank-one matrices A = r σ j u j vj j=1 The best rank ν approximation of A in the 2-norm is A ν = r σ j u j vj j=1 with A A ν 2 = σ ν+1 Also true in the Frobenius norm, with A A ν F = σν σ2 r Radu Tiberiu Trîmbiţaş ( Babeş-Bolyai University)The Singular Value Decomposition February 23, / 24
22 Applications of the SVD Calculation of matrix properties: adu Tiberiu Trîmbiţaş ( Babeş-Bolyai University)The Singular Value Decomposition February 23, / 24
23 Applications of the SVD Calculation of matrix properties: Rank of matrix (counting σ j s > tolerance) adu Tiberiu Trîmbiţaş ( Babeş-Bolyai University)The Singular Value Decomposition February 23, / 24
24 Applications of the SVD Calculation of matrix properties: Rank of matrix (counting σ j s > tolerance) Bases for range and nullspace (in U and V ) adu Tiberiu Trîmbiţaş ( Babeş-Bolyai University)The Singular Value Decomposition February 23, / 24
25 Applications of the SVD Calculation of matrix properties: Rank of matrix (counting σ j s > tolerance) Bases for range and nullspace (in U and V ) Induced matrix norm 2 (= σ 1 ) adu Tiberiu Trîmbiţaş ( Babeş-Bolyai University)The Singular Value Decomposition February 23, / 24
26 Applications of the SVD Calculation of matrix properties: Rank of matrix (counting σ j s > tolerance) Bases for range and nullspace (in U and V ) Induced matrix norm 2 (= σ 1 ) Low-rank approximations (optimal in 2 and F ) adu Tiberiu Trîmbiţaş ( Babeş-Bolyai University)The Singular Value Decomposition February 23, / 24
27 Applications of the SVD Calculation of matrix properties: Rank of matrix (counting σ j s > tolerance) Bases for range and nullspace (in U and V ) Induced matrix norm 2 (= σ 1 ) Low-rank approximations (optimal in 2 and F ) Least squares fitting (more later, another option is QR) adu Tiberiu Trîmbiţaş ( Babeş-Bolyai University)The Singular Value Decomposition February 23, / 24
28 Applications of the SVD Calculation of matrix properties: Rank of matrix (counting σ j s > tolerance) Bases for range and nullspace (in U and V ) Induced matrix norm 2 (= σ 1 ) Low-rank approximations (optimal in 2 and F ) Least squares fitting (more later, another option is QR) Signal and image processing Radu Tiberiu Trîmbiţaş ( Babeş-Bolyai University)The Singular Value Decomposition February 23, / 24
29 Applications of the SVD Calculation of matrix properties: Rank of matrix (counting σ j s > tolerance) Bases for range and nullspace (in U and V ) Induced matrix norm 2 (= σ 1 ) Low-rank approximations (optimal in 2 and F ) Least squares fitting (more later, another option is QR) Signal and image processing Compression (see next slide) Radu Tiberiu Trîmbiţaş ( Babeş-Bolyai University)The Singular Value Decomposition February 23, / 24
30 Applications of the SVD Calculation of matrix properties: Rank of matrix (counting σ j s > tolerance) Bases for range and nullspace (in U and V ) Induced matrix norm 2 (= σ 1 ) Low-rank approximations (optimal in 2 and F ) Least squares fitting (more later, another option is QR) Signal and image processing Compression (see next slide) Noise removal (noise tends to have low σ j ) Radu Tiberiu Trîmbiţaş ( Babeş-Bolyai University)The Singular Value Decomposition February 23, / 24
31 Application: Image Compression View m n image as a (real) matrix A, find best rank ν approx. by SVD adu Tiberiu Trîmbiţaş ( Babeş-Bolyai University)The Singular Value Decomposition February 23, / 24
32 Application: Image Compression View m n image as a (real) matrix A, find best rank ν approx. by SVD Storage ν(m + n) instead of mn Radu Tiberiu Trîmbiţaş ( Babeş-Bolyai University)The Singular Value Decomposition February 23, / 24
33 Radu Tiberiu Trîmbiţaş ( Babeş-Bolyai University)The Singular Value Decomposition February 23, / 24
UNIT 6: The singular value decomposition.
UNIT 6: The singular value decomposition. María Barbero Liñán Universidad Carlos III de Madrid Bachelor in Statistics and Business Mathematical methods II 2011-2012 A square matrix is symmetric if A T
More informationSingular Value Decomposition
Chapter 5 Singular Value Decomposition We now reach an important Chapter in this course concerned with the Singular Value Decomposition of a matrix A. SVD, as it is commonly referred to, is one of the
More informationThe QR Factorization
The QR Factorization How to Make Matrices Nicer Radu Trîmbiţaş Babeş-Bolyai University March 11, 2009 Radu Trîmbiţaş ( Babeş-Bolyai University) The QR Factorization March 11, 2009 1 / 25 Projectors A projector
More informationApplied Mathematics 205. Unit II: Numerical Linear Algebra. Lecturer: Dr. David Knezevic
Applied Mathematics 205 Unit II: Numerical Linear Algebra Lecturer: Dr. David Knezevic Unit II: Numerical Linear Algebra Chapter II.3: QR Factorization, SVD 2 / 66 QR Factorization 3 / 66 QR Factorization
More informationbe a Householder matrix. Then prove the followings H = I 2 uut Hu = (I 2 uu u T u )u = u 2 uut u
MATH 434/534 Theoretical Assignment 7 Solution Chapter 7 (71) Let H = I 2uuT Hu = u (ii) Hv = v if = 0 be a Householder matrix Then prove the followings H = I 2 uut Hu = (I 2 uu )u = u 2 uut u = u 2u =
More informationMATH36001 Generalized Inverses and the SVD 2015
MATH36001 Generalized Inverses and the SVD 201 1 Generalized Inverses of Matrices A matrix has an inverse only if it is square and nonsingular. However there are theoretical and practical applications
More informationNotes on Eigenvalues, Singular Values and QR
Notes on Eigenvalues, Singular Values and QR Michael Overton, Numerical Computing, Spring 2017 March 30, 2017 1 Eigenvalues Everyone who has studied linear algebra knows the definition: given a square
More informationEigenvalue and Eigenvector Problems
Eigenvalue and Eigenvector Problems An attempt to introduce eigenproblems Radu Trîmbiţaş Babeş-Bolyai University April 8, 2009 Radu Trîmbiţaş ( Babeş-Bolyai University) Eigenvalue and Eigenvector Problems
More informationThe Singular Value Decomposition
The Singular Value Decomposition Philippe B. Laval KSU Fall 2015 Philippe B. Laval (KSU) SVD Fall 2015 1 / 13 Review of Key Concepts We review some key definitions and results about matrices that will
More informationSTAT 309: MATHEMATICAL COMPUTATIONS I FALL 2017 LECTURE 5
STAT 39: MATHEMATICAL COMPUTATIONS I FALL 17 LECTURE 5 1 existence of svd Theorem 1 (Existence of SVD) Every matrix has a singular value decomposition (condensed version) Proof Let A C m n and for simplicity
More informationNotes on singular value decomposition for Math 54. Recall that if A is a symmetric n n matrix, then A has real eigenvalues A = P DP 1 A = P DP T.
Notes on singular value decomposition for Math 54 Recall that if A is a symmetric n n matrix, then A has real eigenvalues λ 1,, λ n (possibly repeated), and R n has an orthonormal basis v 1,, v n, where
More informationComputational math: Assignment 1
Computational math: Assignment 1 Thanks Ting Gao for her Latex file 11 Let B be a 4 4 matrix to which we apply the following operations: 1double column 1, halve row 3, 3add row 3 to row 1, 4interchange
More informationLinear Algebra, part 3. Going back to least squares. Mathematical Models, Analysis and Simulation = 0. a T 1 e. a T n e. Anna-Karin Tornberg
Linear Algebra, part 3 Anna-Karin Tornberg Mathematical Models, Analysis and Simulation Fall semester, 2010 Going back to least squares (Sections 1.7 and 2.3 from Strang). We know from before: The vector
More informationAM 205: lecture 8. Last time: Cholesky factorization, QR factorization Today: how to compute the QR factorization, the Singular Value Decomposition
AM 205: lecture 8 Last time: Cholesky factorization, QR factorization Today: how to compute the QR factorization, the Singular Value Decomposition QR Factorization A matrix A R m n, m n, can be factorized
More information(a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? Solution: dim N(A) 1, since rank(a) 3. Ax =
. (5 points) (a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? dim N(A), since rank(a) 3. (b) If we also know that Ax = has no solution, what do we know about the rank of A? C(A)
More informationProblem set 5: SVD, Orthogonal projections, etc.
Problem set 5: SVD, Orthogonal projections, etc. February 21, 2017 1 SVD 1. Work out again the SVD theorem done in the class: If A is a real m n matrix then here exist orthogonal matrices such that where
More informationLecture notes: Applied linear algebra Part 1. Version 2
Lecture notes: Applied linear algebra Part 1. Version 2 Michael Karow Berlin University of Technology karow@math.tu-berlin.de October 2, 2008 1 Notation, basic notions and facts 1.1 Subspaces, range and
More informationThe Singular Value Decomposition (SVD) and Principal Component Analysis (PCA)
Chapter 5 The Singular Value Decomposition (SVD) and Principal Component Analysis (PCA) 5.1 Basics of SVD 5.1.1 Review of Key Concepts We review some key definitions and results about matrices that will
More informationMath 6610 : Analysis of Numerical Methods I. Chee Han Tan
Math 6610 : Analysis of Numerical Methods I Chee Han Tan Last modified : August 18, 017 Contents 1 Introduction 5 1.1 Linear Algebra.................................... 5 1. Orthogonal Vectors and Matrices..........................
More informationThe Singular Value Decomposition and Least Squares Problems
The Singular Value Decomposition and Least Squares Problems Tom Lyche Centre of Mathematics for Applications, Department of Informatics, University of Oslo September 27, 2009 Applications of SVD solving
More informationLinear Least Squares. Using SVD Decomposition.
Linear Least Squares. Using SVD Decomposition. Dmitriy Leykekhman Spring 2011 Goals SVD-decomposition. Solving LLS with SVD-decomposition. D. Leykekhman Linear Least Squares 1 SVD Decomposition. For any
More informationLinear Algebra Formulas. Ben Lee
Linear Algebra Formulas Ben Lee January 27, 2016 Definitions and Terms Diagonal: Diagonal of matrix A is a collection of entries A ij where i = j. Diagonal Matrix: A matrix (usually square), where entries
More informationMath 407: Linear Optimization
Math 407: Linear Optimization Lecture 16: The Linear Least Squares Problem II Math Dept, University of Washington February 28, 2018 Lecture 16: The Linear Least Squares Problem II (Math Dept, University
More informationSummary of Week 9 B = then A A =
Summary of Week 9 Finding the square root of a positive operator Last time we saw that positive operators have a unique positive square root We now briefly look at how one would go about calculating the
More informationLinear Algebra, part 3 QR and SVD
Linear Algebra, part 3 QR and SVD Anna-Karin Tornberg Mathematical Models, Analysis and Simulation Fall semester, 2012 Going back to least squares (Section 1.4 from Strang, now also see section 5.2). We
More informationMATH 612 Computational methods for equation solving and function minimization Week # 2
MATH 612 Computational methods for equation solving and function minimization Week # 2 Instructor: Francisco-Javier Pancho Sayas Spring 2014 University of Delaware FJS MATH 612 1 / 38 Plan for this week
More informationMaths for Signals and Systems Linear Algebra in Engineering
Maths for Signals and Systems Linear Algebra in Engineering Lectures 13 15, Tuesday 8 th and Friday 11 th November 016 DR TANIA STATHAKI READER (ASSOCIATE PROFFESOR) IN SIGNAL PROCESSING IMPERIAL COLLEGE
More informationσ 11 σ 22 σ pp 0 with p = min(n, m) The σ ii s are the singular values. Notation change σ ii A 1 σ 2
HE SINGULAR VALUE DECOMPOSIION he SVD existence - properties. Pseudo-inverses and the SVD Use of SVD for least-squares problems Applications of the SVD he Singular Value Decomposition (SVD) heorem For
More informationIntroduction to Numerical Linear Algebra II
Introduction to Numerical Linear Algebra II Petros Drineas These slides were prepared by Ilse Ipsen for the 2015 Gene Golub SIAM Summer School on RandNLA 1 / 49 Overview We will cover this material in
More information5 Selected Topics in Numerical Linear Algebra
5 Selected Topics in Numerical Linear Algebra In this chapter we will be concerned mostly with orthogonal factorizations of rectangular m n matrices A The section numbers in the notes do not align with
More informationReview of Some Concepts from Linear Algebra: Part 2
Review of Some Concepts from Linear Algebra: Part 2 Department of Mathematics Boise State University January 16, 2019 Math 566 Linear Algebra Review: Part 2 January 16, 2019 1 / 22 Vector spaces A set
More informationLinear Algebra Review. Vectors
Linear Algebra Review 9/4/7 Linear Algebra Review By Tim K. Marks UCSD Borrows heavily from: Jana Kosecka http://cs.gmu.edu/~kosecka/cs682.html Virginia de Sa (UCSD) Cogsci 8F Linear Algebra review Vectors
More informationMath 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination
Math 0, Winter 07 Final Exam Review Chapter. Matrices and Gaussian Elimination { x + x =,. Different forms of a system of linear equations. Example: The x + 4x = 4. [ ] [ ] [ ] vector form (or the column
More informationMATH 350: Introduction to Computational Mathematics
MATH 350: Introduction to Computational Mathematics Chapter V: Least Squares Problems Greg Fasshauer Department of Applied Mathematics Illinois Institute of Technology Spring 2011 fasshauer@iit.edu MATH
More informationDimensionality Reduction: PCA. Nicholas Ruozzi University of Texas at Dallas
Dimensionality Reduction: PCA Nicholas Ruozzi University of Texas at Dallas Eigenvalues λ is an eigenvalue of a matrix A R n n if the linear system Ax = λx has at least one non-zero solution If Ax = λx
More information1. Select the unique answer (choice) for each problem. Write only the answer.
MATH 5 Practice Problem Set Spring 7. Select the unique answer (choice) for each problem. Write only the answer. () Determine all the values of a for which the system has infinitely many solutions: x +
More informationlinearly indepedent eigenvectors as the multiplicity of the root, but in general there may be no more than one. For further discussion, assume matrice
3. Eigenvalues and Eigenvectors, Spectral Representation 3.. Eigenvalues and Eigenvectors A vector ' is eigenvector of a matrix K, if K' is parallel to ' and ' 6, i.e., K' k' k is the eigenvalue. If is
More informationTHE SINGULAR VALUE DECOMPOSITION MARKUS GRASMAIR
THE SINGULAR VALUE DECOMPOSITION MARKUS GRASMAIR 1. Definition Existence Theorem 1. Assume that A R m n. Then there exist orthogonal matrices U R m m V R n n, values σ 1 σ 2... σ p 0 with p = min{m, n},
More informationSymmetric and anti symmetric matrices
Symmetric and anti symmetric matrices In linear algebra, a symmetric matrix is a square matrix that is equal to its transpose. Formally, matrix A is symmetric if. A = A Because equal matrices have equal
More information18.06 Problem Set 10 - Solutions Due Thursday, 29 November 2007 at 4 pm in
86 Problem Set - Solutions Due Thursday, 29 November 27 at 4 pm in 2-6 Problem : (5=5+5+5) Take any matrix A of the form A = B H CB, where B has full column rank and C is Hermitian and positive-definite
More informationSingular Value Decomposition (SVD)
School of Computing National University of Singapore CS CS524 Theoretical Foundations of Multimedia More Linear Algebra Singular Value Decomposition (SVD) The highpoint of linear algebra Gilbert Strang
More informationSingular Value Decomposition
Singular Value Decomposition Motivatation The diagonalization theorem play a part in many interesting applications. Unfortunately not all matrices can be factored as A = PDP However a factorization A =
More informationSingular Value Decomposition (SVD) and Polar Form
Chapter 2 Singular Value Decomposition (SVD) and Polar Form 2.1 Polar Form In this chapter, we assume that we are dealing with a real Euclidean space E. Let f: E E be any linear map. In general, it may
More informationLinear Algebra: Matrix Eigenvalue Problems
CHAPTER8 Linear Algebra: Matrix Eigenvalue Problems Chapter 8 p1 A matrix eigenvalue problem considers the vector equation (1) Ax = λx. 8.0 Linear Algebra: Matrix Eigenvalue Problems Here A is a given
More informationReview problems for MA 54, Fall 2004.
Review problems for MA 54, Fall 2004. Below are the review problems for the final. They are mostly homework problems, or very similar. If you are comfortable doing these problems, you should be fine on
More informationComputational Methods. Eigenvalues and Singular Values
Computational Methods Eigenvalues and Singular Values Manfred Huber 2010 1 Eigenvalues and Singular Values Eigenvalues and singular values describe important aspects of transformations and of data relations
More informationSpectral Theorem for Self-adjoint Linear Operators
Notes for the undergraduate lecture by David Adams. (These are the notes I would write if I was teaching a course on this topic. I have included more material than I will cover in the 45 minute lecture;
More informationChapter 0 Miscellaneous Preliminaries
EE 520: Topics Compressed Sensing Linear Algebra Review Notes scribed by Kevin Palmowski, Spring 2013, for Namrata Vaswani s course Notes on matrix spark courtesy of Brian Lois More notes added by Namrata
More information5.6. PSEUDOINVERSES 101. A H w.
5.6. PSEUDOINVERSES 0 Corollary 5.6.4. If A is a matrix such that A H A is invertible, then the least-squares solution to Av = w is v = A H A ) A H w. The matrix A H A ) A H is the left inverse of A and
More informationLinGloss. A glossary of linear algebra
LinGloss A glossary of linear algebra Contents: Decompositions Types of Matrices Theorems Other objects? Quasi-triangular A matrix A is quasi-triangular iff it is a triangular matrix except its diagonal
More informationLinear Algebra - Part II
Linear Algebra - Part II Projection, Eigendecomposition, SVD (Adapted from Sargur Srihari s slides) Brief Review from Part 1 Symmetric Matrix: A = A T Orthogonal Matrix: A T A = AA T = I and A 1 = A T
More informationChapter 3 Transformations
Chapter 3 Transformations An Introduction to Optimization Spring, 2014 Wei-Ta Chu 1 Linear Transformations A function is called a linear transformation if 1. for every and 2. for every If we fix the bases
More informationLecture notes on Quantum Computing. Chapter 1 Mathematical Background
Lecture notes on Quantum Computing Chapter 1 Mathematical Background Vector states of a quantum system with n physical states are represented by unique vectors in C n, the set of n 1 column vectors 1 For
More informationELE/MCE 503 Linear Algebra Facts Fall 2018
ELE/MCE 503 Linear Algebra Facts Fall 2018 Fact N.1 A set of vectors is linearly independent if and only if none of the vectors in the set can be written as a linear combination of the others. Fact N.2
More informationLinear Algebra in Actuarial Science: Slides to the lecture
Linear Algebra in Actuarial Science: Slides to the lecture Fall Semester 2010/2011 Linear Algebra is a Tool-Box Linear Equation Systems Discretization of differential equations: solving linear equations
More informationMath Fall Final Exam
Math 104 - Fall 2008 - Final Exam Name: Student ID: Signature: Instructions: Print your name and student ID number, write your signature to indicate that you accept the honor code. During the test, you
More informationLinear Algebra Primer
Linear Algebra Primer David Doria daviddoria@gmail.com Wednesday 3 rd December, 2008 Contents Why is it called Linear Algebra? 4 2 What is a Matrix? 4 2. Input and Output.....................................
More informationEE731 Lecture Notes: Matrix Computations for Signal Processing
EE731 Lecture Notes: Matrix Computations for Signal Processing James P. Reilly c Department of Electrical and Computer Engineering McMaster University October 17, 005 Lecture 3 3 he Singular Value Decomposition
More informationReview of some mathematical tools
MATHEMATICAL FOUNDATIONS OF SIGNAL PROCESSING Fall 2016 Benjamín Béjar Haro, Mihailo Kolundžija, Reza Parhizkar, Adam Scholefield Teaching assistants: Golnoosh Elhami, Hanjie Pan Review of some mathematical
More informationProperties of Matrices and Operations on Matrices
Properties of Matrices and Operations on Matrices A common data structure for statistical analysis is a rectangular array or matris. Rows represent individual observational units, or just observations,
More information18.06SC Final Exam Solutions
18.06SC Final Exam Solutions 1 (4+7=11 pts.) Suppose A is 3 by 4, and Ax = 0 has exactly 2 special solutions: 1 2 x 1 = 1 and x 2 = 1 1 0 0 1 (a) Remembering that A is 3 by 4, find its row reduced echelon
More informationFall TMA4145 Linear Methods. Exercise set Given the matrix 1 2
Norwegian University of Science and Technology Department of Mathematical Sciences TMA445 Linear Methods Fall 07 Exercise set Please justify your answers! The most important part is how you arrive at an
More informationThe University of Texas at Austin Department of Electrical and Computer Engineering. EE381V: Large Scale Learning Spring 2013.
The University of Texas at Austin Department of Electrical and Computer Engineering EE381V: Large Scale Learning Spring 2013 Assignment Two Caramanis/Sanghavi Due: Tuesday, Feb. 19, 2013. Computational
More informationLinear Algebra Review. Fei-Fei Li
Linear Algebra Review Fei-Fei Li 1 / 37 Vectors Vectors and matrices are just collections of ordered numbers that represent something: movements in space, scaling factors, pixel brightnesses, etc. A vector
More informationNumerical Methods I Singular Value Decomposition
Numerical Methods I Singular Value Decomposition Aleksandar Donev Courant Institute, NYU 1 donev@courant.nyu.edu 1 MATH-GA 2011.003 / CSCI-GA 2945.003, Fall 2014 October 9th, 2014 A. Donev (Courant Institute)
More informationforms Christopher Engström November 14, 2014 MAA704: Matrix factorization and canonical forms Matrix properties Matrix factorization Canonical forms
Christopher Engström November 14, 2014 Hermitian LU QR echelon Contents of todays lecture Some interesting / useful / important of matrices Hermitian LU QR echelon Rewriting a as a product of several matrices.
More informationMath Final December 2006 C. Robinson
Math 285-1 Final December 2006 C. Robinson 2 5 8 5 1 2 0-1 0 1. (21 Points) The matrix A = 1 2 2 3 1 8 3 2 6 has the reduced echelon form U = 0 0 1 2 0 0 0 0 0 1. 2 6 1 0 0 0 0 0 a. Find a basis for the
More informationPrincipal Component Analysis
Machine Learning Michaelmas 2017 James Worrell Principal Component Analysis 1 Introduction 1.1 Goals of PCA Principal components analysis (PCA) is a dimensionality reduction technique that can be used
More informationChapters 5 & 6: Theory Review: Solutions Math 308 F Spring 2015
Chapters 5 & 6: Theory Review: Solutions Math 308 F Spring 205. If A is a 3 3 triangular matrix, explain why det(a) is equal to the product of entries on the diagonal. If A is a lower triangular or diagonal
More informationMath 408 Advanced Linear Algebra
Math 408 Advanced Linear Algebra Chi-Kwong Li Chapter 4 Hermitian and symmetric matrices Basic properties Theorem Let A M n. The following are equivalent. Remark (a) A is Hermitian, i.e., A = A. (b) x
More informationSTA141C: Big Data & High Performance Statistical Computing
STA141C: Big Data & High Performance Statistical Computing Numerical Linear Algebra Background Cho-Jui Hsieh UC Davis May 15, 2018 Linear Algebra Background Vectors A vector has a direction and a magnitude
More informationChapter 7: Symmetric Matrices and Quadratic Forms
Chapter 7: Symmetric Matrices and Quadratic Forms (Last Updated: December, 06) These notes are derived primarily from Linear Algebra and its applications by David Lay (4ed). A few theorems have been moved
More informationIV. Matrix Approximation using Least-Squares
IV. Matrix Approximation using Least-Squares The SVD and Matrix Approximation We begin with the following fundamental question. Let A be an M N matrix with rank R. What is the closest matrix to A that
More informationCheat Sheet for MATH461
Cheat Sheet for MATH46 Here is the stuff you really need to remember for the exams Linear systems Ax = b Problem: We consider a linear system of m equations for n unknowns x,,x n : For a given matrix A
More informationLecture 10 - Eigenvalues problem
Lecture 10 - Eigenvalues problem Department of Computer Science University of Houston February 28, 2008 1 Lecture 10 - Eigenvalues problem Introduction Eigenvalue problems form an important class of problems
More informationAMS526: Numerical Analysis I (Numerical Linear Algebra)
AMS526: Numerical Analysis I (Numerical Linear Algebra) Lecture 16: Eigenvalue Problems; Similarity Transformations Xiangmin Jiao Stony Brook University Xiangmin Jiao Numerical Analysis I 1 / 18 Eigenvalue
More informationft-uiowa-math2550 Assignment NOTRequiredJustHWformatOfQuizReviewForExam3part2 due 12/31/2014 at 07:10pm CST
me me ft-uiowa-math2550 Assignment NOTRequiredJustHWformatOfQuizReviewForExam3part2 due 12/31/2014 at 07:10pm CST 1. (1 pt) local/library/ui/eigentf.pg A is n n an matrices.. There are an infinite number
More informationSTA141C: Big Data & High Performance Statistical Computing
STA141C: Big Data & High Performance Statistical Computing Lecture 5: Numerical Linear Algebra Cho-Jui Hsieh UC Davis April 20, 2017 Linear Algebra Background Vectors A vector has a direction and a magnitude
More informationLinear Algebra Review. Fei-Fei Li
Linear Algebra Review Fei-Fei Li 1 / 51 Vectors Vectors and matrices are just collections of ordered numbers that represent something: movements in space, scaling factors, pixel brightnesses, etc. A vector
More informationMatrix Factorization and Analysis
Chapter 7 Matrix Factorization and Analysis Matrix factorizations are an important part of the practice and analysis of signal processing. They are at the heart of many signal-processing algorithms. Their
More informationIr O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )
Section 3.2 Theorem 3.6. Let A be an m n matrix of rank r. Then r m, r n, and, by means of a finite number of elementary row and column operations, A can be transformed into the matrix ( ) Ir O D = 1 O
More informationLet A an n n real nonsymmetric matrix. The eigenvalue problem: λ 1 = 1 with eigenvector u 1 = ( ) λ 2 = 2 with eigenvector u 2 = ( 1
Eigenvalue Problems. Introduction Let A an n n real nonsymmetric matrix. The eigenvalue problem: EIGENVALE PROBLEMS AND THE SVD. [5.1 TO 5.3 & 7.4] Au = λu Example: ( ) 2 0 A = 2 1 λ 1 = 1 with eigenvector
More information2. LINEAR ALGEBRA. 1. Definitions. 2. Linear least squares problem. 3. QR factorization. 4. Singular value decomposition (SVD) 5.
2. LINEAR ALGEBRA Outline 1. Definitions 2. Linear least squares problem 3. QR factorization 4. Singular value decomposition (SVD) 5. Pseudo-inverse 6. Eigenvalue decomposition (EVD) 1 Definitions Vector
More informationProblem # Max points possible Actual score Total 120
FINAL EXAMINATION - MATH 2121, FALL 2017. Name: ID#: Email: Lecture & Tutorial: Problem # Max points possible Actual score 1 15 2 15 3 10 4 15 5 15 6 15 7 10 8 10 9 15 Total 120 You have 180 minutes to
More informationLinear Algebra Methods for Data Mining
Linear Algebra Methods for Data Mining Saara Hyvönen, Saara.Hyvonen@cs.helsinki.fi Spring 2007 1. Basic Linear Algebra Linear Algebra Methods for Data Mining, Spring 2007, University of Helsinki Example
More informationMath 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008
Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008 Exam 2 will be held on Tuesday, April 8, 7-8pm in 117 MacMillan What will be covered The exam will cover material from the lectures
More information7. Symmetric Matrices and Quadratic Forms
Linear Algebra 7. Symmetric Matrices and Quadratic Forms CSIE NCU 1 7. Symmetric Matrices and Quadratic Forms 7.1 Diagonalization of symmetric matrices 2 7.2 Quadratic forms.. 9 7.4 The singular value
More informationReview of Linear Algebra
Review of Linear Algebra Definitions An m n (read "m by n") matrix, is a rectangular array of entries, where m is the number of rows and n the number of columns. 2 Definitions (Con t) A is square if m=
More informationMatrix Analysis and Algorithms
Matrix Analysis and Algorithms Andrew Stuart Jochen Voss 4th August 2009 2 Introduction The three basic problems we will address in this book are as follows. In all cases we are given as data a matrix
More informationLinear Algebra. Session 12
Linear Algebra. Session 12 Dr. Marco A Roque Sol 08/01/2017 Example 12.1 Find the constant function that is the least squares fit to the following data x 0 1 2 3 f(x) 1 0 1 2 Solution c = 1 c = 0 f (x)
More informationAMS526: Numerical Analysis I (Numerical Linear Algebra for Computational and Data Sciences)
AMS526: Numerical Analysis I (Numerical Linear Algebra for Computational and Data Sciences) Lecture 1: Course Overview; Matrix Multiplication Xiangmin Jiao Stony Brook University Xiangmin Jiao Numerical
More informationConceptual Questions for Review
Conceptual Questions for Review Chapter 1 1.1 Which vectors are linear combinations of v = (3, 1) and w = (4, 3)? 1.2 Compare the dot product of v = (3, 1) and w = (4, 3) to the product of their lengths.
More informationMatrix decompositions
Matrix decompositions Zdeněk Dvořák May 19, 2015 Lemma 1 (Schur decomposition). If A is a symmetric real matrix, then there exists an orthogonal matrix Q and a diagonal matrix D such that A = QDQ T. The
More informationft-uiowa-math2550 Assignment OptionalFinalExamReviewMultChoiceMEDIUMlengthForm due 12/31/2014 at 10:36pm CST
me me ft-uiowa-math255 Assignment OptionalFinalExamReviewMultChoiceMEDIUMlengthForm due 2/3/2 at :3pm CST. ( pt) Library/TCNJ/TCNJ LinearSystems/problem3.pg Give a geometric description of the following
More informationFunctional Analysis Review
Outline 9.520: Statistical Learning Theory and Applications February 8, 2010 Outline 1 2 3 4 Vector Space Outline A vector space is a set V with binary operations +: V V V and : R V V such that for all
More informationThis can be accomplished by left matrix multiplication as follows: I
1 Numerical Linear Algebra 11 The LU Factorization Recall from linear algebra that Gaussian elimination is a method for solving linear systems of the form Ax = b, where A R m n and bran(a) In this method
More informationStat 159/259: Linear Algebra Notes
Stat 159/259: Linear Algebra Notes Jarrod Millman November 16, 2015 Abstract These notes assume you ve taken a semester of undergraduate linear algebra. In particular, I assume you are familiar with the
More informationSingular Value Decomposition
Singular Value Decomposition CS 205A: Mathematical Methods for Robotics, Vision, and Graphics Doug James (and Justin Solomon) CS 205A: Mathematical Methods Singular Value Decomposition 1 / 35 Understanding
More informationMaths for Signals and Systems Linear Algebra in Engineering
Maths for Signals and Systems Linear Algebra in Engineering Lecture 18, Friday 18 th November 2016 DR TANIA STATHAKI READER (ASSOCIATE PROFFESOR) IN SIGNAL PROCESSING IMPERIAL COLLEGE LONDON Mathematics
More informationLEC 2: Principal Component Analysis (PCA) A First Dimensionality Reduction Approach
LEC 2: Principal Component Analysis (PCA) A First Dimensionality Reduction Approach Dr. Guangliang Chen February 9, 2016 Outline Introduction Review of linear algebra Matrix SVD PCA Motivation The digits
More information