Main matrix factorizations

Similar documents
AMS526: Numerical Analysis I (Numerical Linear Algebra)

AMS526: Numerical Analysis I (Numerical Linear Algebra for Computational and Data Sciences)

Solution of Linear Equations

Applied Mathematics 205. Unit II: Numerical Linear Algebra. Lecturer: Dr. David Knezevic

Singular Value Decomposition

AM 205: lecture 8. Last time: Cholesky factorization, QR factorization Today: how to compute the QR factorization, the Singular Value Decomposition

Conceptual Questions for Review

Linear Algebra Review. Vectors

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

Notes on Eigenvalues, Singular Values and QR

Dimensionality Reduction: PCA. Nicholas Ruozzi University of Texas at Dallas

18.06SC Final Exam Solutions

Linear Algebra Review

Linear Algebra, part 3 QR and SVD

lecture 2 and 3: algorithms for linear algebra

nonlinear simultaneous equations of type (1)

Review of similarity transformation and Singular Value Decomposition

Linear Subspace Models

Preliminary/Qualifying Exam in Numerical Analysis (Math 502a) Spring 2012

There are six more problems on the next two pages

1 Singular Value Decomposition and Principal Component

Principal Component Analysis (PCA)

(a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? Solution: dim N(A) 1, since rank(a) 3. Ax =

Parallel Singular Value Decomposition. Jiaxing Tan

FINITE-DIMENSIONAL LINEAR ALGEBRA

Example: Face Detection

Orthonormal Transformations and Least Squares

Matrix decompositions

Applied Linear Algebra in Geoscience Using MATLAB

Index. for generalized eigenvalue problem, butterfly form, 211

Numerical Methods in Matrix Computations

UNIT 6: The singular value decomposition.

Applied Linear Algebra in Geoscience Using MATLAB

1 Number Systems and Errors 1

PDE Model Reduction Using SVD

STA141C: Big Data & High Performance Statistical Computing

Block Bidiagonal Decomposition and Least Squares Problems

Lecture 6. Numerical methods. Approximation of functions

Review problems for MA 54, Fall 2004.

Numerical Linear Algebra

Lecture: Face Recognition and Feature Reduction

Computational Methods. Eigenvalues and Singular Values

Lecture 1: Systems of linear equations and their solutions

Linear Algebra Methods for Data Mining

APPLIED NUMERICAL LINEAR ALGEBRA

The Mathematics of Facial Recognition

BlockMatrixComputations and the Singular Value Decomposition. ATaleofTwoIdeas

B553 Lecture 5: Matrix Algebra Review

Introduction to Applied Linear Algebra with MATLAB

Lecture: Face Recognition and Feature Reduction

Positive Definite Matrix

Linear Algebra. Brigitte Bidégaray-Fesquet. MSIAM, September Univ. Grenoble Alpes, Laboratoire Jean Kuntzmann, Grenoble.

3D Computer Vision - WT 2004

EECS 275 Matrix Computation

Problem Set (T) If A is an m n matrix, B is an n p matrix and D is a p s matrix, then show

Practical Linear Algebra: A Geometry Toolbox

Linear Algebra & Geometry why is linear algebra useful in computer vision?

Linear Algebra, part 3. Going back to least squares. Mathematical Models, Analysis and Simulation = 0. a T 1 e. a T n e. Anna-Karin Tornberg

STA141C: Big Data & High Performance Statistical Computing

Homework 2 Foundations of Computational Math 2 Spring 2019

Linear Methods in Data Mining

Numerical Methods I Eigenvalue Problems

Orthogonal Transformations

Numerical linear algebra

lecture 3 and 4: algorithms for linear algebra

I. Multiple Choice Questions (Answer any eight)

Parallel Numerical Algorithms

Linear Algebra Primer

Face Recognition. Face Recognition. Subspace-Based Face Recognition Algorithms. Application of Face Recognition

Numerical Methods I Singular Value Decomposition

Vector and Matrix Norms. Vector and Matrix Norms

Maths for Signals and Systems Linear Algebra in Engineering

The QR Factorization

Glossary of Linear Algebra Terms. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB

EECS490: Digital Image Processing. Lecture #26

AMS526: Numerical Analysis I (Numerical Linear Algebra)

Department of Aerospace Engineering AE602 Mathematics for Aerospace Engineers Assignment No. 4

Large-scale eigenvalue problems

Linear Equations and Matrix

spring, math 204 (mitchell) list of theorems 1 Linear Systems Linear Transformations Matrix Algebra

Manning & Schuetze, FSNLP (c) 1999,2000

Lecture 24: Principal Component Analysis. Aykut Erdem May 2016 Hacettepe University

Numerical Linear Algebra Primer. Ryan Tibshirani Convex Optimization /36-725

Vector Spaces, Orthogonality, and Linear Least Squares

Linear Algebra March 16, 2019

Applied Linear Algebra

Lecture 2: Numerical linear algebra

AMS526: Numerical Analysis I (Numerical Linear Algebra)

linearly indepedent eigenvectors as the multiplicity of the root, but in general there may be no more than one. For further discussion, assume matrice

Principal Component Analysis and Singular Value Decomposition. Volker Tresp, Clemens Otte Summer 2014

Singular Value Decomposition

Course Notes: Week 1

Lecture 9: Numerical Linear Algebra Primer (February 11st)

MS&E 318 (CME 338) Large-Scale Numerical Optimization

The Singular Value Decomposition (SVD) and Principal Component Analysis (PCA)

The SVD-Fundamental Theorem of Linear Algebra

MATH 423 Linear Algebra II Lecture 33: Diagonalization of normal operators.

A fast randomized algorithm for overdetermined linear least-squares regression

Roundoff Error. Monday, August 29, 11

Bindel, Fall 2016 Matrix Computations (CS 6210) Notes for

Transcription:

Main matrix factorizations A P L U P permutation matrix, L lower triangular, U upper triangular Key use: Solve square linear system Ax b. A Q R Q unitary, R upper triangular Key use: Solve square or overdetrmined linear systems Ax b. A U V & U, V unitary, diagonal with non-increasing, non-negative elements Key uses: Overdetrmined linear systems Understand effect of matrix-vector product A x. Extract and compress information contained in A.

A P L U Solve A x b P L U x b L U x P T b Splits original system into two successive triangular ones L y P T b, U x y Stable, Factorization cost O(N 3 ) operations Can be obtained directly (Crout or Doolittle), or as a byproduct of Gaussian elimination Cost for each RHS: O(N 2 ) operations One back substitution

A Q R A square or rectangular: A Significance: Q R, A Q R (1) Solve A x b Q R x b R x Q T b ; Slightly higher cost than A P L U; however no pivoting needed. (2) Solve (least squares) overdetermined linear systems Normal equations A * A x A * b often ill conditioned. If starting with A in the form A Q R: R & Q & Q R x R & Q & b Identity matrix; vanishes R & R x R & b. Now R * vanishes analytically, then well conditioned.

A U Σ V * SVD Applies to of any shape: A m%n If m < n : A Reduced form: U V & A U V & If m > n : A Reduced form: U V & Alternate illustration of reduced form: A U V &

Existence and uniqueness of SVD factorization Theorem: Every A m,n matrix has an SVD factorization A U The singular values are uniquely determined j V & For A square: If the are distinct, then the singular vectors are uniquely j determined apart from complex factors of magnitude one.

SVD - The effect of performing A x 2 x 2 illustration 2 u 1 u 2 1 ( v & 1 ) ( v & 2 ) A U Σ V * The range of A x, when A is a 2 x 2 matrix and x is a unit length vector, is an ellipse with semiaxes and ; the pre-images are v 1 and v 2. 1 u 1 2 u 2 The concept generalizes to all sizes of A.

The four fundamental subspaces of a matrix Every matrix A has four fundamental subspaces: Row space: Space spanned by all the rows of A Null space: Space of all column vectors v such that A v 0 Column space: Space spanned by all the columns of A Left null space: Space spanned by all row vectors u such that u A 0

Orthogonality of the four fundamental subspaces The orthogonality of the four fundamental subspaces. An immediate consequence of the SVD decomposition of any matrix A.

A can be written as a sum of r rank one matrices Recall A U V & A u 1 1 v 1 & + u 2 2 v 2 & + + u r r v r & & & Let U i u i v i, U j u j v j, i!j, - Then every column (row) of U i is orthogonal to every column (row) of U j. - If we re-order U i, U j into single columns, they become orthogonal. Key use of SVD: If the k decay fast, expansion can be truncated early Data compression Feature extraction

First example of data compression Top left panel: A 256x256 image (matrix) Below: Its singular values: Remaining panels: The image (matrix) reconstruction when keeping 20, 50, and 75 out of the 256 singular values. However:.jpg etc. are more effective for image compression - utilize more application specific features.

Extraction of translation and rotation information Left panel: A signature, digitized as a 2 x 262 matrix of x,y - coordinates at 262 sample points. Computation: Form the SVD decomposition of A: A U Σ V * : U 0.9851 0.1718 0.1718 0.9851, 73.79 19.52 Right panel: Principle axis and (hyper) ellipse associated with U and Σ.

Facial image recognition A comparison of some different biometrics: Dependent on the application, facial image recognition may be a reasonable compromise between reliability, acceptance, and cost.

Data set for facial image recognition Two images in 'Training set' of around 600 images. Each image 100x100 pixels rearranged into a 10,000 x 1 vector.... All the data together forms a 10000 x 600 matrix: A 10,000% 600

Average face Previous two faces Average face Caricatures - faces with average subtracted

SVD of matrix A based on caricatures A U V & ; Plot of 2 To each singular value corresponds an 'eigenface': 1 st 3 rd 20 th 40 th Orthogonal eigenfaces form a very efficient base for expressing real faces Vast data reduction from length 10,000 vectors to less than about 100 coefficients.

Representation of pictures not seen before Find coefficients such that Xl 1 u 1 + + r u r Projection of X on space of eigenfaces New data vector X Reconstructions with 10 and 40 eigenfaces, resp. Best fit to the rose within the range of the training set

Recognizing faces Facial image not in training Its coefficients, and those Identified image in set that match the best from the training set the training set Clearly the same person Many practical issues not discussed here, incl. corrections for perspective, lightning, changes in hair style, eye glasses, etc.

Numerical computation of the SVD factorization A BAD way: If, then A U V & A & A (U V & ) & (U V & ) V & U & U V & V ( & ) V & u (A & A) V V ( & ) Hermitean Diagonal I u The columns of V are the eigenvectors of A & A The singular values are i i where i are the eigenvalues of A & A. U follows directly from A U V &.

Numerical computation of the SVD factorization A GOOD way (for A of size up to around 1000 x 1000) Compare with QR algorithm for eigenvalues: A phase 1 d d d direct, O(n 3 ) op. phase 2 d d d iterative, O(n) iter, O(n 2 ) op / iter. Full Upper Hessenberg Upper triangular Now, for the SVD: A phase 1 d d d direct, O(n 3 ) op. phase 2 d d d iterative, O(n) iter, O(n) op / iter. Full Upper triangular Diagonal bi-diagonal Key tool in both phases in both cases: Householder transformations. For A sparse and very large: Effective iterative methods available for leading singular values / vectors (Lanczos and subspace iteration methods).

Comparison of SVD vs. eigenvalues / eigenvectors SVD Eigenvalues / eigenvectors Formulas: A U V & A X X 1 Properties: U, V unitary X typically not unitary diagonal, non-negative, (only the case if A 'normal', i.e. AA * A * A) ordered may be contain 'Jordan boxes' Uses: Reveals 'Character' of A, and of Matrix powers A k, functions of A its fundamental subspaces, Linear systems of ODEs Gives most general solution Stability analysis to Ax b, whatever shape A is. Feature extraction Data compression

A few references SVD Trefethen, L.N. and Bau, D., Numerical Linear Algebra, SIAM (1997) Golub, G.H. and van Loan, C., Matrix Computations, The Johns Hopkins University press (1996) Fornberg, B. and Herbst, B., Modeling in Applied Mathematics, very preliminary manuscript online at http://amath.colorado.edu/courses/4380/2009fall/manuscript.pdf