3D Computer Vision - WT 2004

Similar documents
Least squares Solution of Homogeneous Equations

The Singular Value Decomposition

Lecture 4.3 Estimating homographies from feature correspondences. Thomas Opsahl

1 Linearity and Linear Systems

The Singular Value Decomposition (SVD) and Principal Component Analysis (PCA)

Introduction to SVD and Applications

(a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? Solution: dim N(A) 1, since rank(a) 3. Ax =

Singular Value Decomposition

Linear Algebra Review. Vectors

Singular value decomposition

Linear Least Squares. Using SVD Decomposition.

be a Householder matrix. Then prove the followings H = I 2 uut Hu = (I 2 uu u T u )u = u 2 uut u

1. The Polar Decomposition

Linear Algebra, part 3 QR and SVD

The Singular Value Decomposition and Least Squares Problems

Jordan Normal Form and Singular Decomposition

Linear Algebra (Review) Volker Tresp 2017

Chapter 7: Symmetric Matrices and Quadratic Forms

Linear Algebra, part 3. Going back to least squares. Mathematical Models, Analysis and Simulation = 0. a T 1 e. a T n e. Anna-Karin Tornberg

Linear Algebra (Review) Volker Tresp 2018

linearly indepedent eigenvectors as the multiplicity of the root, but in general there may be no more than one. For further discussion, assume matrice

Background Mathematics (2/2) 1. David Barber

Singular Value Decomposition (SVD)

Main matrix factorizations

Final Exam, Linear Algebra, Fall, 2003, W. Stephen Wilson

Review of Some Concepts from Linear Algebra: Part 2

AMS526: Numerical Analysis I (Numerical Linear Algebra)

Multiple View Geometry in Computer Vision

14 Singular Value Decomposition

. = V c = V [x]v (5.1) c 1. c k

Applied Mathematics 205. Unit II: Numerical Linear Algebra. Lecturer: Dr. David Knezevic

Notes on singular value decomposition for Math 54. Recall that if A is a symmetric n n matrix, then A has real eigenvalues A = P DP 1 A = P DP T.

σ 11 σ 22 σ pp 0 with p = min(n, m) The σ ii s are the singular values. Notation change σ ii A 1 σ 2

Matrix decompositions

COMP 558 lecture 18 Nov. 15, 2010

Lecture 6, Sci. Comp. for DPhil Students

Lecture 6. Numerical methods. Approximation of functions

Linear Algebra Review. Fei-Fei Li

Singular Value Decomposition

Assignment #10: Diagonalization of Symmetric Matrices, Quadratic Forms, Optimization, Singular Value Decomposition. Name:

Foundations of Computer Vision

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

Computational Methods. Eigenvalues and Singular Values

Econ Slides from Lecture 7

Linear Algebra Review. Fei-Fei Li

Linear Algebra & Geometry why is linear algebra useful in computer vision?

1. Linear systems of equations. Chapters 7-8: Linear Algebra. Solution(s) of a linear system of equations (continued)

Eigenvalues and diagonalization

Problem # Max points possible Actual score Total 120

Math 224, Fall 2007 Exam 3 Thursday, December 6, 2007

Lecture: Face Recognition and Feature Reduction

Introduction to Data Mining

7. Symmetric Matrices and Quadratic Forms

LEC 2: Principal Component Analysis (PCA) A First Dimensionality Reduction Approach

I. Multiple Choice Questions (Answer any eight)

Linear Algebra. Session 12

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

Linear Algebra in Actuarial Science: Slides to the lecture

Lecture 15, 16: Diagonalization

In this section again we shall assume that the matrix A is m m, real and symmetric.

EECS 442 Discussion. Arash Ushani. September 20, Arash Ushani EECS 442 Discussion September 20, / 14

Review of Linear Algebra

Linear Algebra - Part II

Chap 3. Linear Algebra

Solution of Linear Equations

18.06 Quiz 2 April 7, 2010 Professor Strang

Example: Face Detection

Lecture: Face Recognition and Feature Reduction

Review of similarity transformation and Singular Value Decomposition

Summary of Week 9 B = then A A =

UNIT 6: The singular value decomposition.

Vector and Matrix Norms. Vector and Matrix Norms

Applied Linear Algebra in Geoscience Using MATLAB

Linear Algebra & Geometry why is linear algebra useful in computer vision?

Conceptual Questions for Review

Lecture 13. Principal Component Analysis. Brett Bernstein. April 25, CDS at NYU. Brett Bernstein (CDS at NYU) Lecture 13 April 25, / 26

Least Squares Optimization

Lecture 02 Linear Algebra Basics

1 Singular Value Decomposition and Principal Component

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

Glossary of Linear Algebra Terms. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB

Algorithms for Computing a Planar Homography from Conics in Correspondence

AMS526: Numerical Analysis I (Numerical Linear Algebra for Computational and Data Sciences)

Linear Algebra Fundamentals

Linear Algebra- Final Exam Review

Singular Value Decomposition. 1 Singular Value Decomposition and the Four Fundamental Subspaces

Math Review: parameter estimation. Emma

18.06 Problem Set 10 - Solutions Due Thursday, 29 November 2007 at 4 pm in

Large Scale Data Analysis Using Deep Learning

Elementary Linear Algebra Review for Exam 2 Exam is Monday, November 16th.

Least Squares Optimization

Least Squares Optimization

Singular Value Decomposition

Image Registration Lecture 2: Vectors and Matrices

CS 143 Linear Algebra Review

Spring 2014 Math 272 Final Exam Review Sheet

2. Every linear system with the same number of equations as unknowns has a unique solution.

CSE 252B: Computer Vision II

AM 205: lecture 8. Last time: Cholesky factorization, QR factorization Today: how to compute the QR factorization, the Singular Value Decomposition

Throughout these notes we assume V, W are finite dimensional inner product spaces over C.

Transcription:

3D Computer Vision - WT 2004 Singular Value Decomposition Darko Zikic CAMP - Chair for Computer Aided Medical Procedures November 4, 2004

1 2 3 4 5

Properties For any given matrix A R m n there exists a decomposition A = UDV T such that U is an m n matrix with orthogonal columns D is a n n diagonal matrix with non-negative entries V T is an n n orthogonal matrix

SVD - Visualized Outline Properties A = U D V T A is an m n matrix U is an m n matrix with orthogonal columns D is a n n diagonal matrix with non-negative entries V T is an n n orthogonal matrix

Properties The diagonal values of D are called Singular Values of A The column vectors of U are the Left Singular Vectors of A The column vectors of V are the Right Singular Vectors of A.

Properties Outline Properties The SVD can be performed s.t. the diagonal values of D are descending i.e. d 1 d 2... d n 0. We will assume that the SVD is always performed in that way The diagonal values of D are the square roots of the Eigenvalues of A T A and AA T (Hence the non-negativity of the elements of D)

Some More Properties Properties It holds for the left singular vectors u i : A T Au i = d 2 i u i It holds for the right singular vectors v i : AA T v i = d 2 i v i The left singular vectors u i are eigenvectors of A T A The right singular vectors v i are eigenvectors of AA T

Even More Properties Properties SVD explicitly constructs orthonormal bases for the null-space and the range of a matrix columns of U corresponding to non-zero elements of D span the range columns of V corresponding to zero elements of D span the null-space

Properties Galore Outline Properties SVD allows a rank decision: rank(a) is the largest r s.t. d r > 0 there are m r left singular vectors corresponding to the singular value 0. there are n r right singular vectors corresponding to the singular value 0.

Linear Right Singular Vector Properties Minimization Solving a Problem SVD can be used for linear optimization by using the following property Let v n be the right singular vector corresponding to d n (the smallest element of D) The product Ax with x 2 = 1 has the minimal value for x = v n.

Minimization by SVD Right Singular Vector Properties Minimization Solving a Problem The minimizing property of the last right singular vector v n can be used to solve the following minimization task Given the linear function f = Ax, f : R n R m to be minimized (in most applications m >> n) With the constraint that the solution x is not trivial (x 0) (We will assume that x 2 = 1)

Minimization by SVD II Right Singular Vector Properties Minimization Solving a Problem The minimization problem is thus minimize Ax s.t. x 2 = 1 It can be shown that the solution is the right singular vector x = v n corresponding to the smallest singular value d n.

Proof Outline Right Singular Vector Properties Minimization Solving a Problem Problem: minimize Ax 2 subject to x 2 = 1 Because of the orthogonality of U and V we have Ax 2 = UDV T x 2 = DV T 2 x 2 = V T x 2 Hence have to minimize DV T x 2 subject to V T x 2 = 1 With V T x = y we have: minimize Dy 2 subject to y 2 = 1 Since D diagonal with descending entries we get y = (0, 0,..., 0, 1) T Since V T x = y x = V y we get x = v n

What s next...? Outline Right Singular Vector Properties Minimization Solving a Problem So in order to solve a linear minimization problem by SVD we have to do two things 1 State it in the form minimize Ax s.t. x 2 = 1 2 Compute the SVD A = UDV T and take the last right singular vector v n as the solution

Fitting Lines Outline Fitting Lines Further Notes Task: Given a set of n noisy points {p i }, find the line l that goes through them We know: In homogeneous coordinates we have p i T l = 0 if the point lies on the line For all points we get: p 1 p 2. p 3 l = 0

Fitting Lines II Outline Fitting Lines Further Notes Since the points are noisy we can t satisfy the equation The best we can do is to get the minimal solution Since we are not interested in the trivial solution l = 0 we set l 2 = 1

Fitting Lines III Outline Fitting Lines Further Notes So the problem is with minimize P l s.t. l 2 = 1 P = p 1 p 2. p 3 We solve it by applying the SVD-method

DLT Algorithm Outline Fitting Lines Further Notes Computing the homography H between two images (next lecture) Solution by applying the SVD Only trick: State the problem the right way i.e.: Represent the homography as a vector h 0 ( h 2 = 1) Find a linear function A s.t. minimizing Ah does exactly what you want

Linear Triangulation Fitting Lines Further Notes Reconstructing the real world structure from two or more images (coming soon in the lecture) Boiled down that means finding the world point X by back projecting two image points x and x

Notes Outline Fitting Lines Further Notes Minimization by SVD is extremely simple. The tricky part is to state the problem the right way. The error minimized by the SVD is called Algebraic Error or Algebraic Distance A drawback of the algebraic error is that it is geometrically meaningless, so that minimizing it can lead to completely meaningless results

SVD Black Box Outline SVD Black Box Computation Properties Implementation in Matlab Only thing left is to show how an SVD of a matrix A can be computed But we won t do that. We ll use SVD as a black box Computation involves QR procedure and Householder reduction Original algorithm by Golub and Reinsch

Computation Properties SVD Black Box Computation Properties Implementation in Matlab Algorithm is extremely stable Computation time for SVD of an m n A: Computation of U, V and D: 4m 2 n + 8mn 2 + 9n 3 Computation of V and D: 4mn 2 + 8n 3 Keep in mind that in most cases m >> n

Implementation in Matlab SVD Black Box Computation Properties Implementation in Matlab Matlab has the command [U,S,V] = SVD(X) Attention: The command returns V and not V T Hence it holds that X = U*S*V

Summary Properties of the SVD Linear minimization using SVD

References Hartley and Zisserman. Computer Vision Walter Gander. Fitting Data by Least Squares - Algorithms and Gander and Hrebicek. Solving Problems in Scientific Computing using Maple and Matlab Vachenauer. Höhere Mathematik 1 Press et. al. Numerical Recepies in C Golub and van Loan. Matrix Computations