CSE 252B: Computer Vision II

Similar documents
CSE 252B: Computer Vision II

Multiple View Geometry in Computer Vision

CS4495/6495 Introduction to Computer Vision. 3D-L3 Fundamental matrix

Lecture 5. Epipolar Geometry. Professor Silvio Savarese Computational Vision and Geometry Lab. 21-Jan-15. Lecture 5 - Silvio Savarese

Multiple View Geometry in Computer Vision

A Study of Kruppa s Equation for Camera Self-calibration

EPIPOLAR GEOMETRY WITH MANY DETAILS

Vision par ordinateur

A Practical Method for Decomposition of the Essential Matrix

Visual Object Recognition

Final Exam Due on Sunday 05/06

Trinocular Geometry Revisited

Induced Planar Homologies in Epipolar Geometry

Tracking for VR and AR

1. Projective geometry

Vision 3D articielle Session 2: Essential and fundamental matrices, their computation, RANSAC algorithm

Multiple View Geometry in Computer Vision

Single view metrology

CS 231A Computer Vision (Fall 2011) Problem Set 2

Outline. Linear Algebra for Computer Vision

Camera Models and Affine Multiple Views Geometry

Segmentation of Dynamic Scenes from the Multibody Fundamental Matrix

PAijpam.eu EPIPOLAR GEOMETRY WITH A FUNDAMENTAL MATRIX IN CANONICAL FORM Georgi Hristov Georgiev 1, Vencislav Dakov Radulov 2

Parameterizing the Trifocal Tensor

The Multibody Trifocal Tensor: Motion Segmentation from 3 Perspective Views

Dynamic P n to P n Alignment

Lecture 6 Stereo Systems Multi-view geometry

Pose estimation from point and line correspondences

Mirrors in motion: Epipolar geometry and motion estimation

Camera Self-Calibration Using the Singular Value Decomposition of the Fundamental Matrix

Control de robots y sistemas multi-robot basado en visión

On the Absolute Quadratic Complex and its Application to Autocalibration

Scale & Affine Invariant Interest Point Detectors

Multi-Frame Factorization Techniques

LECTURE 7, WEDNESDAY

arxiv: v2 [cs.cv] 22 Mar 2019

Rotational Invariants for Wide-baseline Stereo

Algorithms for Computing a Planar Homography from Conics in Correspondence

Augmented Reality VU Camera Registration. Prof. Vincent Lepetit

Basics of Epipolar Geometry

On Geometric Error for Homographies

Conditions for Segmentation of Motion with Affine Fundamental Matrix

Camera Calibration The purpose of camera calibration is to determine the intrinsic camera parameters (c 0,r 0 ), f, s x, s y, skew parameter (s =

Robotics - Homogeneous coordinates and transformations. Simone Ceriani

Camera calibration by Zhang

Lecture 13: Orthogonal projections and least squares (Section ) Thang Huynh, UC San Diego 2/9/2018

Self-Calibration of a Moving Camera by Pre-Calibration

Lines and points. Lines and points

Optimisation on Manifolds

3-D Projective Moment Invariants

Multiple forces or velocities influencing an object, add as vectors.

Two-View Segmentation of Dynamic Scenes from the Multibody Fundamental Matrix

The calibrated trifocal variety

Computer Vision Motion

Lecture 24: Goldreich-Levin Hardcore Predicate. Goldreich-Levin Hardcore Predicate

Evaluation, transformation, and parameterization of epipolar conics

Sparse Levenberg-Marquardt algorithm.

MAT 1339-S14 Class 10 & 11

What Tasks Can Be Performed with an Uncalibrated Stereo Vision System?

Diagram Techniques for Multiple View Geometry

3D Computer Vision - WT 2004

Optimization on the Manifold of Multiple Homographies

Uncertainty Models in Quasiconvex Optimization for Geometric Reconstruction

Recovering Unknown Focal Lengths in Self-Calibration: G.N. Newsam D.Q. Huynh M.J. Brooks H.-P. Pan.

CATADIOPTRIC PROJECTIVE GEOMETRY: THEORY AND APPLICATIONS. Christopher Michael Geyer. Computer and Information Science

Congruences and Concurrent Lines in Multi-View Geometry

Multiview Geometry and Bundle Adjustment. CSE P576 David M. Rosen

Visual SLAM Tutorial: Bundle Adjustment

Linear Algebra MATH20F Midterm 1

CE 59700: Digital Photogrammetric Systems

Linear Systems. Carlo Tomasi

Two-View Multibody Structure from Motion

Camera: optical system

Essential Matrix Estimation via Newton-type Methods

6.801/866. Affine Structure from Motion. T. Darrell

The structure tensor in projective spaces

High Accuracy Fundamental Matrix Computation and Its Performance Evaluation

16. Analysis and Computation of the Intrinsic Camera Parameters

A geometric interpretation of the homogeneous coordinates is given in the following Figure.

Chapter 13: Vectors and the Geometry of Space

Chapter 13: Vectors and the Geometry of Space

Determining the Translational Speed of a Camera from Time-Varying Optical Flow

Unsupervised learning: beyond simple clustering and PCA

Theory of Bouguet s MatLab Camera Calibration Toolbox

4.4 Graphs of Logarithmic Functions

VISUAL SERVO TRACKING CONTROL VIA A LYAPUNOV-BASED APPROACH

SIAM Conference on Applied Algebraic Geometry Daejeon, South Korea, Irina Kogan North Carolina State University. Supported in part by the

Homogeneous Coordinates

Structure from Motion. CS4670/CS Kevin Matzen - April 15, 2016

Generalized Gradient Descent Algorithms

Quadrifocal Tensor. Amnon Shashua and Lior Wolf. The Hebrew University, Jerusalem 91904, Israel.

Reconstruction from projections using Grassmann tensors

MAT 1302B Mathematical Methods II

GEOMETRY FOR 3D COMPUTER VISION

Factorization with Uncertainty

0. Introduction 1 0. INTRODUCTION

1.3 LECTURE 3. Vector Product

Lecture 2: Lattices and Bases

Part 1: You are given the following system of two equations: x + 2y = 16 3x 4y = 2

Introduction to pinhole cameras

Transcription:

CSE 252B: Computer Vision II Lecturer: Serge Belongie Scribe: Hamed Masnadi Shirazi, Solmaz Alipour LECTURE 5 Relationships between the Homography and the Essential Matrix 5.1. Introduction In practice, especially when the scene is piecewise planar, we often need to compute the essential matrix E given a homography H computed from some four points known to be coplanar. The terminology is that H is induced by a plane P. In some other cases, the essential matrix E may have been already estimated using points in general position, and we then want to compute the homography for a particular set of coplanar points. There are special relationships between H and E that let us go between them easily. 5.2. Special Properties between H and E Recalling the definition of essential matrix, E = T R, and H = R + 1 d T N 1 Department of Computer Science and Engineering, University of California, San Diego. April 12, 2004 1

2 SERGE BELONGIE, CSE 252B: COMPUTER VISION II Without loss of generality we can write: H = R + T u where R R 3 3 is a general nonsingular 3 3 matrix not necessarily a rotation matrix so the following results apply to both the calibrated and uncalibrated cases. Also assume T, u R 3 and T = 1. For the assumed H and E, we have: (a) E = T H (b) H E + E H = 0 (c) H = T E + T v for some v R 3. The proofs for each item are as follows. (a) (b) also T H = T (R + T u ) = T R + T T u (Note: T T = 0) = T R = E H E = (R + T u ) T R = R T R + ut T R (again, T T = 0) = R T R E H = (H E) = (R T R) = R T R ( T skew-symmetric) = R T R The above result shows that H E is a skew symmetric matrix. Alternatively, this can be shown as follows. Start with the epipolar constraint x 2 Ex 1 = 0. Substituting x 2 = Hx 1, we get x 1 H Ex 1 = 0 for any x 1, which means H E is skew symmetric. (See MaSKS Exercise 2.5.) (c) Notice that

LECTURE 5. RELATIONSHIPS BETWEEN THE HOMOGRAPHY AND THE ESSENTIAL MATRIX3 T H = T R = T T T R (when T = 1 then T T T = T ) = T T E ( T R = E ) (Recall that T is rank deficient, so T T I.) Therefore T (H T E) = 0. That is, all the columns of (H T E) are parallel to T, and hence we have (H T E) = T v for some v R 3. T v = T 1 T 2 [ ] v 1 v 2 v 3 T 3 H = T E + T v for some v R 3 Since R R 3 3 is not necessarily in SO(3) these results hold for E and F, where F = K EK 1 is the (uncalibrated) fundamental matrix. 5.3. From Homography to the Essential Matrix Given the homography H our task is to find E given two points p 1 and p 2 not on the plane P from which H was induced. We denote the corresponding images of p j by x j i in view i = 1, 2. This is illustrated by figure 1. From item (a) we have E = T H where T l 1 2l 2 2 and T = 1. We know that l i 2 x i 2Hx i 1, i = 1, 2. This is expressing the epipolar lines l 1 2 and l 2 2 by crossing two points on each line, Hx 1 1 with x 1 2 and Hx 2 1 with x 2 2, respectively. We also know that l 1 2l 2 2 is the intersection of the two epipolar lines and remember that all epipolar lines intersect at the epipole. All that remains is to realize that the epipole is T up to a scale factor: e 2 = l 1 2l 2 2 T The above method can be called the 4 + 2 point algorithm since it uses 4 points on a plane and 2 points off the plane. Let s look at an example. In figure 2 we see that applying the homography matrix to points on the left image and averaging the result with the right image gives us the third image and it is seen that the coplanar points on the paper are correctly transferred but the points not on the paper surface, like the mug, form a ghosted image. Points that are farther away from the plane have more disparity along the epipolar lines. This is known as plane-induced parallax.

4 SERGE BELONGIE, CSE 252B: COMPUTER VISION II Figure 1. Figure from Ch. 5 of MaSKS. The original caption reads: A Homography H transfers two points x 1 1 and x 2 1 in the first image to two points Hx 1 1 and Hx 2 1 on the same epipolar lines as the respective true images x 1 2 and x 2 2 if the corresponding 3-D points p 1 and p 2 are not on the plane P from which H is induced. 5.4. From the Essential Matrix to Homography Now consider the opposite situation, where the essential matrix E and three points (x i 1, x i 2), i = 1, 2, 3 are given and we want to compute the homography. The homography induced by the plane specified by the three points is H = T E + T v where T can be found from E as previously shown and v R 3 solves the following system of equations x i 2( T E + T v )x i 1 = 0 i = 1, 2, 3. Notice that the above equation is similar to x i 2Hx i 1 = 0, except that H is constrained to have the given form. The method for solving for v is left as an exercise. While one can find an H for any four points on the image (using the four point algorithm), one can only guarantee H will map a point to an epipolar line if it has the form T E+T v. Such an H is consistent or compatible with E. We will use this method for epipolar (stereo) rectification.

LECTURE 5. RELATIONSHIPS BETWEEN THE HOMOGRAPHY AND THE ESSENTIAL MATRIX5 Figure 2. Example of 4 + 2 point algorithm. (Figure from Hartley & Zisserman.) 5.5. Epipolar Rectification In a stereo rectification problem we have two views of a scene. In order to further simplify the search for corresponding points, it is desirable to apply projective transformations to both images so that all epipolar lines correspond to the horizontal scan lines. This entails finding two linear transformations, say H 1 and H 2, that map the epipoles to infinity. MaSKS Algorithm 11.9 shows us how to do epipolar rectification. (a) Compute E (or F ) and e 2. Finding epipoles from E can be done by finding the right and left singular vectors of E. (b) Map e 2 to infinity to make epipolar lines parallel (Hartley, 1997) using H 2. There is a family of H 2 s that will do this, parametrized by v R 3. H = T E + T v Which one to choose? Find the H 2 such that H 2 e 2 [ 1 0 0 ]

6 SERGE BELONGIE, CSE 252B: COMPUTER VISION II Figure 3. Example of Epipolar Rectification and where H 2 is as close as possible to a rigid body transformation. So first define G T R 3 3 as G T = 1 0 o x 0 1 o y 0 0 1 which translates the image center (o x, o y ) to the origin. Now choose G R SO(3) to do a rotation around the Z-axis so as to put the translated epipole onto the x-axis G R G T e 2 = [ x e 0 1 ] Finally pick G R 3 3 defined as 1 0 0 G = 0 1 0 1/x e 0 1 This sends the epipole to infinity. Choosing H 2 = GG R G T R 3 3 completes rectification for the second view. (c) Find an H compatible with E (or F ). Here is where we use the method of Section 5.4 for finding H from E. Use the the least squares version of H = T E +T v for multiple points and choose an H that minimizes the distortion induced by the rectification transformation.

LECTURE 5. RELATIONSHIPS BETWEEN THE HOMOGRAPHY AND THE ESSENTIAL MATRIX7 (d) Compute the matching homography: H 1 = H 2 H. (e) Apply H 1 and H 2 to the left and right images, respectively. Figure 3 shows the original and rectified versions of two images. It can be seen that corresponding points in the two images are on horizontal scan lines. The above algorithm will not work if the camera is moving toward the image, in which case the epipole is inside the image. Pollefeys has developed methods to deal with that case.