Segmentation of Subspace Arrangements III Robust GPCA

Similar documents
Part I Generalized Principal Component Analysis

The Multibody Trifocal Tensor: Motion Segmentation from 3 Perspective Views

Subspace Arrangements in Theory and Practice

Segmentation of Dynamic Scenes from the Multibody Fundamental Matrix

Vision par ordinateur

ESTIMATION OF SUBSPACE ARRANGEMENTS: ITS ALGEBRA AND STATISTICS ALLEN YANG YANG

Two-View Segmentation of Dynamic Scenes from the Multibody Fundamental Matrix

Sparse Subspace Clustering

Generalized Principal Component Analysis (GPCA)

A New GPCA Algorithm for Clustering Subspaces by Fitting, Differentiating and Dividing Polynomials

Nonrobust and Robust Objective Functions

Two-View Multibody Structure from Motion

LECTURE NOTE #10 PROF. ALAN YUILLE

FILTRATED ALGEBRAIC SUBSPACE CLUSTERING

Vision 3D articielle Session 2: Essential and fundamental matrices, their computation, RANSAC algorithm

ECE521 week 3: 23/26 January 2017

Machine Learning for Data Science (CS4786) Lecture 12

CS 4495 Computer Vision Principle Component Analysis

Robust Motion Segmentation by Spectral Clustering

Outliers Robustness in Multivariate Orthogonal Regression

SEQUENTIAL SUBSPACE FINDING: A NEW ALGORITHM FOR LEARNING LOW-DIMENSIONAL LINEAR SUBSPACES.

Generalized Principal Component Analysis (GPCA): an Algebraic Geometric Approach to Subspace Clustering and Motion Segmentation. René Esteban Vidal

A Brief Overview of Robust Statistics

Robust scale estimation with extensions

SLAM Techniques and Algorithms. Jack Collier. Canada. Recherche et développement pour la défense Canada. Defence Research and Development Canada

Conditions for Segmentation of Motion with Affine Fundamental Matrix

Data Exploration and Unsupervised Learning with Clustering

Machine Learning. CUNY Graduate Center, Spring Lectures 11-12: Unsupervised Learning 1. Professor Liang Huang.

Introduction to Data Mining

A Closed-Form Solution to Tensor Voting for Robust Parameter Estimation via Expectation-Maximization

Fast RANSAC with Preview Model Parameters Evaluation

Generalized Principal Component Analysis

Tensor LRR Based Subspace Clustering

Modeling Classes of Shapes Suppose you have a class of shapes with a range of variations: System 2 Overview

CS4495/6495 Introduction to Computer Vision. 8B-L2 Principle Component Analysis (and its use in Computer Vision)

Automatic Camera Model Selection for Multibody Motion Segmentation

Machine Learning for Signal Processing Sparse and Overcomplete Representations

Introduction to Machine Learning. PCA and Spectral Clustering. Introduction to Machine Learning, Slides: Eran Halperin

A minimal solution to the autocalibration of radial distortion

MACHINE LEARNING. Methods for feature extraction and reduction of dimensionality: Probabilistic PCA and kernel PCA

CS 540: Machine Learning Lecture 1: Introduction

Computer Vision Group Prof. Daniel Cremers. 6. Mixture Models and Expectation-Maximization

Gaussian Mixture Models with Component Means Constrained in Pre-selected Subspaces

CSE446: non-parametric methods Spring 2017

Improved Fast Gauss Transform. Fast Gauss Transform (FGT)

Uncertainty Models in Quasiconvex Optimization for Geometric Reconstruction

Camera Calibration The purpose of camera calibration is to determine the intrinsic camera parameters (c 0,r 0 ), f, s x, s y, skew parameter (s =

OBJECT DETECTION AND RECOGNITION IN DIGITAL IMAGES

Machine Learning for Signal Processing Sparse and Overcomplete Representations. Bhiksha Raj (slides from Sourish Chaudhuri) Oct 22, 2013

CS 3710: Visual Recognition Describing Images with Features. Adriana Kovashka Department of Computer Science January 8, 2015

System 1 (last lecture) : limited to rigidly structured shapes. System 2 : recognition of a class of varying shapes. Need to:

Fantope Regularization in Metric Learning

Kaggle.

COMS 4721: Machine Learning for Data Science Lecture 19, 4/6/2017

Linear Regression (continued)

Fitting Linear Statistical Models to Data by Least Squares: Introduction

Sparse Gaussian Markov Random Field Mixtures for Anomaly Detection

9. Robust regression

Grassmann Averages for Scalable Robust PCA Supplementary Material

PCA and admixture models

Lecture 11: Unsupervised Machine Learning

Algebraic Clustering of Affine Subspaces

LECTURE 16: PCA AND SVD

Non-convex Robust PCA: Provable Bounds

Information Theory in Computer Vision and Pattern Recognition

MULTICHANNEL SIGNAL PROCESSING USING SPATIAL RANK COVARIANCE MATRICES

CS 6375 Machine Learning

ON THE CALCULATION OF A ROBUST S-ESTIMATOR OF A COVARIANCE MATRIX

Lecture: Face Recognition and Feature Reduction

CSE 252B: Computer Vision II

Least Squares Optimization

PCA & ICA. CE-717: Machine Learning Sharif University of Technology Spring Soleymani

ROBUST MULTIPLE-VIEW GEOMETRY ESTIMATION BASED ON GMM

CS168: The Modern Algorithmic Toolbox Lecture #8: How PCA Works

Machine Learning - MT & 14. PCA and MDS

Introduction to Nonlinear Image Processing

Multivariate Statistics (I) 2. Principal Component Analysis (PCA)

Data reduction for multivariate analysis

Dimension Reduction Techniques. Presented by Jie (Jerry) Yu

Data Mining and Analysis: Fundamental Concepts and Algorithms

Affine Structure From Motion

CS181 Midterm 2 Practice Solutions

ε ε

Identification of Switched MIMO ARX models

ALGEBRAIC GROUPS. Disclaimer: There are millions of errors in these notes!

Mixtures of Robust Probabilistic Principal Component Analyzers

Principal Component Analysis. Applied Multivariate Statistics Spring 2012

Cheng Soon Ong & Christian Walder. Canberra February June 2018

Lecture: Face Recognition and Feature Reduction

Statistical Machine Learning

Degenerate Expectation-Maximization Algorithm for Local Dimension Reduction

LINEAR MODELS FOR CLASSIFICATION. J. Elder CSE 6390/PSYC 6225 Computational Modeling of Visual Perception

Unsupervised Learning Techniques Class 07, 1 March 2006 Andrea Caponnetto

These slides follow closely the (English) course textbook Pattern Recognition and Machine Learning by Christopher Bishop

Advances in Computer Vision. Prof. Bill Freeman. Image and shape descriptors. Readings: Mikolajczyk and Schmid; Belongie et al.

Lecture 13. Principal Component Analysis. Brett Bernstein. April 25, CDS at NYU. Brett Bernstein (CDS at NYU) Lecture 13 April 25, / 26

Application of Numerical Algebraic Geometry to Geometric Data Analysis

Pattern Recognition and Machine Learning

Conjugate Gradient algorithm. Storage: fixed, independent of number of steps.

Lecture 8: Interest Point Detection. Saad J Bedros

Transcription:

Segmentation of Subspace Arrangements III Robust GPCA Berkeley CS 294-6, Lecture 25 Dec. 3, 2006

Generalized Principal Component Analysis (GPCA): (an overview) x V 1 V 2 (x 3 = 0)or(x 1 = x 2 = 0) {x 1x 3 = 0, x 2x 3 = 0}. Veronese Map: Given N samples x 1,..., x N R 3, x3 V2 R 3 V1. L 2 = [ν2(x 1),..., ν 2(x N )] R M[3] 2 N (x 1 ) 2 (x 1 x 2 ) (x = 1 x 3 ) (x 2 ) 2 (x 2 x 3 ) (x 3 ) 2 x1 x2 c1 = [0, 0, 1, 0, 0, 0] The null space of L 2 is c 2 = [0, 0, 0, 0, 1, 0] p1 = c1ν2(x) = x1x3 p 2 = c 2ν 2(x) = x 2x 3 P(x) =. [p 1(x) p 2(x)] = [ [x 1x 3, x 2x 3], then x3 ] 0 xp = [ xp 1 xp 2 ] = 0 x 3 x 1 x 2. xp at one sample per subspace gives normal vectors that span V 1 and V 2 : [ x = [a, b, 0] T 0 0 ] V 1 xp x = 0 0 V 1. a b Segment samples and recover V 1 and V 2. y = [0, 0, c] T V 2 xp y = [ c 0 0 c 0 0 ] V 2.

1 Multibody Epipolar Constraint Multibody Fundamental Matrix 2 GPCA-Voting Noise issue GPCA-Voting Comparison 3 Robust GPCA Outlier Issue Robustifying GPCA via Influence and MVT Comparison 4 Applications Affine Motion Detection Vanishing-Point Detection 5 Conclusion and Discussion Conclusion Future Directions

Multibody Fundamental Matrix Given K rigid bodies, an image correspondence (x 1, x 2) satisfies: f (x 1, x 2) = (x T 2 F1x1)(xT 2 F2x1) (xt 2 F K x 1) = 0. Using the kronecker product: f (x 1, x 2) = {(x 1 x 2) T F s 1 } {(x1 x2)t F s K }. We treat z = x 1 x 2 R 9 as the new feature vector, then f (x 1, x 2) = ν K (z) T c. A second way to rewrite the bilinear constraint: f (x 1, x 2) is a linear constraint with respect to x 1 and x 2, respectively: f (x 1, x 2) = f x2 (x 1) = c T x 2 ν K (x 1); f (x 1, x 2) = f x1 (x 2) = c T x 1 ν K (x 2). Hence, f (x 1, x 2) can be rewritten by applying the Veronese map individually: Which representation is more compact? 1 K = 2: For c, M [9] 2 = ( 2+9 1 2 f (x 1, x 2) = ν K (x 2) T Fν K (x 1), where F R M[3] K M[3] K. ) = 45. For F, (M [3] 2 )2 = 6 2 = 36. 2 K = 4: For c, M [9] 4 = 495. For F, (M [3] 4 )2 = 15 2 = 225. Choose the second one. F is called the multibody fundamental matrix, comparing to the (single) fundamental matrix F.

Segmentation of Multibody Motion The vanishing polynomial f (x 1, x 2) is solved as the following: f (x 1, x 2) = ν K (x 2) T Fν K (x 1) = (ν K (x 1) ν K (x 2)) T F s. x f (x 2 1, x 2) = ( K ) i=1 j i xt 2 F j x 1 (F i x 1). ( ) Suppose (x 1, x 2) on Object k, then j i xt 2 F j x 1 = 0 for all i k: f (x 1, x 2) = x T 2 x F j x 1 (F k x 1) (F k x 1) l 2 k. 2 j k Hence, x 2 f (x 1, x 2) l 2 k e2 k. Given K rigid body motions, there exist K different epipoles e 1,, e K in the second view such that: which is a standard subspace-segmentation problem. (e T 1 l)(et 2 l) (et K l) = 0,

PDA on noisy data GPCA-Voting: A Stable Implementation V1 V2 R D νn(x) = R M[D] n R M[D] n p(x) = c T x Null(Ln) x = = R D V1 V2 The noise affects the algebraic PDA process: 1 The data matrix L K (V ) is always full-rank. Solution: Use SVD to estimate Null(L K ). 2 How to choose one point per subspace as the representative? Solution: Rule of thumb is to pick samples far away from the origin and intersections. 3 Even with a good sample, evaluation of P is still perturbed away from the true position. Solution: We propose a voting algorithm to evaluate P at all samples.

A Voting Scheme Goal: 1 Averaging xp at more samples of a subspace. 2 Recover correct rank of xp. Difficulty: Do not know which samples belong to the same subspace, yet. GPCA-Voting (a simple example) 1 Assume subspaces (2, 1, 1) in R 3. 2 h I (3) = 4 vanishing polynomials xp R 3 4. 3 Vote on rank-1 & rank-2 codimensions with a tolerance threshold τ 4 Average normal vectors associated with highest votes. 5 (optional) Iteratively refine the segmentation via EM or K-Subspaces.

Simulation Results 1 Illustrations (a) 8% (b) 12% (c) 16% (a) 8% (b) 12% (c) 16% Figure: (2, 1, 1) R 3. Figure: (2, 2, 1) R 3. 2 Segmentation simulations Table: Segmentation errors. 4% Gaussian noise is added. Subspace Dimensions EM K-Subspaces PDA Voting Voting+K-Subspaces (2, 2, 1) in R 3 29% 27% 13.2% 6.4% 5.4% (4, 2, 2, 1) in R 5 53% 57% 39.8% 5.7% 5.7% (4, 4, 4, 4) in R 5 20% 25% 25.3% 17% 11%

Outlier Issue GPCA process: V1 V2 R D νn(x) = R M[D] n R M[D] n p(x) = c T x Null(Ln) x = = V1 V2 R D Rank(Ln) = M n [D] hi (n) Breakdown of GPCA is 0% because breakdown of PCA is 0%: a large outlier can arbitrarily perturb Null(L n) V2 R D R M[D] n R M[D] n V1 νn(x) = Null(Ln) = p(x) = c T x Seek a robust PCA to estimate Null(L n), where L n = [ν n(x 1),, ν n(x N )].

Three approaches to tackle outliers: 1 Probability-based: small-probability samples. Probability plots: [Healy 1968, Cox 1968] PCs: [Rao 1964, Ganadesikan & Kettenring 1972] M-estimators: [Huber 1981, Campbell 1980] multivariate trimming (MVT): [Ganadesikan & Kettenring 1972] 2 Influence-based: large influence on model parameters. Parameter difference with and without a sample: [Hampel et al. 1986, Critchley 1985] 3 Consensus-based: not consistent with models of high consensus. Hough: [Ballard 1981, Lowe 1999] RANSAC: [Fischler & Bolles 1981, Torr 1997] Least Median Estimate (LME): [Rousseeuw 1984, Steward 1999]

Robust GPCA STEP 1: Given the outlier percentage α%, robustify PCA: Influence function: 1 Compute null space C = {c 1, c 2,..., c m} for L n = [ν n(x 1) ν n(x N )]. 2 For x i, compute C (i) for L (i) n = [ν n(x 1) î ν n(x N )]. 3 I (x i ). = C, C (i). 4 Reject top α% samples with highest influence. Multivariate-trimming (MVT): Assuming a Gaussian distribution, samples with large Mahalanobis distance more likely to be outliers. 1 Compute a robust mean ū. v i = u i ū. u i, v i R M[D] n 2 Initialize Σ 0 = I [D]. M n M[D] n 3 In kth iteration, sort v 1,..., v N by the Mahalanobis distance: d i = v T i Σ 1 k 1 v i. 4 Update Σ k from (100 α)% samples with smallest distances. 5 Iteration stops when Σ k 1 Σ k is small.

STEP 2: Estimating the outlier percentage α%: Do we need the exact percentage for estimation? out- (a) 16% liers (b) 7% rejected (c) 38% rejected (d) Maximal sample residuals. 1 With noise and outliers present, unnecessary to distinguish outliers close to subspaces. 2 Robust PCA is moderately stable when the outlier percentage is over-estimated. Outlier Percentage Test based on the Influence Function Principle Further rejection only results in small changes in the model parameters and sample residuals (w.r.t. boundary threshold σ), i.e., the arrangement model stabilizes. Influence

Outline Multibody Epipolar Constraint GPCA-Voting Robust GPCA Applications Conclusion Simulations on Robust GPCA (parameters fixed at τ = 0.3rad and σ = 0.4) RGPCA-Influence (e) 12% (f) 32% (g) 48% (h) 12% (i) 32% (j) 48% (l) 32% (m) 48% (n) 12% (o) 32% (p) 48% RGPCA-MVT (k) 12%

Comparison with RANSAC Accuracy (q) (2, 2, 1) in R 3 (r) (4, 2, 2, 1) in R 5 (s) (5, 5, 5) in R 6 Speed Table: Average time of RANSAC and RGPCA with 24% outliers. Arrangement (2,2,1) in R 3 (4,2,2,1) in R 5 (5,5,5) in R 6 RANSAC 44s 5.1m 3.4m MVT 46s 23m 8m Influence 3m 58m 146m

Limitations of RGPCA 1 Hardware limits for high subspace dimension (> 10) or subspace number (> 6) in MATLAB. 2 Need to know the number of subspaces and dimensions. 3 Overfitting when percentage is overestimated, especially for MVT. = Animation Next section, we show solutions to these limitations via a new lossy coding framework.

Outline Multibody Epipolar Constraint GPCA-Voting Robust GPCA Applications Conclusion Experiment 1: Motion Segmentation under 3-D Affine Projection Sequences: parking-lot segway toys man segway3 RANSAC: MVT: Influence: Feature extraction: Shi and Tomasi. Good features to track. CVPR 1994.

Outline Multibody Epipolar Constraint GPCA-Voting Robust GPCA Applications Experiment 2: Vanishing-Point Detection RGPCA-Influence Images: Segments: Influence: Feature extraction: Kahn et al. A fast line finder for vision-guided robot navigation. PAMI, 1990. Conclusion

Outline Multibody Epipolar Constraint GPCA-Voting Robust GPCA Applications RGPCA-MVT Images: Segments: MVT: Conclusion

Outline Multibody Epipolar Constraint GPCA-Voting Robust GPCA Applications RANSAC-on-Subspaces Images: Segments: RANSAC: Conclusion

Conclusions Estimation of hybrid subspace models is closely related to the study of subspace arrangements in algebraic geometry. Global structure of K subspaces uniquely determined by Kth degree vanishing polynomials. Two algorithms were proposed using vanishing polynomials as a global signature 1 Noise: GPCA-Voting. 2 Outliers: RGPCA. Confluence of Algebra and Statistics In estimation of hybrid subspace models: Algebra makes statistical algorithms well-conditioned; Statistics makes algebraic algorithms robust. G PC A

Future Directions Mathematics: More complex models [Rao et al. ICCV 2005]. 1 A union of quadratic surfaces. 2 A mixture of linear subspaces and quadratic surfaces. Kernel GPCA: tackle the curse of dimensionality in Veronese embedding. V1 V2 R D R M[D] n R M[D] n p(x) = c T x V1 V2 R D νn(x) = Null(Ln) x = = Rank(Ln) = M n [D] hi (n) Model selection: a unified scheme to tackle simultaneous model estimation and selection. Applications 1 Natural image compression and classification. 2 Hyper-spectral image analysis.