Scalable Subspace Clustering

Size: px
Start display at page:

Download "Scalable Subspace Clustering"

Transcription

1 Scalable Subspace Clustering René Vidal Center for Imaging Science, Laboratory for Computational Sensing and Robotics, Institute for Computational Medicine, Department of Biomedical Engineering, Johns Hopkins University

2 High-Dimensional Data In many areas, we deal with high-dimensional data Computer vision Medical imaging Medical robotics Signal processing Bioinformatics

3 Low-Dimensional Manifolds Face clustering and classification Lossy image representation S 1 S 2 Motion segmentation DT segmentation Video segmentation 3

4 Subspace Clustering Problem Given a set of points lying in multiple subspaces, identify The number of subspaces and their dimensions A basis for each subspace The segmentation of the data points Challenges Model selection Nonconvex Combinatorial More challenges Noise Outliers Missing entries

5 Subspace Clustering Problem: Challenges Even more challenges Angles between subspaces are small Nearby points are in different subspaces S 2 S Percentage of subspace pairs Hopkins 155 Extended YaleB Percentage of data points Hopkins 155 Extended YaleB Subspace angle (degree) Number of nearest neighbors

6 Prior Work: Sparse and Low-Rank Methods Approach Data are self-expressive Global affinity by convex optimization Representative methods Sparse Subspace Clustering (SSC) (Elhamifar-Vidal , Candes-Soltanolkotabi 12 13, Wang-Xu 13) Low-Rank Subspace Clustering (LRR and LRSC) (Costeira-Kanade 98, Kanatani 01, Vidal 08, Liu et al , Wei-Lin 10, Favaro-Vidal 11 13) Least Square Regression (LSR) (Lu 12) Sparse + Low-Rank (Luo 11, Wang 13) Sparse + Frobenius Elastic Net (EnSC) (Dyer 13, You 16)

7 Prior Work: Sparse and Low-Rank Methods min C,E f(c)+ g(e) s.t.x = XC + E Sparse Subspace Clustering (SSC) `1 `1, `22 Least Squares Regression (LSR) `22 `22 Elastic Net Subspace Clustering (EnSC) `1 + `22 `22 Low Rank Representation (LRR) nuclear `2,1 Low Rank Subspace Clustering (LRSC) nuclear `1, `22 f g Advantages Convex optimization Broad theoretical results Robust to noise/corruptions Disadvantages / Open Problems Low-dimensional subspaces Missing entries Scalability: can handle 10,000 data points

8 Talk Outline Sparse Subspace Clustering by Basis Pursuit (SSC-BP) Theoretical guarantees for noiseless, noisy, and corrupted data Great performance for applications with 1,000 data points Scalable Sparse Subspace Clustering by Orthogonal Matching Pursuit (SSC-OMP) Theoretical guarantees for noiseless data Scalable to 600,000 data points Scalable Elastic Net Subspace Clustering (EnSC) Theoretical guarantees for noiseless data New active set algorithm that is scalable to 600,000 data points E. Elhamifar and R. Vidal. Sparse Subspace Clustering. CVPR E. Elhamifar and R. Vidal. Sparse Subspace Clustering: Algorithm, Theory and Applications. TPAMI C. You, D. Robinson, R. Vidal. Scalable Sparse Subspace Clustering by Orthogonal Matching Pursuit. CVPR C. You, C. Li, D. Robinson, R. Vidal, Scalable Elastic Net Subspace Clustering. CVPR 2016.

9 Sparse Subspace Clustering by Basis Pursuit Ehsan Elhamifar and René Vidal Computer Science, Northeastern University Center for Imaging Science, Johns Hopkins University

10 Sparse Subspace Clustering: Spectral Clustering Spectral clustering Represent data points as nodes in graph G Connect nodes i and j with weight c ij Infer clusters from Laplacian of G How to define a subspace-preserving affinity matrix C? c ij 6=0 c ij =0 points in the same subspace: points in different subspaces:

11 Sparse Subspace Clustering: Intuition Data in a union of subspaces (UoS) are self-expressive NX x j = c ij x i =) x j = Xc j =) X = XC i=1 Data in a UoS admits a subspace-preserving representation c ij 6=0 =) x i and x j belong to the same subspace S 3 S 1 S 2 X S 1 S 3 S 2 Under what conditions is solution to P0 subspace-preserving? P 0 : min c j kc j k 0 s. t. x j = Xc j, c jj =0 E. Elhamifar and R. Vidal. Sparse Subspace Clustering. CVPR E. Elhamifar and R. Vidal. Clustering Disjoint Subspaces via Sparse Representation. ICASSP E. Elhamifar and R. Vidal. Sparse Subspace Clustering: Algorithm, Theory and Applications. TPAMI 2013.

12 Sparse Subspace Clustering: Noiseless Data Under what conditions on the subspaces and the data is the solution to P1 subspace-preserving? Point by point: All points: min c j kc j k 1 s. t. x j = Xc j, c jj =0 min C kck 1 s. t. X = XC, diag(c) =0 Theorem 1: P1 gives a subspace-preserving representation if the subspaces are independent, i.e., nm dim i=1 S i = nx dim(s i ) i=1 S 2 S 1 E. Elhamifar and R. Vidal. Sparse Subspace Clustering. CVPR E. Elhamifar and R. Vidal. Sparse Subspace Clustering: Algorithm, Theory and Applications. TPAMI 2013.

13 Sparse Subspace Clustering: Noiseless Data Independence may be too restrictive: e.g., articulated motions Theorem 2: P1 gives a subspace-preserving representation if the subspaces are sufficiently separated and the data are well distributed inside the subspaces, i.e., if for all i=1,, n, (incoherence) µ i = max j6=i cos( ij) <r i Theorem 3: n d-dimensional subspaces drawn independently, uniformly at random ρd + 1 points per subspace drawn independently, uniformly at random P1 gives a subspace-preserving representation with high probability if the dimension of the subspace d is small relative to the ambient dimension D (inradius) d< c2 ( ) log( ) 12 log(n) D E. Elhamifar and R. Vidal. Clustering Disjoint Subspaces via Sparse Representation. ICASSP E. Elhamifar and R. Vidal. Sparse Subspace Clustering: Algorithm, Theory and Applications. TPAMI M. Soltanolkotabi, E. Candes. A geometric analysis of subspace clustering with outliers. Annals of Statistics, 40(4): , 2013.

14 6 Sparse Subspace Clustering: Noisy Data Under what conditions on the subspaces and the data is the solution to LASSO subspace-preserving? Noiseless (P1): Noise (LASSO): min C kck 1 s. t. X = XC, diag(c) =0 min C kck kx XCk 2 F s. t. diag(c) =0 Theorem 4: LASSO gives a subspace-preserving representation if the subspaces are sufficiently separated, the data are well distributed, the noise is small enough, and the LASSO parameter is well chosen, i.e., µ i <r i, < r i(r i µ i ) 3r 2 i +8r i +2, < < Wang, Y.-X. and Xu, H. Noisy sparse subspace clustering. ICML 2013.

15 Experiments on Face Clustering Faces under varying illumination 9D subspace Extended Yale B dataset 38 subjects 64 images per subject Clustering error SSC < 2.0% error for 2 subjects SSC < 11.0% error for 10 subjects E. Elhamifar and R. Vidal, Sparse Subspace Clustering: Algorithm, Theory, and Applications, TPAMI13. Clustering error (%) SSC LRSC LRR H LRR SCC LSA D = 2,016 dimensional data Number of subjects

16 Scalable Sparse Subspace Clustering by Orthogonal Matching Pursuit Chong You, Daniel Robinson, and René Vidal Center for Imaging Science, Johns Hopkins University Applied Mathematics and Statistics, Johns Hopkins University

17 [1] E. Elhamifar and R. Vidal, Sparse Subspace Clustering, CVPR 09 [2] G. Liu, Z. Lin, Y. Yu, Robust Subspace Segmentation by Low- Rank Representation, ICML 10 [3] Lu et al,, Robust and efficient subspace segmentation via least squares regression, ECCV [4] X. Chen and D. Cai. Large Scale Spectral Clustering with Landmarkbased Representation, AAAI 11 [5] X. Peng, L. Zhang, Z. Yi, Scalable Sparse Subspace Clustering, CVPR 13 [6] A. Adler, M. Elad, Y. Hel-Or, Linear-Time Subspace Clustering via Bipartite Graph Modeling. TNNLS 15 Prior Work: overview arbitrary subspaces This work independent subspaces 1K data points 1M data points

18 Sparse Subspace Clustering (SSC) [1] Elhamifar-Vidal, Sparse Subspace Clustering, CVPR 2009 [2] Dyer et al, Greedy Feature Selection for Subspace Clustering, JMLR 2014

19 Sparse Subspace Clustering (SSC) [1] Elhamifar-Vidal, Sparse Subspace Clustering, CVPR 2009 [2] Dyer et al, Greedy Feature Selection for Subspace Clustering, JMLR 2014

20 SSC by Orthogonal Matching Pursuit (SSC-OMP)

21 Guaranteed Correct Connections: Deterministic Model [3] M. Soltanolkotabi, E. Candes. A geometric analysis of subspace clustering with outliers. Annals of Statistics, 40(4): , 2013.

22 Guaranteed Correct Connections: Deterministic Model SSC-OMP gives correct connections if

23 Guaranteed Correct Connections: Deterministic Model SSC-OMP gives correct connections if

24 Guaranteed Correct Connections: Random Model [3] M. Soltanolkotabi, E. Candes. A geometric analysis of subspace clustering with outliers. Annals of Statistics, 40(4): , 2013.

25 Synthetic Experiments

26 Synthetic Experiments

27 Experiment on Extended Yale B

28 Experiment on MNIST

29 Conclusion

30 Scalable Elastic Net Subspace Clustering Chong You, Chun-Guang Li *, Daniel Robinson, and René Vidal Center for Imaging Science, Johns Hopkins University *SICE, Beijing University of Posts and Telecommunications Applied Mathematics and Statistics, Johns Hopkins University

31 Motivation

32 Elastic net Subspace Clustering (EnSC)

33 Scalable Elastic net Subspace Clustering Prior methods ADMM Interior point Solution path Proximal gradient method etc.

34 Geometry of the Elastic Net Solution

35 Correct Connections vs. Connectivity

36 Guaranteed Correct Connections

37 Oracle Guided Active Set (ORGEN) Algorithm

38 Oracle Guided Active Set (ORGEN) Algorithm

39 Experiments database # data ambient dim. # clusters Examples Coil-100 7, PIE 11, MNIST 70, CovType 581,

40 Experiments database # data SSC-BP SSC-OMP EnSC Coil-100 7, % 42.93% 69.24% PIE 11, % 24.06% 52.98% MNIST 70, % 93.79% CovType 581, % 53.52%

41 Experiments database # data SSC-BP SSC-OMP EnSC Coil-100 7, mins 3 mins 3 mins PIE 11, mins 5 mins 13 mins MNIST 70,000-6 mins 28 mins CovType 581, mins 1452 mins

42 Conclusion

43 Conclusions Many problems in computer vision can be posed as subspace clustering problems Spatial and temporal video segmentation Face clustering under varying illumination These problems can be solved using Sparse Subspace Clustering by Basis Pursuit (SSC-BP) Sparse Subspace Clustering by Orthogonal Matching Pursuit (SSC-OMP) Elastic Net Subspace Clustering (EnSC) These algorithms are provably correct when Subspaces are sufficiently separated Data are well distributed within each subspace The subspace dimension is small relative to the ambient dimension SSC-OMP and EnSC are scalable to 1M data points

44 Acknowledgements Vision Johns Hopkins University Thank You!

Sparse Subspace Clustering

Sparse Subspace Clustering Sparse Subspace Clustering Based on Sparse Subspace Clustering: Algorithm, Theory, and Applications by Elhamifar and Vidal (2013) Alex Gutierrez CSCI 8314 March 2, 2017 Outline 1 Motivation and Background

More information

Scalable Sparse Subspace Clustering by Orthogonal Matching Pursuit

Scalable Sparse Subspace Clustering by Orthogonal Matching Pursuit Scalable Sparse Subspace Clustering by Orthogonal Matching Pursuit Chong You, Daniel P. Robinson, René Vidal Johns Hopkins University, Baltimore, MD, 21218, USA Abstract Subspace clustering methods based

More information

Graph based Subspace Segmentation. Canyi Lu National University of Singapore Nov. 21, 2013

Graph based Subspace Segmentation. Canyi Lu National University of Singapore Nov. 21, 2013 Graph based Subspace Segmentation Canyi Lu National University of Singapore Nov. 21, 2013 Content Subspace Segmentation Problem Related Work Sparse Subspace Clustering (SSC) Low-Rank Representation (LRR)

More information

Provable Subspace Clustering: When LRR meets SSC

Provable Subspace Clustering: When LRR meets SSC Provable Subspace Clustering: When LRR meets SSC Yu-Xiang Wang School of Computer Science Carnegie Mellon University Pittsburgh, PA 15213 USA yuxiangw@cs.cmu.edu Huan Xu Dept. of Mech. Engineering National

More information

Sparse representation classification and positive L1 minimization

Sparse representation classification and positive L1 minimization Sparse representation classification and positive L1 minimization Cencheng Shen Joint Work with Li Chen, Carey E. Priebe Applied Mathematics and Statistics Johns Hopkins University, August 5, 2014 Cencheng

More information

Low-Rank Subspace Clustering

Low-Rank Subspace Clustering Low-Rank Subspace Clustering Zhouchen Lin 1 & Jiashi Feng 2 1 Peking University 2 National University of Singapore June 26 Outline Representative Models Low-Rank Representation (LRR) Based Subspace Clustering

More information

GROUP-SPARSE SUBSPACE CLUSTERING WITH MISSING DATA

GROUP-SPARSE SUBSPACE CLUSTERING WITH MISSING DATA GROUP-SPARSE SUBSPACE CLUSTERING WITH MISSING DATA D Pimentel-Alarcón 1, L Balzano 2, R Marcia 3, R Nowak 1, R Willett 1 1 University of Wisconsin - Madison, 2 University of Michigan - Ann Arbor, 3 University

More information

Closed-Form Solutions in Low-Rank Subspace Recovery Models and Their Implications. Zhouchen Lin ( 林宙辰 ) 北京大学 Nov. 7, 2015

Closed-Form Solutions in Low-Rank Subspace Recovery Models and Their Implications. Zhouchen Lin ( 林宙辰 ) 北京大学 Nov. 7, 2015 Closed-Form Solutions in Low-Rank Subspace Recovery Models and Their Implications Zhouchen Lin ( 林宙辰 ) 北京大学 Nov. 7, 2015 Outline Sparsity vs. Low-rankness Closed-Form Solutions of Low-rank Models Applications

More information

arxiv: v2 [cs.cv] 23 Sep 2018

arxiv: v2 [cs.cv] 23 Sep 2018 Subspace Clustering using Ensembles of K-Subspaces John Lipor 1, David Hong, Yan Shuo Tan 3, and Laura Balzano arxiv:1709.04744v [cs.cv] 3 Sep 018 1 Department of Electrical & Computer Engineering, Portland

More information

Subspace Clustering with Irrelevant Features via Robust Dantzig Selector

Subspace Clustering with Irrelevant Features via Robust Dantzig Selector Subspace Clustering with Irrelevant Features via Robust Dantzig Selector Chao Qu Department of Mechanical Engineering National University of Singapore A0743@u.nus.edu Huan Xu Department of Mechanical Engineering

More information

Part I Generalized Principal Component Analysis

Part I Generalized Principal Component Analysis Part I Generalized Principal Component Analysis René Vidal Center for Imaging Science Institute for Computational Medicine Johns Hopkins University Principal Component Analysis (PCA) Given a set of points

More information

SUBSPACE CLUSTERING WITH DENSE REPRESENTATIONS. Eva L. Dyer, Christoph Studer, Richard G. Baraniuk

SUBSPACE CLUSTERING WITH DENSE REPRESENTATIONS. Eva L. Dyer, Christoph Studer, Richard G. Baraniuk SUBSPACE CLUSTERING WITH DENSE REPRESENTATIONS Eva L. Dyer, Christoph Studer, Richard G. Baraniuk Rice University; e-mail: {e.dyer, studer, richb@rice.edu} ABSTRACT Unions of subspaces have recently been

More information

The Information-Theoretic Requirements of Subspace Clustering with Missing Data

The Information-Theoretic Requirements of Subspace Clustering with Missing Data The Information-Theoretic Requirements of Subspace Clustering with Missing Data Daniel L. Pimentel-Alarcón Robert D. Nowak University of Wisconsin - Madison, 53706 USA PIMENTELALAR@WISC.EDU NOWAK@ECE.WISC.EDU

More information

Linear-Time Subspace Clustering via Bipartite Graph Modeling

Linear-Time Subspace Clustering via Bipartite Graph Modeling Linear-Time Subspace Clustering via Bipartite Graph Modeling Amir Adler, Michael Elad, Fellow, IEEE, and Yacov Hel-Or, Abstract We present a linear-time subspace clustering approach that combines sparse

More information

SUBSPACE CLUSTERING WITH DENSE REPRESENTATIONS. Eva L. Dyer, Christoph Studer, Richard G. Baraniuk. ECE Department, Rice University, Houston, TX

SUBSPACE CLUSTERING WITH DENSE REPRESENTATIONS. Eva L. Dyer, Christoph Studer, Richard G. Baraniuk. ECE Department, Rice University, Houston, TX SUBSPACE CLUSTERING WITH DENSE REPRESENTATIONS Eva L. Dyer, Christoph Studer, Richard G. Baraniuk ECE Department, Rice University, Houston, TX ABSTRACT Unions of subspaces have recently been shown to provide

More information

arxiv: v2 [cs.cv] 20 Apr 2014

arxiv: v2 [cs.cv] 20 Apr 2014 Robust Subspace Recovery via Bi-Sparsity Pursuit arxiv:1403.8067v2 [cs.cv] 20 Apr 2014 Xiao Bian, Hamid Krim North Carolina State University Abstract Successful applications of sparse models in computer

More information

arxiv: v1 [cs.lg] 9 May 2016

arxiv: v1 [cs.lg] 9 May 2016 Oracle Based Active Set Algorithm for Scalable Elastic Net Subspace Clustering arxiv:65.2633v [cs.lg] 9 May 26 Chong You Chun-Guang Li Daniel P. Robinson René Vidal Center for Imaging Science, Johns Hopkins

More information

A Local Non-Negative Pursuit Method for Intrinsic Manifold Structure Preservation

A Local Non-Negative Pursuit Method for Intrinsic Manifold Structure Preservation Proceedings of the Twenty-Eighth AAAI Conference on Artificial Intelligence A Local Non-Negative Pursuit Method for Intrinsic Manifold Structure Preservation Dongdong Chen and Jian Cheng Lv and Zhang Yi

More information

Achieving Stable Subspace Clustering by Post-Processing Generic Clustering Results

Achieving Stable Subspace Clustering by Post-Processing Generic Clustering Results Achieving Stable Subspace Clustering by Post-Processing Generic Clustering Results Duc-Son Pham Ognjen Arandjelović Svetha Venatesh Department of Computing School of Computer Science School of Information

More information

Robust Subspace Clustering

Robust Subspace Clustering Robust Subspace Clustering Mahdi Soltanolkotabi, Ehsan Elhamifar and Emmanuel J. Candès January 23 Abstract Subspace clustering refers to the task of finding a multi-subspace representation that best fits

More information

Consensus Algorithms for Camera Sensor Networks. Roberto Tron Vision, Dynamics and Learning Lab Johns Hopkins University

Consensus Algorithms for Camera Sensor Networks. Roberto Tron Vision, Dynamics and Learning Lab Johns Hopkins University Consensus Algorithms for Camera Sensor Networks Roberto Tron Vision, Dynamics and Learning Lab Johns Hopkins University Camera Sensor Networks Motes Small, battery powered Embedded camera Wireless interface

More information

Discriminative Subspace Clustering

Discriminative Subspace Clustering Discriminative Subspace Clustering Vasileios Zografos 1, Liam Ellis 1, and Rudolf Mester 1 2 1 CVL, Dept. of Electrical Engineering, Linköping University, Linköping, Sweden 2 VSI Lab, Computer Science

More information

DIMENSIONALITY reduction is a powerful tool for

DIMENSIONALITY reduction is a powerful tool for 34 IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, VOL. 6, NO., OCTOBER 5 Linear-Time Subspace Clustering via Bipartite Graph Modeling Amir Adler, Michael Elad, Fellow, IEEE, andyacovhel-or

More information

FILTRATED ALGEBRAIC SUBSPACE CLUSTERING

FILTRATED ALGEBRAIC SUBSPACE CLUSTERING FILTRATED ALGEBRAIC SUBSPACE CLUSTERING MANOLIS C. TSAKIRIS AND RENÉ VIDAL Abstract. Subspace clustering is the problem of clustering data that lie close to a union of linear subspaces. Existing algebraic

More information

Robust Subspace Clustering

Robust Subspace Clustering Robust Subspace Clustering Mahdi Soltanolkotabi, Ehsan Elhamifar and Emmanuel J Candès January 23 Abstract Subspace clustering refers to the task of finding a multi-subspace representation that best fits

More information

Robust Subspace Clustering

Robust Subspace Clustering Robust Subspace Clustering Mahdi Soltanolkotabi, Ehsan Elhamifar and Emmanuel J. Candès January 23; Revised August, 23 Abstract Subspace clustering refers to the task of finding a multi-subspace representation

More information

Automatic Model Selection in Subspace Clustering via Triplet Relationships

Automatic Model Selection in Subspace Clustering via Triplet Relationships The Thirty-Second AAAI Conference on Artificial Intelligence (AAAI-18) Automatic Model Selection in Subspace Clustering via Triplet Relationships Jufeng Yang, 1 Jie Liang, 1 Kai Wang, 1 Yong-Liang Yang,

More information

IT IS well known that the problem of subspace segmentation

IT IS well known that the problem of subspace segmentation IEEE TRANSACTIONS ON CYBERNETICS 1 LRR for Subspace Segmentation via Tractable Schatten-p Norm Minimization and Factorization Hengmin Zhang, Jian Yang, Member, IEEE, Fanhua Shang, Member, IEEE, Chen Gong,

More information

ECE 8201: Low-dimensional Signal Models for High-dimensional Data Analysis

ECE 8201: Low-dimensional Signal Models for High-dimensional Data Analysis ECE 8201: Low-dimensional Signal Models for High-dimensional Data Analysis Lecture 7: Matrix completion Yuejie Chi The Ohio State University Page 1 Reference Guaranteed Minimum-Rank Solutions of Linear

More information

Solving Corrupted Quadratic Equations, Provably

Solving Corrupted Quadratic Equations, Provably Solving Corrupted Quadratic Equations, Provably Yuejie Chi London Workshop on Sparse Signal Processing September 206 Acknowledgement Joint work with Yuanxin Li (OSU), Huishuai Zhuang (Syracuse) and Yingbin

More information

ROBUST SUBSPACE CLUSTERING. Stanford University, University of California, Berkeley and Stanford University

ROBUST SUBSPACE CLUSTERING. Stanford University, University of California, Berkeley and Stanford University The Annals of Statistics 2014, Vol. 42, No. 2, 669 699 DOI: 10.1214/13-AOS1199 Institute of Mathematical Statistics, 2014 ROBUST SUBSPACE CLUSTERING BY MAHDI SOLTANOLKOTABI 1,EHSAN ELHAMIFAR AND EMMANUEL

More information

On the Sub-Optimality of Proximal Gradient Descent for l 0 Sparse Approximation

On the Sub-Optimality of Proximal Gradient Descent for l 0 Sparse Approximation On the Sub-Optimality of Proximal Gradient Descent for l 0 Sparse Approximation Yingzhen Yang YYANG58@ILLINOIS.EDU Jianchao Yang JIANCHAO.YANG@SNAPCHAT.COM Snapchat, Venice, CA 90291 Wei Han WEIHAN3@ILLINOIS.EDU

More information

Non-convex Robust PCA: Provable Bounds

Non-convex Robust PCA: Provable Bounds Non-convex Robust PCA: Provable Bounds Anima Anandkumar U.C. Irvine Joint work with Praneeth Netrapalli, U.N. Niranjan, Prateek Jain and Sujay Sanghavi. Learning with Big Data High Dimensional Regime Missing

More information

Rice University. Endogenous Sparse Recovery. Eva L. Dyer. A Thesis Submitted in Partial Fulfillment of the Requirements for the Degree

Rice University. Endogenous Sparse Recovery. Eva L. Dyer. A Thesis Submitted in Partial Fulfillment of the Requirements for the Degree Rice University Endogenous Sparse Recovery by Eva L. Dyer A Thesis Submitted in Partial Fulfillment of the Requirements for the Degree Masters of Science Approved, Thesis Committee: Dr. Richard G. Baraniuk,

More information

Orthogonal Matching Pursuit for Sparse Signal Recovery With Noise

Orthogonal Matching Pursuit for Sparse Signal Recovery With Noise Orthogonal Matching Pursuit for Sparse Signal Recovery With Noise The MIT Faculty has made this article openly available. Please share how this access benefits you. Your story matters. Citation As Published

More information

A Counterexample for the Validity of Using Nuclear Norm as a Convex Surrogate of Rank

A Counterexample for the Validity of Using Nuclear Norm as a Convex Surrogate of Rank A Counterexample for the Validity of Using Nuclear Norm as a Convex Surrogate of Rank Hongyang Zhang, Zhouchen Lin, and Chao Zhang Key Lab. of Machine Perception (MOE), School of EECS Peking University,

More information

arxiv: v2 [stat.ml] 1 Jul 2013

arxiv: v2 [stat.ml] 1 Jul 2013 A Counterexample for the Validity of Using Nuclear Norm as a Convex Surrogate of Rank Hongyang Zhang, Zhouchen Lin, and Chao Zhang Key Lab. of Machine Perception (MOE), School of EECS Peking University,

More information

Structured matrix factorizations. Example: Eigenfaces

Structured matrix factorizations. Example: Eigenfaces Structured matrix factorizations Example: Eigenfaces An extremely large variety of interesting and important problems in machine learning can be formulated as: Given a matrix, find a matrix and a matrix

More information

Spectral Clustering using Multilinear SVD

Spectral Clustering using Multilinear SVD Spectral Clustering using Multilinear SVD Analysis, Approximations and Applications Debarghya Ghoshdastidar, Ambedkar Dukkipati Dept. of Computer Science & Automation Indian Institute of Science Clustering:

More information

Tensor LRR Based Subspace Clustering

Tensor LRR Based Subspace Clustering Tensor LRR Based Subspace Clustering Yifan Fu, Junbin Gao and David Tien School of Computing and mathematics Charles Sturt University Bathurst, NSW 2795, Australia Email:{yfu, jbgao, dtien}@csu.edu.au

More information

L 2,1 Norm and its Applications

L 2,1 Norm and its Applications L 2, Norm and its Applications Yale Chang Introduction According to the structure of the constraints, the sparsity can be obtained from three types of regularizers for different purposes.. Flat Sparsity.

More information

Dimensionality Reduction Using the Sparse Linear Model: Supplementary Material

Dimensionality Reduction Using the Sparse Linear Model: Supplementary Material Dimensionality Reduction Using the Sparse Linear Model: Supplementary Material Ioannis Gkioulekas arvard SEAS Cambridge, MA 038 igkiou@seas.harvard.edu Todd Zickler arvard SEAS Cambridge, MA 038 zickler@seas.harvard.edu

More information

Robust Component Analysis via HQ Minimization

Robust Component Analysis via HQ Minimization Robust Component Analysis via HQ Minimization Ran He, Wei-shi Zheng and Liang Wang 0-08-6 Outline Overview Half-quadratic minimization principal component analysis Robust principal component analysis Robust

More information

Sketching for Large-Scale Learning of Mixture Models

Sketching for Large-Scale Learning of Mixture Models Sketching for Large-Scale Learning of Mixture Models Nicolas Keriven Université Rennes 1, Inria Rennes Bretagne-atlantique Adv. Rémi Gribonval Outline Introduction Practical Approach Results Theoretical

More information

Block-Sparse Recovery via Convex Optimization

Block-Sparse Recovery via Convex Optimization 1 Block-Sparse Recovery via Convex Optimization Ehsan Elhamifar, Student Member, IEEE, and René Vidal, Senior Member, IEEE arxiv:11040654v3 [mathoc 13 Apr 2012 Abstract Given a dictionary that consists

More information

Nonnegative Matrix Factorization Clustering on Multiple Manifolds

Nonnegative Matrix Factorization Clustering on Multiple Manifolds Proceedings of the Twenty-Fourth AAAI Conference on Artificial Intelligence (AAAI-10) Nonnegative Matrix Factorization Clustering on Multiple Manifolds Bin Shen, Luo Si Department of Computer Science,

More information

Robust Motion Segmentation by Spectral Clustering

Robust Motion Segmentation by Spectral Clustering Robust Motion Segmentation by Spectral Clustering Hongbin Wang and Phil F. Culverhouse Centre for Robotics Intelligent Systems University of Plymouth Plymouth, PL4 8AA, UK {hongbin.wang, P.Culverhouse}@plymouth.ac.uk

More information

Introduction to Machine Learning. PCA and Spectral Clustering. Introduction to Machine Learning, Slides: Eran Halperin

Introduction to Machine Learning. PCA and Spectral Clustering. Introduction to Machine Learning, Slides: Eran Halperin 1 Introduction to Machine Learning PCA and Spectral Clustering Introduction to Machine Learning, 2013-14 Slides: Eran Halperin Singular Value Decomposition (SVD) The singular value decomposition (SVD)

More information

A Characterization of Sampling Patterns for Union of Low-Rank Subspaces Retrieval Problem

A Characterization of Sampling Patterns for Union of Low-Rank Subspaces Retrieval Problem A Characterization of Sampling Patterns for Union of Low-Rank Subspaces Retrieval Problem Morteza Ashraphijuo Columbia University ashraphijuo@ee.columbia.edu Xiaodong Wang Columbia University wangx@ee.columbia.edu

More information

Analysis of Spectral Kernel Design based Semi-supervised Learning

Analysis of Spectral Kernel Design based Semi-supervised Learning Analysis of Spectral Kernel Design based Semi-supervised Learning Tong Zhang IBM T. J. Watson Research Center Yorktown Heights, NY 10598 Rie Kubota Ando IBM T. J. Watson Research Center Yorktown Heights,

More information

ECE 8201: Low-dimensional Signal Models for High-dimensional Data Analysis

ECE 8201: Low-dimensional Signal Models for High-dimensional Data Analysis ECE 8201: Low-dimensional Signal Models for High-dimensional Data Analysis Lecture 3: Sparse signal recovery: A RIPless analysis of l 1 minimization Yuejie Chi The Ohio State University Page 1 Outline

More information

The Multibody Trifocal Tensor: Motion Segmentation from 3 Perspective Views

The Multibody Trifocal Tensor: Motion Segmentation from 3 Perspective Views The Multibody Trifocal Tensor: Motion Segmentation from 3 Perspective Views Richard Hartley 1,2 and RenéVidal 2,3 1 Dept. of Systems Engineering 3 Center for Imaging Science Australian National University

More information

Low-rank Matrix Completion with Noisy Observations: a Quantitative Comparison

Low-rank Matrix Completion with Noisy Observations: a Quantitative Comparison Low-rank Matrix Completion with Noisy Observations: a Quantitative Comparison Raghunandan H. Keshavan, Andrea Montanari and Sewoong Oh Electrical Engineering and Statistics Department Stanford University,

More information

A New GPCA Algorithm for Clustering Subspaces by Fitting, Differentiating and Dividing Polynomials

A New GPCA Algorithm for Clustering Subspaces by Fitting, Differentiating and Dividing Polynomials A New GPCA Algorithm for Clustering Subspaces by Fitting, Differentiating and Dividing Polynomials René Vidal Yi Ma Jacopo Piazzi Center for Imaging Science Electrical & Computer Engineering Mechanical

More information

Global Optimality in Matrix and Tensor Factorization, Deep Learning & Beyond

Global Optimality in Matrix and Tensor Factorization, Deep Learning & Beyond Global Optimality in Matrix and Tensor Factorization, Deep Learning & Beyond Ben Haeffele and René Vidal Center for Imaging Science Mathematical Institute for Data Science Johns Hopkins University This

More information

ECS289: Scalable Machine Learning

ECS289: Scalable Machine Learning ECS289: Scalable Machine Learning Cho-Jui Hsieh UC Davis Oct 27, 2015 Outline One versus all/one versus one Ranking loss for multiclass/multilabel classification Scaling to millions of labels Multiclass

More information

Relations Among Some Low-Rank Subspace Recovery Models

Relations Among Some Low-Rank Subspace Recovery Models LETTER Communicated by Guangcan Liu Relations Among Some Low-Rank Subspace Recovery Models Hongyang Zhang hy_zh@pku.edu.cn Zhouchen Lin zlin@pku.edu.cn Chao Zhang chzhang@cis.pku.edu.cn Key Laboratory

More information

ECE G: Special Topics in Signal Processing: Sparsity, Structure, and Inference

ECE G: Special Topics in Signal Processing: Sparsity, Structure, and Inference ECE 18-898G: Special Topics in Signal Processing: Sparsity, Structure, and Inference Low-rank matrix recovery via convex relaxations Yuejie Chi Department of Electrical and Computer Engineering Spring

More information

Robust Principal Component Analysis

Robust Principal Component Analysis ELE 538B: Mathematics of High-Dimensional Data Robust Principal Component Analysis Yuxin Chen Princeton University, Fall 2018 Disentangling sparse and low-rank matrices Suppose we are given a matrix M

More information

Breaking the Limits of Subspace Inference

Breaking the Limits of Subspace Inference Breaking the Limits of Subspace Inference Claudia R. Solís-Lemus, Daniel L. Pimentel-Alarcón Emory University, Georgia State University Abstract Inferring low-dimensional subspaces that describe high-dimensional,

More information

An Introduction to Sparse Approximation

An Introduction to Sparse Approximation An Introduction to Sparse Approximation Anna C. Gilbert Department of Mathematics University of Michigan Basic image/signal/data compression: transform coding Approximate signals sparsely Compress images,

More information

Multi-Task Learning for Subspace Segmentation

Multi-Task Learning for Subspace Segmentation Yu Wang David Wipf Qing Ling Wei Chen Ian Wassell Computer Laboratory, University of Cambridge, Cambridge, UK Microsoft Research, Beiing, China University of Science and Technology of China, Hefei, Anhui,

More information

Dimensionality Reduction:

Dimensionality Reduction: Dimensionality Reduction: From Data Representation to General Framework Dong XU School of Computer Engineering Nanyang Technological University, Singapore What is Dimensionality Reduction? PCA LDA Examples:

More information

Algebraic Clustering of Affine Subspaces

Algebraic Clustering of Affine Subspaces 1 Algebraic Clustering of Affine Subspaces Manolis C. Tsakiris and René Vidal, Fellow, IEEE Abstract Subspace clustering is an important problem in machine learning with many applications in computer vision

More information

When Dictionary Learning Meets Classification

When Dictionary Learning Meets Classification When Dictionary Learning Meets Classification Bufford, Teresa 1 Chen, Yuxin 2 Horning, Mitchell 3 Shee, Liberty 1 Mentor: Professor Yohann Tendero 1 UCLA 2 Dalhousie University 3 Harvey Mudd College August

More information

Tensor LRR and Sparse Coding-Based Subspace Clustering Yifan Fu, Junbin Gao, David Tien, Zhouchen Lin, Senior Member, IEEE, and Xia Hong

Tensor LRR and Sparse Coding-Based Subspace Clustering Yifan Fu, Junbin Gao, David Tien, Zhouchen Lin, Senior Member, IEEE, and Xia Hong 2120 IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, VOL. 27, NO. 10, OCTOBER 2016 Tensor LRR and Sparse Coding-Based Subspace Clustering Yifan Fu, Junbin Gao, David Tien, Zhouchen Lin, Senior

More information

Machine Learning for Signal Processing Sparse and Overcomplete Representations

Machine Learning for Signal Processing Sparse and Overcomplete Representations Machine Learning for Signal Processing Sparse and Overcomplete Representations Abelino Jimenez (slides from Bhiksha Raj and Sourish Chaudhuri) Oct 1, 217 1 So far Weights Data Basis Data Independent ICA

More information

Laplacian Eigenmaps for Dimensionality Reduction and Data Representation

Laplacian Eigenmaps for Dimensionality Reduction and Data Representation Laplacian Eigenmaps for Dimensionality Reduction and Data Representation Neural Computation, June 2003; 15 (6):1373-1396 Presentation for CSE291 sp07 M. Belkin 1 P. Niyogi 2 1 University of Chicago, Department

More information

High-Rank Matrix Completion and Subspace Clustering with Missing Data

High-Rank Matrix Completion and Subspace Clustering with Missing Data High-Rank Matrix Completion and Subspace Clustering with Missing Data Authors: Brian Eriksson, Laura Balzano and Robert Nowak Presentation by Wang Yuxiang 1 About the authors

More information

Graph Partitioning Using Random Walks

Graph Partitioning Using Random Walks Graph Partitioning Using Random Walks A Convex Optimization Perspective Lorenzo Orecchia Computer Science Why Spectral Algorithms for Graph Problems in practice? Simple to implement Can exploit very efficient

More information

Riemannian Metric Learning for Symmetric Positive Definite Matrices

Riemannian Metric Learning for Symmetric Positive Definite Matrices CMSC 88J: Linear Subspaces and Manifolds for Computer Vision and Machine Learning Riemannian Metric Learning for Symmetric Positive Definite Matrices Raviteja Vemulapalli Guide: Professor David W. Jacobs

More information

Adaptive Affinity Matrix for Unsupervised Metric Learning

Adaptive Affinity Matrix for Unsupervised Metric Learning Adaptive Affinity Matrix for Unsupervised Metric Learning Yaoyi Li, Junxuan Chen, Yiru Zhao and Hongtao Lu Key Laboratory of Shanghai Education Commission for Intelligent Interaction and Cognitive Engineering,

More information

Group Sparse Non-negative Matrix Factorization for Multi-Manifold Learning

Group Sparse Non-negative Matrix Factorization for Multi-Manifold Learning LIU, LU, GU: GROUP SPARSE NMF FOR MULTI-MANIFOLD LEARNING 1 Group Sparse Non-negative Matrix Factorization for Multi-Manifold Learning Xiangyang Liu 1,2 liuxy@sjtu.edu.cn Hongtao Lu 1 htlu@sjtu.edu.cn

More information

A Convex Approach for Designing Good Linear Embeddings. Chinmay Hegde

A Convex Approach for Designing Good Linear Embeddings. Chinmay Hegde A Convex Approach for Designing Good Linear Embeddings Chinmay Hegde Redundancy in Images Image Acquisition Sensor Information content

More information

REPORT DOCUMENTATION PAGE

REPORT DOCUMENTATION PAGE REPORT DOCUMENTATION PAGE Form Approved OMB NO. 0704-0188 The public reporting burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing instructions,

More information

Lecture Notes 1: Vector spaces

Lecture Notes 1: Vector spaces Optimization-based data analysis Fall 2017 Lecture Notes 1: Vector spaces In this chapter we review certain basic concepts of linear algebra, highlighting their application to signal processing. 1 Vector

More information

Semi Supervised Distance Metric Learning

Semi Supervised Distance Metric Learning Semi Supervised Distance Metric Learning wliu@ee.columbia.edu Outline Background Related Work Learning Framework Collaborative Image Retrieval Future Research Background Euclidean distance d( x, x ) =

More information

Sparsity Models. Tong Zhang. Rutgers University. T. Zhang (Rutgers) Sparsity Models 1 / 28

Sparsity Models. Tong Zhang. Rutgers University. T. Zhang (Rutgers) Sparsity Models 1 / 28 Sparsity Models Tong Zhang Rutgers University T. Zhang (Rutgers) Sparsity Models 1 / 28 Topics Standard sparse regression model algorithms: convex relaxation and greedy algorithm sparse recovery analysis:

More information

Some Recent Advances. in Non-convex Optimization. Purushottam Kar IIT KANPUR

Some Recent Advances. in Non-convex Optimization. Purushottam Kar IIT KANPUR Some Recent Advances in Non-convex Optimization Purushottam Kar IIT KANPUR Outline of the Talk Recap of Convex Optimization Why Non-convex Optimization? Non-convex Optimization: A Brief Introduction Robust

More information

The convex algebraic geometry of rank minimization

The convex algebraic geometry of rank minimization The convex algebraic geometry of rank minimization Pablo A. Parrilo Laboratory for Information and Decision Systems Massachusetts Institute of Technology International Symposium on Mathematical Programming

More information

Subspace Clustering with Priors via Sparse Quadratically Constrained Quadratic Programming

Subspace Clustering with Priors via Sparse Quadratically Constrained Quadratic Programming Subspace Clustering with Priors via Sparse Quadratically Constrained Quadratic Programming Yongfang Cheng, Yin Wang, Mario Sznaier, Octavia Camps Electrical and Computer Engineering Northeastern University,

More information

PU Learning for Matrix Completion

PU Learning for Matrix Completion Cho-Jui Hsieh Dept of Computer Science UT Austin ICML 2015 Joint work with N. Natarajan and I. S. Dhillon Matrix Completion Example: movie recommendation Given a set Ω and the values M Ω, how to predict

More information

Shiqian Ma, MAT-258A: Numerical Optimization 1. Chapter 9. Alternating Direction Method of Multipliers

Shiqian Ma, MAT-258A: Numerical Optimization 1. Chapter 9. Alternating Direction Method of Multipliers Shiqian Ma, MAT-258A: Numerical Optimization 1 Chapter 9 Alternating Direction Method of Multipliers Shiqian Ma, MAT-258A: Numerical Optimization 2 Separable convex optimization a special case is min f(x)

More information

THE permeation of the Internet and social networks into our daily life, as well as the ever increasing number of connected

THE permeation of the Internet and social networks into our daily life, as well as the ever increasing number of connected IEEE TRANSACTIONS ON SIGNAL PROCESSING 08 (TO APPEAR) Sketched Subspace Clustering Panagiotis A. Traganitis, Student Member, IEEE, and Georgios B. Giannakis, Fellow, IEEE arxiv:707.0796v [stat.ml] 7 Feb

More information

Sparse & Redundant Signal Representation, and its Role in Image Processing

Sparse & Redundant Signal Representation, and its Role in Image Processing Sparse & Redundant Signal Representation, and its Role in Michael Elad The CS Department The Technion Israel Institute of technology Haifa 3000, Israel Wave 006 Wavelet and Applications Ecole Polytechnique

More information

Detecting Sparse Structures in Data in Sub-Linear Time: A group testing approach

Detecting Sparse Structures in Data in Sub-Linear Time: A group testing approach Detecting Sparse Structures in Data in Sub-Linear Time: A group testing approach Boaz Nadler The Weizmann Institute of Science Israel Joint works with Inbal Horev, Ronen Basri, Meirav Galun and Ery Arias-Castro

More information

Algorithms for Calculating Statistical Properties on Moving Points

Algorithms for Calculating Statistical Properties on Moving Points Algorithms for Calculating Statistical Properties on Moving Points Dissertation Proposal Sorelle Friedler Committee: David Mount (Chair), William Gasarch Samir Khuller, Amitabh Varshney January 14, 2009

More information

ECS289: Scalable Machine Learning

ECS289: Scalable Machine Learning ECS289: Scalable Machine Learning Cho-Jui Hsieh UC Davis Oct 18, 2016 Outline One versus all/one versus one Ranking loss for multiclass/multilabel classification Scaling to millions of labels Multiclass

More information

Semidefinite Programming Based Preconditioning for More Robust Near-Separable Nonnegative Matrix Factorization

Semidefinite Programming Based Preconditioning for More Robust Near-Separable Nonnegative Matrix Factorization Semidefinite Programming Based Preconditioning for More Robust Near-Separable Nonnegative Matrix Factorization Nicolas Gillis nicolas.gillis@umons.ac.be https://sites.google.com/site/nicolasgillis/ Department

More information

On the Projection Matrices Influence in the Classification of Compressed Sensed ECG Signals

On the Projection Matrices Influence in the Classification of Compressed Sensed ECG Signals On the Projection Matrices Influence in the Classification of Compressed Sensed ECG Signals Monica Fira, Liviu Goras Institute of Computer Science Romanian Academy Iasi, Romania Liviu Goras, Nicolae Cleju,

More information

Restricted Strong Convexity Implies Weak Submodularity

Restricted Strong Convexity Implies Weak Submodularity Restricted Strong Convexity Implies Weak Submodularity Ethan R. Elenberg Rajiv Khanna Alexandros G. Dimakis Department of Electrical and Computer Engineering The University of Texas at Austin {elenberg,rajivak}@utexas.edu

More information

Ranking from Crowdsourced Pairwise Comparisons via Matrix Manifold Optimization

Ranking from Crowdsourced Pairwise Comparisons via Matrix Manifold Optimization Ranking from Crowdsourced Pairwise Comparisons via Matrix Manifold Optimization Jialin Dong ShanghaiTech University 1 Outline Introduction FourVignettes: System Model and Problem Formulation Problem Analysis

More information

Automatic Subspace Learning via Principal Coefficients Embedding

Automatic Subspace Learning via Principal Coefficients Embedding IEEE TRANSACTIONS ON CYBERNETICS 1 Automatic Subspace Learning via Principal Coefficients Embedding Xi Peng, Jiwen Lu, Senior Member, IEEE, Zhang Yi, Fellow, IEEE and Rui Yan, Member, IEEE, arxiv:1411.4419v5

More information

Naval Air Warfare Center Weapons Division China Lake, CA. Automatic Target Recognition via Sparse Representations

Naval Air Warfare Center Weapons Division China Lake, CA. Automatic Target Recognition via Sparse Representations Naval Air Warfare Center Weapons Division China Lake, CA. Automatic Target Recognition via Sparse Representations Outline Research objective / ATR challenges Background on Compressed Sensing (CS) ATR in

More information

Tensor LRR and sparse coding based subspace clustering

Tensor LRR and sparse coding based subspace clustering Tensor LRR and sparse coding based subspace clustering Article Accepted Version Fu, Y., Gao, J., Tien, D., Lin, Z. and Hong, X. (2016) Tensor LRR and sparse coding based subspace clustering. IEEE Transactions

More information

Distributed Low-rank Subspace Segmentation

Distributed Low-rank Subspace Segmentation Distributed Low-rank Subspace Segmentation Ameet Talwalkar a Lester Mackey b Yadong Mu c Shih-Fu Chang c Michael I. Jordan a a University of California, Berkeley b Stanford University c Columbia University

More information

Sparse regression. Optimization-Based Data Analysis. Carlos Fernandez-Granda

Sparse regression. Optimization-Based Data Analysis.   Carlos Fernandez-Granda Sparse regression Optimization-Based Data Analysis http://www.cims.nyu.edu/~cfgranda/pages/obda_spring16 Carlos Fernandez-Granda 3/28/2016 Regression Least-squares regression Example: Global warming Logistic

More information

A Brief Overview of Practical Optimization Algorithms in the Context of Relaxation

A Brief Overview of Practical Optimization Algorithms in the Context of Relaxation A Brief Overview of Practical Optimization Algorithms in the Context of Relaxation Zhouchen Lin Peking University April 22, 2018 Too Many Opt. Problems! Too Many Opt. Algorithms! Zero-th order algorithms:

More information

Sparsity and Compressed Sensing

Sparsity and Compressed Sensing Sparsity and Compressed Sensing Jalal Fadili Normandie Université-ENSICAEN, GREYC Mathematical coffees 2017 Recap: linear inverse problems Dictionary Sensing Sensing Sensing = y m 1 H m n y y 2 R m H A

More information