Semidefinite Programming Based Preconditioning for More Robust Near-Separable Nonnegative Matrix Factorization

Size: px
Start display at page:

Download "Semidefinite Programming Based Preconditioning for More Robust Near-Separable Nonnegative Matrix Factorization"

Transcription

1 Semidefinite Programming Based Preconditioning for More Robust Near-Separable Nonnegative Matrix Factorization Nicolas Gillis Department of Mathematics and Operational Research Université de Mons, Belgium Joint work with Stephen Vavasis (U. of Waterloo) itwist 14 itwist 14 SDP Preconditioning for Separable NMF 1

2 Outline 1. Why Nonnegative Matrix Factorization (NMF)? Definition, motivations & applications 2. How to Solve Nonnegative Matrix Factorization (NMF)? A subclass of efficiently solvable NMF problems (separable NMF) Projection methods SDP-based Preconditioning itwist 14 SDP Preconditioning for Separable NMF 2

3 Part 1 Why? itwist 14 SDP Preconditioning for Separable NMF 3

4 Nonnegative Matrix Factorization (NMF) Given a matrix M R p n + and a factorization rank r min(p, n), find U R p r and V R r n such that min M UV U 0,V 0 2 F = (M UV ) 2 ij. (NMF) i,j NMF is a linear dimensionality reduction technique for nonnegative data : M(:, i) }{{} 0 r k=1 U(:, k) }{{} 0 V (k, i) }{{} 0 for all i. Why nonnegativity? Interpretability: Nonnegativity constraints lead to easily interpretable factors (and a sparse and part-based representation). Many applications. image processing, text mining, hyperspectral unmixing, community detection, clustering, etc. itwist 14 SDP Preconditioning for Separable NMF 4

5 Nonnegative Matrix Factorization (NMF) Given a matrix M R p n + and a factorization rank r min(p, n), find U R p r and V R r n such that min M UV U 0,V 0 2 F = (M UV ) 2 ij. (NMF) i,j NMF is a linear dimensionality reduction technique for nonnegative data : M(:, i) }{{} 0 r k=1 U(:, k) }{{} 0 V (k, i) }{{} 0 for all i. Why nonnegativity? Interpretability: Nonnegativity constraints lead to easily interpretable factors (and a sparse and part-based representation). Many applications. image processing, text mining, hyperspectral unmixing, community detection, clustering, etc. itwist 14 SDP Preconditioning for Separable NMF 4

6 Nonnegative Matrix Factorization (NMF) Given a matrix M R p n + and a factorization rank r min(p, n), find U R p r and V R r n such that min M UV U 0,V 0 2 F = (M UV ) 2 ij. (NMF) i,j NMF is a linear dimensionality reduction technique for nonnegative data : M(:, i) }{{} 0 r k=1 U(:, k) }{{} 0 V (k, i) }{{} 0 for all i. Why nonnegativity? Interpretability: Nonnegativity constraints lead to easily interpretable factors (and a sparse and part-based representation). Many applications. image processing, text mining, hyperspectral unmixing, community detection, clustering, etc. itwist 14 SDP Preconditioning for Separable NMF 4

7 Example 1: topic recovery and document classification Basis elements allow to recover the different topics; Weights allow to assign each text to its corresponding topics. itwist 14 SDP Preconditioning for Separable NMF 5

8 Example 1: topic recovery and document classification Basis elements allow to recover the different topics; Weights allow to assign each text to its corresponding topics. itwist 14 SDP Preconditioning for Separable NMF 5

9 Example 1: topic recovery and document classification Basis elements allow to recover the different topics; Weights allow to assign each text to its corresponding topics. itwist 14 SDP Preconditioning for Separable NMF 5

10 Exemple 2: feature extraction and classification U 0 constraints the basis elements to be nonnegative. Moreover V 0 imposes an additive reconstruction. The basis elements extract facial features such as eyes, nose and lips. itwist 14 SDP Preconditioning for Separable NMF 6

11 Exemple 2: feature extraction and classification U 0 constraints the basis elements to be nonnegative. Moreover V 0 imposes an additive reconstruction. The basis elements extract facial features such as eyes, nose and lips. itwist 14 SDP Preconditioning for Separable NMF 6

12 Exemple 2: feature extraction and classification U 0 constraints the basis elements to be nonnegative. Moreover V 0 imposes an additive reconstruction. The basis elements extract facial features such as eyes, nose and lips. itwist 14 SDP Preconditioning for Separable NMF 6

13 Example 3: blind hyperspectral unmixing Figure : Urban hyperspectral image with 162 spectral bands and 307-by-307 pixels. itwist 14 SDP Preconditioning for Separable NMF 7

14 Example 3: blind hyperspectral unmixing Basis elements allow to recover the different materials; Weights allow to know which pixel contains which material. itwist 14 SDP Preconditioning for Separable NMF 8

15 Example 3: blind hyperspectral unmixing Basis elements allow to recover the different materials; Weights allow to know which pixel contains which material. itwist 14 SDP Preconditioning for Separable NMF 8

16 Example 3: blind hyperspectral unmixing Basis elements allow to recover the different materials; Weights allow to know which pixel contains which material. itwist 14 SDP Preconditioning for Separable NMF 8

17 Example 3: blind hyperspectral unmixing Figure : Decomposition of the Urban dataset. itwist 14 SDP Preconditioning for Separable NMF 9

18 Example 3: blind hyperspectral unmixing Figure : Decomposition of the Urban dataset. itwist 14 SDP Preconditioning for Separable NMF 9

19 Example 3: blind hyperspectral unmixing Figure : Decomposition of the Urban dataset. itwist 14 SDP Preconditioning for Separable NMF 9

20 Part 2 How? itwist 14 SDP Preconditioning for Separable NMF 10

21 Can we only solve NMF problems? Given a matrix M R p n + and a factorization rank r N, find U R p r and V R r n such that min M UV U 0,V 0 2 F = (M UV ) 2 ij. (NMF) i,j NMF is NP-hard [V09], and highly ill-posed [G12]. In practice, it is often satisfactory to use locally optimal solutions for further analysis of the data. In other words, heuristics often solve the problem efficiently with acceptable answers. Try to analyze this state of affairs by considering generative models and algorithms that can recover hidden data. [V09] Vavasis, On the Complexity of Nonnegative Matrix Factorization, SIAM J. Optim., [G12] G., Sparse and Unique Nonnegative Matrix Factorization Through Data Preprocessing, JMLR, itwist 14 SDP Preconditioning for Separable NMF 11

22 Can we only solve NMF problems? Given a matrix M R p n + and a factorization rank r N, find U R p r and V R r n such that min M UV U 0,V 0 2 F = (M UV ) 2 ij. (NMF) i,j NMF is NP-hard [V09], and highly ill-posed [G12]. In practice, it is often satisfactory to use locally optimal solutions for further analysis of the data. In other words, heuristics often solve the problem efficiently with acceptable answers. Try to analyze this state of affairs by considering generative models and algorithms that can recover hidden data. [V09] Vavasis, On the Complexity of Nonnegative Matrix Factorization, SIAM J. Optim., [G12] G., Sparse and Unique Nonnegative Matrix Factorization Through Data Preprocessing, JMLR, itwist 14 SDP Preconditioning for Separable NMF 11

23 Separability Assumption Heuristics often solve the problem efficiently with acceptable answers (usually one needs to use additional regularization on U and V ). Try to analyze this state of affairs by considering generative models and algorithms that can recover hidden data. NMF can be solved in polynomial time if M is separable (rather strong condition) [AGKM12]. Separability of M = there exists an NMF (U, V ) 0 with M = UV where each column of U is equal to a column of M. [AGKM12] Arora, Ge, Kannan, Moitra, Computing a Nonnegative Matrix Factorization Provably, STOC itwist 14 SDP Preconditioning for Separable NMF 12

24 Separability Assumption Heuristics often solve the problem efficiently with acceptable answers (usually one needs to use additional regularization on U and V ). Try to analyze this state of affairs by considering generative models and algorithms that can recover hidden data. NMF can be solved in polynomial time if M is separable (rather strong condition) [AGKM12]. Separability of M = there exists an NMF (U, V ) 0 with M = UV where each column of U is equal to a column of M. [AGKM12] Arora, Ge, Kannan, Moitra, Computing a Nonnegative Matrix Factorization Provably, STOC itwist 14 SDP Preconditioning for Separable NMF 12

25 Is separability a reasonable assumption? Text mining: for each topic, there is a pure document on that topic, or, for each topic, there is a anchor word used only by that topic. [KSK13] Kumar, Sindhwani, Kambadur, Fast Conical Hull Algorithms for Near-separable Non-Negative Matrix Factorization, ICML [A+13] Arora, Ge, Halpern, Mimno, Moitra, Sontag, Wu, Zhu, A Practical Algorithm for Topic Modeling with Provable Guarantees, ICML [DRIS13] Ding, Rohban, Ishwar, Saligrama, Topic Discovery through Data Dependent and Random Projections, ICML Hyperspectral unmixing: separability is particularly natural: for each constitutive material, there is a pure pixel containing only that material. This is the so called pure-pixel assumption. [Ma+13] Ma, Bioucas-Dias, Chan, G., Gader, Plaza, Ambikapathi, Chi, Signal Processing Perspective on Hyperspectral Unmixing, IEEE Signal Proc. Mag., General image processing: No. itwist 14 SDP Preconditioning for Separable NMF 13

26 Is separability a reasonable assumption? Text mining: for each topic, there is a pure document on that topic, or, for each topic, there is a anchor word used only by that topic. [KSK13] Kumar, Sindhwani, Kambadur, Fast Conical Hull Algorithms for Near-separable Non-Negative Matrix Factorization, ICML [A+13] Arora, Ge, Halpern, Mimno, Moitra, Sontag, Wu, Zhu, A Practical Algorithm for Topic Modeling with Provable Guarantees, ICML [DRIS13] Ding, Rohban, Ishwar, Saligrama, Topic Discovery through Data Dependent and Random Projections, ICML Hyperspectral unmixing: separability is particularly natural: for each constitutive material, there is a pure pixel containing only that material. This is the so called pure-pixel assumption. [Ma+13] Ma, Bioucas-Dias, Chan, G., Gader, Plaza, Ambikapathi, Chi, Signal Processing Perspective on Hyperspectral Unmixing, IEEE Signal Proc. Mag., General image processing: No. itwist 14 SDP Preconditioning for Separable NMF 13

27 Is separability a reasonable assumption? Text mining: for each topic, there is a pure document on that topic, or, for each topic, there is a anchor word used only by that topic. [KSK13] Kumar, Sindhwani, Kambadur, Fast Conical Hull Algorithms for Near-separable Non-Negative Matrix Factorization, ICML [A+13] Arora, Ge, Halpern, Mimno, Moitra, Sontag, Wu, Zhu, A Practical Algorithm for Topic Modeling with Provable Guarantees, ICML [DRIS13] Ding, Rohban, Ishwar, Saligrama, Topic Discovery through Data Dependent and Random Projections, ICML Hyperspectral unmixing: separability is particularly natural: for each constitutive material, there is a pure pixel containing only that material. This is the so called pure-pixel assumption. [Ma+13] Ma, Bioucas-Dias, Chan, G., Gader, Plaza, Ambikapathi, Chi, Signal Processing Perspective on Hyperspectral Unmixing, IEEE Signal Proc. Mag., General image processing: No. itwist 14 SDP Preconditioning for Separable NMF 13

28 Geometric Interpretation of Separability Using normalization, the entries of each column of M, U and V can be assumed to sum to one: for all j, M(:, j) = r U(:, k)v (k, j) where V (k, j) 0 and k=1 r V (k, j) = 1. The columns of U are the vertices of the convex hull of the columns of M. k=1 itwist 14 SDP Preconditioning for Separable NMF 14

29 Separable NMF M is r-separable M = U[I r, V ]Π, for some V 0, and some permutation matrix Π. itwist 14 SDP Preconditioning for Separable NMF 15

30 Near-Separable NMF M = U[I r, V ]Π + N, where N is noise. itwist 14 SDP Preconditioning for Separable NMF 15

31 Near-Separable NMF: Noise and Conditioning We will assume that the noise is bounded (but otherwise arbitrary): N(:, j) 2 ɛ, for all j, and some dependence on some condition number is unavoidable: itwist 14 SDP Preconditioning for Separable NMF 16

32 Near-Separable NMF: Noise and Conditioning We will assume that the noise is bounded (but otherwise arbitrary): N(:, j) 2 ɛ, for all j, and some dependence on some condition number is unavoidable: itwist 14 SDP Preconditioning for Separable NMF 16

33 Near-Separable NMF: Noise and Conditioning We will assume that the noise is bounded (but otherwise arbitrary): N(:, j) 2 ɛ, for all j, and some dependence on some condition number is unavoidable: itwist 14 SDP Preconditioning for Separable NMF 16

34 Successive Projection Algorithm (SPA) 0: Initially K =. For i = 1 : r 1: Find j such that M(:, j) is max. 2: K = K {j }. 3: M ( I uu T ) M where u = M(:,j ) M(:,j ) 2. end modified Gram-Schmidt with column pivoting. ) Theorem. If ɛ O, SPA satisfies ( σmin (U) rκ 2 (U) min M M(:, K)V O ( ɛκ 2 (U) ). V 0 Advantages. Extremely fast, no parameter. Drawbacks. Requires U to be full rank; bound is weak. [GV12] G., Vavasis, Fast and Robust Recursive Algorithms for Separable Nonnegative Matrix Factorization, IEEE Trans. Patt. Anal. Mach. Intell. 36 (4), pp , itwist 14 SDP Preconditioning for Separable NMF 17

35 Successive Projection Algorithm (SPA) 0: Initially K =. For i = 1 : r 1: Find j such that M(:, j) is max. 2: K = K {j }. 3: M ( I uu T ) M where u = M(:,j ) M(:,j ) 2. end modified Gram-Schmidt with column pivoting. ) Theorem. If ɛ O, SPA satisfies ( σmin (U) rκ 2 (U) min M M(:, K)V O ( ɛκ 2 (U) ). V 0 Advantages. Extremely fast, no parameter. Drawbacks. Requires U to be full rank; bound is weak. [GV12] G., Vavasis, Fast and Robust Recursive Algorithms for Separable Nonnegative Matrix Factorization, IEEE Trans. Patt. Anal. Mach. Intell. 36 (4), pp , itwist 14 SDP Preconditioning for Separable NMF 17

36 Pre-conditioning for More Robust SPA Observation. Pre-multiplying M preserves separability: P M = P (U[I r, V ] + N) = (P U) [I, V ] + P N. Ideally, P = U 1 so that κ(p U) = 1 (assuming p = r). Solving the minimum volume ellipsoid centered at the origin and containing all the columns of M (which is SDP representable) min A S r + log det(a) 1 s.t. m j T A m j 1 j, allows to approximate U 1 : in fact, A (UU T ) 1. Theorem. If ɛ O ( σmin (U) r r ), preconditioned SPA satisfies min M M(:, K)V O (ɛκ(u)). V 0 [GV13] G., Vavasis, SDP-based Preconditioning for More Robust Near-Separable NMF, arxiv: itwist 14 SDP Preconditioning for Separable NMF 18

37 Pre-conditioning for More Robust SPA Observation. Pre-multiplying M preserves separability: P M = P (U[I r, V ] + N) = (P U) [I, V ] + P N. Ideally, P = U 1 so that κ(p U) = 1 (assuming p = r). Solving the minimum volume ellipsoid centered at the origin and containing all the columns of M (which is SDP representable) min A S r + log det(a) 1 s.t. m j T A m j 1 j, allows to approximate U 1 : in fact, A (UU T ) 1. Theorem. If ɛ O ( σmin (U) r r ), preconditioned SPA satisfies min M M(:, K)V O (ɛκ(u)). V 0 [GV13] G., Vavasis, SDP-based Preconditioning for More Robust Near-Separable NMF, arxiv: itwist 14 SDP Preconditioning for Separable NMF 18

38 Pre-conditioning for More Robust SPA Observation. Pre-multiplying M preserves separability: P M = P (U[I r, V ] + N) = (P U) [I, V ] + P N. Ideally, P = U 1 so that κ(p U) = 1 (assuming p = r). Solving the minimum volume ellipsoid centered at the origin and containing all the columns of M (which is SDP representable) min A S r + log det(a) 1 s.t. m j T A m j 1 j, allows to approximate U 1 : in fact, A (UU T ) 1. Theorem. If ɛ O ( σmin (U) r r ), preconditioned SPA satisfies min M M(:, K)V O (ɛκ(u)). V 0 [GV13] G., Vavasis, SDP-based Preconditioning for More Robust Near-Separable NMF, arxiv: itwist 14 SDP Preconditioning for Separable NMF 18

39 Pre-conditioning for More Robust SPA Observation. Pre-multiplying M preserves separability: P M = P (U[I r, V ] + N) = (P U) [I, V ] + P N. Ideally, P = U 1 so that κ(p U) = 1 (assuming p = r). Solving the minimum volume ellipsoid centered at the origin and containing all the columns of M (which is SDP representable) min A S r + log det(a) 1 s.t. m j T A m j 1 j, allows to approximate U 1 : in fact, A (UU T ) 1. Theorem. If ɛ O ( σmin (U) r r ), preconditioned SPA satisfies min M M(:, K)V O (ɛκ(u)). V 0 [GV13] G., Vavasis, SDP-based Preconditioning for More Robust Near-Separable NMF, arxiv: itwist 14 SDP Preconditioning for Separable NMF 18

40 Geometric Interpretation Figure : Geometric Interpretation of the SDP-based Preconditioning. itwist 14 SDP Preconditioning for Separable NMF 19

41 Synthetic data sets Each entry of U R uniform in [0, 1]; each column normalized. The other columns of M are the middle points of the columns of U (hence there are ( 20 2 ) = 190). The noise moves the middle points toward the outside of the convex hull of the column of U. itwist 14 SDP Preconditioning for Separable NMF 20

42 Synthetic data sets Each entry of U R uniform in [0, 1]; each column normalized. The other columns of M are the middle points of the columns of U (hence there are ( 20 2 ) = 190). The noise moves the middle points toward the outside of the convex hull of the column of U. itwist 14 SDP Preconditioning for Separable NMF 20

43 Results for the synthetic data sets Figure : Average of the percentage of columns correctly extracted depending on the noise level (for each noise level, 10 matrices are generated). itwist 14 SDP Preconditioning for Separable NMF 21

44 Hubble telescope hyperspectral image Figure : Sample of images for the Hubble telescope hyperspectral image with 100 spectral bands and pixels. itwist 14 SDP Preconditioning for Separable NMF 22

45 Hubble telescope hyperspectral image Figure : Spectral signatures extracted by SPA, corresponding to constitutive materials (matrix U with κ(u) = 115). itwist 14 SDP Preconditioning for Separable NMF 22

46 Hubble telescope hyperspectral image Figure : Reconstructed abundance maps (matrix V ). itwist 14 SDP Preconditioning for Separable NMF 22

47 Hubble telescope with blur and noise Figure : Sample of images for the Hubble telescope. itwist 14 SDP Preconditioning for Separable NMF 23

48 Hubble telescope with blur and noise Figure : Spectral signatures extracted by SPA, corresponding to constitutive materials (matrix U). itwist 14 SDP Preconditioning for Separable NMF 23

49 Hubble telescope with blur and noise Figure : Reconstructed abundance maps (matrix V ). With the blur and noise, SPA fails to identify good columns (two correspond to the same material). itwist 14 SDP Preconditioning for Separable NMF 23

50 Hubble telescope with blur and noise Figure : Spectral signatures extracted by preconditioned SPA, corresponding to constitutive materials (matrix U). itwist 14 SDP Preconditioning for Separable NMF 24

51 Hubble telescope with blur and noise Figure : Reconstructed abundance maps (matrix V ). With the blur and noise, preconditioned SPA is able to identify the right columns. itwist 14 SDP Preconditioning for Separable NMF 24

52 Conclusion 1. Nonnegative matrix factorization (NMF) Easily interpretable linear dimensionality reduction technique for nonnegative data, with many applications 2. Separable NMF Separability makes NMF problems efficiently solvable Need for fast, practical and robust algorithms SPA, a fast and robust recursive algorithm for separable NMF (simple to implement, no parameter tuning, no need to recompute the solution from scratch when r is changed). Pre-conditioning allows to make it much more robust to noise. Other robust algorithms, e.g., based on linear optimization (G. and Luce, Robust Near-Separable Nonnegative Matrix Factorization Using Linear Optimization, JMLR, 2014). itwist 14 SDP Preconditioning for Separable NMF 25

53 Conclusion 1. Nonnegative matrix factorization (NMF) Easily interpretable linear dimensionality reduction technique for nonnegative data, with many applications 2. Separable NMF Separability makes NMF problems efficiently solvable Need for fast, practical and robust algorithms SPA, a fast and robust recursive algorithm for separable NMF (simple to implement, no parameter tuning, no need to recompute the solution from scratch when r is changed). Pre-conditioning allows to make it much more robust to noise. Other robust algorithms, e.g., based on linear optimization (G. and Luce, Robust Near-Separable Nonnegative Matrix Factorization Using Linear Optimization, JMLR, 2014). itwist 14 SDP Preconditioning for Separable NMF 25

54 Conclusion 1. Nonnegative matrix factorization (NMF) Easily interpretable linear dimensionality reduction technique for nonnegative data, with many applications 2. Separable NMF Separability makes NMF problems efficiently solvable Need for fast, practical and robust algorithms SPA, a fast and robust recursive algorithm for separable NMF (simple to implement, no parameter tuning, no need to recompute the solution from scratch when r is changed). Pre-conditioning allows to make it much more robust to noise. Other robust algorithms, e.g., based on linear optimization (G. and Luce, Robust Near-Separable Nonnegative Matrix Factorization Using Linear Optimization, JMLR, 2014). itwist 14 SDP Preconditioning for Separable NMF 25

55 Thank you for your attention! Code and papers available on [GV13] G., Vavasis, SDP-based Preconditioning for More Robust Near-Separable NMF, arxiv: itwist 14 SDP Preconditioning for Separable NMF 26

Linear dimensionality reduction for data analysis

Linear dimensionality reduction for data analysis Linear dimensionality reduction for data analysis Nicolas Gillis Joint work with Robert Luce, François Glineur, Stephen Vavasis, Robert Plemmons, Gabriella Casalino The setup Dimensionality reduction for

More information

A Practical Algorithm for Topic Modeling with Provable Guarantees

A Practical Algorithm for Topic Modeling with Provable Guarantees 1 A Practical Algorithm for Topic Modeling with Provable Guarantees Sanjeev Arora, Rong Ge, Yoni Halpern, David Mimno, Ankur Moitra, David Sontag, Yichen Wu, Michael Zhu Reviewed by Zhao Song December

More information

On Identifiability of Nonnegative Matrix Factorization

On Identifiability of Nonnegative Matrix Factorization On Identifiability of Nonnegative Matrix Factorization Xiao Fu, Kejun Huang, and Nicholas D. Sidiropoulos Department of Electrical and Computer Engineering, University of Minnesota, Minneapolis, 55455,

More information

Robust Spectral Inference for Joint Stochastic Matrix Factorization

Robust Spectral Inference for Joint Stochastic Matrix Factorization Robust Spectral Inference for Joint Stochastic Matrix Factorization Kun Dong Cornell University October 20, 2016 K. Dong (Cornell University) Robust Spectral Inference for Joint Stochastic Matrix Factorization

More information

Semidefinite Programming Based Preconditioning for More Robust Near-Separable Nonnegative Matrix Factorization

Semidefinite Programming Based Preconditioning for More Robust Near-Separable Nonnegative Matrix Factorization Semidefinite Programming Based Preconditioning for More Robust Near-Separable Nonnegative Matrix Factorization Nicolas Gillis Faculté Polytechnique, Université de Mons Department of Mathematics and Operational

More information

Robust Near-Separable Nonnegative Matrix Factorization Using Linear Optimization

Robust Near-Separable Nonnegative Matrix Factorization Using Linear Optimization Journal of Machine Learning Research 5 (204) 249-280 Submitted 3/3; Revised 0/3; Published 4/4 Robust Near-Separable Nonnegative Matrix Factorization Using Linear Optimization Nicolas Gillis Department

More information

ONP-MF: An Orthogonal Nonnegative Matrix Factorization Algorithm with Application to Clustering

ONP-MF: An Orthogonal Nonnegative Matrix Factorization Algorithm with Application to Clustering ONP-MF: An Orthogonal Nonnegative Matrix Factorization Algorithm with Application to Clustering Filippo Pompili 1, Nicolas Gillis 2, P.-A. Absil 2,andFrançois Glineur 2,3 1- University of Perugia, Department

More information

Intersecting Faces: Non-negative Matrix Factorization With New Guarantees

Intersecting Faces: Non-negative Matrix Factorization With New Guarantees Rong Ge Microsoft Research New England James Zou Microsoft Research New England RONGGE@MICROSOFT.COM JAZO@MICROSOFT.COM Abstract Non-negative matrix factorization (NMF) is a natural model of admixture

More information

ENGG5781 Matrix Analysis and Computations Lecture 10: Non-Negative Matrix Factorization and Tensor Decomposition

ENGG5781 Matrix Analysis and Computations Lecture 10: Non-Negative Matrix Factorization and Tensor Decomposition ENGG5781 Matrix Analysis and Computations Lecture 10: Non-Negative Matrix Factorization and Tensor Decomposition Wing-Kin (Ken) Ma 2017 2018 Term 2 Department of Electronic Engineering The Chinese University

More information

Non-negative Matrix Factorization under Heavy Noise

Non-negative Matrix Factorization under Heavy Noise Non-negative Matrix Factorization under Heavy Noise Chiranjib Bhattacharyya CHIRU@CSA.IISC.ERNET.IN Navin Goyal NAVINGO@MICROSOFT.COM Ravindran Kannan KANNAN@MICROSOFT.COM Jagdeep Pani PANI.JAGDEEP@GMAIL.COM

More information

CS264: Beyond Worst-Case Analysis Lecture #15: Topic Modeling and Nonnegative Matrix Factorization

CS264: Beyond Worst-Case Analysis Lecture #15: Topic Modeling and Nonnegative Matrix Factorization CS264: Beyond Worst-Case Analysis Lecture #15: Topic Modeling and Nonnegative Matrix Factorization Tim Roughgarden February 28, 2017 1 Preamble This lecture fulfills a promise made back in Lecture #1,

More information

Provable Alternating Gradient Descent for Non-negative Matrix Factorization with Strong Correlations

Provable Alternating Gradient Descent for Non-negative Matrix Factorization with Strong Correlations Provable Alternating Gradient Descent for Non-negative Matrix Factorization with Strong Correlations Yuanzhi Li 1 Yingyu Liang 1 In this paper, we provide a theoretical analysis of a natural algorithm

More information

arxiv: v2 [stat.ml] 7 Mar 2014

arxiv: v2 [stat.ml] 7 Mar 2014 The Why and How of Nonnegative Matrix Factorization arxiv:1401.5226v2 [stat.ml] 7 Mar 2014 Nicolas Gillis Department of Mathematics and Operational Research Faculté Polytechnique, Université de Mons Rue

More information

arxiv: v3 [math.oc] 16 Aug 2017

arxiv: v3 [math.oc] 16 Aug 2017 A Fast Gradient Method for Nonnegative Sparse Regression with Self Dictionary Nicolas Gillis Robert Luce arxiv:1610.01349v3 [math.oc] 16 Aug 2017 Abstract A nonnegative matrix factorization (NMF) can be

More information

The convex algebraic geometry of rank minimization

The convex algebraic geometry of rank minimization The convex algebraic geometry of rank minimization Pablo A. Parrilo Laboratory for Information and Decision Systems Massachusetts Institute of Technology International Symposium on Mathematical Programming

More information

Solving Corrupted Quadratic Equations, Provably

Solving Corrupted Quadratic Equations, Provably Solving Corrupted Quadratic Equations, Provably Yuejie Chi London Workshop on Sparse Signal Processing September 206 Acknowledgement Joint work with Yuanxin Li (OSU), Huishuai Zhuang (Syracuse) and Yingbin

More information

Non-convex Robust PCA: Provable Bounds

Non-convex Robust PCA: Provable Bounds Non-convex Robust PCA: Provable Bounds Anima Anandkumar U.C. Irvine Joint work with Praneeth Netrapalli, U.N. Niranjan, Prateek Jain and Sujay Sanghavi. Learning with Big Data High Dimensional Regime Missing

More information

Phil Schniter. Collaborators: Jason Jeremy and Volkan

Phil Schniter. Collaborators: Jason Jeremy and Volkan Bilinear Generalized Approximate Message Passing (BiG-AMP) for Dictionary Learning Phil Schniter Collaborators: Jason Parker @OSU, Jeremy Vila @OSU, and Volkan Cehver @EPFL With support from NSF CCF-,

More information

NON-NEGATIVE SPARSE CODING

NON-NEGATIVE SPARSE CODING NON-NEGATIVE SPARSE CODING Patrik O. Hoyer Neural Networks Research Centre Helsinki University of Technology P.O. Box 9800, FIN-02015 HUT, Finland patrik.hoyer@hut.fi To appear in: Neural Networks for

More information

Robust Principal Component Analysis

Robust Principal Component Analysis ELE 538B: Mathematics of High-Dimensional Data Robust Principal Component Analysis Yuxin Chen Princeton University, Fall 2018 Disentangling sparse and low-rank matrices Suppose we are given a matrix M

More information

Phase Retrieval from Local Measurements: Deterministic Measurement Constructions and Efficient Recovery Algorithms

Phase Retrieval from Local Measurements: Deterministic Measurement Constructions and Efficient Recovery Algorithms Phase Retrieval from Local Measurements: Deterministic Measurement Constructions and Efficient Recovery Algorithms Aditya Viswanathan Department of Mathematics and Statistics adityavv@umich.edu www-personal.umich.edu/

More information

arxiv: v2 [stat.ml] 21 Oct 2013

arxiv: v2 [stat.ml] 21 Oct 2013 Robust Near-Separable Nonnegative Matrix Factorization Using Linear Optimization arxiv:302.4385v2 [stat.ml] 2 Oct 203 Nicolas Gillis Department of Mathematics and Operational Research Faculté Polytechnique,

More information

A fast randomized algorithm for approximating an SVD of a matrix

A fast randomized algorithm for approximating an SVD of a matrix A fast randomized algorithm for approximating an SVD of a matrix Joint work with Franco Woolfe, Edo Liberty, and Vladimir Rokhlin Mark Tygert Program in Applied Mathematics Yale University Place July 17,

More information

A Practical Algorithm for Topic Modeling with Provable Guarantees

A Practical Algorithm for Topic Modeling with Provable Guarantees Sanjeev Arora Rong Ge Yoni Halpern David Mimno Ankur Moitra David Sontag Yichen Wu Michael Zhu arora@cs.princeton.edu rongge@cs.princeton.edu halpern@cs.nyu.edu mimno@cs.princeton.edu moitra@ias.edu dsontag@cs.nyu.edu

More information

CANONICAL POLYADIC DECOMPOSITION OF HYPERSPECTRAL PATCH TENSORS

CANONICAL POLYADIC DECOMPOSITION OF HYPERSPECTRAL PATCH TENSORS th European Signal Processing Conference (EUSIPCO) CANONICAL POLYADIC DECOMPOSITION OF HYPERSPECTRAL PATCH TENSORS M.A. Veganzones a, J.E. Cohen a, R. Cabral Farias a, K. Usevich a, L. Drumetz b, J. Chanussot

More information

Rapid, Robust, and Reliable Blind Deconvolution via Nonconvex Optimization

Rapid, Robust, and Reliable Blind Deconvolution via Nonconvex Optimization Rapid, Robust, and Reliable Blind Deconvolution via Nonconvex Optimization Shuyang Ling Department of Mathematics, UC Davis Oct.18th, 2016 Shuyang Ling (UC Davis) 16w5136, Oaxaca, Mexico Oct.18th, 2016

More information

Scalable Subspace Clustering

Scalable Subspace Clustering Scalable Subspace Clustering René Vidal Center for Imaging Science, Laboratory for Computational Sensing and Robotics, Institute for Computational Medicine, Department of Biomedical Engineering, Johns

More information

Log-determinant Non-negative Matrix Factorization via Successive Trace Approximation

Log-determinant Non-negative Matrix Factorization via Successive Trace Approximation Log-determinant Non-negative Matrix Factorization via Successive Trace Approximation Optimizing an optimization algorithm Andersen Ang Mathématique et recherche opérationnelle UMONS, Belgium Email: manshun.ang@umons.ac.be

More information

1 Introduction to constrained low-rank matrix approximations. Volume 25 Number 1 January

1 Introduction to constrained low-rank matrix approximations. Volume 25 Number 1 January Volume 25 Number 1 January 2017 7 Introduction to Nonnegative Matrix Factorization Nicolas Gillis Department of Mathematics and Operational Research University of Mons, Rue de Houdain 9, 7000 Mons, Belgium

More information

INFORMED SPLIT GRADIENT NON-NEGATIVE MATRIX FACTORIZATION USING HUBER COST FUNCTION FOR SOURCE APPORTIONMENT

INFORMED SPLIT GRADIENT NON-NEGATIVE MATRIX FACTORIZATION USING HUBER COST FUNCTION FOR SOURCE APPORTIONMENT INFORMED SPLIT GRADIENT NON-NEGATIVE MATRIX FACTORIZATION USING HUBER COST FUNCTION FOR SOURCE APPORTIONMENT Robert Chreiky, Gilles Delmaire, Matthieu Puigt, Gilles Roussel, and Antoine Abche Univ. Littoral

More information

ECE 8201: Low-dimensional Signal Models for High-dimensional Data Analysis

ECE 8201: Low-dimensional Signal Models for High-dimensional Data Analysis ECE 8201: Low-dimensional Signal Models for High-dimensional Data Analysis Lecture 7: Matrix completion Yuejie Chi The Ohio State University Page 1 Reference Guaranteed Minimum-Rank Solutions of Linear

More information

NONNEGATIVE matrix factorization (NMF) has become

NONNEGATIVE matrix factorization (NMF) has become 1 Efficient and Non-Convex Coordinate Descent for Symmetric Nonnegative Matrix Factorization Arnaud Vandaele 1, Nicolas Gillis 1, Qi Lei 2, Kai Zhong 2, and Inderjit Dhillon 2,3, Fellow, IEEE 1 Department

More information

Jointly Clustering Rows and Columns of Binary Matrices: Algorithms and Trade-offs

Jointly Clustering Rows and Columns of Binary Matrices: Algorithms and Trade-offs Jointly Clustering Rows and Columns of Binary Matrices: Algorithms and Trade-offs Jiaming Xu Joint work with Rui Wu, Kai Zhu, Bruce Hajek, R. Srikant, and Lei Ying University of Illinois, Urbana-Champaign

More information

Appendix A. Proof to Theorem 1

Appendix A. Proof to Theorem 1 Appendix A Proof to Theorem In this section, we prove the sample complexity bound given in Theorem The proof consists of three main parts In Appendix A, we prove perturbation lemmas that bound the estimation

More information

N-mode Analysis (Tensor Framework) Behrouz Saghafi

N-mode Analysis (Tensor Framework) Behrouz Saghafi N-mode Analysis (Tensor Framework) Behrouz Saghafi N-mode Analysis (Tensor Framework) Drawback of 1-mode analysis (e.g. PCA): Captures the variance among just a single factor Our training set contains

More information

arxiv: v1 [cs.na] 1 Oct 2017

arxiv: v1 [cs.na] 1 Oct 2017 Efficient Preconditioning for Noisy Separable NMFs by Successive Projection Based Low-Rank Approximations Tomohiko Mizutani Mirai Tanaka arxiv:1710.00387v1 [cs.na] 1 Oct 2017 August 27, 2018 Abstract The

More information

Cardinality-Constrained Clustering and Outlier Detection via Conic Optimization

Cardinality-Constrained Clustering and Outlier Detection via Conic Optimization Computational Management Science 2017 Cardinality-Constrained Clustering and Outlier Detection via Conic Optimization Presented by Kilian Schindler École Polytechnique Fédérale de Lausanne Joint work with

More information

Science Insights: An International Journal

Science Insights: An International Journal Available online at http://www.urpjournals.com Science Insights: An International Journal Universal Research Publications. All rights reserved ISSN 2277 3835 Original Article Object Recognition using Zernike

More information

Fast and Robust Phase Retrieval

Fast and Robust Phase Retrieval Fast and Robust Phase Retrieval Aditya Viswanathan aditya@math.msu.edu CCAM Lunch Seminar Purdue University April 18 2014 0 / 27 Joint work with Yang Wang Mark Iwen Research supported in part by National

More information

Graph Partitioning Using Random Walks

Graph Partitioning Using Random Walks Graph Partitioning Using Random Walks A Convex Optimization Perspective Lorenzo Orecchia Computer Science Why Spectral Algorithms for Graph Problems in practice? Simple to implement Can exploit very efficient

More information

Algorithms Reading Group Notes: Provable Bounds for Learning Deep Representations

Algorithms Reading Group Notes: Provable Bounds for Learning Deep Representations Algorithms Reading Group Notes: Provable Bounds for Learning Deep Representations Joshua R. Wang November 1, 2016 1 Model and Results Continuing from last week, we again examine provable algorithms for

More information

Introduction to the Tensor Train Decomposition and Its Applications in Machine Learning

Introduction to the Tensor Train Decomposition and Its Applications in Machine Learning Introduction to the Tensor Train Decomposition and Its Applications in Machine Learning Anton Rodomanov Higher School of Economics, Russia Bayesian methods research group (http://bayesgroup.ru) 14 March

More information

A quadratic programming approach to find faces in robust nonnegative matrix factorization

A quadratic programming approach to find faces in robust nonnegative matrix factorization A quadratic programming approach to find faces in robust nonnegative matrix factorization by Sai Mali Ananthanarayanan A thesis presented to the University of Waterloo in fulfillment of the thesis requirement

More information

SPARSE signal representations have gained popularity in recent

SPARSE signal representations have gained popularity in recent 6958 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 57, NO. 10, OCTOBER 2011 Blind Compressed Sensing Sivan Gleichman and Yonina C. Eldar, Senior Member, IEEE Abstract The fundamental principle underlying

More information

Nonnegative Matrix Factorization Clustering on Multiple Manifolds

Nonnegative Matrix Factorization Clustering on Multiple Manifolds Proceedings of the Twenty-Fourth AAAI Conference on Artificial Intelligence (AAAI-10) Nonnegative Matrix Factorization Clustering on Multiple Manifolds Bin Shen, Luo Si Department of Computer Science,

More information

A Quantum Interior Point Method for LPs and SDPs

A Quantum Interior Point Method for LPs and SDPs A Quantum Interior Point Method for LPs and SDPs Iordanis Kerenidis 1 Anupam Prakash 1 1 CNRS, IRIF, Université Paris Diderot, Paris, France. September 26, 2018 Semi Definite Programs A Semidefinite Program

More information

Spectral k-support Norm Regularization

Spectral k-support Norm Regularization Spectral k-support Norm Regularization Andrew McDonald Department of Computer Science, UCL (Joint work with Massimiliano Pontil and Dimitris Stamos) 25 March, 2015 1 / 19 Problem: Matrix Completion Goal:

More information

Matrix Decomposition in Privacy-Preserving Data Mining JUN ZHANG DEPARTMENT OF COMPUTER SCIENCE UNIVERSITY OF KENTUCKY

Matrix Decomposition in Privacy-Preserving Data Mining JUN ZHANG DEPARTMENT OF COMPUTER SCIENCE UNIVERSITY OF KENTUCKY Matrix Decomposition in Privacy-Preserving Data Mining JUN ZHANG DEPARTMENT OF COMPUTER SCIENCE UNIVERSITY OF KENTUCKY OUTLINE Why We Need Matrix Decomposition SVD (Singular Value Decomposition) NMF (Nonnegative

More information

Interpretable Nonnegative Matrix Decompositions

Interpretable Nonnegative Matrix Decompositions 27 August 2008 Outline 1 Introduction 2 Definitions 3 Algorithms and Complexity 4 Experiments Synthetic Data Real Data 5 Conclusions Outline 1 Introduction 2 Definitions 3 Algorithms and Complexity 4 Experiments

More information

SDP Relaxations for MAXCUT

SDP Relaxations for MAXCUT SDP Relaxations for MAXCUT from Random Hyperplanes to Sum-of-Squares Certificates CATS @ UMD March 3, 2017 Ahmed Abdelkader MAXCUT SDP SOS March 3, 2017 1 / 27 Overview 1 MAXCUT, Hardness and UGC 2 LP

More information

Dropping Symmetry for Fast Symmetric Nonnegative Matrix Factorization

Dropping Symmetry for Fast Symmetric Nonnegative Matrix Factorization Dropping Symmetry for Fast Symmetric Nonnegative Matrix Factorization Zhihui Zhu Mathematical Institute for Data Science Johns Hopkins University Baltimore, MD, USA zzhu29@jhu.edu Kai Liu Department of

More information

Facial reduction for Euclidean distance matrix problems

Facial reduction for Euclidean distance matrix problems Facial reduction for Euclidean distance matrix problems Nathan Krislock Department of Mathematical Sciences Northern Illinois University, USA Distance Geometry: Theory and Applications DIMACS Center, Rutgers

More information

An Introduction to Spectral Learning

An Introduction to Spectral Learning An Introduction to Spectral Learning Hanxiao Liu November 8, 2013 Outline 1 Method of Moments 2 Learning topic models using spectral properties 3 Anchor words Preliminaries X 1,, X n p (x; θ), θ = (θ 1,

More information

k-means Clustering via the Frank-Wolfe Algorithm

k-means Clustering via the Frank-Wolfe Algorithm k-means Clustering via the Frank-Wolfe Algorithm Christian Bauckhage B-IT, University of Bonn, Bonn, Germany Fraunhofer IAIS, Sankt Augustin, Germany http://multimedia-pattern-recognition.info Abstract.

More information

Exact Low-rank Matrix Recovery via Nonconvex M p -Minimization

Exact Low-rank Matrix Recovery via Nonconvex M p -Minimization Exact Low-rank Matrix Recovery via Nonconvex M p -Minimization Lingchen Kong and Naihua Xiu Department of Applied Mathematics, Beijing Jiaotong University, Beijing, 100044, People s Republic of China E-mail:

More information

Sparse Solutions of an Undetermined Linear System

Sparse Solutions of an Undetermined Linear System 1 Sparse Solutions of an Undetermined Linear System Maddullah Almerdasy New York University Tandon School of Engineering arxiv:1702.07096v1 [math.oc] 23 Feb 2017 Abstract This work proposes a research

More information

Donald Goldfarb IEOR Department Columbia University UCLA Mathematics Department Distinguished Lecture Series May 17 19, 2016

Donald Goldfarb IEOR Department Columbia University UCLA Mathematics Department Distinguished Lecture Series May 17 19, 2016 Optimization for Tensor Models Donald Goldfarb IEOR Department Columbia University UCLA Mathematics Department Distinguished Lecture Series May 17 19, 2016 1 Tensors Matrix Tensor: higher-order matrix

More information

Majorization Minimization (MM) and Block Coordinate Descent (BCD)

Majorization Minimization (MM) and Block Coordinate Descent (BCD) Majorization Minimization (MM) and Block Coordinate Descent (BCD) Wing-Kin (Ken) Ma Department of Electronic Engineering, The Chinese University Hong Kong, Hong Kong ELEG5481, Lecture 15 Acknowledgment:

More information

Automatic Rank Determination in Projective Nonnegative Matrix Factorization

Automatic Rank Determination in Projective Nonnegative Matrix Factorization Automatic Rank Determination in Projective Nonnegative Matrix Factorization Zhirong Yang, Zhanxing Zhu, and Erkki Oja Department of Information and Computer Science Aalto University School of Science and

More information

Semidefinite Facial Reduction for Euclidean Distance Matrix Completion

Semidefinite Facial Reduction for Euclidean Distance Matrix Completion Semidefinite Facial Reduction for Euclidean Distance Matrix Completion Nathan Krislock, Henry Wolkowicz Department of Combinatorics & Optimization University of Waterloo First Alpen-Adria Workshop on Optimization

More information

ELE539A: Optimization of Communication Systems Lecture 15: Semidefinite Programming, Detection and Estimation Applications

ELE539A: Optimization of Communication Systems Lecture 15: Semidefinite Programming, Detection and Estimation Applications ELE539A: Optimization of Communication Systems Lecture 15: Semidefinite Programming, Detection and Estimation Applications Professor M. Chiang Electrical Engineering Department, Princeton University March

More information

Preserving Privacy in Data Mining using Data Distortion Approach

Preserving Privacy in Data Mining using Data Distortion Approach Preserving Privacy in Data Mining using Data Distortion Approach Mrs. Prachi Karandikar #, Prof. Sachin Deshpande * # M.E. Comp,VIT, Wadala, University of Mumbai * VIT Wadala,University of Mumbai 1. prachiv21@yahoo.co.in

More information

arxiv: v2 [math.oc] 10 Feb 2014

arxiv: v2 [math.oc] 10 Feb 2014 Two Algorithms for Orthogonal Nonnegative Matrix Factorization with Application to Clustering Filippo Pompili Nicolas Gillis P.-A. Absil François Glineur arxiv:1201.0901v2 [math.oc] 10 Feb 2014 Abstract

More information

ECE G: Special Topics in Signal Processing: Sparsity, Structure, and Inference

ECE G: Special Topics in Signal Processing: Sparsity, Structure, and Inference ECE 18-898G: Special Topics in Signal Processing: Sparsity, Structure, and Inference Low-rank matrix recovery via convex relaxations Yuejie Chi Department of Electrical and Computer Engineering Spring

More information

PHASE RETRIEVAL OF SPARSE SIGNALS FROM MAGNITUDE INFORMATION. A Thesis MELTEM APAYDIN

PHASE RETRIEVAL OF SPARSE SIGNALS FROM MAGNITUDE INFORMATION. A Thesis MELTEM APAYDIN PHASE RETRIEVAL OF SPARSE SIGNALS FROM MAGNITUDE INFORMATION A Thesis by MELTEM APAYDIN Submitted to the Office of Graduate and Professional Studies of Texas A&M University in partial fulfillment of the

More information

Interaction on spectral data with: Kira Abercromby (NASA-Houston) Related Papers at:

Interaction on spectral data with: Kira Abercromby (NASA-Houston) Related Papers at: Low-Rank Nonnegative Factorizations for Spectral Imaging Applications Bob Plemmons Wake Forest University Collaborators: Christos Boutsidis (U. Patras), Misha Kilmer (Tufts), Peter Zhang, Paul Pauca, (WFU)

More information

Optimisation Combinatoire et Convexe.

Optimisation Combinatoire et Convexe. Optimisation Combinatoire et Convexe. Low complexity models, l 1 penalties. A. d Aspremont. M1 ENS. 1/36 Today Sparsity, low complexity models. l 1 -recovery results: three approaches. Extensions: matrix

More information

New Coherence and RIP Analysis for Weak. Orthogonal Matching Pursuit

New Coherence and RIP Analysis for Weak. Orthogonal Matching Pursuit New Coherence and RIP Analysis for Wea 1 Orthogonal Matching Pursuit Mingrui Yang, Member, IEEE, and Fran de Hoog arxiv:1405.3354v1 [cs.it] 14 May 2014 Abstract In this paper we define a new coherence

More information

arxiv: v2 [stat.ml] 30 Apr 2018

arxiv: v2 [stat.ml] 30 Apr 2018 Randomized Nonnegative Matrix Factorization N. Benjamin Erichson University of Washington Ariana Mendible University of Washington Sophie Wihlborn Fidelity International J. Nathan Kutz University of Washington

More information

Introduction to Convex Optimization

Introduction to Convex Optimization Introduction to Convex Optimization Daniel P. Palomar Hong Kong University of Science and Technology (HKUST) ELEC5470 - Convex Optimization Fall 2018-19, HKUST, Hong Kong Outline of Lecture Optimization

More information

Scalable Completion of Nonnegative Matrices with the Separable Structure

Scalable Completion of Nonnegative Matrices with the Separable Structure Proceedings of the Thirtieth AAAI Conference on Artificial Intelligence (AAAI-6) Scalable Completion of Nonnegative Matrices with the Separable Structure Xiyu Yu, Wei Bian, Dacheng Tao Center for Quantum

More information

Discriminative Direction for Kernel Classifiers

Discriminative Direction for Kernel Classifiers Discriminative Direction for Kernel Classifiers Polina Golland Artificial Intelligence Lab Massachusetts Institute of Technology Cambridge, MA 02139 polina@ai.mit.edu Abstract In many scientific and engineering

More information

Rank Selection in Low-rank Matrix Approximations: A Study of Cross-Validation for NMFs

Rank Selection in Low-rank Matrix Approximations: A Study of Cross-Validation for NMFs Rank Selection in Low-rank Matrix Approximations: A Study of Cross-Validation for NMFs Bhargav Kanagal Department of Computer Science University of Maryland College Park, MD 277 bhargav@cs.umd.edu Vikas

More information

Some preconditioners for systems of linear inequalities

Some preconditioners for systems of linear inequalities Some preconditioners for systems of linear inequalities Javier Peña Vera oshchina Negar Soheili June 0, 03 Abstract We show that a combination of two simple preprocessing steps would generally improve

More information

Overlapping Variable Clustering with Statistical Guarantees and LOVE

Overlapping Variable Clustering with Statistical Guarantees and LOVE with Statistical Guarantees and LOVE Department of Statistical Science Cornell University WHOA-PSI, St. Louis, August 2017 Joint work with Mike Bing, Yang Ning and Marten Wegkamp Cornell University, Department

More information

Sparsity-Preserving Difference of Positive Semidefinite Matrix Representation of Indefinite Matrices

Sparsity-Preserving Difference of Positive Semidefinite Matrix Representation of Indefinite Matrices Sparsity-Preserving Difference of Positive Semidefinite Matrix Representation of Indefinite Matrices Jaehyun Park June 1 2016 Abstract We consider the problem of writing an arbitrary symmetric matrix as

More information

An algebraic perspective on integer sparse recovery

An algebraic perspective on integer sparse recovery An algebraic perspective on integer sparse recovery Lenny Fukshansky Claremont McKenna College (joint work with Deanna Needell and Benny Sudakov) Combinatorics Seminar USC October 31, 2018 From Wikipedia:

More information

Data Mining and Matrices

Data Mining and Matrices Data Mining and Matrices 08 Boolean Matrix Factorization Rainer Gemulla, Pauli Miettinen June 13, 2013 Outline 1 Warm-Up 2 What is BMF 3 BMF vs. other three-letter abbreviations 4 Binary matrices, tiles,

More information

A ROBUST TEST FOR NONLINEAR MIXTURE DETECTION IN HYPERSPECTRAL IMAGES

A ROBUST TEST FOR NONLINEAR MIXTURE DETECTION IN HYPERSPECTRAL IMAGES A ROBUST TEST FOR NONLINEAR MIXTURE DETECTION IN HYPERSPECTRAL IMAGES Y. Altmann, N. Dobigeon and J-Y. Tourneret University of Toulouse IRIT-ENSEEIHT Toulouse, France J.C.M. Bermudez Federal University

More information

L26: Advanced dimensionality reduction

L26: Advanced dimensionality reduction L26: Advanced dimensionality reduction The snapshot CA approach Oriented rincipal Components Analysis Non-linear dimensionality reduction (manifold learning) ISOMA Locally Linear Embedding CSCE 666 attern

More information

Algorithmic Aspects of Machine Learning. Ankur Moitra

Algorithmic Aspects of Machine Learning. Ankur Moitra Algorithmic Aspects of Machine Learning Ankur Moitra Contents Contents i Preface v 1 Introduction 3 2 Nonnegative Matrix Factorization 7 2.1 Introduction................................ 7 2.2 Algebraic

More information

Sum-of-Squares Method, Tensor Decomposition, Dictionary Learning

Sum-of-Squares Method, Tensor Decomposition, Dictionary Learning Sum-of-Squares Method, Tensor Decomposition, Dictionary Learning David Steurer Cornell Approximation Algorithms and Hardness, Banff, August 2014 for many problems (e.g., all UG-hard ones): better guarantees

More information

Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008

Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008 Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008 Exam 2 will be held on Tuesday, April 8, 7-8pm in 117 MacMillan What will be covered The exam will cover material from the lectures

More information

A convex model for non-negative matrix factorization and dimensionality reduction on physical space

A convex model for non-negative matrix factorization and dimensionality reduction on physical space A convex model for non-negative matrix factorization and dimensionality reduction on physical space Ernie Esser Joint work with Michael Möller, Stan Osher, Guillermo Sapiro and Jack Xin University of California

More information

Signal Recovery from Permuted Observations

Signal Recovery from Permuted Observations EE381V Course Project Signal Recovery from Permuted Observations 1 Problem Shanshan Wu (sw33323) May 8th, 2015 We start with the following problem: let s R n be an unknown n-dimensional real-valued signal,

More information

Analysis of NIR spectroscopy data for blend uniformity using semi-nonnegative Matrix Factorization

Analysis of NIR spectroscopy data for blend uniformity using semi-nonnegative Matrix Factorization Analysis of NIR spectroscopy data for blend uniformity using seminonnegative Matrix Factorization Nicolas Sauwen 1, Adriaan Blommaert 1, Tatsiana Khamiakova 2, Michel Thiel 2, Jan Dijkmans 2, Ana Tavares

More information

Topic Discovery through Data Dependent and Random Projections

Topic Discovery through Data Dependent and Random Projections Weicong Ding dingwc@bu.edu Mohammad H. Rohban mhrohban@bu.edu Prakash Ishwar pi@bu.edu Venkatesh Saligrama srv@bu.edu Department of Electrical & Computer Engineering, Boston University, Boston, MA, USA

More information

Sparse and Robust Optimization and Applications

Sparse and Robust Optimization and Applications Sparse and and Statistical Learning Workshop Les Houches, 2013 Robust Laurent El Ghaoui with Mert Pilanci, Anh Pham EECS Dept., UC Berkeley January 7, 2013 1 / 36 Outline Sparse Sparse Sparse Probability

More information

Combining Sparsity with Physically-Meaningful Constraints in Sparse Parameter Estimation

Combining Sparsity with Physically-Meaningful Constraints in Sparse Parameter Estimation UIUC CSL Mar. 24 Combining Sparsity with Physically-Meaningful Constraints in Sparse Parameter Estimation Yuejie Chi Department of ECE and BMI Ohio State University Joint work with Yuxin Chen (Stanford).

More information

Certifying the Global Optimality of Graph Cuts via Semidefinite Programming: A Theoretic Guarantee for Spectral Clustering

Certifying the Global Optimality of Graph Cuts via Semidefinite Programming: A Theoretic Guarantee for Spectral Clustering Certifying the Global Optimality of Graph Cuts via Semidefinite Programming: A Theoretic Guarantee for Spectral Clustering Shuyang Ling Courant Institute of Mathematical Sciences, NYU Aug 13, 2018 Joint

More information

A randomized algorithm for approximating the SVD of a matrix

A randomized algorithm for approximating the SVD of a matrix A randomized algorithm for approximating the SVD of a matrix Joint work with Per-Gunnar Martinsson (U. of Colorado) and Vladimir Rokhlin (Yale) Mark Tygert Program in Applied Mathematics Yale University

More information

Limitations in Approximating RIP

Limitations in Approximating RIP Alok Puranik Mentor: Adrian Vladu Fifth Annual PRIMES Conference, 2015 Outline 1 Background The Problem Motivation Construction Certification 2 Planted model Planting eigenvalues Analysis Distinguishing

More information

PROJECTIVE NON-NEGATIVE MATRIX FACTORIZATION WITH APPLICATIONS TO FACIAL IMAGE PROCESSING

PROJECTIVE NON-NEGATIVE MATRIX FACTORIZATION WITH APPLICATIONS TO FACIAL IMAGE PROCESSING st Reading October 0, 200 8:4 WSPC/-IJPRAI SPI-J068 008 International Journal of Pattern Recognition and Artificial Intelligence Vol. 2, No. 8 (200) 0 c World Scientific Publishing Company PROJECTIVE NON-NEGATIVE

More information

Lecture Note 5: Semidefinite Programming for Stability Analysis

Lecture Note 5: Semidefinite Programming for Stability Analysis ECE7850: Hybrid Systems:Theory and Applications Lecture Note 5: Semidefinite Programming for Stability Analysis Wei Zhang Assistant Professor Department of Electrical and Computer Engineering Ohio State

More information

CS675: Convex and Combinatorial Optimization Fall 2016 Combinatorial Problems as Linear and Convex Programs. Instructor: Shaddin Dughmi

CS675: Convex and Combinatorial Optimization Fall 2016 Combinatorial Problems as Linear and Convex Programs. Instructor: Shaddin Dughmi CS675: Convex and Combinatorial Optimization Fall 2016 Combinatorial Problems as Linear and Convex Programs Instructor: Shaddin Dughmi Outline 1 Introduction 2 Shortest Path 3 Algorithms for Single-Source

More information

ECS289: Scalable Machine Learning

ECS289: Scalable Machine Learning ECS289: Scalable Machine Learning Cho-Jui Hsieh UC Davis Oct 11, 2016 Paper presentations and final project proposal Send me the names of your group member (2 or 3 students) before October 15 (this Friday)

More information

Handout 8: Dealing with Data Uncertainty

Handout 8: Dealing with Data Uncertainty MFE 5100: Optimization 2015 16 First Term Handout 8: Dealing with Data Uncertainty Instructor: Anthony Man Cho So December 1, 2015 1 Introduction Conic linear programming CLP, and in particular, semidefinite

More information

Learning Spectral Clustering

Learning Spectral Clustering Learning Spectral Clustering Francis R. Bach Computer Science University of California Berkeley, CA 94720 fbach@cs.berkeley.edu Michael I. Jordan Computer Science and Statistics University of California

More information

DOWNLINK transmit beamforming has recently received

DOWNLINK transmit beamforming has recently received 4254 IEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL. 58, NO. 8, AUGUST 2010 A Dual Perspective on Separable Semidefinite Programming With Applications to Optimal Downlink Beamforming Yongwei Huang, Member,

More information

Analysis of Robust PCA via Local Incoherence

Analysis of Robust PCA via Local Incoherence Analysis of Robust PCA via Local Incoherence Huishuai Zhang Department of EECS Syracuse University Syracuse, NY 3244 hzhan23@syr.edu Yi Zhou Department of EECS Syracuse University Syracuse, NY 3244 yzhou35@syr.edu

More information